Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AI 代理 Wasm 插件对接 Ollama #956

Closed
Tracked by #940
CH3CHO opened this issue May 15, 2024 · 6 comments · Fixed by #1001
Closed
Tracked by #940

AI 代理 Wasm 插件对接 Ollama #956

CH3CHO opened this issue May 15, 2024 · 6 comments · Fixed by #1001
Assignees
Labels

Comments

@CH3CHO
Copy link
Collaborator

CH3CHO commented May 15, 2024

https://github.com/ollama/ollama

@CH3CHO CH3CHO changed the title Ollama(链接) AI 代理 Wasm 插件对接 Ollama May 15, 2024
@CH3CHO CH3CHO added type/enhancement New feature or request good first issue Good for newcomers help wanted Extra attention is needed level/normal area/plugin sig/wasm labels May 15, 2024
@github-project-automation github-project-automation bot moved this to Todo in Higress May 15, 2024
@Claire-w
Copy link
Contributor

申请认领这个任务,麻烦帮忙分配下 @CH3CHO

Claire-w added a commit to Claire-w/higress that referenced this issue May 23, 2024
@Claire-w
Copy link
Contributor

@CH3CHO 您好,Ollama似乎只能在本地运行然后通过localhost调用API?我目前已开发并测试在本地部署模型的版本,但不太确定是不是只能本地运行,想和您确认一下

@CH3CHO
Copy link
Collaborator Author

CH3CHO commented May 25, 2024

@CH3CHO 您好,Ollama似乎只能在本地运行然后通过localhost调用API?我目前已开发并测试在本地部署模型的版本,但不太确定是不是只能本地运行,想和您确认一下

试试这个呢?ollama/ollama#1179

@github-project-automation github-project-automation bot moved this from Todo to Done in Higress May 28, 2024
@CH3CHO CH3CHO reopened this Nov 10, 2024
@CH3CHO
Copy link
Collaborator Author

CH3CHO commented Nov 10, 2024

经测试,现在的对接实现存在一定问题,无法正常工作。希望社区感兴趣的同学参与修正。

@Claire-w
Copy link
Contributor

经测试,现在的对接实现存在一定问题,无法正常工作。希望社区感兴趣的同学参与修正。

您好,Ollama这边没有一个集中的服务器,它是需要在本地下载运行Ollma服务器(https://github.com/ollama/ollama),然后修改ai-proxy配置文件中的ollamaServerHost和ollamaServerPort为服务运行对应的IP地址和端口号,不知道是否是这里导致的问题

@CH3CHO
Copy link
Collaborator Author

CH3CHO commented Nov 11, 2024

您好,Ollama这边没有一个集中的服务器,它是需要在本地下载运行Ollma服务器

经过再次验证,是我之前测试工艺的问题。现在插件可以正常工作。请忽略。抱歉。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Archived in project
Development

Successfully merging a pull request may close this issue.

3 participants