-
Notifications
You must be signed in to change notification settings - Fork 552
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AI 代理 Wasm 插件对接 Ollama #956
Comments
申请认领这个任务,麻烦帮忙分配下 @CH3CHO |
@CH3CHO 您好,Ollama似乎只能在本地运行然后通过localhost调用API?我目前已开发并测试在本地部署模型的版本,但不太确定是不是只能本地运行,想和您确认一下 |
试试这个呢?ollama/ollama#1179 |
经测试,现在的对接实现存在一定问题,无法正常工作。希望社区感兴趣的同学参与修正。 |
您好,Ollama这边没有一个集中的服务器,它是需要在本地下载运行Ollma服务器(https://github.com/ollama/ollama),然后修改ai-proxy配置文件中的ollamaServerHost和ollamaServerPort为服务运行对应的IP地址和端口号,不知道是否是这里导致的问题 |
经过再次验证,是我之前测试工艺的问题。现在插件可以正常工作。请忽略。抱歉。 |
https://github.com/ollama/ollama
The text was updated successfully, but these errors were encountered: