-
Notifications
You must be signed in to change notification settings - Fork 145
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
能运行但是没反应[BUG] #76
Comments
@luaonze we have noticed that when using ollama that all the details related to function calling are gone. Meaning ollama drops them before they reach the intended LLM. There is a section in our readme that shows how to setup ollama correctly. Have you tried it? |
Once you do set this up successfully, I suspect that you will run into this #75 |
I tried ollama, and my interface is the same as others, especially the robot icon is completely different. I asked questions locally, but there seems to be no response. And I saw you sent me the solution, but I asked him to operate the model but he didn't reply to me. He didn't respond at all. I uninstalled and reinstalled many times but it still didn't work. |
So you have a problem with Ollama setup then @luaonze ? If that's the case, there is not much I can do |
I think it might be, but I deployed it on several computers and got the same result. |
I think my model is not running. The llama3 model in ollama in docker that I use may not be running. |
@luaonze 你好,请问这个问题后面解决了吗? |
这个Agent不支持开源类未实现函数调用的llm... |
seems to have gone stale, reopen if needed |
It works but it doesn't respond and my interface is different from others
After I run llama3 with ollama locally, I open the page like the screenshot, which is completely different from others. I don't know what to do. I have found many request methods, but they are all wrong.
There is nothing wrong with my running command. I don't know where the problem is. This problem has troubled me for 3 days. Please help me. I give you my most sincere thanks.
The text was updated successfully, but these errors were encountered: