Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

能运行但是没反应[BUG] #76

Closed
luaonze opened this issue Jul 5, 2024 · 9 comments
Closed

能运行但是没反应[BUG] #76

luaonze opened this issue Jul 5, 2024 · 9 comments
Labels
bug Something isn't working triaged

Comments

@luaonze
Copy link

luaonze commented Jul 5, 2024

It works but it doesn't respond and my interface is different from others
截屏2024-07-06 00 23 41
After I run llama3 with ollama locally, I open the page like the screenshot, which is completely different from others. I don't know what to do. I have found many request methods, but they are all wrong.截屏2024-07-06 00 25 36
There is nothing wrong with my running command. I don't know where the problem is. This problem has troubled me for 3 days. Please help me. I give you my most sincere thanks.

@luaonze luaonze added the bug Something isn't working label Jul 5, 2024
@teaxio
Copy link
Collaborator

teaxio commented Jul 5, 2024

@luaonze we have noticed that when using ollama that all the details related to function calling are gone. Meaning ollama drops them before they reach the intended LLM. There is a section in our readme that shows how to setup ollama correctly. Have you tried it?
Another point is that the LLM you use must have function calling capability as that's underlying this.
However, even with that there is limited success I suspect due to the verbosity of the planner. We are in need for community members to help us make this work with local LLMs. We have not had time to free up cycles for it :(
Please let me know about the above question

@teaxio teaxio added the triaged label Jul 5, 2024
@teaxio
Copy link
Collaborator

teaxio commented Jul 5, 2024

Once you do set this up successfully, I suspect that you will run into this #75
As indicated in the other bug, I welcome ideas on how to defeat this.

@luaonze
Copy link
Author

luaonze commented Jul 6, 2024

I tried ollama, and my interface is the same as others, especially the robot icon is completely different. I asked questions locally, but there seems to be no response.

And I saw you sent me the solution, but I asked him to operate the model but he didn't reply to me. He didn't respond at all. I uninstalled and reinstalled many times but it still didn't work.

@teaxio
Copy link
Collaborator

teaxio commented Jul 6, 2024

So you have a problem with Ollama setup then @luaonze ? If that's the case, there is not much I can do

@luaonze
Copy link
Author

luaonze commented Jul 7, 2024

I think it might be, but I deployed it on several computers and got the same result.

@luaonze
Copy link
Author

luaonze commented Jul 7, 2024

I think my model is not running. The llama3 model in ollama in docker that I use may not be running.

@liuyanmei22
Copy link

@luaonze 你好,请问这个问题后面解决了吗?

@bolt163
Copy link

bolt163 commented Aug 1, 2024

这个Agent不支持开源类未实现函数调用的llm...
This project builds on AutoGen agent framework which relies on the LLM being used must support function calling.
#28

@teaxio
Copy link
Collaborator

teaxio commented Nov 4, 2024

seems to have gone stale, reopen if needed

@teaxio teaxio closed this as completed Nov 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triaged
Projects
None yet
Development

No branches or pull requests

4 participants