-
Notifications
You must be signed in to change notification settings - Fork 55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how to use open-source models? #52
Comments
thx for the advice. Just try to understand better about you needs. You need local open-source models because of the concern about gpt cost? or from the privacy perspective? or other reasons? |
I think we should add this, it's an easy win and open doors for many efforts @yuyuan223 @basicmi |
more than that, I think we can play with different models in the same game and compare them. |
I saw in the video the ability to select open source models, but when I ran skyagi I found that it was not possible to choose. Is it not supported yet? |
We are working on the local/cloud-hosted open-source LLM support, with the help of https://github.com/tensorchord/modelz-llm. It should be available next week. |
I successfully run skyAGI with lmsys/fastchat-t5-3b-v1.0 on my PC, without the GPU:
I have some workarounds to
|
I will write a doc for it. |
Thank you @gaocegege for writing a doc, because so far I can't make it work:
works, but |
Please add examples using local open-source models, like llama or chatGLM. Thanks
The text was updated successfully, but these errors were encountered: