-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
It still does not work with some GGML v3 models #11
Comments
@idobric, it seems like there are new updates again 😅 the main branch is pointing to the GGML v2 I guess. |
@idobric, I updated |
Hi and thank you very much for doing this. What's really hard about all these breaking changes is that they never keep retrocompatibility whith previous models versions. Your bindings are supported at my gpt4all-ui. It would be cool if you test it and upgrade the binding part as you change yours your bindings has -a at the end of the name (for abdeladim) as i also support other developers bindings. Marella has also made some good bindings and i add -m to distinguish his from yours. I probably gonna change the bindings list to have a card that shows who built it and give a link to their repos to show some love. I have created a binding template to show how to interface any new binding with the app. It is very easy, just implement a class and you're good to go. I finally built a video to show how the basic use of the ui. I hope you like it: |
@ParisNeo, yeah, that's the big problem, it's hard to know which version for which model, especially with the myriad of So finally the 2.0 release of Yeah, I noticed that you support my bindings. Glad you found them useful. Keep up the good work 💪 |
Thanks man :) |
For example:
https://huggingface.co/TheBloke/gpt4-alpaca-lora-30B-4bit-GGML/tree/main
The main branch should have GGML v3 but python crashes when trying to run this model.
I can run it using llama_cpp package.
Thanks
The text was updated successfully, but these errors were encountered: