You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, it is possible if you use the ollama binding. You can install ollama on your PC and I think they already support ROCm on linux.
That may be true, but none of the models I'm trying to run seem to want to work with that particular binding, do you know of any other that might be able to take advantage? Best example I have, is GPT4All will use my GPU fine, but it lacks a mountain of features this has that I'm hoping to get working.
Well, ROCM support is really limited on most platforms. I think you way use the hugging face binding then install manually pytorch with Rocm support, then you can use all hugging face transformers models
I can't see any way to get this working with ROCm, is this possible?
The text was updated successfully, but these errors were encountered: