Replies: 1 comment 9 replies
-
Can you elaborate on how it is failing? I don't see any immediate issue in the logs you provided. |
Beta Was this translation helpful? Give feedback.
9 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
The latest updates made to LM Studio binaries, llama.cpp vulkan runtime, is not loading properly LLM as large as 50B/70B parameters (about 35~40GB) on AMD RX580 GPU. Leading to the message:
Previous models ran perfectly.
Here are a few logs when it was working properly and when it stop to work:
Vulkan while working: vulkan-working.txt
Recent vulkan not working: vulkan-not-working.txt
Attached the llama.cpp vulkan working perfectly:
llama.cpp-linux-x86_64-vulkan-avx2-1.4.0.zip
I already discussed with LM Studio devs, that suspect that the issue is related to llama.cpp vulkan. Could you please help? Any advice is appreciated.
Beta Was this translation helpful? Give feedback.
All reactions