Replies: 1 comment
-
I made a foolish mistake, but fortunately, I managed to solve the problem. It turns out I hadn't read the instructions carefully. To avoid being as silly as I was, all you need to do is install the exllama package. That's all there is to it. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Traceback (most recent call last):
File "/home/quokka/text-generation-webui/server.py", line 62, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name, loader)
File "/home/quokka/text-generation-webui/modules/models.py", line 65, in load_model
output = load_func_maploader
File "/home/quokka/text-generation-webui/modules/models.py", line 275, in ExLlama_loader
from modules.exllama import ExllamaModel
File "/home/quokka/text-generation-webui/modules/exllama.py", line 9, in
from generator import ExLlamaGenerator
ModuleNotFoundError: No module named 'generator'
I'm having trouble loading the GPTQ models using exllama (oobabooga). Whenever I attempt to load the model, an error message the one mentioned above is displayed. I'm not sure what to resolve this issue.
Beta Was this translation helpful? Give feedback.
All reactions