Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

what(): Cannot create socket #87

Open
Slaghton opened this issue Jun 2, 2024 · 2 comments
Open

what(): Cannot create socket #87

Slaghton opened this issue Jun 2, 2024 · 2 comments

Comments

@Slaghton
Copy link

Slaghton commented Jun 2, 2024

I got local inference working but when I try to use workers I get this error.

dllama inference --model dllama_model_tinyllama_1_1b_3t_q40.m --tokenizer dllama_tokenizer_tinyllama_1_1b_3t_q40.t --buffer-float-type q80 --prompt "Hello world" --steps 16 --nthreads 4 --workers 192.168.0.1:9998

terminate called after throwing an instance of 'std::runtime_error'
what(): Cannot create socket

My worker is currently happily running I believe on the same computer. Just to test if I would get any response from it, i did try having silly tavern connect to it which obviously failed but also crashed the worker so the port is definitely reachable.

C:\SWARM\distributed-llama>dllama worker --port 9998 --nthreads 4
Listening on 0.0.0.0:9998...

@b4rtaz
Copy link
Owner

b4rtaz commented Jun 4, 2024

@Slaghton could you pull the latest changes and try again?

@coljac
Copy link

coljac commented Aug 3, 2024

I have the same issue, on the latest version as of Aug 3.

terminate called after throwing an instance of 'std::runtime_error'
  what():  Cannot read magic value
[1]    4141724 IOT instruction (core dumped)  ./dllama inference --steps 64 --prompt "Hello world" --model  --tokenizer    

I can start a worker, it dies as above when a connection is made.

Is irrespective of the model used; and I have 4 A100s.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants