Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MPI run on M1 Max #4244

Closed
ageorgios opened this issue Nov 28, 2023 · 3 comments
Closed

MPI run on M1 Max #4244

ageorgios opened this issue Nov 28, 2023 · 3 comments

Comments

@ageorgios
Copy link

Prerequisites

Please answer the following questions for yourself before submitting an issue.

  • [YES ] I am running the latest code. Development is very rapid so there are no tagged versions as of now.
  • [ YES] I carefully followed the README.md.
  • [YES ] I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • [ YES] I reviewed the Discussions, and have a new bug or useful enhancement to share.

Expected Behavior

run 2 chunks of the model on the same CPU-GPU

Current Behavior

ERRORS:
GGML_ASSERT: llama.cpp:8672: false && "not implemented"
GGML_ASSERT: llama.cpp:5443: false && "not implemented"

Environment and Context

Macbook M1 Max 32GB

Steps to Reproduce

cat hostfile 
127.0.0.1:2
  1. mpirun -hostfile hostfile -n 2 ./main -m /Users/ageorgios/.ollama/models/blobs/sha256:29fdb92e57cf0827ded04ae6461b5931d01fa595843f55d36f5b275a52087dd2 -n 128 &> ../output.txt

Failure Logs

output.txt

@vvsotnikov
Copy link

Unfortunately, it's expected. You could try to use older versions of llama.cpp, before #3228 was merged.
MPI should be fixed in #3334, more context: #3752.

@github-actions github-actions bot added the stale label Mar 19, 2024
Copy link
Contributor

github-actions bot commented Apr 3, 2024

This issue was closed because it has been inactive for 14 days since being marked as stale.

@github-actions github-actions bot closed this as completed Apr 3, 2024
@SVyatoslavG
Copy link

This is still an issue. Please reopen.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants