-
Notifications
You must be signed in to change notification settings - Fork 9.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Incompatible with MacOS #773
Comments
I'm having the same problem and have no idea how to fix it! |
@artursapek: Thanks for the help! I will need to spend more time on this but any help would be very cool from the FOSS community and FB! Thanks! |
WIP PR for Apple support: #504 You can either wait for this to be merged or use the code from this PR. Note that you can't use MPS yet |
@subramen: What about support for intel macs? Thanks! |
@shyamalschandra yeah you should be able to run it on the CPU with that PR |
Have you tried it yourself on an Intel MacOS machine? |
I encounter similar error on Macbook Air M1. Traceback (most recent call last):
File "/Users/Me/survey/LLM/llama-main/example_text_completion.py", line 4, in <module>
import fire
ModuleNotFoundError: No module named 'fire' But I do have fire module: $ pip show fire
Name: fire
Version: 0.5.0
Summary: A library for automatically generating command line interfaces.
Home-page: https://github.com/google/python-fire
Author: David Bieber
Author-email: [email protected]
License: Apache Software License
Location: /Users/Me/survey/LLM/LLM_env/lib/python3.11/site-packages
Requires: six, termcolor
Required-by: llama So this might be related to the fire module not support macOS? |
Hi,
I just ran the code with
torchrun
afterpip3 install -e .
and this is what I got:Can you fix this problem ASAP? Also, I don't have graphics card that is compatible with CUDA. Are you going to release one for OpenCL/Vulkan/Metal? Thanks again!
The text was updated successfully, but these errors were encountered: