-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Installation]: Cannot install with Poetry #8851
Comments
I don't think vLLM supports installation directly via Poetry at the moment. As a workaround, you can manually |
Anyone come up with the solution? Thanks! |
cc @dtrifiro |
Not sure why, but you're building from scratch instead of installing the wheel. If building vllm from scratch is what you're trying to accomplish, you can try setting the |
This issue has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this issue should remain open. Thank you! |
Your current environment
5.47 /tmp/tmp4enbmtnv/.venv/lib/python3.12/site-packages/torch/_subclasses/functional_tensor.py:258: UserWarning: Failed to initialize NumPy: No module named 'numpy' (Triggered internally at /pytorch/torch/csrc/utils/tensor_numpy.cpp:84.)
45.47 cpu = _conversion_method_template(device=torch.device("cpu"))
45.47 Traceback (most recent call last):
45.47 File "/usr/local/lib/python3.12/site-packages/pyproject_hooks/_in_process/_in_process.py", line 373, in
45.47 main()
45.47 File "/usr/local/lib/python3.12/site-packages/pyproject_hooks/_in_process/_in_process.py", line 357, in main
45.47 json_out["return_val"] = hook(**hook_input["kwargs"])
45.47 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
45.47 File "/usr/local/lib/python3.12/site-packages/pyproject_hooks/_in_process/_in_process.py", line 134, in get_requires_for_build_wheel
45.47 return hook(config_settings)
45.47 ^^^^^^^^^^^^^^^^^^^^^
45.47 File "/tmp/tmp4enbmtnv/.venv/lib/python3.12/site-packages/setuptools/build_meta.py", line 332, in get_requires_for_build_wheel
45.47 return self._get_build_requires(config_settings, requirements=[])
45.47 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
45.47 File "/tmp/tmp4enbmtnv/.venv/lib/python3.12/site-packages/setuptools/build_meta.py", line 302, in _get_build_requires
45.47 self.run_setup()
45.47 File "/tmp/tmp4enbmtnv/.venv/lib/python3.12/site-packages/setuptools/build_meta.py", line 318, in run_setup
45.47 exec(code, locals())
45.47 File "", line 510, in
45.47 File "", line 450, in get_requirements
45.47 ValueError: Unsupported platform, please use CUDA, ROCm, Neuron, OpenVINO, or CPU.
45.47
45.47
45.47 at /usr/local/lib/python3.12/site-packages/poetry/installation/chef.py:164 in _prepare
45.48 160│
45.48 161│ error = ChefBuildError("\n\n".join(message_parts))
45.48 162│
45.48 163│ if error is not None:
45.48 → 164│ raise error from None
45.48 165│
45.48 166│ return path
45.48 167│
45.48 168│ def _prepare_sdist(self, archive: Path, destination: Path | None = None) -> Path:
45.48
45.48 Note: This error originates from the build backend, and is likely not a problem with poetry but with vllm (0.6.2) not supporting PEP 517 builds. You can verify this by running 'pip wheel --no-cache-dir --use-pep517 "vllm (==0.6.2)"'.
45.48
How would you like to use vllm
I'm using MacOS and want to install vllm with poerty.
my actions in docker:
poetry add "numpy=1.26.4"
poerty add vllm
and i catch this problem.
How i can install vllm lib on my mac ?
Before submitting a new issue...
The text was updated successfully, but these errors were encountered: