-
-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature]: Not support Qwen-VL-Chat #7017
Comments
Qwen-VL is not supported by vLLM yet but you're very welcome to submit a PR to do so. Please see all supported vision language models here https://docs.vllm.ai/en/latest/models/supported_models.html#vision-language-models |
I am looking into adding support for image inputs for Qwen-VL/Qwen-VL-Chat! Definitely open to collaborating somehow though 😄 |
Hi, is anything update about supporting Qwen-VL/Qwen-VL-Chat in vllm? |
Hello @zhujinhua! I'm still working on it and have made some progress, I am hoping to have a PR up for it soon 🤞 |
Closing as duplicate of #962 |
🚀 The feature, motivation and pitch
While trying to execute the following code:
I encountered the following error message:
vllm Version: 0.5.3.post
It appears that the Qwen-VL-Chat model is not supported by the vllm API. Can you please assist in resolving this issue?
Thank you!
Alternatives
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: