Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: support out tree multimodal models #8667

Closed
1 task done
Jack47 opened this issue Sep 20, 2024 · 11 comments · Fixed by #8717
Closed
1 task done

[Feature]: support out tree multimodal models #8667

Jack47 opened this issue Sep 20, 2024 · 11 comments · Fixed by #8717

Comments

@Jack47
Copy link

Jack47 commented Sep 20, 2024

🚀 The feature, motivation and pitch

vllm >= 0.6 doesn't support out tree defined multimodal models:

rank0]:   File "/usr/local/lib/python3.8/dist-packages/vllm/entrypoints/llm.py", line 177, in __init__
[rank0]:     self.llm_engine = LLMEngine.from_engine_args(
[rank0]:   File "/usr/local/lib/python3.8/dist-packages/vllm/engine/llm_engine.py", line 535, in from_engine_args
[rank0]:     engine_config = engine_args.create_engine_config()
[rank0]:   File "/usr/local/lib/python3.8/dist-packages/vllm/engine/arg_utils.py", line 792, in create_engine_config
[rank0]:     model_config = ModelConfig(
[rank0]:   File "/usr/local/lib/python3.8/dist-packages/vllm/config.py", line 230, in __init__
[rank0]:     self.multimodal_config = self._init_multimodal_config(
[rank0]:   File "/usr/local/lib/python3.8/dist-packages/vllm/config.py", line 251, in _init_multimodal_config
[rank0]:     raise ValueError(
[rank0]: ValueError: limit_mm_per_prompt is only supported for multimodal models.

Alternatives

No response

Additional context

No response

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
@youkaichao
Copy link
Member

how do you register the out of tree model?

@Jack47
Copy link
Author

Jack47 commented Sep 21, 2024

how do you register the out of tree model?

This is the feature I want to have, currently vllm 0.6 doesn't provide ways to register out of tree models?

One possible hack way:add my model type and class into _MULTIMODAL_MODELS in vllm

@youkaichao
Copy link
Member

we do support out of tree registration, see https://docs.vllm.ai/en/latest/models/adding_model.html#out-of-tree-model-integration

@Jack47
Copy link
Author

Jack47 commented Sep 21, 2024

we do support out of tree registration, see https://docs.vllm.ai/en/latest/models/adding_model.html#out-of-tree-model-integration

sure and I've registered my multimodal models using that function. But the error I posted still occurs.

so vllm 0.6 doesn't provide ways to register out of tree multimodal registration?

@youkaichao
Copy link
Member

cc @DarkLight1337 @ywang96
does multi-modal model support the out-of-tree registration?

@DarkLight1337
Copy link
Member

Currently, you can do it by adding the model to _MULTIMODAL_MODELS. Just like regular OOT models, there's no public API for this.

@youkaichao
Copy link
Member

@DarkLight1337 we do have a public api for this:

def register_model(model_arch: str, model_cls: Type[nn.Module]):

@DarkLight1337
Copy link
Member

DarkLight1337 commented Sep 21, 2024

@DarkLight1337 we do have a public api for this:

def register_model(model_arch: str, model_cls: Type[nn.Module]):

The docs currently suggest to modify _MODELS directly.

@youkaichao
Copy link
Member

@DarkLight1337 that is for registering your model when you fork vllm.

this issue is about out-of-tree registration, i.e. adding new model to vllm without a fork.

@ywang96
Copy link
Member

ywang96 commented Sep 22, 2024

@Jack47 Can you take a look and try #8717 to see if it fixes your issue?

@Jack47
Copy link
Author

Jack47 commented Sep 27, 2024

I'll try it later, thanks.
will post the result there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants