Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Misc][Utils] allow get_open_port to be called for multiple times #5333

Merged
merged 4 commits into from
Jun 7, 2024

Conversation

youkaichao
Copy link
Member

#4914 allows users to specify a port to use, but is restricted to one port. When we want to use multiple ports (e.g. multiple LLM instances in offline inference for interactive playing), it will cause a problem.

This PR makes it possible to call get_open_port multiple times, and return different ports. In a safety-constrained system, the admin can open a port range for vLLM to use.

@DarkLight1337
Copy link
Member

DarkLight1337 commented Jun 7, 2024

Should we really make this the default behaviour? It may be a bit confusing when the port number does not match what was originally specified in the command. Perhaps we should require another env flag to enable this. If get_open_port is called multiple times without this env flag being set, we can log a message to inform the user about this option.

@youkaichao
Copy link
Member Author

It may be a bit confusing when the port number does not match what was originally specified in the command. Perhaps we should require another env flag to enable this.

The first port will be the specified port, but the second port will not.

We cannot achieve this via flags. Users can create arbitary number of LLMs, and each one can require one port.

In addition, vLLM itself might also use more than one port. Currently some code avoids this by using some other protocols (e.g. shared files in

init_method = f"file://{temp_path}"
)

This is the best approach I can come up with. After all, users just want to control the port range vLLM uses, but will not limit vLLM to use just one port.

@youkaichao
Copy link
Member Author

@DarkLight1337 I added some logging so that users will know new ports are used.

@DarkLight1337
Copy link
Member

In that case, I guess it's fine as long as you inform the user.

@youkaichao youkaichao enabled auto-merge (squash) June 7, 2024 02:23
@WoosukKwon WoosukKwon disabled auto-merge June 7, 2024 05:15
@WoosukKwon WoosukKwon merged commit 388596c into vllm-project:main Jun 7, 2024
89 of 90 checks passed
@youkaichao youkaichao deleted the multiple_ports branch June 7, 2024 05:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants