Skip to content

Commit

Permalink
Work around error reading VLLM_PORT tcp://a.b.c.d:8000
Browse files Browse the repository at this point in the history
For some reason VLLM_PORT gets set as a full url
  • Loading branch information
jamt9000 committed Jun 10, 2024
1 parent 7b13e03 commit de3439b
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion vllm/envs.py
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,7 @@
# by incrementing the VLLM_PORT value.
# '0' is used to make mypy happy
'VLLM_PORT':
lambda: int(os.getenv('VLLM_PORT', '0'))
lambda: int(os.getenv('VLLM_PORT', '0').split(":")[-1])
if 'VLLM_PORT' in os.environ else None,

# If true, will load models from ModelScope instead of Hugging Face Hub.
Expand Down

0 comments on commit de3439b

Please sign in to comment.