Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integrate vllm Error, TypeError: top_k must be an integer, got float #3501

Open
alanhsu777 opened this issue Aug 28, 2024 · 1 comment · May be fixed by #3533
Open

Integrate vllm Error, TypeError: top_k must be an integer, got float #3501

alanhsu777 opened this issue Aug 28, 2024 · 1 comment · May be fixed by #3533

Comments

@alanhsu777
Copy link

When I use fastchat to integrate vllm, I get the error "TypeError: top_k must be an integer, got float". The reason is that vllm 0.5.5 has added a BUGFIX #7227, check top_k must be an integer。For fastchat, the default value of top_k in vllm_worker.py is set to -1.0.
1
2

@surak
Copy link
Collaborator

surak commented Sep 23, 2024

I can confirm. Changing from 1.0 to 1 fixes this.

surak added a commit to HelmholtzAI-FZJ/FastChat that referenced this issue Sep 23, 2024
VLLM needs top_j to be int, not float
@surak surak linked a pull request Sep 23, 2024 that will close this issue
2 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants