Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When will the vllm PR be merged to the main branch? #19

Open
zuxin666 opened this issue Jun 22, 2024 · 8 comments
Open

When will the vllm PR be merged to the main branch? #19

zuxin666 opened this issue Jun 22, 2024 · 8 comments

Comments

@zuxin666
Copy link

Thank you for your impressive work on this project. I'm eager to try this model, but I've noticed that the vllm deployment pull request has conflicts with the main branch, and building vllm from scratch is challenging for my development environment.

Is there an active effort to resolve these conflicts and merge the PR into the main branch? If possible, could you provide an estimated timeline for this merge? I greatly appreciate your work and look forward to using this implementation. Thank you for your time.

@i-love-doufunao
Copy link

I'm eager like you to wait for trying this model ! But this pr has been pending for a period time @zwd003 could you give us a favor?

@fengyang95
Copy link

Thank you for your impressive work on this project. I'm eager to try this model, but I've noticed that the vllm deployment pull request has conflicts with the main branch, and building vllm from scratch is challenging for my development environment.感谢您在这个项目上所做的令人印象深刻的工作。我很想尝试这个模型,但我注意到 vllm 部署拉取请求与主分支有冲突,并且从头开始构建 vllm 对于我的开发环境来说是一个挑战。

Is there an active effort to resolve these conflicts and merge the PR into the main branch? If possible, could you provide an estimated timeline for this merge? I greatly appreciate your work and look forward to using this implementation. Thank you for your time.是否正在积极努力解决这些冲突并将 PR 合并到主分支中?如果可能的话,您能否提供此次合并的预计时间表?我非常感谢您的工作并期待使用此实现。感谢您的时间。

+1

@lwaekfjlk
Copy link

same here.

@viktara
Copy link

viktara commented Jul 8, 2024

It's merged, but doesn't work

@halexan
Copy link

halexan commented Jul 16, 2024

vllm v0.5.1 support deepseek v2

@viktara
Copy link

viktara commented Jul 17, 2024

vllm v0.5.1 support deepseek v2

Are you using the 236b model or the lite one?

@halexan
Copy link

halexan commented Jul 20, 2024

vllm v0.5.1 support deepseek v2

Are you using the 236b model or the lite one?

Not test yet

@halexan
Copy link

halexan commented Jul 27, 2024

vllm v0.5.1 support deepseek v2

Are you using the 236b model or the lite one?

use 236b, success

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants
@viktara @halexan @fengyang95 @i-love-doufunao @lwaekfjlk @zuxin666 and others