Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

可以使用8块 16G T4训练1B的模型吗 #797

Open
northeastsquare opened this issue Dec 25, 2024 · 0 comments
Open

可以使用8块 16G T4训练1B的模型吗 #797

northeastsquare opened this issue Dec 25, 2024 · 0 comments

Comments

@northeastsquare
Copy link

northeastsquare commented Dec 25, 2024

感谢作者把模型效果做的这么好,我在测试1B模型,推理使用T4是可以的,但是使用8块T4 lora finetune可以吗?
我跑训练脚本时候,遇到很多错误,所以想直接问问

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant