-
Notifications
You must be signed in to change notification settings - Fork 185
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
请问目前是仅支持lora和qlora微调吗,全参数微调后续会开放吗? #257
Comments
已经找到全参数微调的脚本了,运行时报错参数不匹配,请问是什么原因呢? 这是我的脚本设置: deepspeed --num_gpus 8 dbgpt_hub/train/sft_train.py |
今天碰到了同样的问题。检查后发现是前面跟着3.2 Quick Start执行了pip install dbgpt-hub命令,而pip版本里的preprocess_dataset()函数和现在的不一致。所以执行from dbgpt_hub.data_process.data_utils import xxx的时候实际上引用了旧的package。pip uninstall dbgpt-hub应该就可以了。 |
谢谢🙏我当时把最后的sft参数注释掉了也work了:) |
谢谢各位作者优秀的工作!
请问各位author有没有做过实验评估全参数微调和peft后的模型性能各自怎么样呢?希望解惑~~
The text was updated successfully, but these errors were encountered: