Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can Pippy be combined with PEFT LoRA? #1122

Open
Songjw133 opened this issue Jun 6, 2024 · 1 comment
Open

Can Pippy be combined with PEFT LoRA? #1122

Songjw133 opened this issue Jun 6, 2024 · 1 comment

Comments

@Songjw133
Copy link

Songjw133 commented Jun 6, 2024

I'm not very familiar with pipeline parallelism. Can it work if most of the model's parameters are frozen?

@kwen2501
Copy link
Contributor

Hi, good question, I haven't tried it myself and don't have much experience with PEFT. Do you have a use case in hand?
For the forward pass, it should still work if you provide the PEFT'ed model to PP's API.
For the backward pass, we rely on an assumption that the backward flow of gradients have the same size as the forward flow of activations. Do you think this assumption still holds in PEFT case?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants