Skip to content

FEAT: add LoRALinearGeneral #4718

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

zhengzl18
Copy link

  1. add LoRALinearGeneral since MultiHeadAttention uses LinearGeneral.
  2. modify default value of axis in LinearGeneral.__init__() so that it is compatible to in_features that is a suquence of ints
  3. add lora_base_module param to LoRALinear so that it's consistent with LoRA
  4. update docs in lora.py

Since LoRA currently doesn't support dot_general, I reshape the input before and after calling lora. I also add a check on the value of axis param in LinearGeneral, so that it's only valid to apply linear transformation on the last axes of the input.

I got trouble running the tests, so I couldn't add one.

Copy link

google-cla bot commented Apr 19, 2025

Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

View this failed invocation of the CLA check for more information.

For the most up to date status, view the checks section at the bottom of the pull request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant