-
Notifications
You must be signed in to change notification settings - Fork 16
Add validation for lora_dropout #316
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
183d0b1
to
cae49ed
Compare
Don't forget to bump up the version before merging! |
@@ -101,6 +101,11 @@ def create_finetune_request( | |||
raise ValueError( | |||
f"LoRA adapters are not supported for the selected model ({model_or_checkpoint})." | |||
) | |||
|
|||
if lora_dropout is not None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this can be simplified to if lora_dropout
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just left a few comments but nothing major. LGTM otherwise
create_finetune_request( | ||
model_limits=_MODEL_LIMITS, | ||
model=_MODEL_NAME, | ||
training_file=_TRAINING_FILE, | ||
lora=True, | ||
lora_dropout=lora_dropout, | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure why we need this but if it successfully detects out of range dropouts that's fine with me.
Have you read the Contributing Guidelines?
Issue: Lora droupout can be set to values outside the 0-1 range
Describe your changes
Add validation to the parameter