Best LoRA settings that you have found - so many settings so little explanation #992
Unanswered
FurkanGozukara
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I compared settings with the famous kohya
In kohya the default used learning rates are
unet_lr = 1e-4 #@param {'type':'number'}
text_encoder_lr = 5e-5 #@param {'type':'number'}
I think these are coming from official LoRA paper
Now there are settings we have that I couldn't match with kohya
Lora UNET Rank
and
Lora Text Encoder Rank
in the description it says the Learning rate changes according to these ranks
what do you set these ranks for faces or styles?
what is Calculate Split Loss in testing tab?
should we check Unfreeze Model model when creating a model? we are already training unet and text encoder - clip what else is there to train? i mean I can't make sense of Unfreeze Model
Lion or 8bitadam optimizer you pick ? which learning rates accordingly?
do you use classification images or not for lora to improve success?
bf16 or fp16 yields better?
do you use Freeze CLIP Normalization Layers ? if this is used that means only token vectors are trained not the clip itself?
do you use AdamW Weight Decay? if so what value?
do AdamW Weight Decay and lion works at the same time?
do you use new famous Offset Noise ? if so what value?
do you use Use DEIS for noise scheduler in new testing tab?
thank you so much for answers. I am frequently getting asked for good settings for LoRA however there are just too many options to test blindly
Beta Was this translation helpful? Give feedback.
All reactions