Replies: 3 comments 1 reply
-
Just save the lora, not the checkpoint, and reduce the unet and text encoder rank, 32/64 is enough. |
Beta Was this translation helpful? Give feedback.
1 reply
-
In the save tab, |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I try to open the Lora option, and the training result file amost 8G.
But the kohya lora tool just 120M, Please tell me why?
Beta Was this translation helpful? Give feedback.
All reactions