You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Training LoRAs doesn't require a lot of computing power, and can be done on desktop PCs with single modern GPU. Python, PyTorch and CUDA officially support Windows, and popular tools like ComfyUI work on Windows with single GPU just fine, too.
It would be nice if you didn't hardcode official training code (like train/train_c_lora.py) and launching scripts (like train/example_train.sh) to work only on Linux machines with Slurm, but also add support for training on desktop PCs (like single Windows machine with single GPU and no Slurm). This way people wouldn't have to face issues like #69, and figure out workarounds by themselves.
The text was updated successfully, but these errors were encountered:
Training LoRAs doesn't require a lot of computing power, and can be done on desktop PCs with single modern GPU. Python, PyTorch and CUDA officially support Windows, and popular tools like ComfyUI work on Windows with single GPU just fine, too.
It would be nice if you didn't hardcode official training code (like train/train_c_lora.py) and launching scripts (like train/example_train.sh) to work only on Linux machines with Slurm, but also add support for training on desktop PCs (like single Windows machine with single GPU and no Slurm). This way people wouldn't have to face issues like #69, and figure out workarounds by themselves.
The text was updated successfully, but these errors were encountered: