Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there a way to run the stable diffusion fine-tuning on RTX4090? For all configurations, I keep on facing cuda out of memory error. #55

Open
garg-aayush opened this issue Feb 27, 2023 · 2 comments

Comments

@garg-aayush
Copy link

I would like to fine tune Stable Diffusion for a custom dataset similar to Lambdalabs post.

However, whatever configuration I use? I keep on running the CUDA out of memory issue.

Has anyone tried to run fine-tuning on 24GB GPU say 3090 or 4090?

@youluexx
Copy link

youluexx commented Mar 6, 2023

The same question, if there are two 4090 graphics cards, can they be trained, and how do they need to be configured?

@jnulzl
Copy link

jnulzl commented Mar 31, 2023

The same question

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants