Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How are you able to host your model on GPU with < 16 GB RAM #63

Open
matemato opened this issue May 4, 2023 · 0 comments
Open

How are you able to host your model on GPU with < 16 GB RAM #63

matemato opened this issue May 4, 2023 · 0 comments

Comments

@matemato
Copy link

matemato commented May 4, 2023

Hello,

I am trying to reproduce your txt to pokemon work and try to compare it to my model with bulbapedia descriptions.
When I finish my work I would love to host it on Google Colab like you did.
When I try to generate images like you did, I get CUDA out of memory with GPUs with < 16 GB RAM, but I see that you have your model hosted with GPU Tesla T4 (16 GB RAM) on hugging face. And I see that you created a google colab as well (although it doesn't work anymore) that ran with said GPU.

Does anyone have any idea how to generate images of size 512 but with GPUs available on Google Colab free tier?

Thank you so much for you help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant