Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How much GPU Memory should have at minimum? #7

Open
bilemond opened this issue Jul 24, 2024 · 1 comment
Open

How much GPU Memory should have at minimum? #7

bilemond opened this issue Jul 24, 2024 · 1 comment

Comments

@bilemond
Copy link

Hi, great work!
I tried running the training code with 4090 GPUs that has 24GB memory. Even with the batch size set to 1, it exceeded the available memory. Can you confirm if this is the case? And how much Memory does the A100 GPU that you used?
Thank you very much!

@yeqinglin
Copy link
Collaborator

Thank you very much. We trained our model on A100 GPUs with 40GB memory and the batch size per GPU is 6. When training with a batch size of 1, the memory usage is ~8GB, which should be able to fit in a GPU with 24GB memory. How did you update the batch size? It is via the configuration file, or in genie/config.py? Note that batch_size in genie/config.py would be overwritten by batchSize from the configuration file (i.e. runs/example/configuration).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants