-
Notifications
You must be signed in to change notification settings - Fork 126
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running on GPU with less memory - CUDA out of memeory #69
Comments
Hi Syed, |
Hi,I met this problem in training. How can I solve it? |
Hi,I met this problem in training.Can you tell me about your batch-size in training? |
Reduce the batch size for training. In testing, passing the following parameter in the command made it run successfully. --test-crops=1 Note that the default value is 10 for test crops. Best of luck |
Half precision is also helpful to reduce memory consumptiuon. Try use |
I was trying this model on GTX 1080 Ti (12GB memory). All trainings have worked fine for me but with smaller batch size.
However, testing for I-frame model is giving CUDA out of memory error. Which is strange because testing should not require that much or memory especially when training has worked fine.
What is the best way to resolve this issue?
The text was updated successfully, but these errors were encountered: