-
Notifications
You must be signed in to change notification settings - Fork 120
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Training TAPIR PyTorch version script? #90
Comments
Internally we train using JAX. Maintaining a second training codebase would be non-trivial work for us and is beyond the capacity of our team. Sorry about that. |
Thanks for your response. Yes, I got that. I tried to train with the existing PyTorch TAPIR code but failed to optimize due to the non-integration structure. Seems need to spend more time now. Anyway thanks for the nice work. |
Hi @riponazad could you please explain what you mean by the non-integration structure? Thanks! |
Hi @justachetan, the provided pytorch script doesn't track the computation during the forward pass. Hence, autograd can't compute gradients and the optimizer doesn't update the parameters of the model. That is my understanding and I didn't dig anymore. I may be wrong or describing wrongly. Please let me know if you are abale to train it. |
Thanks! Yeah I realized that too. But I guess if you remove the calls to |
@justachetan Thanks a lot :). That's it. Saved a lot of time from digging. Now it's being optimized. |
Hi @justachetan @riponazad in case you have written out a training script in torch, I wanted to see if it is possible to make it available? (I wanted to compare with what I have) thank you! |
Is there any script for training TAPIR pytorch version?
The text was updated successfully, but these errors were encountered: