Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Training TAPIR PyTorch version script? #90

Open
riponazad opened this issue Apr 17, 2024 · 7 comments
Open

Training TAPIR PyTorch version script? #90

riponazad opened this issue Apr 17, 2024 · 7 comments

Comments

@riponazad
Copy link

Is there any script for training TAPIR pytorch version?

@cdoersch
Copy link
Collaborator

Internally we train using JAX. Maintaining a second training codebase would be non-trivial work for us and is beyond the capacity of our team. Sorry about that.

@riponazad
Copy link
Author

Thanks for your response. Yes, I got that. I tried to train with the existing PyTorch TAPIR code but failed to optimize due to the non-integration structure. Seems need to spend more time now. Anyway thanks for the nice work.

@justachetan
Copy link

Hi @riponazad could you please explain what you mean by the non-integration structure? Thanks!

@riponazad
Copy link
Author

Hi @justachetan, the provided pytorch script doesn't track the computation during the forward pass. Hence, autograd can't compute gradients and the optimizer doesn't update the parameters of the model. That is my understanding and I didn't dig anymore. I may be wrong or describing wrongly. Please let me know if you are abale to train it.

@justachetan
Copy link

Thanks! Yeah I realized that too. But I guess if you remove the calls to .detach() in the code that should be fixed, right @cdoersch?

@riponazad
Copy link
Author

@justachetan Thanks a lot :). That's it. Saved a lot of time from digging. Now it's being optimized.

@shivanimall
Copy link

Hi @justachetan @riponazad in case you have written out a training script in torch, I wanted to see if it is possible to make it available? (I wanted to compare with what I have) thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants