Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Should I add '--epochs 200' in the provided pre-training command? #2

Open
yanjk3 opened this issue Aug 10, 2022 · 0 comments
Open

Should I add '--epochs 200' in the provided pre-training command? #2

yanjk3 opened this issue Aug 10, 2022 · 0 comments

Comments

@yanjk3
Copy link

yanjk3 commented Aug 10, 2022

Thanks for your exciting work!!
The provided pre-training command seems for training 300epochs on Imagenet. I wonder whether the provided checkpoint file 'dino_selfpatch.pth' is pre-trained with 200epochs or 300epochs. As mentioned in the paper, you pre-train selfpatch in imagenet for 200epochs. And if I want to reproduce your results, shouldI add '--epochs 200' in the provided pre-training command?

@yanjk3 yanjk3 changed the title Show I add '--epochs 200' in the provided pre-training command? Should I add '--epochs 200' in the provided pre-training command? Aug 10, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant