Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question regarding the learning rate update strategy #30

Open
zhitao-guo opened this issue Aug 23, 2024 · 0 comments
Open

Question regarding the learning rate update strategy #30

zhitao-guo opened this issue Aug 23, 2024 · 0 comments

Comments

@zhitao-guo
Copy link

Thank you for your beautiful work~

I noticed that the learning rate is updated after every batch by default, which seems different from the more common strategies (update lr every epoch).

self.accum_iter = 1

if data_iter_step % accum_iter == 0:

Could you please explain the reason behind this approach? Additionally, if possible, could you provide some useful references or links on this topic? Looking forward to your reply~

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant