-
Notifications
You must be signed in to change notification settings - Fork 375
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEAT] Add option to support user defined learning rate scheduler for NeuralForecast Models #998
Conversation
Omit unused import in itransformer.py
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
e3de57f
to
2a7d761
Compare
Fix assertion check Fix iterable issue fix parameter passed to user-defined lr_scheduler
2a7d761
to
17a895b
Compare
@jmoralez Please review. Hope this might be useful to the community too. |
Can you please add warnings if the user provides |
@jmoralez
If you have the link/reference to the ask by sktime folks, can mention in this PR too. Thanks |
lr_scheduler kwargs should be checked with 'if obj'
I was facing the same issue when I wanted to use a custom scheduler and you have a solution that is pretty much what I have, the only thing that I am missing in it is that I think we should also be able to change the other arguments of "lr_scheduler_config" like the frequency or the interval (https://lightning.ai/docs/pytorch/stable/common/lightning_module.html#configure-optimizers for a list of other parameters). My solution was in fact to add 2 new kwargs, |
@BrunoBelucci On top of the prepared PR, think I may need an additional arg called |
I'm not really a fan of adding so many arguments to all models. Would it be possible to instead provide a function that overrides the configure_optimizers method? Either by calling it or monkey patching. |
Honestly, I think that in this case, the cleanest solution is to add one more argument. We already have arguments for the trainer, optimizer, and scheduler. Introducing a different approach for configuring the scheduler, which is part of the same system, could confuse users and require them to write additional boilerplate code to achieve their goals. Here are some scenarios to further develop my point:
I see no reason for users to have to handle each scenario above differently. Additionally, this approach would allow us to document the default behavior clearly. |
What I mean is that we currently have two arguments (for the optimizer), this adds two more (for the scheduler) and you're proposing adding a fifth, when all of these are going to be used in the same method ( Given that sktime is already using the optimizer arguments it'd be a bad experience to deprecate them and introduce a new one that takes a function, so I think we should move forward with adding more to not deprecate something we recently introduced, I just wish we'd done the single argument approach from the start. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the thorough tests, as always!
@jmoralez By the way, shall we re-consider this implementation and revert this PR? I implemented the option of modifying configure_optimizers behavior via |
Thank you so much for the implementation of a custom lr scheduler! @BrunoBelucci :
|
@MLfreakPy |
Rationale