Added SVRG optimizer documentation and tests for stability and convergence #2
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR introduces a new Stochastic Variance Reduced Gradient (SVRG) optimizer to the
statsmodels_sgd
repository, including comprehensive documentation and unit tests for accuracy and stability.Key Changes
SVRG Optimizer Implementation:
SVRG
optimizer inoptimizers.py
, leveraging variance reduction for improved convergence stability and speed compared to standard SGD.lr
) and inner-loop iterations (n_inner
), providing flexibility for various regression and classification tasks.Documentation:
svrg_optimizer.md
in thedocs
folder.Unit Tests:
tests/test_optimizers.py
to verify parameter updates, stability, and convergence properties of SVRG.This addition enhances
statsmodels_sgd
by introducing a more efficient optimizer option, making it suitable for large datasets where variance reduction can significantly improve results.Please review and merge to make this improvement available in the main repository.