Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added SVRG optimizer documentation and tests for stability and convergence #2

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

Vishal-sys-code
Copy link

This PR introduces a new Stochastic Variance Reduced Gradient (SVRG) optimizer to the statsmodels_sgd repository, including comprehensive documentation and unit tests for accuracy and stability.

Key Changes

  1. SVRG Optimizer Implementation:

    • Added an SVRG optimizer in optimizers.py, leveraging variance reduction for improved convergence stability and speed compared to standard SGD.
    • Supports configurable learning rate (lr) and inner-loop iterations (n_inner), providing flexibility for various regression and classification tasks.
  2. Documentation:

    • Created svrg_optimizer.md in the docs folder.
    • Detailed documentation covering the optimizer’s purpose, configuration options, usage examples, and integration steps.
  3. Unit Tests:

    • Added thorough tests in tests/test_optimizers.py to verify parameter updates, stability, and convergence properties of SVRG.
    • Tests are designed to cover various scenarios, ensuring the robustness of the implementation.

This addition enhances statsmodels_sgd by introducing a more efficient optimizer option, making it suitable for large datasets where variance reduction can significantly improve results.

Please review and merge to make this improvement available in the main repository.

Added svrg_optimizer.md to the docs folder, providing an overview, usage example, parameters, and testing instructions for the SVRG optimizer. Updated tests/test_optimizers.py to include test cases validating SVRG’s variance reduction and parameter update functionality. Ensured that SVRG implementation in optimizers.py supports variance-reduced updates for enhanced convergence.
@Vishal-sys-code Vishal-sys-code changed the title Add SVRG optimizer documentation and tests for stability and convergence Added SVRG optimizer documentation and tests for stability and convergence Nov 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant