Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parameter-Free Optimization Algorithms #81

Merged
merged 18 commits into from
Aug 12, 2024

Conversation

Red-Portal
Copy link
Member

@Red-Portal Red-Portal commented Aug 10, 2024

This PR adds some recent parameter-free optimization algorithms. These algorithms should provide close-to-optimal performance without any tuning. The hope is to choose one of these algorithms as the default strategy so that Turing users don't need to perform any tuning when using AdvancedVI. In particular, the COCOB algorithm have been reported to be very effective for particle variational inference algorithms.

The PR also adds parameter-averaging strategies. Some of the parameter-free algorithms report the best performance when combined with parameter averaging. This paper also previously suggested that VI is best combined with parameter averaging (albeit by determining when to start averaging through the $\widehat{R}$ measure.)

Once the PosteriorDB project is done, we can probably run some large-scale experiments to determine which one is best.

  • Add parameter-free optimization algorithms
    • DoG
    • DoWG
    • COCOB
  • Parameter averaging strategies
    • No averaging
    • Polynomial averaging
    • (maybe?) $\widehat{R}$-based adaptive averaging strategy
  • Documentation
  • Unit tests

Copy link

codecov bot commented Aug 10, 2024

Codecov Report

Attention: Patch coverage is 94.44444% with 3 lines in your changes missing coverage. Please review.

Project coverage is 95.76%. Comparing base (1b36c6e) to head (107e793).
Report is 12 commits behind head on master.

Files Patch % Lines
src/AdvancedVI.jl 0.00% 3 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master      #81      +/-   ##
==========================================
- Coverage   96.09%   95.76%   -0.33%     
==========================================
  Files          11       13       +2     
  Lines         205      260      +55     
==========================================
+ Hits          197      249      +52     
- Misses          8       11       +3     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@Red-Portal
Copy link
Member Author

Red-Portal commented Aug 10, 2024

This PR is now ready. @yebai @mhauru @sunxd3 If anybody could take a look it would be great! Running the formatter unfortunately messed up the diff a little bit. Apologies for this in advance. The main breaking change by this PR is that optimize now returns both the last-iterate and the iterate average of SGD. (It previously only returned the last-iterate.)

Copy link

@sunxd3 sunxd3 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is great! I learned a lot by reviewing.
Went though the algorithms in the paper, the implementations looks correct.
Couple of really minor comments.

src/optimization/rules.jl Show resolved Hide resolved
src/optimization/rules.jl Show resolved Hide resolved
src/optimize.jl Show resolved Hide resolved
@Red-Portal Red-Portal added this to the v0.3.0 milestone Aug 12, 2024
@Red-Portal Red-Portal merged commit 8ba6cb6 into TuringLang:master Aug 12, 2024
8 of 11 checks passed
@Red-Portal Red-Portal deleted the parameterfree branch September 10, 2024 04:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants