You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The default algorithm of the Optim minimizer is Nelder-Mead(), which is a gradient-free method. I'd like to explore gradient-based algorithms, such as GradientDescent() or LBFGS().
Motivation:
We sometimes get weird fit results of calibration gamma peaks
When testing combined fits or comparing the different peakshape-models, the fit sometimes unreliable (local minimum?)
Choice of some pseudo-prior (e.g. skew fraction) have unrealistic strong impact on fit result
When going to a gradient-based algorithm, it would make sense to provide the minimizer with the analytical gradients. @oschulz
do you have experience on how to do that in julia?
The text was updated successfully, but these errors were encountered:
In principle we can used automatic differentiation (ForwardDiff if it's just a handful of parameters) to enable LBFGS and so on. But we have to make sure the Likelihood is differentiable in principle, but for fitting a function to a histograms that's typically not a problem.
The default algorithm of the Optim minimizer is
Nelder-Mead()
, which is a gradient-free method. I'd like to explore gradient-based algorithms, such asGradientDescent()
orLBFGS()
.Motivation:
When going to a gradient-based algorithm, it would make sense to provide the minimizer with the analytical gradients. @oschulz
do you have experience on how to do that in julia?
The text was updated successfully, but these errors were encountered: