- The convergence tolerance computations have changed slightly
- Optional scaling for inverse Hessian in L-BFGS
- Support for initial step length guesses via LineSearches
- Support for optimization on Riemannian manifolds
- Support for optimization of functions of complex variables
- New experimental KrylovTrustRegion method useful when cheap Hessian-vector products are available
- Improved support for BigFloats
- Add doc strings to methods
- Drop support for Julia versions less than v0.6.0-pre
- Fminbox: If an initial guess is on the boundary of the box, the guess is moved inside the box and a warning is produced, as opposed to crashing with an error.
- Significant changes to the Non-, Once-, and TwiceDifferentiable setup; these now hold temporaries relevant to the evaluation of objectives, gradients, and Hessians. They also hold f-, g-, and h_calls counters
- Refactor tests
- Drop v0.4 support
- Add limits to f-, g-, and h_calls
- Improve trace for univariate optimization
- Changed order of storage arrays and evaluation point arrays in gradient and Hessian calls
- Skip v0.8.0 to allow fixes on Julia v0.5.0
- Fix deprecations for *Function constructors
- Fix depwarns on Julia master (v0.6)
- Update references to new JuliaNLSolvers home for Optim+family
- Various bug fixes
- Deprecate DifferentiableFunction, TwiceDifferentiable in favor of OnceDifferentiable, TwiceDifferentiable
- widen some type annotations (e.g. allow for multidimensional arrays as inputs again)
- introduce allow_f_increases keyword in Optim.Options to allow objective to increase between iterations
- New option in Optim.Options: allow_f_increases. Defaults to false, but if set to true, the solver will not stop even if a step leads to an increase in the objective.
- Newton and BFGS: set initial step length to one. See 328.
- OptimizationOptions is now unexported, and has been renamed to Options. Must be accessed as Optim.Options as a result.
- Bug fixes to Nelder-Mead tracing.
- Keywords with ! in them have been deprecated to version without. For example, linesearch! -> linesearch.
- Failures in a line search now terminates the optimization with a warning and status of non-convergence. The results can still be accessed, but
minimizer(res)
will not represent a local minimum. See 275.
- Refactor code internally to clean up code and allow more flexible use Optim
- Switch to new (v.0.3) ForwardDiff
- Make minimizer/minimum transition final
- Fix dispatch bug for univariate optimization.
- The line search functionality has been separated into a new package LineSearches.jl, see 277.
- Make move to minimizer (the argmin) and minimum (objective at argmin) final: field names are now in sync with function based API.
- Assess convergence in g before iterating to avoid line search errors if
initial_x
is a stationary point - Fix trace bug in LevenbergMarquardt.
- Add ForwardDiff AD functionality to NewtonTrustRegion
- Make documentation even more noticable in README.md
- Various bug fixes
- Added box constraints to Levenberg Marquardt
- Changed old solver: Nelder-Mead w/ Adaptive parameters
- Julia v0.5 deprecation fixes
- Added NewtonTrustRegion solver 238 245
- Added ParticleSwarm solver 218
- Added documentation generated by Documenter.jl, see PR225.
- Added NEWS.md