You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In certain situations, you want to transform your optimisation space. E.g. when fitting rates for chemical reaction networks (a parameter fitting problem), you typically fir the log10 of the rates, not the rates (tow advantages: It naturally imposes non-negativity and also makes fitting easier as parameters typically varies naturally over orders of magnitude).
This obviously something that the user can encode in their cost function, and e.g. something PEtab can handle automatically (https://sebapersson.github.io/PEtab.jl/stable/API/#PEtab.PEtabParameter). However, if it is possible to encode transformed functions in Optimisation, this would make things way easier (it would also save the user from having to juggle different scales back and fourth, which is an utter mess).
It would be useful wth both ready made transforms, and the option for the user to designate their own (both across all optimisation variables, but also for individual ones). Some useful ones would be:
Log10 transform.
Sigmoidal transform (basically a way to turn a bounded value (u1 < u < u2) to become unbounded (for e.g. Adam optimisation).
Some kind of tanh transform (combining log10 transform and encoding of bounds into the optimisation variable).
Obviously this is no trivial undertaking, but it would definitely be very useful.
The text was updated successfully, but these errors were encountered:
I recently had a similar set of thoughts and ended up refactoring my code base to use TransformVariables.jl, so my cost function is written with a transform as an argument.
Other things I liked about TransformVariables:
After optimizing, you can use the same transform on your optimized values to get out the parameter values of interest.
With my recently-merged PR, the transform can tack on units; the same PR also added composable transforms which make fine-tuning easier
The transform can take a flat vector and give a NamedTuple, which can in turn be mapped onto a subset of the fields of a struct with setproperties from ConstructionBase
(Sorry to be advertising my own work, but it solved a lot of problems for me!)
In certain situations, you want to transform your optimisation space. E.g. when fitting rates for chemical reaction networks (a parameter fitting problem), you typically fir the log10 of the rates, not the rates (tow advantages: It naturally imposes non-negativity and also makes fitting easier as parameters typically varies naturally over orders of magnitude).
This obviously something that the user can encode in their cost function, and e.g. something PEtab can handle automatically (https://sebapersson.github.io/PEtab.jl/stable/API/#PEtab.PEtabParameter). However, if it is possible to encode transformed functions in Optimisation, this would make things way easier (it would also save the user from having to juggle different scales back and fourth, which is an utter mess).
It would be useful wth both ready made transforms, and the option for the user to designate their own (both across all optimisation variables, but also for individual ones). Some useful ones would be:
Obviously this is no trivial undertaking, but it would definitely be very useful.
The text was updated successfully, but these errors were encountered: