-
-
Notifications
You must be signed in to change notification settings - Fork 65
Newton solver to find the mode of the Gaussian (for INLA) #342
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Specific optimizers used in the referenced implementations:
|
Implementing |
You shouldn't be calling Also you don't need to approximate hessians in pytensor, you should just be able to compute it, unless the thing is really massive. I've done models with ~100k variables and used symbolic inverse Jacobians successfully. |
Thanks for the advice! I'm essentially trying to find the mode of a function (that is, maximise log likelihood). The INLA notes all point to using a fixed-point iterator with Newton's method to find the root of the derivative (approximated up to second order), but given that we need to calculate jac and hess for this at each step anyway, I don't see why we shouldn't simply call |
At the moment I mainly need to figure out how to pass in the NLL inputs and specify the latent field in some neat way. I've been a bit busy these past two days so I haven't had a chance to play around with INLA/PyMC in jupyter notebooks to get a feel for it yet. I'm part way through the |
As you point out, any minimization problem can be transformed into a root-finding problem by using the system of equations defined by the gradients If you have access to a contraction operator And if you have a system of equations Which method to use is entirely up to you. |
Part of INLA roadmap #340.
We want to do the Laplace approximation around$\mu$ . We need to find it first.
Notes
The text was updated successfully, but these errors were encountered: