Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cross derivatives of a multivariate function? #205

Open
daisukeadachi opened this issue Jul 6, 2022 · 3 comments
Open

Cross derivatives of a multivariate function? #205

daisukeadachi opened this issue Jul 6, 2022 · 3 comments

Comments

@daisukeadachi
Copy link

Hello,

I hope all is well. I have a question about the title. I am interested in computing a $M\times N \times N$ three-way array of derivatives:

$\frac{\partial^{2}f_{i}}{\partial x_{j}\partial x_{k}}$

where $f$ is a $M$ vector of functions $f_i \ (i=1,\ldots,M)$, and each $f_i$ takes $x \in \mathbb{R}^N$ as its arguments. What is the best way to compute this object in FiniteDifferences.jl?

This is clearly not a Jacobian, but I tried to do jacobian(central_fdm(5,2), myfun, myval) just to what it gives. It did not give an error or an $M\times N \times N$ array but an $M\times N$ array. So what is it supposed to compute?

Best,
Daisuke

@wesselb
Copy link
Member

wesselb commented Aug 9, 2022

Hey @daisukeadachi,

If jacobian receives a matrix-valued function, it will vectorise the output before computing the jacobian, which will then be a matrix. I think this example should be what you're after

using LinearAlgebra
using FiniteDifferences


m = central_fdm(5, 1)
# Don't use adaptation for the inner method.
m_no_adapt = central_fdm(5, 1, adapt=0)

f(x) = [dot(x, x), 2dot(x, x)]

d = jacobian(m, x -> jacobian(m_no_adapt, f, x)[1], x)[1]
d = reshape(d, 2, 3, 3)  # Undo the vectorisation. The two outputs come first.

# Hessian for the first output:
@assert isapprox(d[1, :, :], 2I)

# Hessian for the second output:
@assert isapprox(d[2, :, :], 4I)

@paulc-git
Copy link

paulc-git commented Oct 13, 2022

Does this generalise straightforwardly to higher dimensions? E.g. I have a 5x5x5 array, where each entry depends on 5 parameters and I want to differentiate it w.r.t. those 5 parameters to get a 5x5x5x5 array, where e.g. the derivative w.r.t. the first parameter is stored in the first sub-array :

d[ :, :, :, 1]
etc.

I would assume this is achieved in the same way, i.e.

fun #function depending on 5 parameters with 5x5x5 array as output
d = jacobian(central_fdm(5,1), fun, val)
d = reshape(d, 5, 5, 5, 5)

I am unsure because I fear that the indices could get all messed up in the process of vectorising and reshaping at higher dimensions.

@wesselb
Copy link
Member

wesselb commented Oct 14, 2022

Hey @paulc-git! I believe this should readily generalise to higher dimensions in the way you describe. You’re right that it’s not obvious whether the indexing just works out, but I believe that it should.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants