-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for sparsity #63
Comments
It only makes sense to compare identical backends, so the first step would be to support the sparse AD backends defined in ADTypes.jl. |
Or do these sparse backends make use of SparseDiffTools @gdalle? |
The |
Also I wouldn't bother too much about the operators in SparseDiffTools, I have been meaning to rewrite them for quite some time. If you want to benchmark, https://github.com/SciML/NonlinearSolve.jl/blob/master/src/internal/operators.jl is a much better operator based implementation that will (hopefully) one day replace the sparsedifftools ones. |
Yeah @adrhill, the idea is that @prbzrg I plan to add support for sparsity (#7) but the preparation phase is a bit tricky so I haven't gotten around to it yet. Basically we would need to provide a config object + a cache object + a coloring, if I understand correctly. I'll look into that |
@avik-pal what's wrong with the SparseDiffTools operators? |
FunctionOperator has been kind of broken in SciMLOperators (in terms of type stability and interface) for quite some time now. VecJac and JacVec construct that internally. In the long term, we want to fix function operator, but currently, there is a void regarding who wants to pick that up. The nonlinearsolve JacobianOperator essentially implements a type-stable interface for what FunctionOperator should looks like (but it misses some of the necessary functionality to make it general beyond jacobians) |
Re interfacce issues: See SciML/SciMLOperators.jl#223 |
Once #135 is merged, sparse Jacobians and Hessians will be supported, with the precomputation of the sparsity pattern happening in the preparation step. |
Update: the upcoming v1.0 of ADTypes.jl will allow us to decouple
Related: |
As of right now, the main branch supports row-wise and column-wise estimation of sparse Jacobians, and column-wise estimation of sparse Hessians, using what is basically a clean reimplementation of SparseDiffTools. Closing this issue, feel free to reopen more specific ones |
SparseDiffTools
has an interface to use different ADs. It would be good to have a benchmark against it. (or maybe replace current implementation withDifferentiationInterface
)It's interface:
VecJac(f, v; autodiff = autodiff_backend) * x
JacVec(f, v; autodiff = autodiff_backend) * x
The text was updated successfully, but these errors were encountered: