Skip to content

Commit

Permalink
Use SparseConnectivityTracer.jl
Browse files Browse the repository at this point in the history
  • Loading branch information
amontoison authored Jun 5, 2024
1 parent 15d897d commit b5b61f9
Show file tree
Hide file tree
Showing 20 changed files with 94 additions and 448 deletions.
4 changes: 4 additions & 0 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -3,16 +3,20 @@ uuid = "54578032-b7ea-4c30-94aa-7cbd1cce6c9a"
version = "0.7.2"

[deps]
ADTypes = "47edcb42-4c32-4615-8424-f2b9edc5f35b"
ColPack = "ffa27691-3a59-46ab-a8d4-551f45b8d401"
ForwardDiff = "f6369f11-7733-5829-9624-2563aa707210"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
NLPModels = "a4795742-8479-5a88-8948-cc11e1c8c1a6"
Requires = "ae029012-a4dd-5104-9daa-d747884805df"
ReverseDiff = "37e2e3b7-166d-5795-8a7a-e32c996b4267"
SparseArrays = "2f01184e-e22b-5df5-ae63-d93ebab69eaf"
SparseConnectivityTracer = "9f842d2f-2579-4b1d-911e-f412cf18a3f5"

[compat]
ADTypes = "1.2.1"
ColPack = "0.4"
SparseConnectivityTracer = "0.5"
ForwardDiff = "0.9.0, 0.10.0"
NLPModels = "0.18, 0.19, 0.20, 0.21"
Requires = "1"
Expand Down
2 changes: 0 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -96,8 +96,6 @@ The following AD packages are supported:
and as optional dependencies (you must load the package before):

- `Enzyme.jl`;
- `SparseDiffTools.jl`;
- `Symbolics.jl`;
- `Zygote.jl`.

## Bug reports and discussions
Expand Down
4 changes: 0 additions & 4 deletions docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,6 @@ OptimizationProblems = "5049e819-d29b-5fba-b941-0eee7e64c1c6"
Percival = "01435c0c-c90d-11e9-3788-63660f8fbccc"
Plots = "91a5bcdd-55d7-5caf-9e0b-520d859cae80"
SolverBenchmark = "581a75fa-a23a-52d0-a590-d6201de2218a"
SymbolicUtils = "d1185830-fcd6-423d-90d6-eec64667417b"
Symbolics = "0c5d862f-8b57-4792-8d23-62f2024744c7"
Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"

[compat]
Expand All @@ -24,6 +22,4 @@ OptimizationProblems = "0.7"
Percival = "0.7"
Plots = "1"
SolverBenchmark = "0.5"
SymbolicUtils = "=1.5.1"
Symbolics = "5.3"
Zygote = "0.6.62"
20 changes: 10 additions & 10 deletions docs/src/backend.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,14 +7,14 @@ The backend information is in a structure [`ADNLPModels.ADModelBackend`](@ref) i

The functions used internally to define the NLPModel API and the possible backends are defined in the following table:

| Functions | FowardDiff backends | ReverseDiff backends | Zygote backends | Enzyme backend | SparseDiffTools backend | Symbolics backend |
| Functions | FowardDiff backends | ReverseDiff backends | Zygote backends | Enzyme backend | Sparse backend |
| ----------- | ----------- | ----------- | ----------- | ----------- | ----------- | ----------- |
| `gradient` and `gradient!` | `ForwardDiffADGradient`/`GenericForwardDiffADGradient` | `ReverseDiffADGradient`/`GenericReverseDiffADGradient` | `ZygoteADGradient` | `EnzymeADGradient` | -- | -- |
| `jacobian` | `ForwardDiffADJacobian` | `ReverseDiffADJacobian` | `ZygoteADJacobian` | -- | `SDTSparseADJacobian` | `SparseADJacobian`/`SparseSymbolicsADJacobian` |
| `hessian` | `ForwardDiffADHessian` | `ReverseDiffADHessian` | `ZygoteADHessian` | -- | -- | `SparseADHessian`/`SparseSymbolicsADHessian` |
| `Jprod` | `ForwardDiffADJprod`/`GenericForwardDiffADJprod` | `ReverseDiffADJprod`/`GenericReverseDiffADJprod` | `ZygoteADJprod` | -- | `SDTForwardDiffADJprod` | -- |
| `Jtprod` | `ForwardDiffADJtprod`/`GenericForwardDiffADJtprod` | `ReverseDiffADJtprod`/`GenericReverseDiffADJtprod` | `ZygoteADJtprod` | -- | -- | -- |
| `Hvprod` | `ForwardDiffADHvprod`/`GenericForwardDiffADHvprod` | `ReverseDiffADHvprod`/`GenericReverseDiffADHvprod` | -- | -- | `SDTForwardDiffADHvprod` | -- |
| `gradient` and `gradient!` | `ForwardDiffADGradient`/`GenericForwardDiffADGradient` | `ReverseDiffADGradient`/`GenericReverseDiffADGradient` | `ZygoteADGradient` | `EnzymeADGradient` | -- |
| `jacobian` | `ForwardDiffADJacobian` | `ReverseDiffADJacobian` | `ZygoteADJacobian` | -- | `SparseADJacobian` |
| `hessian` | `ForwardDiffADHessian` | `ReverseDiffADHessian` | `ZygoteADHessian` | -- | `SparseADHessian` |
| `Jprod` | `ForwardDiffADJprod`/`GenericForwardDiffADJprod` | `ReverseDiffADJprod`/`GenericReverseDiffADJprod` | `ZygoteADJprod` | -- |
| `Jtprod` | `ForwardDiffADJtprod`/`GenericForwardDiffADJtprod` | `ReverseDiffADJtprod`/`GenericReverseDiffADJtprod` | `ZygoteADJtprod` | -- |
| `Hvprod` | `ForwardDiffADHvprod`/`GenericForwardDiffADHvprod` | `ReverseDiffADHvprod`/`GenericReverseDiffADHvprod` | -- |
| `directional_second_derivative` | `ForwardDiffADGHjvprod` | -- | -- | -- | -- |

The functions `hess_structure!`, `hess_coord!`, `jac_structure!` and `jac_coord!` defined in `ad.jl` are generic to all the backends for now.
Expand Down Expand Up @@ -49,7 +49,7 @@ Thanks to the backends inside `ADNLPModels.jl`, it is easy to change the backend

```@example adnlp
nlp = ADNLPModel(f, x0, gradient_backend = ADNLPModels.ReverseDiffADGradient)
grad(nlp, nlp.meta.x0) # returns the gradient at x0 using `ReverseDiff`
grad(nlp, nlp.meta.x0) # returns the gradient at x0 using `ReverseDiff`
```

It is also possible to try some new implementation for each function. First, we define a new `ADBackend` structure.
Expand Down Expand Up @@ -81,7 +81,7 @@ Finally, we use the homemade backend to compute the gradient.

```@example adnlp
nlp = ADNLPModel(sum, ones(3), gradient_backend = NewADGradient)
grad(nlp, nlp.meta.x0) # returns the gradient at x0 using `NewADGradient`
grad(nlp, nlp.meta.x0) # returns the gradient at x0 using `NewADGradient`
```

### Change backend
Expand All @@ -104,7 +104,7 @@ set_adbackend!(nlp, adback)
get_adbackend(nlp)
```

The alternative is to use ``set_adbackend!` and pass the new backends via `kwargs`. In the second approach, it is possible to pass either the type of the desired backend or an instance as shown below.
The alternative is to use `set_adbackend!` and pass the new backends via `kwargs`. In the second approach, it is possible to pass either the type of the desired backend or an instance as shown below.

```@example adnlp2
set_adbackend!(
Expand Down
2 changes: 1 addition & 1 deletion docs/src/performance.md
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ v = ones(2)
It is tempting to define the most generic and efficient `ADNLPModel` from the start.

```@example ex2
using ADNLPModels, NLPModels, Symbolics
using ADNLPModels, NLPModels
f(x) = (x[1] - x[2])^2
x0 = ones(2)
lcon = ucon = ones(1)
Expand Down
26 changes: 5 additions & 21 deletions docs/src/predefined.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,30 +49,12 @@ get_adbackend(nlp)

## Hessian and Jacobian computations

It is to be noted that by default the Jacobian and Hessian matrices are dense.
It is to be noted that by default the Jacobian and Hessian matrices are sparse.

```@example ex1
(get_nnzj(nlp), get_nnzh(nlp)) # number of nonzeros elements in the Jacobian and Hessian
(get_nnzj(nlp), get_nnzh(nlp)) # number of nonzeros elements in the Jacobian and Hessian
```

To enable sparse computations of these entries, one needs to first load the package [`Symbolics.jl`](https://github.com/JuliaSymbolics/Symbolics.jl)

```@example ex1
using Symbolics
```

and now

```@example ex1
ADNLPModels.predefined_backend[:optimized][:jacobian_backend]
```

```@example ex1
ADNLPModels.predefined_backend[:optimized][:hessian_backend]
```

Choosing another optimization problem with the optimized backend will compute sparse Jacobian and Hessian matrices.

```@example ex1
f(x) = (x[1] - 1)^2
T = Float64
Expand All @@ -92,4 +74,6 @@ x = rand(T, 2)
jac(nlp, x)
```

The package [`Symbolics.jl`](https://github.com/JuliaSymbolics/Symbolics.jl) is used to compute the sparsity pattern of the sparse matrix. The evaluation of the number of directional derivatives needed to evaluate the matrix is done by [`ColPack.jl`](https://github.com/michel2323/ColPack.jl).
The package [`SparseConnectivityTracer.jl`](https://github.com/adrhill/SparseConnectivityTracer.jl) is used to compute the sparsity pattern of Jacobians and Hessians.
The evaluation of the number of directional derivatives and the seeds needed to evaluate the compressed Jacobians and Hessians is done by [`ColPack.jl`](https://github.com/exanauts/ColPack.jl).
We acknowledge Guillaume Dalle (@gdalle), Adrian Hill (@adrhill), and Michel Schanen (@michel2323) for the development of these packages.
12 changes: 6 additions & 6 deletions docs/src/reference.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,17 @@
# Reference

## Contents

```@contents
Pages = ["reference.md"]
```

## Index

```@index
Pages = ["reference.md"]
```

```@autodocs
Modules = [ADNLPModels]
```
```
47 changes: 5 additions & 42 deletions src/ADNLPModels.jl
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,11 @@ module ADNLPModels

# stdlib
using LinearAlgebra, SparseArrays

# external
using ColPack, ForwardDiff, ReverseDiff
using ADTypes: ADTypes, AbstractSparsityDetector
using SparseConnectivityTracer, ColPack, ForwardDiff, ReverseDiff

# JSO
using NLPModels
using Requires
Expand All @@ -16,39 +19,13 @@ const ADModel{T, S} = Union{AbstractADNLPModel{T, S}, AbstractADNLSModel{T, S}}
include("ad.jl")
include("ad_api.jl")

"""
compute_jacobian_sparsity(c!, cx, x0)
Return a sparse matrix.
"""
function compute_jacobian_sparsity(args...)
throw(
ArgumentError(
"Please load Symbolics.jl to enable sparse Jacobian or implement `compute_jacobian_sparsity`.",
),
)
end

"""
compute_hessian_sparsity(f, nvar, c!, ncon)
Return a sparse matrix.
"""
function compute_hessian_sparsity(args...)
throw(
ArgumentError(
"Please load Symbolics.jl to enable sparse Hessian or implement `compute_hessian_sparsity`.",
),
)
end

include("sparsity_pattern.jl")
include("sparse_jacobian.jl")
include("sparse_hessian.jl")

include("forward.jl")
include("reverse.jl")
include("enzyme.jl")
include("sparse_diff_tools.jl")
include("zygote.jl")
include("predefined_backend.jl")
include("nlp.jl")
Expand Down Expand Up @@ -181,20 +158,6 @@ function ADNLSModel!(model::AbstractNLSModel; kwargs...)
end
end

@init begin
@require Symbolics = "0c5d862f-8b57-4792-8d23-62f2024744c7" begin
include("sparse_sym.jl")

predefined_backend[:default][:jacobian_backend] = SparseADJacobian
predefined_backend[:default][:jacobian_residual_backend] = SparseADJacobian
predefined_backend[:optimized][:jacobian_backend] = SparseADJacobian
predefined_backend[:optimized][:jacobian_residual_backend] = SparseADJacobian

predefined_backend[:default][:hessian_backend] = SparseADHessian
predefined_backend[:optimized][:hessian_backend] = SparseReverseADHessian
end
end

export get_adbackend, set_adbackend!

"""
Expand Down
12 changes: 6 additions & 6 deletions src/predefined_backend.jl
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,13 @@ default_backend = Dict(
:hprod_backend => ForwardDiffADHvprod,
:jprod_backend => ForwardDiffADJprod,
:jtprod_backend => ForwardDiffADJtprod,
:jacobian_backend => ForwardDiffADJacobian,
:hessian_backend => ForwardDiffADHessian,
:jacobian_backend => SparseADJacobian, # ForwardDiffADJacobian
:hessian_backend => SparseADHessian, # ForwardDiffADHessian
:ghjvprod_backend => ForwardDiffADGHjvprod,
:hprod_residual_backend => ForwardDiffADHvprod,
:jprod_residual_backend => ForwardDiffADJprod,
:jtprod_residual_backend => ForwardDiffADJtprod,
:jacobian_residual_backend => ForwardDiffADJacobian,
:jacobian_residual_backend => SparseADJacobian, # ForwardDiffADJacobian,
:hessian_residual_backend => ForwardDiffADHessian,
)

Expand All @@ -18,13 +18,13 @@ optimized = Dict(
:hprod_backend => ReverseDiffADHvprod,
:jprod_backend => ForwardDiffADJprod,
:jtprod_backend => ReverseDiffADJtprod,
:jacobian_backend => ForwardDiffADJacobian,
:hessian_backend => ForwardDiffADHessian,
:jacobian_backend => SparseADJacobian, # ForwardDiffADJacobian
:hessian_backend => SparseReverseADHessian, # ForwardDiffADHessian,
:ghjvprod_backend => ForwardDiffADGHjvprod,
:hprod_residual_backend => ReverseDiffADHvprod,
:jprod_residual_backend => ForwardDiffADJprod,
:jtprod_residual_backend => ReverseDiffADJtprod,
:jacobian_residual_backend => ForwardDiffADJacobian,
:jacobian_residual_backend => SparseADJacobian, # ForwardDiffADJacobian
:hessian_residual_backend => ForwardDiffADHessian,
)

Expand Down
Loading

0 comments on commit b5b61f9

Please sign in to comment.