diff --git a/previews/PR302/.documenter-siteinfo.json b/previews/PR302/.documenter-siteinfo.json index fa1ed821..8fb146b4 100644 --- a/previews/PR302/.documenter-siteinfo.json +++ b/previews/PR302/.documenter-siteinfo.json @@ -1 +1 @@ -{"documenter":{"julia_version":"1.10.5","generation_timestamp":"2024-09-26T00:29:38","documenter_version":"1.7.0"}} \ No newline at end of file +{"documenter":{"julia_version":"1.10.5","generation_timestamp":"2024-09-26T00:29:47","documenter_version":"1.7.0"}} \ No newline at end of file diff --git a/previews/PR302/backend/index.html b/previews/PR302/backend/index.html index 6331f2e9..35b6b3cc 100644 --- a/previews/PR302/backend/index.html +++ b/previews/PR302/backend/index.html @@ -57,9 +57,9 @@ return g end
Finally, we use the homemade backend to compute the gradient.
nlp = ADNLPModel(sum, ones(3), gradient_backend = NewADGradient)
grad(nlp, nlp.meta.x0) # returns the gradient at x0 using `NewADGradient`
3-element Vector{Float64}:
- 0.437855944376617
- 0.9275335801536463
- 0.3879876758647781
Once an instance of an ADNLPModel
has been created, it is possible to change the backends without re-instantiating the model.
using ADNLPModels, NLPModels
+ 0.25134352937048154
+ 0.014934949634762829
+ 0.5167014826880357
Once an instance of an ADNLPModel
has been created, it is possible to change the backends without re-instantiating the model.
using ADNLPModels, NLPModels
f(x) = 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2
x0 = 3 * ones(2)
nlp = ADNLPModel(f, x0)
@@ -128,10 +128,10 @@
jhess: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0 jhprod: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
Then, the gradient will return a vector of Float64
.
x64 = rand(2)
grad(nlp, x64)
2-element Vector{Float64}:
- -22.997214205244017
- 195.12883522056904
It is now possible to move to a different type, for instance Float32
, while keeping the instance nlp
.
x0_32 = ones(Float32, 2)
+ -47.401657140104184
+ 41.130394589412724
It is now possible to move to a different type, for instance Float32
, while keeping the instance nlp
.
x0_32 = ones(Float32, 2)
set_adbackend!(nlp, gradient_backend = ADNLPModels.ForwardDiffADGradient, x0 = x0_32)
x32 = rand(Float32, 2)
grad(nlp, x32)
2-element Vector{Float64}:
- -118.4972152709961
- 147.8224334716797
Settings
This document was generated with Documenter.jl version 1.7.0 on Thursday 26 September 2024. Using Julia version 1.10.5.