diff --git a/previews/PR314/.documenter-siteinfo.json b/previews/PR314/.documenter-siteinfo.json index 1bcf43b8..332c5853 100644 --- a/previews/PR314/.documenter-siteinfo.json +++ b/previews/PR314/.documenter-siteinfo.json @@ -1 +1 @@ -{"documenter":{"julia_version":"1.10.7","generation_timestamp":"2024-12-20T16:32:04","documenter_version":"1.8.0"}} \ No newline at end of file +{"documenter":{"julia_version":"1.11.2","generation_timestamp":"2024-12-20T16:33:59","documenter_version":"1.8.0"}} \ No newline at end of file diff --git a/previews/PR314/backend/index.html b/previews/PR314/backend/index.html index 779867e2..a9d09352 100644 --- a/previews/PR314/backend/index.html +++ b/previews/PR314/backend/index.html @@ -57,9 +57,9 @@ return g end
Finally, we use the homemade backend to compute the gradient.
nlp = ADNLPModel(sum, ones(3), gradient_backend = NewADGradient)
grad(nlp, nlp.meta.x0) # returns the gradient at x0 using `NewADGradient`
3-element Vector{Float64}:
- 0.6071726394174236
- 0.14717406903623043
- 0.8576568096739905
Once an instance of an ADNLPModel
has been created, it is possible to change the backends without re-instantiating the model.
using ADNLPModels, NLPModels
+ 0.030188194174360583
+ 0.42187939956508747
+ 0.930689998188221
Once an instance of an ADNLPModel
has been created, it is possible to change the backends without re-instantiating the model.
using ADNLPModels, NLPModels
f(x) = 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2
x0 = 3 * ones(2)
nlp = ADNLPModel(f, x0)
@@ -128,10 +128,10 @@
jhess: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0 jhprod: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
Then, the gradient will return a vector of Float64
.
x64 = rand(2)
grad(nlp, x64)
2-element Vector{Float64}:
- -7.704737284893528
- 66.80970051291837
It is now possible to move to a different type, for instance Float32
, while keeping the instance nlp
.
x0_32 = ones(Float32, 2)
+ -11.752369025605399
+ 11.622097426192173
It is now possible to move to a different type, for instance Float32
, while keeping the instance nlp
.
x0_32 = ones(Float32, 2)
set_adbackend!(nlp, gradient_backend = ADNLPModels.ForwardDiffADGradient, x0 = x0_32)
x32 = rand(Float32, 2)
grad(nlp, x32)
2-element Vector{Float64}:
- 175.5325927734375
- -115.24372100830078
Settings
This document was generated with Documenter.jl version 1.8.0 on Friday 20 December 2024. Using Julia version 1.10.7.