diff --git a/previews/PR314/.documenter-siteinfo.json b/previews/PR314/.documenter-siteinfo.json index ed582b31..817a7168 100644 --- a/previews/PR314/.documenter-siteinfo.json +++ b/previews/PR314/.documenter-siteinfo.json @@ -1 +1 @@ -{"documenter":{"julia_version":"1.10.7","generation_timestamp":"2024-12-21T20:07:42","documenter_version":"1.8.0"}} \ No newline at end of file +{"documenter":{"julia_version":"1.11.2","generation_timestamp":"2024-12-21T20:09:39","documenter_version":"1.8.0"}} \ No newline at end of file diff --git a/previews/PR314/backend/index.html b/previews/PR314/backend/index.html index bfabceed..64b0e6a9 100644 --- a/previews/PR314/backend/index.html +++ b/previews/PR314/backend/index.html @@ -57,9 +57,9 @@ return g end
Finally, we use the homemade backend to compute the gradient.
nlp = ADNLPModel(sum, ones(3), gradient_backend = NewADGradient)
grad(nlp, nlp.meta.x0) # returns the gradient at x0 using `NewADGradient`
3-element Vector{Float64}:
- 0.28670930527875793
- 0.07017176511439704
- 0.5556853370806676
Once an instance of an ADNLPModel
has been created, it is possible to change the backends without re-instantiating the model.
using ADNLPModels, NLPModels
+ 0.14741584220291293
+ 0.5016581544176959
+ 0.9463651629874223
Once an instance of an ADNLPModel
has been created, it is possible to change the backends without re-instantiating the model.
using ADNLPModels, NLPModels
f(x) = 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2
x0 = 3 * ones(2)
nlp = ADNLPModel(f, x0)
@@ -128,10 +128,10 @@
jhess: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0 jhprod: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
Then, the gradient will return a vector of Float64
.
x64 = rand(2)
grad(nlp, x64)
2-element Vector{Float64}:
- 104.42640703309738
- -73.86987557112867
It is now possible to move to a different type, for instance Float32
, while keeping the instance nlp
.
x0_32 = ones(Float32, 2)
+ 58.99263057208681
+ -40.916174541610964
It is now possible to move to a different type, for instance Float32
, while keeping the instance nlp
.
x0_32 = ones(Float32, 2)
set_adbackend!(nlp, gradient_backend = ADNLPModels.ForwardDiffADGradient, x0 = x0_32)
x32 = rand(Float32, 2)
grad(nlp, x32)
2-element Vector{Float64}:
- -67.99763488769531
- 50.18924331665039
Settings
This document was generated with Documenter.jl version 1.8.0 on Saturday 21 December 2024. Using Julia version 1.10.7.