Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add NLPModels testing #83

Merged
merged 78 commits into from
May 28, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
78 commits
Select commit Hold shift + click to select a range
77d09c2
Add conj, real, transpose, adjoint on numbers
gdalle May 19, 2024
9cc2866
first order not second
gdalle May 19, 2024
1877394
remove imag
gdalle May 19, 2024
adb3c7d
Merge remote-tracking branch 'origin/main' into gd/conj
gdalle May 21, 2024
4adf1f0
Add dot test
gdalle May 21, 2024
a0225d0
Local test for `dot`
gdalle May 21, 2024
4f0b377
Use ==
gdalle May 21, 2024
9d71d86
First attempt at extension, problems with eval and precompilation
gdalle May 21, 2024
313a899
Merge remote-tracking branch 'origin/main' into gd/adnlp
gdalle May 21, 2024
3b897bf
Add SpecialFunctions to deps
gdalle May 21, 2024
4483ebb
Compat
gdalle May 21, 2024
2b7f3fd
Fix extension
gdalle May 22, 2024
15f9448
Merge remote-tracking branch 'origin/main' into gd/adnlp
gdalle May 22, 2024
b2dc872
Add tests
gdalle May 22, 2024
65c65c3
Fix tests
gdalle May 22, 2024
d523e07
Refix tests
gdalle May 22, 2024
a792463
Add NLPModels testing
gdalle May 22, 2024
b8f1576
Merge remote-tracking branch 'origin/main' into gd/adnlp
gdalle May 22, 2024
0a088f6
Merge remote-tracking branch 'origin/gd/adnlp' into gd/adnlp_tests
gdalle May 22, 2024
0ee996a
Fix
gdalle May 22, 2024
3c4262c
Merge branch 'gd/adnlp' into gd/adnlp_tests
gdalle May 22, 2024
65ba47d
Merge remote-tracking branch 'origin/main' into gd/adnlp_tests
gdalle May 22, 2024
6042367
Hessians in NLPModels contain only lower triangle
gdalle May 22, 2024
65758e5
Fix constraint handling
gdalle May 22, 2024
b4cfefd
Global first, local then
gdalle May 22, 2024
93cd1cb
Skip tests with errors
gdalle May 22, 2024
9624b2c
Debugging
gdalle May 22, 2024
f2175a1
Mark disagreements as broken
gdalle May 22, 2024
c208d0a
Verbose
gdalle May 22, 2024
6eb4a5a
Merge branch 'main' into gd/adnlp_tests
gdalle May 22, 2024
4208ea5
Non verbose
gdalle May 22, 2024
9e76cf0
Support `ifelse`
adrhill May 22, 2024
5c94d73
Add tests
adrhill May 22, 2024
f7a0504
Fix Hessian test
adrhill May 22, 2024
fff9f1d
Fix Hessian test v2
adrhill May 22, 2024
478c586
Limit Dual to Real primals
gdalle May 23, 2024
2ca865f
Remove more Complex-ity (pun intended)
gdalle May 23, 2024
07531d7
Remove im
gdalle May 23, 2024
c36f620
Support f(x) = 0
gdalle May 23, 2024
a88883f
Fix
gdalle May 23, 2024
34551a8
Merge remote-tracking branch 'origin/main' into gd/adnlp_tests
gdalle May 23, 2024
d404753
Narrow down discrepancy tests
gdalle May 23, 2024
27653ba
Replace :Number with :Real
gdalle May 23, 2024
46c9879
Fix
gdalle May 23, 2024
d861b69
Merge branch 'gd/dual_real' into gd/adnlp_tests
gdalle May 23, 2024
89873a4
Verbosity
gdalle May 23, 2024
67c7263
Merge branch 'gd/trivial_functions_hessian' into gd/adnlp_tests
gdalle May 23, 2024
05558cb
Pkg dep
gdalle May 23, 2024
4be6662
Time
gdalle May 23, 2024
0a76ba8
Overload comparisons
adrhill May 23, 2024
fed38f9
Test on ADNLPProblems' AMPGO07 problem
adrhill May 23, 2024
53d8df7
Update `MissingPrimalError` tests
adrhill May 23, 2024
e36f434
Remove duplicate method
adrhill May 23, 2024
a864d35
Merge branch 'main' into ah/ifelse
adrhill May 23, 2024
fff25b8
Fix type ambig.
adrhill May 23, 2024
ae653ae
Fix typo from merge
adrhill May 23, 2024
03bc455
Fix Hessian overload on comparisons
adrhill May 23, 2024
ef610ee
Split overload functions into AbstractTracers/Dual
adrhill May 23, 2024
fe8e849
Subtype `Dual <: Real`, not `AbstractTracer`
adrhill May 23, 2024
6a2a591
Simplify conversions
adrhill May 23, 2024
4c9413f
Update `ifelse` overload
adrhill May 23, 2024
1effc46
Update type restriction on `Dual`
adrhill May 23, 2024
adf9375
Add tests for errors on ternary expressions
adrhill May 23, 2024
f5c6352
Merge remote-tracking branch 'origin/ah/ifelse' into gd/adnlp_tests
gdalle May 24, 2024
b7e4ea0
Split tests
gdalle May 24, 2024
f72364d
Remove test deps
gdalle May 24, 2024
54e9494
Only global
gdalle May 24, 2024
f504b34
No local tracing
gdalle May 24, 2024
2ffa6f6
No NLPModels testing on 1.6
gdalle May 24, 2024
470e4a4
Merge branch 'main' into gd/adnlp_tests
gdalle May 25, 2024
f23d546
Fix diff
gdalle May 25, 2024
7d6a87c
No exclude on 1.6
gdalle May 25, 2024
d6bde37
Fix duplicate include
gdalle May 25, 2024
5006e07
No skip
gdalle May 25, 2024
217a598
Final tests
gdalle May 25, 2024
83a7aad
Summary at the end
gdalle May 25, 2024
745119c
Fix
gdalle May 25, 2024
277b528
Exclude 1.6 NLPModels
gdalle May 25, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 12 additions & 8 deletions .github/workflows/CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,8 @@ concurrency:
cancel-in-progress: ${{ startsWith(github.ref, 'refs/pull/') }}
jobs:
test:
name: Julia ${{ matrix.version }} - ${{ matrix.os }} - ${{ matrix.arch }} - ${{ github.event_name }}
runs-on: ${{ matrix.os }}
name: Julia ${{ matrix.version }} (${{ matrix.group }})
runs-on: ubuntu-latest
timeout-minutes: 60
permissions: # needed to allow julia-actions/cache to proactively delete old caches that it has created
actions: write
Expand All @@ -25,19 +25,23 @@ jobs:
version:
- '1.6'
- '1'
os:
- ubuntu-latest
arch:
- x64
group:
- Core
- NLPModels
gdalle marked this conversation as resolved.
Show resolved Hide resolved
exclude:
- version: '1.6'
group: NLPModels
steps:
- uses: actions/checkout@v4
- uses: julia-actions/setup-julia@v2
with:
version: ${{ matrix.version }}
arch: ${{ matrix.arch }}
arch: x64
- uses: julia-actions/cache@v2
- uses: julia-actions/julia-buildpkg@v1
- uses: julia-actions/julia-runtest@v1
env:
JULIA_SCT_TEST_GROUP: ${{ matrix.group }}
- uses: julia-actions/julia-processcoverage@v1
- uses: codecov/codecov-action@v4
with:
Expand All @@ -55,7 +59,7 @@ jobs:
- uses: actions/checkout@v4
- uses: julia-actions/setup-julia@v2
with:
version: '1.10'
version: '1'
- uses: julia-actions/cache@v2
- name: Configure doc environment
shell: julia --project=docs --color=yes {0}
Expand Down
1 change: 1 addition & 0 deletions test/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ JET = "c3a54625-cd67-489e-a8e7-0a5a0ff4e31b"
JuliaFormatter = "98e50ef6-434e-11e9-1051-2b60c6c9e899"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
NNlib = "872c559c-99b0-510c-b3b7-b6c96a88d5cd"
Pkg = "44cfe95a-1eb2-52ea-b672-e2afdf69b78f"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
ReferenceTests = "324d217c-45ce-50fc-942e-d289b448e8cf"
SparseArrays = "2f01184e-e22b-5df5-ae63-d93ebab69eaf"
Expand Down
207 changes: 207 additions & 0 deletions test/nlpmodels.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,207 @@
using ADTypes: ADTypes
using LinearAlgebra
using SparseArrays
using SparseConnectivityTracer
import SparseConnectivityTracer as SCT
using Test

using Pkg
Pkg.add([
"ADNLPModels", "ForwardDiff", "OptimizationProblems", "NLPModels", "NLPModelsJuMP"
])
adrhill marked this conversation as resolved.
Show resolved Hide resolved

using ADNLPModels
using ForwardDiff: ForwardDiff
using NLPModels
using NLPModelsJuMP
using OptimizationProblems

## ForwardDiff reference

function directional_derivative(f, x::AbstractVector, d::AbstractVector)
return ForwardDiff.derivative(t -> f(x + t * d), zero(eltype(x)))
end

function second_directional_derivative(
f, x::AbstractVector, din::AbstractVector, dout::AbstractVector
)
f_din(x) = directional_derivative(f, x, din)
return directional_derivative(f_din, x, dout)
end

function jac_coeff(f, x::AbstractVector, i::Integer, j::Integer)
d = zero(x)
d[j] = 1
return directional_derivative(f, x, d)[i]
end

function hess_coeff(f, x::AbstractVector, i::Integer, j::Integer)
din = zero(x)
din[i] = 1
dout = zero(x)
dout[j] = 1
return second_directional_derivative(f, x, din, dout)
end

## Function wrappers

function mycons(nlp, x)
c = similar(x, nlp.meta.ncon)
cons!(nlp, x, c)
return c
end

function mylag(nlp, x)
o = obj(nlp, x)
c = mycons(nlp, x)
λ = randn(length(c))
return o + dot(λ, c)
end

## Jacobian sparsity

function jac_coeff(name::Symbol, i::Integer, j::Integer)
nlp = OptimizationProblems.ADNLPProblems.eval(name)()
f = Base.Fix1(mycons, nlp)
x = nlp.meta.x0
return jac_coeff(f, x, i, j)
end

function jac_sparsity_sct(name::Symbol)
nlp = OptimizationProblems.ADNLPProblems.eval(name)()
f = Base.Fix1(mycons, nlp)
x = nlp.meta.x0
return ADTypes.jacobian_sparsity(f, x, TracerSparsityDetector())
end

function jac_sparsity_jump(name::Symbol)
jump_model = OptimizationProblems.PureJuMP.eval(name)()
nlp = MathOptNLPModel(jump_model)
jrows, jcols = jac_structure(nlp)
nnzj = length(jrows)
jvals = ones(Bool, nnzj)
return sparse(jrows, jcols, jvals, nlp.meta.ncon, nlp.meta.nvar)
end

## Hessian sparsity

function hess_coeff(name::Symbol, i::Integer, j::Integer)
nlp = OptimizationProblems.ADNLPProblems.eval(name)()
f = Base.Fix1(mylag, nlp)
x = nlp.meta.x0
return hess_coeff(f, x, i, j)
end

function hess_sparsity_sct(name::Symbol)
nlp = OptimizationProblems.ADNLPProblems.eval(name)()
f = Base.Fix1(mylag, nlp)
x = nlp.meta.x0
return ADTypes.hessian_sparsity(f, x, TracerSparsityDetector())
end

function hess_sparsity_jump(name::Symbol)
jump_model = OptimizationProblems.PureJuMP.eval(name)()
nlp = MathOptNLPModel(jump_model)
hrows, hcols = hess_structure(nlp)
nnzh = length(hrows)
hvals = ones(Bool, nnzh)
H_L = sparse(hrows, hcols, hvals, nlp.meta.nvar, nlp.meta.nvar)
# only the lower triangular part is stored
return sparse(Symmetric(H_L, :L))
end

## Comparison

function compare_patterns(; sct, jump)
A_diff = jump - sct
nnz_sct = nnz(sct)
nnz_jump = nnz(jump)

diagonal = if A_diff == Diagonal(A_diff)
"[diagonal difference only]"
else
""
end
message = if all(>(0), nonzeros(A_diff))
"SCT ($nnz_sct nz) ⊂ JuMP ($nnz_jump nz) $diagonal"
elseif all(<(0), nonzeros(A_diff))
"SCT ($nnz_sct nz) ⊃ JuMP ($nnz_jump nz) $diagonal"
else
"SCT ($nnz_sct nz) ≠ JuMP ($nnz_jump nz) $diagonal"
end
return message
end

## Actual tests

@testset verbose = true "ForwardDiff reference" begin
f(x) = sin.(x) .* cos.(reverse(x)) .* exp(x[1]) .* log(x[end])
g(x) = sum(f(x))
x = rand(6)
@testset "Jacobian" begin
J = ForwardDiff.jacobian(f, x)
for i in axes(J, 1), j in axes(J, 2)
@test J[i, j] == jac_coeff(f, x, i, j)
end
end
@testset "Hessian" begin
H = ForwardDiff.hessian(g, x)
for i in axes(H, 1), j in axes(H, 2)
@test H[i, j] == hess_coeff(g, x, i, j)
end
end
end;

jac_inconsistencies = []

@testset verbose = true "Jacobian comparison" begin
@testset "$name" for name in Symbol.(OptimizationProblems.meta[!, :name])
@info "Testing Jacobian sparsity for $name"
J_sct = jac_sparsity_sct(name)
J_jump = jac_sparsity_jump(name)
if J_sct == J_jump
@test J_sct == J_jump
else
@test_broken J_sct == J_jump
J_diff = J_jump - J_sct
message = compare_patterns(; sct=J_sct, jump=J_jump)
# @warn "Inconsistency for Jacobian of $name: $message"
push!(jac_inconsistencies, (name, message))
@test all(zip(findnz(J_diff)...)) do (i, j, _)
iszero(jac_coeff(name, i, j))
end
end
end
end;

hess_inconsistencies = []

@testset verbose = true "Hessian comparison" begin
@testset "$name" for name in Symbol.(OptimizationProblems.meta[!, :name])
@info "Testing Hessian sparsity for $name"
H_sct = hess_sparsity_sct(name)
H_jump = hess_sparsity_jump(name)
if H_sct == H_jump
@test H_sct == H_jump
else
@test_broken H_sct == H_jump
message = compare_patterns(; sct=H_sct, jump=H_jump)
# @warn "Inconsistency for Hessian of $name: $message"
push!(hess_inconsistencies, (name, message))
H_diff = H_jump - H_sct
@test all(zip(findnz(H_diff)...)) do (i, j, _)
iszero(hess_coeff(name, i, j))
end
end
end
end;

if !isempty(jac_inconsistencies) || !isempty(hess_inconsistencies)
@warn "Inconsistencies were detected"
for (name, message) in jac_inconsistencies
@warn "Inconsistency for Jacobian of $name: $message"
end
for (name, message) in hess_inconsistencies
@warn "Inconsistency for Hessian of $name: $message"
end
end
Loading
Loading