Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add NLPModels testing #83

Merged
merged 78 commits into from
May 28, 2024
Merged

Add NLPModels testing #83

merged 78 commits into from
May 28, 2024

Conversation

gdalle
Copy link
Collaborator

@gdalle gdalle commented May 22, 2024

Fix #69

There are still disagreements between SCT and JuMP but I'm not sure we're wrong, so I've labeled them as broken tests

I also skip a few tests for which the NLP pipeline errors due to undefined variables

@gdalle
Copy link
Collaborator Author

gdalle commented May 22, 2024

Here are some of the errors I'm seeing

On the SCT side

@adrhill can you take a look at those?

  MethodError: no method matching hessian_pattern_to_mat(::Vector{SparseConnectivityTracer.Dual{Float64, SparseConnectivityTracer.HessianTracer{BitSet, Set{Tuple{Int64, Int64}}}}}, ::Float64)
  
  Closest candidates are:
    hessian_pattern_to_mat(::AbstractArray{D}, ::D) where {P, T<:SparseConnectivityTracer.HessianTracer, D<:SparseConnectivityTracer.Dual{P, T}}
     @ SparseConnectivityTracer ~/Work/GitHub/Julia/SparseConnectivityTracer.jl/src/pattern.jl:387
  
  Stacktrace:
    [1] local_hessian_pattern(f::Function, x::Vector{Float64}, ::Type{BitSet}, ::Type{Set{Tuple{Int64, Int64}}})
      @ SparseConnectivityTracer ~/Work/GitHub/Julia/SparseConnectivityTracer.jl/src/pattern.jl:369
    [2] hessian_sparsity
      @ ~/Work/GitHub/Julia/SparseConnectivityTracer.jl/src/adtypes.jl:112 [inlined]
  MethodError: ^(::SparseConnectivityTracer.Dual{Float64, SparseConnectivityTracer.HessianTracer{BitSet, Set{Tuple{Int64, Int64}}}}, ::Int64) is ambiguous.
  
  Candidates:
    ^(dx::D, y::Number) where {P, T<:SparseConnectivityTracer.HessianTracer, D<:SparseConnectivityTracer.Dual{P, T}}
      @ SparseConnectivityTracer ~/Work/GitHub/Julia/SparseConnectivityTracer.jl/src/overload_hessian.jl:120
    ^(x::Number, p::Integer)
      @ Base intfuncs.jl:311
  
  Possible fix, define
    ^(::D, ::Integer) where {P, T<:SparseConnectivityTracer.HessianTracer, D<:SparseConnectivityTracer.Dual{P, T}}
  
  Stacktrace:
    [1] literal_pow
      @ ./intfuncs.jl:351 [inlined]

On the other side

@amontoison those would be for you to investigate

  UndefVarError: `xe_turtle` not defined
  Stacktrace:
    [1] triangle_turtle(; kwargs::@Kwargs{})
      @ OptimizationProblems.ADNLPProblems ~/.julia/packages/OptimizationProblems/nfPUU/src/ADNLPProblems/triangle.jl:72

As well as plenty of errors of the form "expected Float64, got Dual" for the in-place constraints.

@adrhill
Copy link
Owner

adrhill commented May 22, 2024

@adrhill can you take a look at those?

  MethodError: no method matching hessian_pattern_to_mat(::Vector{SparseConnectivityTracer.Dual{Float64, SparseConnectivityTracer.HessianTracer{BitSet, Set{Tuple{Int64, Int64}}}}}, ::Float64)

Looks like the output isn't a tracer and therefore doesn't contain information about the Hessian.

 MethodError: ^(::SparseConnectivityTracer.Dual{Float64, SparseConnectivityTracer.HessianTracer{BitSet, Set{Tuple{Int64, Int64}}}}, ::Int64) is ambiguous.

Will fix. (EDIT: see #84)

Base automatically changed from gd/adnlp to main May 22, 2024 12:12
@gdalle
Copy link
Collaborator Author

gdalle commented May 27, 2024

@amontoison I did some more investigation on the sparsity pattern discrepancies between us and JuMP:

  • After double-checking with ForwardDiff: every discrepancy is a non-structural zero of the matrix in question, so there is no correctness issue on either side.
  • In all cases but one (the Jacobian of lincon), SCT is more precise than JuMP, that is, the SCT nonzero pattern is included in the JuMP nonzero pattern.
  • The diagonal differences don't matter for coloring Hessians, but there are 4 cases in which non-diagonal differences occur: britgas, channel, hs114 and marine. I haven't checked what it means for coloring, but it could possibly lead to fewer colors.

In addition, we made it possible (in v0.5.0) to use global sparsity detection with control flow like x[1] > 1 or ifelse. Thus, the patterns detected by SCT should not depend on the input values (unless you do some weird shit like closures or global variables).

To sum up, I think SCT is ready for use in NLPModels, with the caveat that some Hessian sparsity detections are very slow: the largest of the tetra_* instances takes around 2 minutes.
That is something I hope to fix with #103 and some other improvements.

Jacobian

Inconsistency for Jacobian of lincon: SCT (19 nz) ⊃ JuMP (17 nz) 

Hessian

Inconsistency for Hessian of aircrfta: SCT (16 nz) ⊂ JuMP (21 nz) [diagonal difference only]
Inconsistency for Hessian of argauss: SCT (8 nz) ⊂ JuMP (9 nz) [diagonal difference only]
Inconsistency for Hessian of biggs5: SCT (9 nz) ⊂ JuMP (12 nz) [diagonal difference only]
Inconsistency for Hessian of biggs6: SCT (9 nz) ⊂ JuMP (12 nz) [diagonal difference only]
Inconsistency for Hessian of britgas: SCT (1087 nz) ⊂ JuMP (1415 nz) 
Inconsistency for Hessian of camshape: SCT (395 nz) ⊂ JuMP (494 nz) [diagonal difference only]
Inconsistency for Hessian of chain: SCT (75 nz) ⊂ JuMP (100 nz) [diagonal difference only]
Inconsistency for Hessian of channel: SCT (696 nz) ⊂ JuMP (768 nz) 
Inconsistency for Hessian of controlinvestment: SCT (100 nz) ⊂ JuMP (200 nz) [diagonal difference only]
Inconsistency for Hessian of hs106: SCT (10 nz) ⊂ JuMP (18 nz) [diagonal difference only]
Inconsistency for Hessian of hs114: SCT (19 nz) ⊂ JuMP (24 nz) 
Inconsistency for Hessian of hs116: SCT (27 nz) ⊂ JuMP (34 nz) [diagonal difference only]
Inconsistency for Hessian of hs250: SCT (6 nz) ⊂ JuMP (9 nz) [diagonal difference only]
Inconsistency for Hessian of hs251: SCT (6 nz) ⊂ JuMP (9 nz) [diagonal difference only]
Inconsistency for Hessian of hs36: SCT (6 nz) ⊂ JuMP (9 nz) [diagonal difference only]
Inconsistency for Hessian of hs37: SCT (6 nz) ⊂ JuMP (9 nz) [diagonal difference only]
Inconsistency for Hessian of hs40: SCT (15 nz) ⊂ JuMP (16 nz) [diagonal difference only]
Inconsistency for Hessian of hs41: SCT (6 nz) ⊂ JuMP (9 nz) [diagonal difference only]
Inconsistency for Hessian of hs44: SCT (8 nz) ⊂ JuMP (12 nz) [diagonal difference only]
Inconsistency for Hessian of hs45: SCT (20 nz) ⊂ JuMP (25 nz) [diagonal difference only]
Inconsistency for Hessian of hs56: SCT (10 nz) ⊂ JuMP (13 nz) [diagonal difference only]
Inconsistency for Hessian of hs68: SCT (9 nz) ⊂ JuMP (10 nz) [diagonal difference only]
Inconsistency for Hessian of hs69: SCT (9 nz) ⊂ JuMP (10 nz) [diagonal difference only]
Inconsistency for Hessian of hs83: SCT (15 nz) ⊂ JuMP (19 nz) [diagonal difference only]
Inconsistency for Hessian of hs84: SCT (8 nz) ⊂ JuMP (13 nz) [diagonal difference only]
Inconsistency for Hessian of hs87: SCT (9 nz) ⊂ JuMP (11 nz) [diagonal difference only]
Inconsistency for Hessian of hs93: SCT (34 nz) ⊂ JuMP (36 nz) [diagonal difference only]
Inconsistency for Hessian of hs95: SCT (12 nz) ⊂ JuMP (17 nz) [diagonal difference only]
Inconsistency for Hessian of hs96: SCT (12 nz) ⊂ JuMP (17 nz) [diagonal difference only]
Inconsistency for Hessian of hs97: SCT (12 nz) ⊂ JuMP (17 nz) [diagonal difference only]
Inconsistency for Hessian of hs98: SCT (12 nz) ⊂ JuMP (17 nz) [diagonal difference only]
Inconsistency for Hessian of marine: SCT (186 nz) ⊂ JuMP (239 nz) 
Inconsistency for Hessian of polygon1: SCT (550 nz) ⊂ JuMP (600 nz) [diagonal difference only]
Inconsistency for Hessian of polygon2: SCT (350 nz) ⊂ JuMP (400 nz) [diagonal difference only]
Inconsistency for Hessian of robotarm: SCT (252 nz) ⊂ JuMP (321 nz) [diagonal difference only]

@adrhill
Copy link
Owner

adrhill commented May 27, 2024

with the caveat that some Hessian sparsity detections are very slow

This caveat should come with the meta-caveat that we haven't done any performance optimizations on Hessians yet. We have plenty of ideas that will speed things up significantly (e.g. #105, #80, #34).

test/nlpmodels.jl Show resolved Hide resolved
test/runtests.jl Show resolved Hide resolved
.github/workflows/CI.yml Show resolved Hide resolved
@adrhill
Copy link
Owner

adrhill commented May 28, 2024

@gdalle can this be merged?

@gdalle
Copy link
Collaborator Author

gdalle commented May 28, 2024

Yep

@adrhill adrhill merged commit 851ba51 into main May 28, 2024
5 checks passed
@adrhill adrhill deleted the gd/adnlp_tests branch May 28, 2024 16:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add optimization problems to "real-world" tests
4 participants