Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't dualize generic models #167

Closed
araujoms opened this issue Dec 11, 2024 · 15 comments · Fixed by #168
Closed

Can't dualize generic models #167

araujoms opened this issue Dec 11, 2024 · 15 comments · Fixed by #168

Comments

@araujoms
Copy link

I don't know if this is not yet implemented or a bug, but when I try to dualize a model that's built with anything other than Float64, for example GenericModel{BigFloat}(), I get an error. If it's a bug I'll be happy to provide a MWE.

In any case, it's a feature that would be nice to have.

@guilhermebodin
Copy link
Collaborator

At some point, it was fixed to be a Float64 because JuMP did not support other types. We have changed a lot of the code to make it generic but some parts must have been missed.

I think that if it is not working now it should be considered a bug, could you provide the MWE?

@araujoms
Copy link
Author

Sure, here it is:

using JuMP
import Dualization
using LinearAlgebra
using GenericLinearAlgebra
import Hypatia

function mineig(::Type{T}) where {T}
    model = GenericModel{T}()
    d = 3
    c = Hermitian(randn(T,d,d))
    @variable(model, x[1:d,1:d] in PSDCone())
    @constraint(model, tr(x) == 1)
    @objective(model, Min, real(dot(c,x)))
    set_attribute(model, "verbose", true)   
    set_optimizer(model, Dualization.dual_optimizer(Hypatia.Optimizer{T}))    
    #set_optimizer(model, Hypatia.Optimizer{T})
    optimize!(model)
    display(minimum(eigvals(c)))
    return objective_value(model)
end

If you call mineig(Float64) it works, but mineig(BigFloat) (or anything else, even Float32) fails.

@guilhermebodin
Copy link
Collaborator

guilhermebodin commented Dec 11, 2024

@araujoms indeed this is a bug. I have managed to build a workaround without the function Dualization.dual_optimizer

function mineig(::Type{T}) where {T}
    model = GenericModel{T}()
    d = 3
    c = Hermitian(randn(T,d,d))
    @variable(model, x[1:d,1:d] in PSDCone())
    @constraint(model, tr(x) == 1)
    @objective(model, Min, real(dot(c,x)))
    set_attribute(model, "verbose", true)
    optimizer_constructor = Hypatia.Optimizer{T}
    dual_optimizer = () -> Dualization.DualOptimizer{T}(MOI.instantiate(optimizer_constructor))
    set_optimizer(model, dual_optimizer)    
    # set_optimizer(model, Hypatia.Optimizer{T})
    optimize!(model)
    display(minimum(eigvals(c)))
    return objective_value(model)
end

@show mineig(Float64)
@show mineig(Float32)

The problem is that the default function for Dualization.dual_optimizer

function dual_optimizer(optimizer_constructor)
    return () -> DualOptimizer(MOI.instantiate(optimizer_constructor))
end

fallback to the default constructor of Dualization.DualOptimizer, this one has a hard coded Float64. There is another version that receives the parametric type and builds everything correctly.

function DualOptimizer(dual_optimizer::OT) where {OT<:MOI.ModelLike}
    return DualOptimizer{Float64}(dual_optimizer)
end

To be able to do this automatically we should be able to query the precision the solver is working on.

cc @blegat I haven't found any API to query the precision of a solver in the docs, do we have one?

@araujoms
Copy link
Author

Thanks a lot! Now that I know what is going on it is easy to fix it. We can just extract the type of the solver. The following implementation of dual_optimizer works:

using JuMP
using Dualization
using LinearAlgebra
using GenericLinearAlgebra
import Hypatia

function mineig(::Type{T}) where {T}
    model = GenericModel{T}()
    d = 3
    c = Hermitian(randn(T,d,d))
    @variable(model, x[1:d,1:d] in PSDCone())
    @constraint(model, tr(x) == 1)
    @objective(model, Min, real(dot(c,x)))
    set_attribute(model, "verbose", true)   
    set_optimizer(model, dual_optimizer2(Hypatia.Optimizer{T}))    
    optimize!(model)
    display(minimum(eigvals(c)))
    return objective_value(model)
end

function dual_optimizer2(optimizer_constructor)
    T = optimizer_constructor.parameters[1]
    return () -> DualOptimizer{T}(MOI.instantiate(optimizer_constructor))
end

I have a feeling this is not what you wanted, though?

@odow
Copy link
Member

odow commented Dec 12, 2024

@araujoms your code won't work for things like dual_optimizer2(() -> Hypatia.Optimizer{T}()).

@guilhermebodin I don't think there's a good way. Just implement something like:

function DualOptimizer{T}(optimizer_constructor)
    return () -> DualOptimizer{T}(MOI.instantiate(optimizer_constructor))
end

set_optimizer(model, () -> DualOptimizer{T}(Hypatia.Optimizer{T}))

or

function dual_optimizer(optimizer_constructor; number_type = Float64)
    return () -> DualOptimizer{number_type}(MOI.instantiate(optimizer_constructor))
end

set_optimizer(model, () -> dual_optimizer(Hypatia.Optimizer{T}; number_type = T))

@blegat
Copy link
Member

blegat commented Dec 12, 2024

I think DualOptimizer{T}(optimizer_constructor) would be the most standard. You can assume that if what you received is not a MOI.ModelLike, it must be a optimizer_constructor.

@araujoms
Copy link
Author

@odow In this case one can just test whether optimizer_constructor is a function, then extract the type as typeof(optimizer_constructor()).parameters[1]. It just seems silly to ask for a keyword argument with the type when it's already included in optimizer_constructor.

@blegat
Copy link
Member

blegat commented Dec 12, 2024

If it is a function then what is the type ? There is not rule on optimizer constructors except that when called with no argument, it should return an instance of MOI.ModelLike, you don't know that the type will be the first parameter. So if you access internal type parameters like that, it will only work for some solvers and will break for others.

@araujoms
Copy link
Author

Well, you can make this a rule. Since the only arbitrary precision solvers we have are Hypatia and Clarabel, and they already respect this rule, it won't be breaking. I think it is anyway a good idea to have a standard way to query the type.

(Technically speaking COSMO also accepts a type argument, but it doesn't work for anything other that Float64)

@blegat
Copy link
Member

blegat commented Dec 12, 2024

If we had a rule, I would prefer it to be something like MOI.get(model, MOI.ValueType()) or MOI.get(model, MOI.CoefficientType()) that we would add to MOI

@araujoms
Copy link
Author

That would be great, but I don't see how it would help with the problem at hand, since dual_optimizer takes an optimizer as input, not a model.

@odow
Copy link
Member

odow commented Dec 12, 2024

takes an optimizer as input, not a model

They're the same thing

@blegat
Copy link
Member

blegat commented Dec 12, 2024

Yes but then DualOptimizer receives an instance so you would just do

function DualOptimizer(dual_optimizer::OT) where {OT<:MOI.ModelLike}
    return DualOptimizer{MOI.get(dual_optimizer, MOI.ValueType())}(dual_optimizer)
end

@araujoms
Copy link
Author

Ah I see, if I do a = Hypatia.Optimizer{T}, then MOI.Instantiate(a) is a MOI.ModelLike, and then your function works.

Well, there's nothing I can do except encourage you to implement it.

@guilhermebodin
Copy link
Collaborator

cc @odow @blegat not sure what path you prefer but I made this PR #168

@odow odow closed this as completed in #168 Dec 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging a pull request may close this issue.

4 participants