-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't dualize generic models #167
Comments
At some point, it was fixed to be a Float64 because JuMP did not support other types. We have changed a lot of the code to make it generic but some parts must have been missed. I think that if it is not working now it should be considered a bug, could you provide the MWE? |
Sure, here it is: using JuMP
import Dualization
using LinearAlgebra
using GenericLinearAlgebra
import Hypatia
function mineig(::Type{T}) where {T}
model = GenericModel{T}()
d = 3
c = Hermitian(randn(T,d,d))
@variable(model, x[1:d,1:d] in PSDCone())
@constraint(model, tr(x) == 1)
@objective(model, Min, real(dot(c,x)))
set_attribute(model, "verbose", true)
set_optimizer(model, Dualization.dual_optimizer(Hypatia.Optimizer{T}))
#set_optimizer(model, Hypatia.Optimizer{T})
optimize!(model)
display(minimum(eigvals(c)))
return objective_value(model)
end If you call |
@araujoms indeed this is a bug. I have managed to build a workaround without the function function mineig(::Type{T}) where {T}
model = GenericModel{T}()
d = 3
c = Hermitian(randn(T,d,d))
@variable(model, x[1:d,1:d] in PSDCone())
@constraint(model, tr(x) == 1)
@objective(model, Min, real(dot(c,x)))
set_attribute(model, "verbose", true)
optimizer_constructor = Hypatia.Optimizer{T}
dual_optimizer = () -> Dualization.DualOptimizer{T}(MOI.instantiate(optimizer_constructor))
set_optimizer(model, dual_optimizer)
# set_optimizer(model, Hypatia.Optimizer{T})
optimize!(model)
display(minimum(eigvals(c)))
return objective_value(model)
end
@show mineig(Float64)
@show mineig(Float32) The problem is that the default function for function dual_optimizer(optimizer_constructor)
return () -> DualOptimizer(MOI.instantiate(optimizer_constructor))
end fallback to the default constructor of function DualOptimizer(dual_optimizer::OT) where {OT<:MOI.ModelLike}
return DualOptimizer{Float64}(dual_optimizer)
end To be able to do this automatically we should be able to query the precision the solver is working on. cc @blegat I haven't found any API to query the precision of a solver in the docs, do we have one? |
Thanks a lot! Now that I know what is going on it is easy to fix it. We can just extract the type of the solver. The following implementation of using JuMP
using Dualization
using LinearAlgebra
using GenericLinearAlgebra
import Hypatia
function mineig(::Type{T}) where {T}
model = GenericModel{T}()
d = 3
c = Hermitian(randn(T,d,d))
@variable(model, x[1:d,1:d] in PSDCone())
@constraint(model, tr(x) == 1)
@objective(model, Min, real(dot(c,x)))
set_attribute(model, "verbose", true)
set_optimizer(model, dual_optimizer2(Hypatia.Optimizer{T}))
optimize!(model)
display(minimum(eigvals(c)))
return objective_value(model)
end
function dual_optimizer2(optimizer_constructor)
T = optimizer_constructor.parameters[1]
return () -> DualOptimizer{T}(MOI.instantiate(optimizer_constructor))
end I have a feeling this is not what you wanted, though? |
@araujoms your code won't work for things like @guilhermebodin I don't think there's a good way. Just implement something like: function DualOptimizer{T}(optimizer_constructor)
return () -> DualOptimizer{T}(MOI.instantiate(optimizer_constructor))
end
set_optimizer(model, () -> DualOptimizer{T}(Hypatia.Optimizer{T})) or function dual_optimizer(optimizer_constructor; number_type = Float64)
return () -> DualOptimizer{number_type}(MOI.instantiate(optimizer_constructor))
end
set_optimizer(model, () -> dual_optimizer(Hypatia.Optimizer{T}; number_type = T)) |
I think |
@odow In this case one can just test whether |
If it is a function then what is the type ? There is not rule on optimizer constructors except that when called with no argument, it should return an instance of |
Well, you can make this a rule. Since the only arbitrary precision solvers we have are Hypatia and Clarabel, and they already respect this rule, it won't be breaking. I think it is anyway a good idea to have a standard way to query the type. (Technically speaking COSMO also accepts a type argument, but it doesn't work for anything other that |
If we had a rule, I would prefer it to be something like |
That would be great, but I don't see how it would help with the problem at hand, since |
They're the same thing |
Yes but then function DualOptimizer(dual_optimizer::OT) where {OT<:MOI.ModelLike}
return DualOptimizer{MOI.get(dual_optimizer, MOI.ValueType())}(dual_optimizer)
end |
Ah I see, if I do Well, there's nothing I can do except encourage you to implement it. |
I don't know if this is not yet implemented or a bug, but when I try to dualize a model that's built with anything other than
Float64
, for exampleGenericModel{BigFloat}()
, I get an error. If it's a bug I'll be happy to provide a MWE.In any case, it's a feature that would be nice to have.
The text was updated successfully, but these errors were encountered: