Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't use DIRECT method #29

Closed
fedebenelli opened this issue Sep 11, 2024 · 3 comments
Closed

Can't use DIRECT method #29

fedebenelli opened this issue Sep 11, 2024 · 3 comments

Comments

@fedebenelli
Copy link

Hello! Thank you so much for developing this project, is helping me a lot in my research.

Until now I used local optimizers with no problem, but now that I'm trying to use a global optimizer I'm facing some troubles that I can't find solution to.

I'm trying to use the DIRECT method, which also happened with other global optimizers. No matter how many parameters I set up, I always get an NLOPT_INVALID_ARGS code. Here is a minimal script to replicate this, in the script I only set bounds but I've tried every method that starts with set_. Am I doing something wrong?

module myfoo
  use iso_fortran_env, only: wp => real64
  implicit none

contains

  function foo(x, gradient, func_data) result(f)
    real(wp), intent(in) :: x(:)
    real(wp), intent(in out), optional :: gradient(:)
    class(*), intent(in), optional :: func_data
    real(wp) :: f

    f = x(1)**2 + x(2)**2

  end function foo
end module


program main
  use myfoo, only: foo, wp
  use nlopt_wrap, only: create, destroy, nlopt_opt, nlopt_algorithm_enum
  use nlopt_callback, only: nlopt_func, create_nlopt_func
  implicit none

  type(nlopt_opt) :: opt
  type(nlopt_func) :: f

  real(wp) :: x(2), fval
  integer :: stat

  X = [2, 5]

  f = create_nlopt_func(foo)
  ! opt = nlopt_opt(nlopt_algorithm_enum%LN_NELDERMEAD, 2)
  opt = nlopt_opt(nlopt_algorithm_enum%GN_DIRECT, 2)

  call opt%set_min_objective(f)

  call opt%set_lower_bounds([-1._wp, -1._wp])
  call opt%set_upper_bounds([1._wp, 1._wp])


  call opt%optimize(x, fval, stat)
  print *, X
  print *, fval, stat

end program main
@awvwgk
Copy link
Member

awvwgk commented Sep 12, 2024

From the example code posted the main issue I am spotting is that the trial vector X is already above the upper bounds of the optimizer. This could be for this particular example a reason for failure.

I will try to set some time on the weekend to check in more detail.

@awvwgk
Copy link
Member

awvwgk commented Sep 16, 2024

This works fine for me:

module myfoo
  use iso_fortran_env, only: wp => real64
  implicit none

contains

  function foo(x, gradient, func_data) result(f)
    real(wp), intent(in) :: x(:)
    real(wp), intent(in out), optional :: gradient(:)
    class(*), intent(in), optional :: func_data
    real(wp) :: f

    f = x(1)**2 + x(2)**2
    print *, "[", x, "]", f

  end function foo
end module


program main
  use myfoo, only: foo, wp
  use nlopt_wrap, only: create, destroy, nlopt_opt, nlopt_algorithm_enum
  use nlopt_callback, only: nlopt_func, create_nlopt_func
  use nlopt_enum, only: result_to_string
  implicit none

  type(nlopt_opt) :: opt
  type(nlopt_func) :: f

  real(wp) :: x(2), fval
  integer :: stat

  X = [.2, .5]

  f = create_nlopt_func(foo)
  ! opt = nlopt_opt(nlopt_algorithm_enum%LN_NELDERMEAD, 2)
  opt = nlopt_opt(nlopt_algorithm_enum%GN_DIRECT, 2)

  call opt%set_min_objective(f)

  call opt%set_lower_bounds([-1._wp, -1._wp])
  call opt%set_upper_bounds([1._wp, 1._wp])

  call opt%set_maxeval(200)

  call opt%optimize(x, fval, stat)
  print *, X
  print *, fval, result_to_string(stat)

end program main

Closing this as resolved, feel free to reopen.

@awvwgk awvwgk closed this as completed Sep 16, 2024
@fedebenelli
Copy link
Author

Thanks for the help! I did not know that detail about the initial vector, makes sense

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants