-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problem with 53 variables to be optimized using PyNOMAD; PSD_NOMAD not working; What other options do I have? #165
Comments
For now PSD-Mads cannot be used when running the Python interface version. However, you can use the regular version of Nomad built with OpenMP support (see README) and use a standalone python blackbox evaluation. I put an example in the the develop branch examples/advanced/batch/PSDMadsWithPythonBB. |
Thanks for your quick reply. Truly appreciate it. For me, it has to be entirely in python for now. So, PSD-Mads is something which would not work in Python yet and you are not recommending to use SSD-Mads (although, after posting message here, I triggered a run with SSD-Mads. It was very fast comparatively and the outputs also improved but still are away from the optima). What other options do I have to improve the results? Which other parameters can I try using? |
Here are a few things you can try (one by one or combined): 1- Use DIRECTION_TYPE ortho 2n. The default is Ortho n+1 quad is usually more efficient but not always! 3- Enable VNS search: VNS_MADS_SEARCH yes. This enforces a more global exploration by trying to escape local minimum. It is a costly strategy. But if you see that many evaluations are spent without improvements it could be an interesting strategy. 4- Spend some evaluations to explore the design space globally at the begining. LH_SEARCH x 0. Put a reasonable x value for an initial LH. LH will sample point within the bounds of the variables (bounds are required for all variables). This strategy can be interesting if your initial point is not very good. It will provide information to build better global quadratic models of the objective and constraints to guide the quadratic model search. 5- If some of your variables "work" together to affect objective and constraint you can group them. For example, the coordinates (x,y,z) of points in spaces can be grouped -> VARIABLE_GROUP 0-2. Several groups can be defined. PSD-Mads (and SSD-Mads) makes group of variables (groups of 2 by default) randomly at each super-iteration. I am not sure that you can force group definition with PSD-Mads. These advices (except #5) are already in Chapter 5 "Tricks of the trade" of the user guide. |
Thanks, especially for the explanation around what would actually be affected by changing each of these parameters, which doesn't usually come out very clearly through the user guide. I am new to this entire concept of black-box optimization. That is also probably the reason that I have so many basic questions. Do you have any other resource which you can point me to, which can help me understand all these concepts better? Also, I tried few more runs with SSD-Mads, with different values of SSD_MADS_NB_VAR_IN_SUBPROBLEM and it failed (gave very off answers) for some of those values, but without any apparent pattern as such. Like, it worked for a value of 8, but performed miserably with 10; worked good with 2 but performed very bad with 5. So, do you have any insights on this as well, or you would say I should stay away from using SSD-Mads for time being? |
A complete introduction to derivative-free and blackbox optimization can be found in the textbook:
From our experience, 2 is good value for the subproblems (random variable groups of size 2). You can play with SSD_MADS_ITER_OPPORTUNISTIC (set it to false), SSD_MADS_SUBPROBLEM_MAX_BB_EVAL (10 if you use 2 for subproblem size). |
Sure, thanks for further suggestions and the book recommendation. I may be naive in asking this, but could you suggest any other algo/optimizer (python based) which I should definitely explore apart from PyNOMAD, for problems with large number of variables? Regards |
You can try CMA-ES |
sure, thanks for all the guidance. |
Hi again, Then I tried setting the parameter ANISOTROPIC_MESH to FALSE and that worked, but the run times have increased. Could you please throw some light on this? Why is this happening? Is it recommended to do? |
When letting Nomad run with default stopping criterion we have some mathematical guaranties to reach a local optimum. Nothing more. But in some cases, the convergence can be extremely slow and we will stop because the mesh is too fine. By default, the mesh refinement is managed differently for each variable (mesh anisotropy). In presence of many local optima Nomad can converge to different points for different settings. Even changing only the seed will affect the result. |
Makes sense. I am running few more experiments. Trying Pycma-es as well now. Thanks, I will keep you posted here. Your inputs and guidance are very helpful. Thanks again!! |
Hi all,
I am working on optimizing a black box function. The problem has 52 variables.
PyNOMAD on running for around 2.5 hours (BB function evaluation is costly) is giving inferior answers as compared Genetic Algorithm implementation. While going through documentation, came across this page- https://nomad-4-user-guide.readthedocs.io/en/latest/TricksOfTheTrade.html , which suggests using PSD-Mads.
Trying to use PSD-Mads in python is giving this error -
NOMAD Parameter Error:
NOMAD::Exception thrown (D:\a\nomad\nomad\src\Param\RunParameters.cpp, 506) Error: PSD_MADS_OPTIMIZATION can only be used when OpenMP is available. If that is not the case, use SSD_MADS_OPTIMIZATION.
It suggests using SSD_MADS, but I cannot find any content on it. Is it comparable to PSD-Mads? How can I use PSD-Mads in python?
The text was updated successfully, but these errors were encountered: