Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PhoenicsParseError: 'parameters' #5

Open
comnGuy opened this issue May 25, 2020 · 4 comments
Open

PhoenicsParseError: 'parameters' #5

comnGuy opened this issue May 25, 2020 · 4 comments

Comments

@comnGuy
Copy link

comnGuy commented May 25, 2020

Hi,

I tried to execute an example of your repository, e.g. Sequential optimization. I run into the following error.

Traceback (most recent call last):
  File "/opt/conda/lib/python3.7/site-packages/phoenics/utilities/decorators.py", line 36, in wrapper
    function(*args, **kwargs)
  File "/opt/conda/lib/python3.7/site-packages/phoenics/utilities/config_parser.py", line 368, in parse_config_file
    self._parse(self.config_dict)
  File "/opt/conda/lib/python3.7/site-packages/phoenics/utilities/config_parser.py", line 356, in _parse
    self._parse_parameters(self.config['parameters'])

PhoenicsParseError: 'parameters'

It looks like that the error has something to do with config.json.

Steps

  1. I used the image docker pull jupyter/datascience-notebook
  2. Cloned your repository.
  3. Navigated to phoenics/examples/optimization_sequential/
  4. python optimize_branin.py

Container Information

  • Ubuntu 18.04.4 LTS
  • Python 3.7.6
  • GCC 7.3.0

Thanks!

If you need additional information, let me know.

Best Regards
Bernhard

@FreakyJ
Copy link

FreakyJ commented Jun 13, 2020

I've got it working using this config file:

{
	"general": {
		"num_batches": 1,
		"batch_size":  2,
		"backend": "tfprob",
		"parallel_evaluations": "True"
	},
	"parameters": [{"name": "param_0", "type": "continuous", "low":  -4.0, "high": 4.0, "size": 1},
				  {"name": "param_1", "type": "continuous", "low": -4.0, "high":  4.0, "size": 1}],
	"objectives": [{"name": "obj_0", "goal": "minimize", "hierarchy": 0, "tolerance": 0.2}, 
				   {"name": "obj_1", "goal": "maximize", "hierarchy": 1, "tolerance": 0.2}]
}

However, the rest of the example code seems to be out of dated, too,

import sys 
sys.path.append('../../phoenics')
import pickle
from phoenics import Phoenics 
import numpy as np

def fonseca(params):
    vector = np.array([params[key_name][0] for key_name in ['param_0', 'param_1']])
    obj_0  = 1 - np.exp( - np.sum((vector - 1. / np.sqrt(len(vector)))**2))
    obj_1  = 1 - np.exp( - np.sum((vector + 1. / np.sqrt(len(vector)))**2))
    params['obj_0'] = obj_0
    params['obj_1'] = obj_1
    return params


class OptimizationManager(object):

    def __init__(self, config_file, loss_function):

        # creates instance of Phoenics optimizer
        self.phoenics      = Phoenics(config_file)
        self.loss_function = loss_function 


    def optimize(self, max_iter = 100):

        observations = []

        for num_iter in range(max_iter):

            # query for new parameters based on prior observations
            params = self.phoenics.recommend(observations = observations)
            print('LEN_PARAMS', len(params))

            # use parameters for evaluation ...
            # ... experimentally or computationally
            for param in params:
                observation = self.loss_function(param)
                observations.append(observation)

            # log observations in a pickle file for future analysis
            pickle.dump(observations, open('observations.pkl', 'wb'))

            # print observations to file 
            logfile = open('logfile.dat', 'a')
            for param in params:
                new_line = ''
                for var_name in sorted(self.phoenics.config.parameters.name):
                    for param_value in param[var_name]:
                        new_line += '%.5e\t' % (param[var_name])
                for obj_name in sorted(self.phoenics.config.objectives.name):
                    new_line += '%.5e\t' % (param[obj_name])
                logfile.write(new_line + '\n')
            logfile.close()

#========================================================================

if __name__ == '__main__':

    logfile = open('config.json', 'r')
    print(logfile.read())
    logfile.close()

    manager = OptimizationManager('config.json', fonseca)
    manager.optimize()

@FreakyJ
Copy link

FreakyJ commented Jun 13, 2020

And you can plot it like this:

import pandas as pd
import matplotlib.pyplot as plt

observations = pickle.load(open('observations.pkl', 'rb'))
df_observations = pd.DataFrame(observations)

fig = plt.figure(figsize=[10, 10])
ax = fig.add_subplot(111, projection='3d')
ax.scatter(df_observations["param_0"], df_observations["param_1"], df_observations["obj_0"], zdir='z', s=20, c=None, depthshade=True)
ax.scatter(df_observations["param_0"], df_observations["param_1"], df_observations["obj_1"], zdir='z', s=20, c=None, depthshade=True)

@FreakyJ
Copy link

FreakyJ commented Jun 13, 2020

And in case you would like to plot the "problem"

import pandas as pd
import matplotlib.pyplot as plt

df_params = pd.DataFrame({'x': [], 'y': [], 'obj_0': [], 'obj_1': []})
for x in range(-400, 400, 10):
    for y in range(-400, 400, 10):
        res = fonseca({"param_0": [x/100], "param_1": [y/100]})
        df_params = df_params.append({'x': x/100, 'y': y/100, 'obj_0': res["obj_0"], 'obj_1': res["obj_1"]}, ignore_index=True)
        

fig = plt.figure(figsize=[10, 10])
ax = fig.add_subplot(111, projection='3d')
ax.scatter(df_params["x"], df_params["y"], df_params["obj_0"], zdir='z', s=20, c=None, depthshade=True)
ax.scatter(df_params["x"], df_params["y"], df_params["obj_1"], zdir='z', s=20, c=None, depthshade=True)

@rikoimade
Copy link

I tried to run with the following config file (basically replacing 'variables' key with 'parameters' key:

{
"general": {
"num_batches": 1,
"batch_size": 1,
"backend": "tfprob",
"parallel_evaluations": "True"
},
"parameters": [{"x": {"low": -5.0, "high": 10.0, "type": "float", "size": 1}},
{"y": {"low": 0.0, "high": 15.0, "type": "float", "size": 1}}],
"objectives": [{"branin": {"hierarchy": 0, "type": "minimum", "tolerance": 0.0}}]
}

and I get another error about 'size'

from phoenics import Phoenics
phoenics = Phoenics('config.json')
Traceback (most recent call last):
File "C:\Users\xxx\anaconda3\envs\chimera\lib\site-packages\phoenics\utilities\decorators.py", line 36, in wrapper
function(*args, **kwargs)
File "C:\Users\xxx\anaconda3\envs\chimera\lib\site-packages\phoenics\utilities\config_parser.py", line 369, in parse_config_file
self._parse(self.config_dict)
File "C:\Users\xxx\anaconda3\envs\chimera\lib\site-packages\phoenics\utilities\config_parser.py", line 356, in _parse
self._parse_parameters(self.config['parameters'])
File "C:\Users\xxx\anaconda3\envs\chimera\lib\site-packages\phoenics\utilities\config_parser.py", line 139, in _parse_parameters
size = setting['size']

←[0;31mPhoenicsParseError: 'size'←[0m

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants