Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Classification example broken in gpflow 2.7.0 #2

Open
Jack-Sandberg opened this issue Feb 7, 2023 · 0 comments
Open

Classification example broken in gpflow 2.7.0 #2

Jack-Sandberg opened this issue Feb 7, 2023 · 0 comments

Comments

@Jack-Sandberg
Copy link

I ran into some issues when trying to run examples/classification.ipynb using gpflow 2.7.0. I am looking to adapt the Graph Matern to my own work and would love to use the scalable methods you have tried.

Specifically, I get an error when running the function optimize_SVGP using gpflow.models.SVGP with GPInducingVariables. If I instead use GraphSVGP and `inducing_variable=[0] * num_eigenpairs' I get an dimension error. I have put the error messages last.

Both issues can be resolved by making the inducing variable have 2 axes: inducing_variable = np.zeros((num_eigenpairs, 1)). But for the SVGP model the accuracy drops to 16% so I suspect its not a proper fix.

By downgrading to gpflow 2.6.3 I could get the notebook to run without errors for the SVGP model but the GraphSVGP model gets the same error. I suspect GraphSVGP in classification.ipynb was not tested after 3dd6b9c.

Error with SVGP and GPInducingVariables:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[51], line 16
     13 adam_opt = tf.optimizers.Adam(0.001)
     14 natgrad_opt = gpflow.optimizers.NaturalGradient(gamma=0.001)
---> 16 optimize_SVGP(model, (adam_opt, natgrad_opt), 1000, True)
     17 gpflow.utilities.print_summary(model)

Cell In[48], line 35, in optimize_SVGP(model, optimizers, steps, q_diag)
     33     opt_step(natgrad_opt, loss, natgrad_params)
     34 if step % 200 == 0:
---> 35     likelihood = model.elbo((x_train, y_train))
     36     t.set_postfix({'ELBO': likelihood.numpy()})

File ~/miniconda3/envs/bandits-gpflow/lib/python3.10/site-packages/check_shapes/integration/tf.py:76, in install_tf_integration.<locals>.TfWrapperPostProcessor.on_wrap.<locals>.wrapped_method(self, *args, **kwargs)
     75 def wrapped_method(self: Any, *args: Any, **kwargs: Any) -> Any:
---> 76     return wrapped_function(self, *args, **kwargs)

File ~/miniconda3/envs/bandits-gpflow/lib/python3.10/site-packages/check_shapes/decorator.py:185, in check_shapes.<locals>._check_shapes.<locals>.wrapped_function(*args, **kwargs)
    182 _check_specs(pre_specs)
    184 with set_shape_checker(checker):
--> 185     result = func(*args, **kwargs)
    186 arg_map[RESULT_TOKEN] = result
    188 _check_specs(post_specs)
...
    100     dtype = dtypes.as_dtype(dtype).as_datatype_enum
    101 ctx.ensure_initialized()
--> 102 return ops.EagerTensor(value, ctx.device_name, dtype)

ValueError: Attempt to convert a value (<object object at 0x7f8bf9647dc0>) with an unsupported type (<class 'object'>) to a Tensor.

Error with GraphSVGP and inducing_variable=[0]*num_eigenpairs:

ShapeMismatchError                        Traceback (most recent call last)
Cell In[52], line 3
      1 # To use GraphSVGP change the strings with comments.
      2 # model = gpflow.models.SVGP(  
----> 3 model = GraphSVGP(
      4     kernel=kernel,
      5     likelihood=gpflow.likelihoods.MultiClass(cls_number),
      6     # inducing_variable=inducing_points, 
      7     inducing_variable=[0]*num_eigenpairs,
      8     # inducing_variable = np.zeros((num_eigenpairs, 1)),
      9     num_latent_gps=cls_number,
     10     whiten=True,
     11     q_diag=True,
     12 )
     13 adam_opt = tf.optimizers.Adam(0.001)
     14 natgrad_opt = gpflow.optimizers.NaturalGradient(gamma=0.001)

File ~/Documents/Master-Thesis/Graph-Gaussian-Processes/graph_matern/svgp.py:12, in GraphSVGP.__init__(self, *args, **kwargs)
     11 def __init__(self, *args, **kwargs):
---> 12     super().__init__(*args, **kwargs)

File ~/miniconda3/envs/bandits-gpflow/lib/python3.10/site-packages/check_shapes/integration/tf.py:76, in install_tf_integration.<locals>.TfWrapperPostProcessor.on_wrap.<locals>.wrapped_method(self, *args, **kwargs)
     75 def wrapped_method(self: Any, *args: Any, **kwargs: Any) -> Any:
---> 76     return wrapped_function(self, *args, **kwargs)
...
    Declared: /home/jack/miniconda3/envs/bandits-gpflow/lib/python3.10/site-packages/gpflow/inducing_variables/inducing_variables.py:64
    Argument: Z
      Expected: [M, D]
      Actual:   [500]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant