Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/develop' into uri/optimizer_result
Browse files Browse the repository at this point in the history
  • Loading branch information
Uri Granta committed Aug 1, 2023
2 parents db1d9cd + 9a531f0 commit ef48ec0
Show file tree
Hide file tree
Showing 10 changed files with 13 additions and 13 deletions.
2 changes: 1 addition & 1 deletion docs/notebooks/active_learning.pct.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# %% [markdown]
# # Active Learning
# # Active learning

# %% [markdown]
# Sometimes, we may just want to learn a black-box function, rather than optimizing it. This goal is known as active learning and corresponds to choosing query points that reduce our model uncertainty. This notebook demonstrates how to perform Bayesian active learning using Trieste.
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# %% [markdown]
# # Active Learning for binary classification
# # Active learning for binary classification

# %%
import gpflow
Expand Down
2 changes: 1 addition & 1 deletion docs/notebooks/ask_tell_optimization.pct.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# %% [markdown]
# # Ask-Tell Optimization Interface
# # Ask-Tell optimization interface

# %% [markdown]
# In this notebook we will illustrate the use of an Ask-Tell interface in Trieste. It is useful for cases where you want to have greater control of the optimization loop, or when letting Trieste manage this loop is impossible.
Expand Down
2 changes: 1 addition & 1 deletion docs/notebooks/batch_optimization.pct.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# %% [markdown]
# # Batch Bayesian Optimization
# # Batch Bayesian optimization

# %% [markdown]
# Sometimes it is practically convenient to query several points at a time. This notebook demonstrates four ways to perfom batch Bayesian optimization with Trieste.
Expand Down
2 changes: 1 addition & 1 deletion docs/notebooks/expected_improvement.pct.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# %% [markdown]
# # Introduction to Bayesian Optimization
# # Introduction to Bayesian optimization

# %%
import numpy as np
Expand Down
2 changes: 1 addition & 1 deletion docs/notebooks/multifidelity_modelling.pct.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
import gpflow.kernels

# %% [markdown]
# # Multifidelity Modelling
# # Multifidelity modelling
#
# This tutorial demonstrates the usage of the `MultifidelityAutoregressive` model for fitting multifidelity data. This is an implementation of the AR1 model initially described in <cite data-cite="Kennedy2000"/>.

Expand Down
2 changes: 1 addition & 1 deletion docs/notebooks/rembo.pct.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# %% [markdown]
# # High-dimensional Bayesian Optimization
# # High-dimensional Bayesian optimization
# This notebook demonstrates a simple method for optimizing a high-dimensional (100-D) problem, where standard BO methods have trouble.

# %%
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
# %% [markdown]
# # Scalable Thompson Sampling
# # Scalable Thompson sampling

# %% [markdown]
# In our other [Thompson sampling notebook](thompson_sampling.pct.py) we demonstrate how to perform batch optimization using a traditional implementation of Thompson sampling that samples exactly from an underlying Gaussian Process surrogate model. Unfortunately, this approach incurs a large computational overhead that scales polynomially with the optimization budget and so cannot be applied to settings with larger optimization budgets, e.g. those where large batches (>>10) of points can be collected.
Expand Down
2 changes: 1 addition & 1 deletion docs/notebooks/thompson_sampling.pct.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# %% [markdown]
# # Thompson Sampling
# # Thompson sampling

# %%
import numpy as np
Expand Down
8 changes: 4 additions & 4 deletions docs/tutorials.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,10 +15,10 @@
Tutorials
=========

Example optimization problems
-----------------------------
Optimization problems
---------------------

The following tutorials explore various types of optimization problems using Trieste.
The following tutorials illustrate solving different types of optimization problems using Trieste.

.. toctree::
:maxdepth: 1
Expand All @@ -44,7 +44,7 @@ The following tutorials explore various types of optimization problems using Tri
Frequently asked questions
--------------------------

The following tutorials (or sections thereof) explain how to use and extend specific Trieste functionality.
The following tutorials explain how to use and extend specific Trieste functionality.

* :doc:`How do I set up a basic Bayesian optimization routine?<notebooks/expected_improvement>`
* :doc:`How do I set up a batch Bayesian optimization routine?<notebooks/batch_optimization>`
Expand Down

0 comments on commit ef48ec0

Please sign in to comment.