Skip to content

Commit 0f754e6

Browse files
committed
docs: add Spearmint example
1 parent 9604137 commit 0f754e6

File tree

3 files changed

+42
-1
lines changed

3 files changed

+42
-1
lines changed

.gitignore

+1-1
Original file line numberDiff line numberDiff line change
@@ -30,4 +30,4 @@ node_modules
3030
bower_components
3131

3232
# Environment configuration variables
33-
.env
33+
/.env
+31
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
# Bayesian Optimisation (FGLab)
2+
3+
## Introduction
4+
5+
Bayesian optimisation is a global optimisation technique, which treats the function it must optimise as a random function. It places a prior on the function, and evaluates the function to collect data points. Each evaluation is used to update the posterior distribution over the function, which in turn is used to select the next point to evaluate. This allows Bayesian optimisation to be data-efficient, and hence it is a suitable technique for optimising hyperparameters of another system. This example will utilise the Spearmint library [1-5] in order to optimise the Branin-Hoo function.
6+
7+
## Requirements
8+
9+
- [MongoDB](https://www.mongodb.org/)
10+
- [Spearmint](https://github.com/HIPS/Spearmint)
11+
- [Flask](http://flask.pocoo.org/)
12+
- [Requests](http://python-requests.org/)
13+
14+
## Instructions
15+
16+
This example has been adapted from the [noisy Branin-Hoo example](https://github.com/HIPS/Spearmint/tree/master/examples/noisy). `branin_noisy.py` has been set up to take command line arguments and save its results in a JSON file, whilst `fglab.py` acts as an intermediary between Spearmint and the function to optimise by using FGLab's API.
17+
18+
1. Create a new project from [bayesian-optimisation.json](https://github.com/Kaixhin/FGLab/blob/master/examples/Bayesian-Optimisation/bayesian-optimisation.json).
19+
1. Set up [FGMachine](https://github.com/Kaixhin/FGMachine/blob/master/examples/Bayesian-Optimisation) and run Spearmint.
20+
21+
## Citations
22+
23+
[1] Snoek, J., Larochelle, H., & Adams, R. P. (2012). Practical Bayesian optimization of machine learning algorithms. In *Advances in neural information processing systems* (pp. 2951-2959).
24+
25+
[2] Swersky, K., Snoek, J., & Adams, R. P. (2013). Multi-task bayesian optimization. In *Advances in Neural Information Processing Systems* (pp. 2004-2012).
26+
27+
[3] Snoek, J., Swersky, K., Zemel, R. S., & Adams, R. P. (2014). Input warping for Bayesian optimization of non-stationary functions. *arXiv preprint* arXiv:1402.0929.
28+
29+
[4] Snoek, J. (2013). *Bayesian Optimization and Semiparametric Models with Applications to Assistive Technology* (Doctoral dissertation, University of Toronto).
30+
31+
[5] Gelbart, M. A., Snoek, J., & Adams, R. P. (2014). Bayesian optimization with unknown constraints. *arXiv preprint* arXiv:1403.5607.
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
{
2+
"x": {
3+
"type": "float",
4+
"default": 2
5+
},
6+
"y": {
7+
"type": "float",
8+
"default": 7
9+
}
10+
}

0 commit comments

Comments
 (0)