Releases: pytorch/botorch
Releases · pytorch/botorch
High Order GP model, multi-step look-ahead acquisition function
Compatibility
New Features
HigherOrderGP
- High-Order Gaussian Process (HOGP) model for
high-dimensional output regression (#631, #646, #648, #680).qMultiStepLookahead
acquisition function for general look-ahead
optimization approaches (#611, #659).ScalarizedPosteriorMean
andproject_to_sample_points
for more
advanced MFKG functionality (#645).- Large-scale Thompson sampling tutorial (#654, #713).
- Tutorial for optimizing mixed continuous/discrete domains (application
to multi-fidelity KG with discrete fidelities) (#716). GPDraw
utility for sampling from (exact) GP priors (#655).- Add
X
as optional arg to call signature ofMCAcqusitionObjective
(#487). OSY
synthetic test problem (#679).
Bug Fixes
- Fix matrix multiplication in
scalarize_posterior
(#638). - Set
X_pending
inget_acquisition_function
inqEHVI
(#662). - Make contextual kernel device-aware (#666).
- Do not use an
MCSampler
inMaxPosteriorSampling
(#701). - Add ability to subset outcome transforms (#711).
Performance Improvements
- Batchify box decomposition for 2d case (#642).
Other Changes
- Use scipy distribution in MES quantile bisect (#633).
- Use new closure definition for GPyTorch priors (#634).
- Allow enabling of approximate root decomposition in
posterior
calls (#652). - Support for upcoming 21201-dimensional PyTorch
SobolEngine
(#672, #674). - Refactored various MOO utilities to allow future additions (#656, #657, #658, #661).
- Support input_transform in PairwiseGP (#632).
- Output shape checks for t_batch_mode_transform (#577).
- Check for NaN in
gen_candidates_scipy
(#688). - Introduce
base_sample_shape
property toPosterior
objects (#718).
Contextual Bayesian Optimization, Input Warping, TuRBO, sampling from polytopes.
Compatibility
New Features
- Models (LCE-A, LCE-M and SAC ) for Contextual Bayesian Optimziation (#581).
- Implements core models from:
High-Dimensional Contextual Policy Search with Unknown Context Rewards using Bayesian Optimization.
Q. Feng, B. Letham, H. Mao, E. Bakshy. NeurIPS 2020. - See Ax for usage of these models.
- Implements core models from:
- Hit and run sampler for uniform sampling from a polytope (#592).
- Input warping:
- TuRBO-1 tutorial (#598).
- Implements the method from:
Scalable Global Optimization via Local Bayesian Optimization.
D. Eriksson, M. Pearce, J. Gardner, R. D. Turner, M. Poloczek. NeurIPS 2019.
- Implements the method from:
Bug fixes
Other changes
- Add
train_inputs
option toqMaxValueEntropy
(#593). - Enable gpytorch settings to override BoTorch defaults for
fast_pred_var
anddebug
(#595). - Rename
set_train_data_transform
->preprocess_transform
(#575). - Modify
_expand_bounds()
shape checks to work with >2-dim bounds (#604). - Add
batch_shape
property to models (#588). - Modify
qMultiFidelityKnowledgeGradient.evaluate()
to work withproject
,expand
andcost_aware_utility
(#594). - Add list of papers using BoTorch to website docs (#617).
Maintenance Release
New Features
- Add
PenalizedAcquisitionFunction
wrapper (#585) - Input transforms
- Differentiable approximate rounding for integers (#561)
Bug fixes
- Fix sign error in UCB when
maximize=False
(a4bfacbfb2109d3b89107d171d2101e1995822bb) - Fix batch_range sample shape logic (#574)
Other changes
- Better support for two stage sampling in preference learning
(0cd13d0) - Remove noise term in
PairwiseGP
and addScaleKernel
by default (#571) - Rename
prior
totask_covar_prior
inMultiTaskGP
andFixedNoiseMultiTaskGP
(16573fe) - Support only transforming inputs on training or evaluation (#551)
- Add
equals
method forInputTransform
(#552)
Maintenance Release
New Features
- Constrained Multi-Objective tutorial (#493)
- Multi-fidelity Knowledge Gradient tutorial (#509)
- Support for batch qMC sampling (#510)
- New
evaluate
method forqKnowledgeGradient
(#515)
Compatibility
- Require PyTorch >=1.6 (#535)
- Require GPyTorch >=1.2 (#535)
- Remove deprecated
botorch.gen module
(#532)
Bug fixes
- Fix bad backward-indexing of task_feature in
MultiTaskGP
(#485) - Fix bounds in constrained Branin-Currin test function (#491)
- Fix max_hv for C2DTLZ2 and make Hypervolume always return a float (#494)
- Fix bug in
draw_sobol_samples
that did not use the proper effective dimension (#505) - Fix constraints for
q>1
inqExpectedHypervolumeImprovement
(c80c4fd) - Only use feasible observations in partitioning for
qExpectedHypervolumeImprovement
inget_acquisition_function
(#523) - Improved GPU compatibility for
PairwiseGP
(#537)
Performance Improvements
- Reduce memory footprint in
qExpectedHypervolumeImprovement
(#522) - Add
(q)ExpectedHypervolumeImprovement
to nonnegative functions
[for better initialization] (#496)
Other changes
- Support batched
best_f
inqExpectedImprovement
(#487) - Allow to return full tree of solutions in
OneShotAcquisitionFunction
(#488) - Added
construct_inputs
class method to models to programmatically construct the
inputs to the constructor from a standardizedTrainingData
representation
(#477, #482, 3621198) - Acquisition function constructors now accept catch-all
**kwargs
options
(#478, e5b6935) - Use
psd_safe_cholesky
inqMaxValueEntropy
for better numerical stabilty (#518) - Added
WeightedMCMultiOutputObjective
(81d91fd) - Add ability to specify
outcomes
to all multi-output objectives (#524) - Return optimization output in
info_dict
forfit_gpytorch_scipy
(#534) - Use
setuptools_scm
for versioning (#539)
Multi-Objective Bayesian Optimization
New Features
- Multi-Objective Acquisition Functions (#466)
- q-Expected Hypervolume Improvement
- q-ParEGO
- Analytic Expected Hypervolume Improvement with auto-differentiation
- Multi-Objective Utilities (#466)
- Pareto Computation
- Hypervolume Calculation
- Box Decomposition algorithm
- Multi-Objective Test Functions (#466)
- Suite of synthetic test functions for multi-objective, constrained
optimzation
- Suite of synthetic test functions for multi-objective, constrained
- Multi-Objective Tutorial (#468)
- Abstract ConstrainedBaseTestProblem (#454)
- Add optimize_acqf_list method for sequentially, greedily optimizing 1 candidate
from each provided acquisition function (d10aec9)
Bug fixes
- Fixed re-arranging mean in MultiTask multi-output models (#450).
Other changes
Bugfix Release
Bugfix Release
Bug fixes
- There was a mysterious issue with the 0.2.3 wheel on pypi, where part of the
botorch/optim/utils.py
file was not included, which resulted in anImportError
for many central components of the code. Interestingly, the source dist (built with the same command) did not have this issue. - Preserve order in ChainedOutcomeTransform (#440).
New Features
- Utilities for estimating the feasible volume under outcome constraints (#437).
Pairwise GP for Preference Learning, Sampling Strategies
Introduces a new Pairwise GP model for Preference Learning with pair-wise preferential feedback, as well as a Sampling Strategies abstraction for generating candidates from a discrete candidate set.
Compatibility
New Features
- Add
PairwiseGP
for preference learning with pair-wise comparison data (#388). - Add
SamplingStrategy
abstraction for sampling-based generation strategies, including
MaxPosteriorSampling
(i.e. Thompson Sampling) andBoltzmannSampling
(#218, #407).
Deprecations
- The existing
botorch.gen
module is moved tobotorch.generation.gen
and imports
frombotorch.gen
will raise a warning (an error in the next release) (#218).
Bug fixes
- Fix & update a number of tutorials (#394, #398, #393, #399, #403).
- Fix CUDA tests (#404).
- Fix sobol maxdim limitation in
prune_baseline
(#419).
Other changes
- Better stopping criteria for stochastic optimization (#392).
- Improve numerical stability of
LinearTruncatedFidelityKernel
(#409). - Allow batched
best_f
inqExpectedImprovement
andqProbabilityOfImprovement
(#411). - Introduce new logger framework (#412).
- Faster indexing in some situations (#414).
- More generic
BaseTestProblem
(9e604fe).
Require Python 3.7 and new features
Require Python 3.7 and adds new features for active learning and multi-fidelity optimization, along with a number of bug fixes.
Compatibility
New Features
- Add
qNegIntegratedPosteriorVariance
for Bayesian active learning (#377). - Add
FixedNoiseMultiFidelityGP
, analogous toSingleTaskMultiFidelityGP
(#386). - Support
scalarize_posterior
for m>1 and q>1 posteriors (#374). - Support
subset_output
method on multi-fidelity models (#372). - Add utilities for sampling from simplex and hypersphere (#369).
Bug fixes
- Fix
TestLoader
local test discovery (#376). - Fix batch-list conversion of
SingleTaskMultiFidelityGP
(#370). - Validate tensor args before checking input scaling for more
informative error messaages (#368). - Fix flaky
qNoisyExpectedImprovement
test (#362). - Fix test function in closed-loop tutorial (#360).
- Fix num_output attribute in BoTorch/Ax tutorial (#355).
Other changes
Compatibility Release
Minor bug fix release.
New Features
- Add a static method for getting batch shapes for batched MO models (#346).
Bug fixes
- Revamp qKG constructor to avoid issue with missing objective (#351).
- Make sure MVES can support sampled costs like KG (#352).
Other changes
- Allow custom module-to-array handling in fit_gpytorch_scipy (#341).