VBLL introduces a deterministic variational formulation for training Bayesian last layers in neural networks. This method offers a computationally efficient approach to improving uncertainty estimation in deep learning models. By leveraging this technique, VBLL can be trained and evaluated with quadratic complexity in last layer width, making it nearly computationally free to add to standard architectures. Our work focuses on enhancing predictive accuracy, calibration, and out-of-distribution detection over baselines in both regression and classification.
The easiest way to install VBLL is with pip:
pip install vbll
You can also install by cloning the GitHub repo:
# Clone the repository
git clone https://github.com/VectorInstitute/vbll.git
# Navigate into the repository directory
cd vbll
# Install required dependencies
pip install -e .
Documentation is available here.
You can also check out our tutorial colabs:
Contributions to the VBLL project are welcome. If you're interested in contributing, please read the contribution guidelines in the repository.
If you find VBLL useful in your research, please consider citing our paper:
@inproceedings{harrison2024vbll,
title={Variational Bayesian Last Layers},
author={Harrison, James and Willes, John and Snoek, Jasper},
booktitle={International Conference on Learning Representations (ICLR)},
year={2024}
}