Releases: ENSTA-U2IS-AI/torch-uncertainty
v0.2.0 Lightning 2.0, RegressionRoutine & SegmentationRoutine
🚀 TorchUncertainty 0.2.0 Released! 🚀
We're thrilled to unveil TorchUncertainty 0.2.0!
This update brings a complete overhaul reconstruction around our uncertainty-aware routines. Highlights include:
-
Lightning 2.0: Support and a complete overhaul of the command-line interface.
-
RegressionRoutine: Fully functional, now supporting probabilistic regression with PyTorch distributions.
-
SegmentationRoutine: Introduces semantic segmentation support for datasets like Cityscapes and MUAD.
Stay tuned for even more (Monocular depth estimation!) in TorchUncertainty 0.2.1!
Breaking Changes
As we are still in pre-release, this version breaks a large part of the routine and CLI components of TorchUncertainty 0.1.6.
CLI
The behavior of the CLI has completely changed and is now based on the configuration files from Lightning 2.0. We provide a new page that explains how to leverage Baselines using the CLI for easy benchmarking.
Routines
Notably, there is no more distinction between ensemble and single routines to reduce code entropy: single routines are ensemble routines with 1 estimator. Furthermore, the routines' loss parameters now take an instantiated loss instead of a type, the optimization_procedure is renamed optim_recipe and is now a dictionary and not a callable. The ood_criterion and the calibration sets are now strings.
Metrics
The NegativeLogLikelihood metric is renamed CategoricalNLL.
Baselines
All baselines have been renamed to explicitly contain "Baselines" in their name.
Tutorials
We have rewritten and updated the tutorials should now be clearer. Send us feedback!
What's Changed
- ➖ Avoid using Argvcontext in tutorials by @o-laurent in #82
- 🚀 Upgrade to Lightning 2.0 by @alafage in #79
- 🚀 Update to Lightning 2.0, Add Segmentation, & Rework Regression by @o-laurent in #85
Full Changelog: v0.1.6...v0.2.0
v0.1.6 Add Grouping Loss, MC-BN, OpenImage-O & MUAD
What's Changed
- ⬆️ Bump tj-actions/changed-files from 34 to 41 in /.github/workflows by @dependabot in #75
- ✨ Add ResNet-20, corruptions and improve docs by @o-laurent in #76
- ✨ Add the grouping loss to single model training by @o-laurent in #77
- ✨ Add Monte-Carlo Batch Normalization by @o-laurent in #78
- ✨ Add grouping loss, Monte-Carlo Batch Normalization, OpenImage-O, MUAD & Improve code quality by @o-laurent in #80
Full Changelog: v0.1.5...v0.1.6
Add Evidential Classification & Regression, add Mixup and variants, fix MC-Dropout, add Sparsification metric and plots for calibration, switch to flit & ruff
What's Changed
- ✨ Improve documentation, Enable BNN on GPUs, improve BNN code & code overall quality by @o-laurent in #44
- ✨ Continue improving Bayesian layers, & Fix MI in rare cases by @o-laurent in #45
- ✨ Add Deep Evidential Regression, generalize Packed layers, and refactor the datasets by @badrmarani in #46 and @o-laurent in #48
- ✨ Add visualization tools by @o-laurent in #50
- ⬆️ Bump urllib3 from 2.0.6 to 2.0.7 by @dependabot in #51
- ✨ Add MC-Dropout & Visualization tools by by @badrmarani in #49 and @o-laurent in #53
- ⬆️ Bump werkzeug from 3.0.0 to 3.0.1 by @dependabot in #62
- 🔧 Switch to ruff instead of black + isort + flake8 by @o-laurent in #64
- ✨ Add Evidential Classification, switch to ruff, update packages by @xuanlongORZ in #56 and @o-laurent in #65
- ✨ add beta nll, a modified GaussianNLL by @xuanlongORZ in #66
- ✨ Mixup variants + Cross validation + Temperature scaling in routines by @qbouniot in #63
- 👕 Extend ruff rules by @o-laurent in #67
- 🔥 Remove poetry & add flit by @o-laurent in #69
- 👕 Improve the code and the documentation by @o-laurent in #71
- ✨ Sparsification metric and plot methods for Calibration Error by @alafage in #68
- 🐛 Fix Monte-Carlo Dropout by @o-laurent in #73
- ✨ Remove Poetry, add Mixup, rework metrics, & improve code quality by @o-laurent in #70
- 🔧 Update sphinx dependency by @o-laurent in #74
New Contributors
- @badrmarani made their first contribution in #46
- @dependabot made their first contribution in #51
- @xuanlongORZ made their first contribution in #56
- @qbouniot made their first contribution in #63
Huge thanks to them!
Full Changelog: v0.1.4...v0.1.5
MIMO, Scalers, & more datasets
MIMO, Scalers, & more datasets
In this PR, we added a new preprocessing function for MIMO-like networks. We added different post-hoc scaling methods to improve the calibration. We added MNIST-C and TinyImageNet-C to the corrupted datasets. Furthermore, we refined the GitHub workflow to improve the development chain. Finally, we also added one tutorial for temperature scaling.
What's Changed
- Add scalers, TinyImageNet-C & various improvements by @o-laurent in #37
- Add MNIST-C, a tutorial, & small fixes by @o-laurent in #38
- 📖 Improve the scaling tutorial & misc by @o-laurent in #39
- ✨ Add MIMO by @alafage in #41
- Add MIMO, NotMNIST, improve coverage, and Misc by @o-laurent in #42
Full Changelog: v0.1.3...v0.1.4
Add Regression, Bayesian Neural Networks, & Deep Ensembles
New functionalities
- Regression training and testing (routine, dataset, experiment, etc)
- Bayesian Neural Networks (Linear, Conv, Sampler, ELBO, tutorial)
- Deep Ensembles (Routine & model)
- Add VGG
- relax constraints on the packages
- Improve coverage
Pull Requests merged
- Add contribution page and rework readme 📖 in #26
- ✨ Enable regression & misc in #27
- ✨ Add support for regression & BNNs in #30
Full Changelog: v0.1.2...v0.1.3
Rework baselines & Add datasets
What's Changed
- Rework baselines to increase code reuse and simplify their use.
- Add post-hoc temperature scaling
- Add ImageNet variations (A, O, R)
- Add dataset autodownload
- Add transforms for PixMix.
- Update packages
Pull Requests
- Rework baselines by @alafage in #25
- Merge dev in main ✨ by @o-laurent in #24
Full Changelog: v0.1.1...v0.1.2
Add baselines & Improve documentation
What's Changed
- Doc theme by @alafage in #12
- Add logos 🎨
- Add WR baselines for BatchEnsembles and Masksembles ✨
- Switch to ImageNet structure by default and add the corresponding parameter 🔨
- Add CIFAR-10H support ✨
- Refactor Packed layers to ease understanding 🔨
- Start handling HF weights ✨
- Improve the documentation & ReadMe 📚
- Add a reference page 📖
- Improve test coverage ✔️
- Update packages ⚡
- Make the logo's background transparent 👕
- Remove old implementations of the metrics 🔥
- Add channel-last support ✨
- Major metrics updates 🐛 🎨
- Improve tests
New Contributors
Full Changelog: v0.1.0...v0.1.1
Torch Uncertainty First Release
- Add first baselines
- Add metrics
- Add tests
- Add documentation