Awesome
<div align="center"> </div>TorchUncertainty is a package designed to help you leverage uncertainty quantification techniques and make your deep neural networks more reliable. It aims at being collaborative and including as many methods as possible, so reach out to add yours!
:construction: TorchUncertainty is in early development :construction: - expect changes, but reach out and contribute if you are interested in the project! Please raise an issue if you have any bugs or difficulties and join the discord server.
:books: Our webpage and documentation is available here: torch-uncertainty.github.io. :books:
TorchUncertainty contains the official implementations of multiple papers from major machine-learning and computer vision conferences and was/will be featured in tutorials at WACV 2024, HAICON 2024 and ECCV 2024.
This package provides a multi-level API, including:
- easy-to-use :zap: lightning uncertainty-aware training & evaluation routines for 4 tasks: classification, probabilistic and pointwise regression, and segmentation.
- ready-to-train baselines on research datasets, such as ImageNet and CIFAR
- pretrained weights for these baselines on ImageNet and CIFAR ( :construction: work in progress :construction: ).
- layers, models, metrics, & losses available for use in your networks
- scikit-learn style post-processing methods such as Temperature Scaling.
Have a look at the Reference page or the API reference for a more exhaustive list of the implemented methods, datasets, metrics, etc.
:gear: Installation
TorchUncertainty requires Python 3.10 or greater. Install the desired PyTorch version in your environment. Then, install the package from PyPI:
pip install torch-uncertainty
The installation procedure for contributors is different: have a look at the contribution page.
:racehorse: Quickstart
We make a quickstart available at torch-uncertainty.github.io/quickstart.
:books: Implemented methods
TorchUncertainty currently supports classification, probabilistic and pointwise regression, segmentation and pixelwise regression (such as monocular depth estimation). It includes the official codes of the following papers:
- LP-BNN: Encoding the latent posterior of Bayesian Neural Networks for uncertainty quantification - IEEE TPAMI
- Packed-Ensembles for Efficient Uncertainty Estimation - ICLR 2023 - Tutorial
- MUAD: Multiple Uncertainties for Autonomous Driving, a benchmark for multiple uncertainty types and tasks - BMVC 2022
We also provide the following methods:
Baselines
To date, the following deep learning baselines have been implemented. Click :inbox_tray: on the methods for tutorials:
- Deep Ensembles, BatchEnsemble, Masksembles, & MIMO
- MC-Dropout
- Packed-Ensembles (see Blog post)
- Variational Bayesian Neural Networks
- Checkpoint Ensembles & Snapshot Ensembles
- Stochastic Weight Averaging & Stochastic Weight Averaging Gaussian
- Regression with Beta Gaussian NLL Loss
- Deep Evidential Classification & Regression
Augmentation methods
The following data augmentation methods have been implemented:
- Mixup, MixupIO, RegMixup, WarpingMixup
Post-processing methods
To date, the following post-processing methods have been implemented:
- Temperature, Vector, & Matrix scaling
- Monte Carlo Batch Normalization
- Laplace approximation using the Laplace library
Tutorials
Check out our tutorials at torch-uncertainty.github.io/auto_tutorials.
:telescope: Projects using TorchUncertainty
The following projects use TorchUncertainty:
- A Symmetry-Aware Exploration of Bayesian Neural Network Posteriors - ICLR 2024
If you are using TorchUncertainty in your project, please let us know, we will add your project to this list!