Awesome
[ECCV 2024] Uncertainty Calibration with Energy Based Instance-wise Scaling in the Wild Dataset
<!-- **π’Code will be released soon !** -->Official implementation of "Uncertainty Calibration with Energy Based Instance-wise Scaling in the Wild Dataset (ECCV '24)" Mijoo Kim, and Junseok Kwon
Abstract
<p align='center'> <img src='./figures/pipeline.png' width='900'/> </p>In this paper, we investigate robust post-hoc uncertainty calibration methods for DNNs within the context of multi-class classification tasks. While previous studies have made notable progress, they still face challenges in achieving robust calibration, particularly in scenarios involving out-of-distribution (OOD). We identify that previous methods lack adaptability to individual input data and struggle to accurately estimate uncertainty when processing inputs drawn from the wild dataset. To address this issue, we introduce a novel instance-wise calibration method based on an energy model. Our method incorporates energy scores instead of softmax confidence scores, allowing for adaptive consideration of DNN uncertainty for each prediction within a logit space.
Run demo
(1) Download datasets
- ID dataset (ex. cifar10, cifar100, imagenet )
- OOD dataset
- Corrupted dataset (ex. cifar10C, cifar100C, imagenetC ) link
- two Semantic OOD datasets (ex. SVHN, Texture)
- one for tuning / one for evaluation
π¦Energy-Calibration
βββ πdata
β βββπcifar10
β βββπcifar10C
β βββπSVHN
β βββπdtd
βββ ...
(2) Environment setup
git clone https://github.com/mijoo308/Energy-Calibration.git
cd Energy-Calibration
conda create -n energycal python=3.8
conda activate energycal
pip install -r requirements.txt
(3) Run
python main.py --gpu --ddp
If you already have logit files, you can use the following flags:
--id_train_inferenced
: already have all logit files for tuning--ood_train_inferenced
: already have all ood logit files for tuning--test_inferenced
: already have all test logit files for evaluation
(4) Result
After running the main script, you will see the results in your terminal and the results will be saved in the result
folder:
π¦Energy-Calibration
βββ πresult
β βββ πcifar10
β βββ πdensenet201
β βββ πdensenet201_cifar10_{corruption_type}_{severity_level}_result.pkl
βββ ...
Run using your own classifier
Place the network architecture files in the ./models/
directory and the pretrained weight files in the ./weights/
directory.
π¦Energy-Calibration
βββπdata
β βββπcifar10
β βββπcifar10C
β βββπSVHN
β βββπdtd
βββπmodels
β βββπ{network}.py /* place your own network */
βββπresult
βββπsource
βββπweights
β βββπ{network}.pth /* place your own weight file */
βββπmain.py
βββ...
Run
python main.py --gpu --net {network} --weight_path {weight file path}
Acknowledgements
This repository benefits from the following repositories:
- energy_ood
- pytorch-cifar100
- temperature_scaling
- focal_calibration
- DensityAwareCalibration
- Mix-n-Match-Calibration
- spline-calibration
We greatly appreciate their outstanding work and contributions to the community !
Citation
If you find this repository useful, please consider citing :
@inproceedings{kim2024uncertainty,
title={Uncertainty Calibration with Energy Based Instance-wise Scaling in the Wild Dataset},
author={Kim, Mijoo and Kwon, Junseok},
booktitle={ECCV},
year={2024}
}