Home

Awesome

regression_uncertainty

overview image

Official implementation (PyTorch) of the paper:
How Reliable is Your Regression Model's Uncertainty Under Real-World Distribution Shifts?, 2023 [arXiv] [project].
Fredrik K. Gustafsson, Martin Danelljan, Thomas B. Schön.
We propose an extensive benchmark for testing the reliability of regression uncertainty estimation methods under real-world distribution shifts. It consists of 8 publicly available image-based regression datasets with different types of challenging distribution shifts. We use our benchmark to evaluate many of the most common uncertainty estimation methods, as well as two state-of-the-art uncertainty scores from OOD detection. We find that while methods are well calibrated when there is no distribution shift, they all become highly overconfident on many of the benchmark datasets. This uncovers important limitations of current uncertainty estimation methods, and the proposed benchmark thus serves as a challenge to the research community.

If you find this work useful, please consider citing:

@inproceedings{gustafsson2023how,
  title={How Reliable is Your Regression Model's Uncertainty Under Real-World Distribution Shifts?},
  author={Gustafsson, Fredrik K and Danelljan, Martin and Sch{\"o}n, Thomas B},
  booktitle={Transactions on Machine Learning Research (TMLR)},
  year={2023}
}


Datasets:

Cells:

Cells-Tails:

Cells-Gap:

ChairAngle:

ChairAngle-Tails:

ChairAngle-Gap:

AssetWealth:

VentricularVolume:

BrainTumourPixels:

SkinLesionPixels:

HistologyNucleiPixels:

AerialBuildingPixels: