Home

Awesome

PRIME: A Few Primitives Can Boost Robustness to Common Corruptions

This is the official repository of PRIME, the data agumentation method introduced in the ECCV 2022 paper "PRIME: A Few Primitives Can Boost Robustness to Common Corruptions". PRIME is a generic, plug-n-play data augmentation scheme that consists of simple families of max-entropy image transformations for conferring robustness against common corruptions. PRIME leads to significant improvements in corruption robustness on multiple benchmarks.

<p align="center"> <img src="misc/prime-augmentations.png"/> </p>

Pre-trained models

We provide different models trained with PRIME on CIFAR-10/100 and ImageNet datasets. You can download them from here.

Setup

This code has been tested with Python 3.8.5 and PyTorch 1.9.1. To install required dependencies run:

$ pip install -r requirements.txt

For corruption robustness evaluation, download and extract the CIFAR-10-C, CIFAR-100-C and ImageNet-C datasets from here.

Usage

We provide a script train.py for PRIME training on CIFAR-10/100, ImageNet-100 and ImageNet. For example, to train a ResNet-50 network on ImageNet with PRIME, run:

$ python -u train.py --config=config/imagenet_cfg.py \
    --config.save_dir=<save_dir> \
    --config.data_dir=<data_dir> \
    --config.cc_dir=<common_corr_dir> \
    --config.use_prime=True

Detailed configuration options for all the datasets can be found in config.

Results

Results on ImageNet/ImageNet-100 with a ResNet-50/ResNet-18 (†: without JSD loss)

DatasetMethod Clean ↑ CC Acc ↑  mCE ↓ 
ImageNetStandard76.138.176.1
ImageNetAugMix77.548.365.3
ImageNetDeepAugment76.752.660.4
ImageNetPRIME†77.055.057.5
ImageNetPRIME75.356.455.5
ImageNet-100Standard88.049.7100.0
ImageNet-100AugMix88.760.779.1
ImageNet-100DeepAugment86.367.768.1
ImageNet-100PRIME85.971.661.0

Results on CIFAR-10/100 with a ResNet-18

     Dataset        Method      Clean ↑CC Acc ↑   mCE ↓ 
CIFAR-10Standard95.074.024.0
CIFAR-10AugMix95.288.611.4
CIFAR-10PRIME93.189.011.0
CIFAR-100Standard76.751.948.1
CIFAR-100AugMix78.264.935.1
CIFAR-100PRIME77.668.331.7

Citing this work

@inproceedings{PRIME2022,
    title = {PRIME: A Few Primitives Can Boost Robustness to Common Corruptions}, 
    author = {Apostolos Modas and Rahul Rade and Guillermo {Ortiz-Jim\'enez} and Seyed-Mohsen {Moosavi-Dezfooli} and Pascal Frossard},
    year = {2022},
    booktitle = {European Conference on Computer Vision (ECCV)}
}