Home

Awesome

This repo is the official implementation of the ECCV2022 paper: Efficient One Pass Self-distillation with Zipf's Label Smoothing.

Zipf's LS: Efficient One Pass Self-distillation with Zipf's Label Smoothing

Framework & Comparison

<div style="text-align:center"><img src="megengine_zipfls/pics/framework.png" width="100%" ></div> <div style="text-align:center"><img src="megengine_zipfls/pics/comparison.png" width="100%" ></div>

[2022.9] Pytorch Zipf's label smoothing is uploaded! CIFAR, TinyImageNet, ImageNet, INAT-21 are now supported by our training codes.

[2022.7] MegEngine Zipf's label smoothing is uploaded!

Main Results

MethodDenseNet121DenseNet121ResNet18ResNet18
ArchCIFAR100TinyImageNetCIFAR100TinyImageNet
Pytorch Baseline77.86±0.2660.31±0.3675.51±0.2856.41±0.20
Pytorch Zipf's LS79.03±0.3262.64±0.3077.38±0.3259.25±0.20
Megengine Baseline77.97±0.1860.78±0.3175.29±0.2956.03±0.34
Megengine Zipf's LS79.85±0.2762.35±0.3277.08±0.2859.01±0.23

Training

train_baseline_cifar100_resnet18:

python3 train.py --ngpus 1 --dataset CIFAR100 --data_dir cifar100_data --arch CIFAR_ResNet18 --loss_lambda 0.0 --alpha 0.0 --dense

train_ZipfsLS_cifar100_resnet18:

python3 train.py --ngpus 1 --dataset CIFAR100 --data_dir cifar100_data --arch CIFAR_ResNet18 --loss_lambda 0.1 --alpha 0.1 --dense

See more examples in Makefile.

Liscense

Zipf's LS is released under the Apache 2.0 license. See LICENSE for details.