Home

Awesome

Long-Tailed-Classification-Leaderboard

date: 2021/3/3(Updated 2021/3/3)<br /> auther: YW YSZ

1. Introduction

List of abbreviations:

AbbreviationsReWTrLMeLDeLAugSeSuOtM
Full namesRe-weightingTransfer LearningMeta LearningDecoupling LearningData AugmentationSelf-Supervised LearningOther methods

2. Benchmark datasets

DatasetYearImages(Triain/Val/Test)ClassesMax imagesMin imagesImbalance factorReported by
CIFAR-LT-10201950000–11203/--/10000105,000500–251.0–200.0Source
CIFAR-LT-100201950000–9502/--/10000100500500–21.0–200.0Source
ImageNet-LT2019115846/20000/50000100012805256.0Source
Places-LT201962500/7300/3650036549805996.0Source
iNat 20172017579184/95986/--508939199435.4Source
iNat 20182018437513/24426/--814210002500.0Source

3. Leaderboard

3.1 CIFAR-10-LT

Evaluation metric: classification error rate.

IF represents Imbalance factor.

Backbone: ResNet-32
MethodVenueYearBackboneTypeIF=10IF=50IF=100CodeReported by
Class-Balanced LossCVPR2019ResNet-32ReW12.5120.7325.43----Source
LDAM-DRWNeurIPS2019ResNet-32ReW11.84-----22.97----Source
MW-NetNeurIPS2019ResNet-32ReW/MeL12.1619.9424.79----Source
LDAM-M2mCVPR2020ResNet-32TrL12.5-----20.9-----Source
BBNCVPR2020ResNet-32DeL11.6817.8220.18----Source
CBasDA-LDAMCVPR2020ResNet-32ReW12.617.7722.77----Source
Balanced SoftmaxECCV-Workshop2020ResNet-32OtM9.1-----16.9----Source
De-confound-TDENeurIPS2020ResNet-32OtM11.516.419.4----Source
BALMSNeurIPS2020ResNet-32ReW8.7-----15.1----Source
LDAM-DRW + SSPNeurIPS2020ResNet-32SeSu11.4717.8722.17----Source
Baseline + tricksAAAI2021ResNet-32OtM-----16.4119.97----Source
Remix-DRWECCV-Workshop2020ResNet-32Aug10.98-----20.24----Source
Backbone: ResNet-18
MethodVenueYearBackboneTypeIF=10IF=50IF=100CodeReported by
FSAECCV2020ResNet-18Aug8.2515.2919.43----Source
Backbone: ResNet-34
MethodVenueYearBackboneTypeIF=10IF=50IF=100CodeReported by
FSAECCV2020ResNet-34Aug8.815.5117.94----Source

3.2 CIFAR-100- LT

Evaluation metric: classification error rate.

Backbone: ResNet-32
MethodVenueYearBackboneTypeIF=10IF=50IF=100CodeReported by
Class-Balanced LossCVPR2019ResNet-32ReW42.0154.6860.40----Source
LDAM-DRWNeurIPS2019ResNet-32ReW41.29-----57.96----Source
MW-NetNeurIPS2019ResNet-32ReW/MeL41.5453.2657.91----Source
LDAM-M2mCVPR2020ResNet-32TrL42.4-----56.5-----Source
BBNCVPR2020ResNet-32DeL40.8852.9857.44----Source
CBasDA-LDAMCVPR2020ResNet-32ReW42.050.8460.47----Source
LFME+LDAMECCV2020ResNet-32TrL----------56.2----Source
Balanced SoftmaxECCV-Workshop2020ResNet-32OtM36.9-----49.7----Source
De-confound-TDENeurIPS2020ResNet-32OtM40.449.755.9----Source
BALMSNeurIPS2020ResNet-32ReW37.0-----49.2----Source
LDAM-DRW + SSPNeurIPS2020ResNet-32SeSu41.0952.8956.57----Source
Baseline + tricksAAAI2021ResNet-32OtM-----48.3152.17----Source
Remix-DRWECCV-Workshop2020ResNet-32Aug38.77-----53.23----Source
Backbone: ResNet-18
MethodVenueYearBackboneTypeIF=10IF=50IF=100CodeReported by
FSAECCV2020ResNet-18Aug34.9248.153.43----Source
Backbone: ResNet-34
MethodVenueYearBackboneTypeIF=10IF=50IF=100CodeReported by
FSAECCV2020ResNet-34Aug34.7147.8351.49----Source

3.3 ImageNet-LT

Evaluation metric: closed-set setting/Top-1 classification accuracy.

Backbone: ResNet-10
MethodVenueYearBackboneTypeMany-ShotMedium-ShotFew-ShotALLCodeReported by
OLTRCVPR2019ResNet-10TrL43.235.118.535.6----Source
LWSICLR2020ResNet-10DeL--------------41.4----Source
IEMCVPR2020ResNet-10OtM48.944.024.443.2----Source
LFME+OLTRECCV2020ResNet-10TrL47.037.919.238.8----Source
FSAECCV2020ResNet-10Aug47.331.614.735.2----Source
BALMSNeurIPS2020ResNet-10ReW50.339.525.341.8----Source
cRT + SSPNeurIPS2020ResNet-10SeSu--------------43.2----Source
Baseline + tricksAAAI2021ResNet-10OtM--------------43.31----Source
Backbone: ResNeXt-50
MethodVenueYearBackboneTypeMany-ShotMedium-ShotFew-ShotALLCodeReported by
LWSICLR2020ResNeXt-50DeL60.247.230.349.9----Source
Backbone: ResNeXt-152
MethodVenueYearBackboneTypeMany-ShotMedium-ShotFew-ShotALLCodeReported by
LWSICLR2020ResNeXt-152DeL63.550.434.253.3----Source

3.4 Places-LT

Evaluation metric: closed-set setting/Top-1 classification accuracy.

Backbone: ResNet-152
MethodVenueYearBackboneTypeMany-ShotMedium-ShotFew-ShotALLCodeReported by
OLTRCVPR2019ResNet-152TrL44.73725.335.9----Source
LWSICLR2020ResNet-152DeL40.639.128.637.6----Source
τ -normalizedICLR2020ResNet-152DeL37.840.731.837.9----Source
IEMCVPR2020ResNet-152OtM46.839.228.039.7----Source
LFME+OLTRECCV2020ResNet-152TrL39.339.624.236.2----Source
FSAECCV2020ResNet-152Aug42.837.522.736.4----Source
Backbone: ResNet-10
MethodVenueYearBackboneTypeMany-ShotMedium-ShotFew-ShotALLCodeReported by
BALMSNeurIPS2020ResNet-10ReW41.239.831.638.7----

3.5 iNaturalist

Evaluation metric: Top-1 classification accuracy

Backbone: ResNet-50
MethodVenueYearBackboneTypeiNat-2017(Top1)iNat-2018(Top1)CodeReported by
CB FocalCVPR2019ResNet-50ReW58.0861.12----Source
LWSICLR2020ResNet-50DeL-----65.9/69.5 (90/200)----Source
IEMCVPR2020ResNet-50OtM-----70.2----Source
BBNCVPR2020ResNet-50DeL63.3966.29----Source
BBN(2×)CVPR2020ResNet-50DeL65.7569.62----Source
CBasDA-CECVPR2020ResNet-50ReW59.3867.55----Source
FSAECCV2020ResNet-50Aug61.9665.91----Source
cRT + SSPNeurIPS2020ResNet-50SeSu-----68.1----Source
Baseline + tricksAAAI2021ResNet-50OtM-----70.87----Source
Remix-DRSECCV-Workshop2020ResNet-50Aug-----70.74----Source
Backbone: ResNet-152
MethodVenueYearBackboneTypeiNat-2017(Top1)iNat-2018(Top1)CodeReported by
CB FocalCVPR2019ResNet-152ReW61.8464.16----Source
LWSICLR2020ResNet-152DeL-----69.1/72.1 (90/200)----Source
FSAECCV2020ResNet-152Aug66.5869.08----Source

4. Contact

Yan Wang : yanwang@smail.nju.edu.cn

Yongshun Zhang: zhangys@lamda.nju.edu.cn

5. References

2019

2020

2021