Home

Awesome

ResLT: Residual Learning for Long-tailed Recognition (TPAMI 2022)

This repository contains the implementation code for paper:
Residual Learning for Long-tailed Recognition https://arxiv.org/abs/2101.10633

Updates

We further verifty the proposed ResLT is complementary to ensemble-based methods. Equipped with RIDEResNeXt, our model achieves better results. All experiments are conducted without knowledge distillation for fair comparison. For RIDE, we use their public code and train 180 epochs.

ImageNet-LT

ModelTop-1 AccDownloadlog
RIDEResNeXt(3 experts)55.1-log
RIDEResNeXt-ResLT(3 experts)57.6modellog

Inaturalist 2018

ModelTop-1 AccDownloadlog
RIDEResNeXt(3 experts)70.8-log
RIDEResNeXt-ResLT(3 experts)72.9modellog

Overview

In this paper, we proposed a residual learning method to address long-tailed recognition, which contains a Residual Fusion Module and a Parameter Specialization Mechanism. With extensive ablation studies, we demonstrate the effectiveness of our method.

image

Get Started

ResLT Training

For CIFAR, due to the small data size, different experimental environment can have a big difference. To achieve the reported results, you may need to slightly tune the $\alpha$.

bash sh/CIFAR100/CIFAR100LT_imf0.01_resnet32sx1_beta0.9950.sh

For ImageNet-LT,

bash sh/X50.sh

For iNaturalist 2018,

bash sh/R50.sh

Results and Models

ImageNet-LT

ModelDownloadlog
ResNet-10modellog
ResNeXt-50modellog
ResNeXt-101modellog

iNatualist 2018

ModelDownloadlog
ResNet-50modellog

Places-LT

ModelDownloadlog
ResNet-152--

Acknowledgements

This code is partly based on the open-source implementations from offical PyTorch examples and LDAM-DRW.

Contact

If you have any questions, feel free to contact us through email (jiequancui@link.cuhk.edu.hk) or Github issues. Enjoy!

BibTex

If you find this code or idea useful, please consider citing our works:

@ARTICLE{9774921,
  author={Cui, Jiequan and Liu, Shu and Tian, Zhuotao and Zhong, Zhisheng and Jia, Jiaya},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence}, 
  title={ResLT: Residual Learning for Long-Tailed Recognition}, 
  year={2023},
  volume={45},
  number={3},
  pages={3695-3706},
  doi={10.1109/TPAMI.2022.3174892}}

@ARTICLE{10130611,
  author={Cui, Jiequan and Zhong, Zhisheng and Tian, Zhuotao and Liu, Shu and Yu, Bei and Jia, Jiaya},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence}, 
  title={Generalized Parametric Contrastive Learning}, 
  year={2023},
  volume={},
  number={},
  pages={1-12},
  doi={10.1109/TPAMI.2023.3278694}}


@inproceedings{cui2021parametric,
  title={Parametric contrastive learning},
  author={Cui, Jiequan and Zhong, Zhisheng and Liu, Shu and Yu, Bei and Jia, Jiaya},
  booktitle={Proceedings of the IEEE/CVF international conference on computer vision},
  pages={715--724},
  year={2021}
}
  
@article{cui2022region,
  title={Region Rebalance for Long-Tailed Semantic Segmentation},
  author={Cui, Jiequan and Yuan, Yuhui and Zhong, Zhisheng and Tian, Zhuotao and Hu, Han and Lin, Stephen and Jia, Jiaya},
  journal={arXiv preprint arXiv:2204.01969},
  year={2022}
  }
  
@article{zhong2023understanding,
  title={Understanding Imbalanced Semantic Segmentation Through Neural Collapse},
  author={Zhong, Zhisheng and Cui, Jiequan and Yang, Yibo and Wu, Xiaoyang and Qi, Xiaojuan and Zhang, Xiangyu and Jia, Jiaya},
  journal={arXiv preprint arXiv:2301.01100},
  year={2023}
}