Home

Awesome

CLearning is a general continual learning framework.

1. Features

2. Methods

The current repository implements the following continual learning methods:

3. Environment

3.1. Code Environment

The required packages are listed in files 'config/requirements.txt' and 'config/env.yml'. We recommend using Anaconda to manage packages, such as:

conda create -n torch1.11.0 python=3.9
conda activate torch1.11.0
conda install pytorch==1.11.0 torchvision==0.12.0 torchaudio==0.11.0 cudatoolkit=11.3 -c pytorch
cd config
pip install -r requirements.txt

3.2. Dataset

Folder Structures

├── data
│   ├── cifar-100-python
│   ├── ImageNet
│   ├── ImageNet100

The CIFAR-100 dataset can be automatically downloaded by running an arbitrary script of CIFAR-100 experiments. The ImageNet dataset must be pre-downloaded. The ImageNet-100 dataset can be generated by the script 'tool/gen_imagenet100.py'.

4. Command

python main.py --cfg config/baseline/finetune/finetune_cifar100.yaml --device 0 --note test

5. Acknowledgements

We appreciate the following GitHub repos a lot for their valuable code base:

6. Citation

We hope that our research can help you and promote the development of continual learning. If you find this work useful, please consider citing the corresponding paper:

@inproceedings{wen2024class,
  title={Class Incremental Learning with Multi-Teacher Distillation},
  author={Wen, Haitao and Pan, Lili and Dai, Yu and Qiu, Heqian and Wang, Lanxiao and Wu, Qingbo and Li, Hongliang},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={28443--28452},
  year={2024}
}

@InProceedings{pmlr-v202-wen23b,
  title = {Optimizing Mode Connectivity for Class Incremental Learning},
  author = {Wen, Haitao and Cheng, Haoyang and Qiu, Heqian and Wang, Lanxiao and Pan, Lili and Li, Hongliang},
  booktitle = {Proceedings of the 40th International Conference on Machine Learning},
  pages = {36940--36957},
  year = {2023},
  volume = {202},
  publisher = {PMLR}
}