Home

Awesome

Learning Fast, Learning Slow

Official Repository for ICLR'22 Paper "Learning Fast, Learning Slow: A General Continual Learning Method based on Complementary Learning System"

Screenshot 2023-08-07 at 10 47 11

We extended the Mammoth framework with our method (CLS-ER) and GCIL-CIFAR-100 dataset

Additional Results

For a more extensive evaluation of our our method and benchmarking, we evaluated CLS-ER on S-CIFAR100 with 5 Tasks and also provide the Task-IL results for all the settings. Note that similar to DER, Task-IL results merely use logit masking at inference.

S-MNISTS-CIFAR-10S-CIFAR-100S-TinyImg
Buffer SizeClass-ILTask-ILClass-ILTask-ILClass-ILTask-ILClass-ILTask-IL
20089.54±0.2197.97±0.1766.19±0.7593.90±0.6043.80±1.8973.49±1.0423.47±0.8049.60±0.72
50092.05±0.3298.95±0.1075.22±0.7194.94±0.5351.40±1.0078.12±0.2431.03±0.5660.41±0.50
512095.73±0.1199.40±0.0486.78±0.1797.08±0.0965.77±0.4984.46±0.4546.74±0.3175.81±0.35

Setup

Examples:

python main.py --dataset seq-mnist --model clser --buffer_size 500 --load_best_args

python main.py --dataset seq-cifar10 --model clser --buffer_size 500 --load_best_args

python main.py --dataset seq-tinyimg --model clser --buffer_size 500 --load_best_args

python main.py --dataset perm-mnist --model clser --buffer_size 500 --load_best_args

python main.py --dataset rot-mnist --model clser --buffer_size 500 --load_best_args

python main.py --dataset mnist-360 --model clser --buffer_size 500 --load_best_args

Example:

python main.py --dataset gcil-cifar100 --weight_dist unif --model clser --buffer_size 500 --load_best_args

python main.py --dataset gcil-cifar100 --weight_dist longtail --model clser --buffer_size 500 --load_best_args

Requirements

Cite Our Work

If you find the code useful in your research, please consider citing our paper:

@inproceedings{
  arani2022learning,
  title={Learning Fast, Learning Slow: A General Continual Learning Method based on Complementary Learning System},
  author={Elahe Arani and Fahad Sarfraz and Bahram Zonooz},
  booktitle={International Conference on Learning Representations},
  year={2022},
  url={https://openreview.net/forum?id=uxxFrDwrE7Y}
}