Home

Awesome

Consistency is the key to further mitigating catastrophic forgettin in continual learning

Official Repository for for CoLLAs 2022 paper "Consistency is the key to further mitigating catastrophic forgettin in continual learning"

This repo is built on top of the Mammoth continual learning framework

Setup

Examples:

python main.py --seed 10 --dataset seq-cifar10 --img_size 32 --model cr  --buffer_size 200 --load_best_args --pretext_task mse

python main.py --seed 10 --dataset seq-cifar100 --img_size 32 --model cr  --buffer_size 500 --load_best_args --pretext_task linf

python main.py --seed 10 --dataset seq-tinyimg --img_size 64 --model cr  --buffer_size 5120 --load_best_args --pretext_task l1

Requirements

Cite Our Work

If you find the code useful in your research, please consider citing our paper:

@InProceedings{pmlr-v199-bhat22b,
  title = 	 {Consistency is the Key to Further Mitigating Catastrophic Forgetting in Continual Learning},
  author =       {Bhat, Prashant Shivaram and Zonooz, Bahram and Arani, Elahe},
  booktitle = 	 {Proceedings of The 1st Conference on Lifelong Learning Agents},
  pages = 	 {1195--1212},
  year = 	 {2022},
  editor = 	 {Chandar, Sarath and Pascanu, Razvan and Precup, Doina},
  volume = 	 {199},
  series = 	 {Proceedings of Machine Learning Research},
  month = 	 {22--24 Aug},
  publisher =    {PMLR},
  pdf = 	 {https://proceedings.mlr.press/v199/bhat22b/bhat22b.pdf},
  url = 	 {https://proceedings.mlr.press/v199/bhat22b.html},
}