Home

Awesome

This codebase has developed into a new project that is well-maintained and includes more SOTA methods. Please refer to PyCIL: A Python Toolbox for Class-Incremental Learning for more information.

Implementation of continual learning methods

This repository implements some continual / incremental / lifelong learning methods by PyTorch.

Especially the methods based on memory replay.

Dependencies

  1. torch 1.7.1
  2. torchvision 0.8.2
  3. tqdm
  4. numpy
  5. scipy

Usage

Run experiment

  1. Edit the config.json file for global settings.
  2. Edit the hyperparameters in the corresponding .py file (e.g., models/icarl.py).
  3. Run:
python main.py

Add datasets

  1. Add corresponding classes to utils/data.py.
  2. Modify the _get_idata function in utils/data_manager.py.

Results

iCaRL

CIFAR100

<img src="https://github.com/zhchuu/continual-learning-reproduce/blob/master/resources/iCaRL_cifar100_10.png" width = "325"/><img src="https://github.com/zhchuu/continual-learning-reproduce/blob/master/resources/iCaRL_cifar100_20.png" width = "325"/><img src="https://github.com/zhchuu/continual-learning-reproduce/blob/master/resources/iCaRL_cifar100_50.png" width = "325"/>

Average accuracies of CIFAR-100 (iCaRL):

IncrementsPaper reportedReproduce
10 classes64.163.10
20 classes67.265.25
50 classes68.667.69

UCIR

CIFAR100

<img src="https://github.com/zhchuu/continual-learning-reproduce/blob/master/resources/UCIR_CNN_cifar100_5.png" width = "325"/><img src="https://github.com/zhchuu/continual-learning-reproduce/blob/master/resources/UCIR_NCM_cifar100_5.png" width = "325"/><img src="https://github.com/zhchuu/continual-learning-reproduce/blob/master/resources/UCIR_CNN_cifar100_10.png" width = "325"/><img src="https://github.com/zhchuu/continual-learning-reproduce/blob/master/resources/UCIR_NCM_cifar100_10.png" width = "325"/>

ImageNet-Subset

<img src="https://github.com/zhchuu/continual-learning-reproduce/blob/master/resources/UCIR_CNN_imagenet_subset_5.png" width = "325"/><img src="https://github.com/zhchuu/continual-learning-reproduce/blob/master/resources/UCIR_NME_imagenet_subset_5.png" width = "325"/><img src="https://github.com/zhchuu/continual-learning-reproduce/blob/master/resources/UCIR_CNN_imagenet_subset_10.png" width = "325"/><img src="https://github.com/zhchuu/continual-learning-reproduce/blob/master/resources/UCIR_NME_imagenet_subset_10.png" width = "325"/>

BiC

ImageNet-1000

1002003004005006007008009001000
Paper reported (BiC)94.192.589.689.185.783.280.277.575.073.2
Reproduce94.391.689.687.585.684.382.279.476.774.1

PODNet

CIFAR100

NME results are shown and the reproduced results are not in line with the reported results. Maybe I missed something...

ClassifierStepsReported (%)Reproduced (%)
Cosine (k=1)5056.6955.49
LSC-CE (k=10)5059.8655.69
LSC-NCA (k=10)5061.4056.50
LSC-CE (k=10)25-----59.16
LSC-NCA (k=10)2562.7159.79
LSC-CE (k=10)10-----62.59
LSC-NCA (k=10)1064.0362.81
LSC-CE (k=10)5-----64.16
LSC-NCA (k=10)564.4864.37

Change log

Some problems

Q: Why can't I reproduce the results of the paper by this repository?

A: The result of the methods may be affected by the incremental order (In my opinion). You can either generate more orders and average their results or increase the number of training iterations (Adjust the hyperparameters).

References

https://github.com/arthurdouillard/incremental_learning.pytorch