Home

Awesome

GDumb

This repository contains simplified code for the paper:

GDumb: A Simple Approach that Questions Our Progress in Continual Learning, ECCV 2020 (Oral: Top 2%)
Ameya Prabhu, Philip Torr, Puneet Dokania

[PDF] [Slides] [Bibtex]

<p align="center"> <a href="url"><img src="https://github.com/drimpossible/GDumb/blob/master/Model.png" height="300" width="381" ></a> </p>

Installation and Dependencies

# First, activate a new virtual environment
$ pip3 install -r requirements.txt

Usage

$ python main.py --dataset CIFAR100 --num_classes_per_task 5 --num_tasks 20 --memory_size 500 --num_passes 256 --regularization cutmix --model ResNet --depth 32 --exp_name my_experiment_name

Arguments you can freely tweak given a dataset and model:

To add your favorite dataset:

Additional details and default hyperparameters can be found in src/opts.py

$ bash replicate.sh $SEED

Similarly, other scripts can replicate results for specific formulations.

Results

After running replicate.sh you should get results somewhat like these:

GDumb ModelMem (k)TableAccuracy
MNIST-MLP-1003003,889.1 ± 0.4
MNIST-MLP-100500390.2 ± 0.4
MNIST-MLP-400500491.9 ± 0.5
MNIST-MLP-40044005,697.8 ± 0.1
SVHN-ResNet184400393.4 ± 0.1
CIFAR10-ResNet18200335.0 ± 0.4
CIFAR10-ResNet185003,4,845.4 ± 1.9
CIFAR10-ResNet1810003,461.2 ± 1.0
CIFAR100-ResNet322000524.3 ± 0.4
TinyImageNet-DenseNet-100-12-BC9000657.32 (best of 3)

Extensibility to other setups

If you discover any bugs in the code please contact me, I will cross-check them with my nightmares.

Citation

We hope GDumb is a strong baseline and comparison, and the sampler or masking introduced are useful for your cool CL formulation! To cite our work:

@inproceedings{prabhu2020greedy,
  title={GDumb: A Simple Approach that Questions Our Progress in Continual Learning},
  author={Prabhu, Ameya and Torr, Philip and Dokania, Puneet},
  booktitle={The European Conference on Computer Vision (ECCV)},
  month={August},
  year={2020}
}