Home

Awesome

When Prompt-based Incremental Learning Does Not Meet Strong Pretraining (ICCV2023)

Official PyTorch implementation of our ICCV2023 paper “When Prompt-based Incremental Learning Does Not Meet Strong Pretraining”[paper]

Environments

Data preparation

Download Datasets

Path structure

After downloading, the dataset should be organized like this

datasets
│  
│──imagenet
│   │
│   └───train
│       │   n01440764
│       │   n01443537 
│       │   ...
│   │
│   └───val
│       │   n01440764
│       │   n01443537
│       │   ...
│   │   
│   │ train_100.txt
│   │ train_900.txt
│   │ val_100.txt 
│
│──cifar-100-python
│   │ ...
│ 
│──my-imagenet-r 
│   │ ...
│ 
│──my-EuroSAT_RGB
│   │ ...
│ 
│──my-NWPU-RESISC45
│   │ ...
└

Training

Non-pretrained incremental learning

ImageNet-100

This setting is the default setting of class-incremental learning. In this setting, the network is trained from scratch.

Notes for step1:

Since this setting is not using any pretrained weights, we treat the first task as the pretraining tasks. For ViT we use, it is difficult to train the model from scratch to match the first-task performance as the CNN (ResNets) . So we use an ResNet teacher to assist the first-task training. The resnet teacher can be trained under repos like PODNet or our previous work Imagine)

For simplicity, we provide the trained ViT weights on the first task which can be found in the link above. Running the script above will load such weights. If you want to train the ViT on the first task, you can run

bash runs/non_pretrained/imagnet_pretrain/run_nonPretrained_imageNetSub_B50_teacher.sh 0,1,2,3 10241

After training, please put the checkpoint on the chkpts folder and modify the path in the script above (line28).

CIFAR100

Notes for step1:

Like the Imagenet, here is the script for training the first stage.

bash runs/non_pretrained/cifar_pretrain/run_nonPretrained_CIFAR100_B50_teacher.sh 0,1,2,3 10241

Pretrained incremental learning

In order to compare our method with other prompt-based methods, we also conduct experiments with pretrained weights (ImageNet21k & TinyImageNet pretrained).

Download the pretrained checkpoints

Download weights from the following link and put them in 'chkpts' folder. We use the pretrained-weight(deit_base_patch16_224-b5f2ef4d.pth) from the Deit repo. We use the tiny-ImageNet weight from this repo.

ImageNet-R

CIFAR100

EuroSAT and RESISC45

Todo list

-[x] Non-pretrained incremental learning on CIFAR100

-[x] Pretrained incremental learning on ImageNetR

-[x] Pretrained incremental learning on EuroSAT and RESISC45

Acknowledgement

@inproceedings{tang2022learning,
  title={When Prompt-based Incremental Learning Does Not Meet Strong Pretraining},
  author={Tang, Yu-Ming and Peng, Yi-Xing and Zheng, Wei-Shi},
  booktitle={Proceedings of the IEEE International Conference on Computer Vision (ICCV)},
  year={2023}
}