Awesome
Self-Promoted Supervision for Few-Shot Transformer
Yet another Few-Shot ViT training framework.
Our code is mainly based on MetaBaseline, and SUN-F (means FEAT)/D (means DeepEMD) are based on the corresponding codebase. Sincerely thanks for their contribution.
Update
(March 24th 2023) We upload the template of visualizing attention masks. This code is used to generate the attention masks used in the paper.
Requirements
- PyTorch (>= 1.9)
- TorchVision
- timm (latest)
- einops
- tqdm
- numpy
- scikit-learn
- scipy
- argparse
- tensorboardx
Update
June 4th, 2022: we upload the meta-training phase and the meta-tuning phase of SUN-D.
June 4th, 2022: we upload the teacher training code in the meta-training phase of SUN.
June 3rd, 2022: we upload the meta-tuning phase of SUN-M.
Pretrained Checkpoints
Currently we provide SUN-M (Visformer) trained on miniImageNet (5-way 1-shot and 5-way 5-shot), see Google Drive for details.
More pretrained checkpoints coming soon.
Evaluate the Pretrained Checkpoints
Prepare data
For example, miniImageNet:
cd test_phase
Download miniImageNet dataset from miniImageNet (courtesy of Spyros Gidaris)
unzip the package to materials/mini-imagenet, then obtain materials/mini-imagenet with pickle files.
Prapare pretrained checkpoints
Download corresponding checkpoints from Google Drive and store the checkpoints in test_phase/ directory.
Evaluation
cd test_phase
python test_few_shot.py --config configs/test_1_shot.yaml --shot 1 --gpu 1 # for 1-shot
python test_few_shot.py --config configs/test_5_shot.yaml --shot 5 --gpu 1 # for 5-shot
For 1-shot, you can obtain: test epoch 1: acc=67.80 +- 0.45 (%)
For 5-shot, you can obtain: test epoch 1: acc=83.25 +- 0.28 (%)
Test accuracy may slightly vary with different pytorch/cuda versions or different hardwares
Citation
@inproceedings{dong2022self,
title={Self-Promoted Supervision for Few-Shot Transformer},
author={Dong, Bowen and Zhou, Pan and Yan, Shuicheng and Zuo, Wangmeng},
booktitle={European Conference on Computer Vision (ECCV)},
year={2022}
}
TODO
- more checkpoints