Home

Awesome

Self-Supervision Can Be a Good Few-Shot Learner

<p align="center"> <img src="https://user-images.githubusercontent.com/60600462/179650280-165db647-31d9-42b1-a69f-f726f2f0c12d.png" width="700"> </p>

Updates

This is a PyTorch re-implementation of the paper Self-Supervision Can Be a Good Few-Shot Learner (ECCV 2022).

@inproceedings{Lu2022Self,
	title={Self-Supervision Can Be a Good Few-Shot Learner},
	author={Lu, Yuning and Wen, Liangjian and Liu, Jianzhuang and Liu, Yajing and Tian, Xinmei},
	booktitle={European Conference on Computer Vision (ECCV)},
	year={2022}
}

PWC PWC PWC PWC

Data Preparation

mini-ImageNet

tiered-ImageNet

python ./split/tiered_split.py \
  --imagenet_path [your imagenet-train-folder]

Unsupervised Training

Only DataParallel training is supported.

Run python ./train.py --data_path [your DATA FOLDER] --dataset [DATASET NAME] --backbone [BACKBONE] [--OPTIONARG]

For example, to train UniSiam model with ResNet-50 backbone and strong data augmentations on the mini-ImageNet dataset in 4*V100:

python train.py \
  --dataset miniImageNet \
  --backbone resnet50 \
  --lrd_step \
  --data_path [your mini-imagenet-folder] \
  --save_path [your save-folder]

More configs can be found in ./config.

Unsupervised Training with Distillation

Run python ./train.py --teacher_path [your TEACHER MODEL] --data_path [your DATA FOLDER] --dataset [DATASET NAME] --backbone [BACKBONE] [--OPTIONARG]

With a pre-trained UniSiam (teacher) model, to train UniSiam model with ResNet-18 backbone and on the mini-ImageNet dataset in 4*V100:

python train.py \
  --dataset miniImageNet \
  --backbone resnet18 \
  --lrd_step \
  --data_path [your mini-imagenet-folder] \
  --save_path [your save-folder] \
  --teacher_path [your teacher-model-path]

More configs can be found in ./config. You can train a teacher model with ResNet-50 backbone yourself or just download the provided pre-trained model as the teacher model.

Models

Our pre-trained ResNet models (with 224 image size and strong augmentations) can be downloaded from google drive.

Acknowledgements

Some codes borrow from SimSiam, SupContrast, (unofficial) SimCLR, RFS.