Home

Awesome

Leveraging the Feature Distribution in Transfer-based Few-Shot Learning

This repository is the official implementation of Leveraging the Feature Distribution in Transfer-based Few-Shot Learning.

Requirements

To install requirements:

pip install -r requirements.txt

Donwloading the dataset and create base/val/novel splits:

miniImageNet

CUB

CIFAR-FS

Training

To train the feature extractors in the paper, run this command:

For miniImageNet/CUB<br/>

python train.py --dataset [miniImagenet/CUB] --method [S2M2_R/rotation] --model [WideResNet28_10/ResNet18] --train_aug

For CIFAR-FS<br/>

python train_cifar.py --dataset cifar --method [S2M2_R/rotation] --model [WideResNet28_10/ResNet18] --train_aug

Evaluation

To evaluate my model on miniImageNet/CUB/cifar/cross, run:

python test_standard.py

Pre-trained Models

You can download pretrained models and extracted features here:

📋 To extract and save the novel class features of a newly trained backbone, run:

python save_plk.py --dataset [miniImagenet/CUB] --method S2M2_R --model [trainedmodel]

Results

Our model achieves the following performance (backbone: WRN) on :

Dataset1-shot Accuracy5-shot Accuracy
miniImageNet82.92+-0.26%88.82+-0.13%
tieredImageNet85.41+-0.25%90.44+-0.14%
CUB91.55+-0.19%93.99+-0.10%
CIFAR-FS87.69+-0.23%90.68+-0.15%
cross domain62.49+-0.32%76.51+-0.18%

References

A Closer Look at Few-shot Classification

Charting the Right Manifold: Manifold Mixup for Few-shot Learning

Manifold Mixup: Better Representations by Interpolating Hidden States

Sinkhorn Distances: Lightspeed Computation of Optimal Transport

SimpleShot: Revisiting Nearest-Neighbor Classification for Few-Shot Learning

Notes on optimal transport