Home

Awesome

LaplacianShot: Laplacian Regularized Few Shot Learning

This repository contains the code for LaplacianShot. The code is adapted from SimpleShot github.

More details in the following ICML 2020 paper:

Laplacian Regularized Few-shot Learning
Imtiaz Masud Ziko, Jose Dolz, Eric Granger and Ismail Ben Ayed
In ICML 2020.

Introduction

We propose LaplacianShot for few-shot learning tasks, which integrates two types of potentials: (1) assigning query samples to the nearest class prototype, and (2) pairwise Laplacian potentials encouraging nearby query samples to have consistent predictions.

LaplacianShot is utilized during inference in few-shot scenarios, following the traditional training of a deep convolutional network on the base classes with the cross-entropy loss. In fact, LaplacianShot can be utilized during inference on top of any learned feature embeddings.

Usage

1. Code tested with Dependencies

pip install -r requirements.txt

2. Datasets

2.1 Mini-ImageNet

You can download the dataset from here. Unpack the dataset in to data/ directory.

2.2 Tiered-ImageNet

You can download the Tiered-ImageNet from here. Unpack this dataset in data/ directory. Then run the following script to generate split files.

python src/utils/tieredImagenet.py --data path-to-tiered --split split/tiered/

2.3 CUB

Download and unpack the CUB 200-2011 from here in data/ directory. Then run the following script to generate split files.

python src/utils/cub.py --data path-to-cub --split split/cub/

2.4 iNat2017

We follow the instruction from https://github.com/daviswer/fewshotlocal. Download and unpack the iNat2017 Training and validation images, and the Training bounding box annotations, to data/iNat directory from here. Also download traincatlist.pth and testcatlist.pth in the same directory from here. Then, run the following to setup the dataset:

cd ./data/iNat
python iNat_setup.py

And run the following script to generate the split files.

python ./src/inatural_split.py --data path-to-inat/setup --split ./split/inatural/

3 Train and Test

You can download the pretrained network models from here.

Alternatively to train the network on the base classes from scratch remove the "--evaluate " options in the following script. The scripts to test LaplacianShot:

sh run.sh

You can change the commented options accordingly for each dataset. Also all the different options are fairly described in the configuration.py file.

Results

We get the following results in different few-shot benchmarks:

On mini-ImageNet

With WRN network:

Methods1-shot5-shot
ProtoNet (Snell et al., 2017)62.6079.97
CC+rot (Gidaris et al., 2019)62.9379.87
MatchingNet (Vinyals et al., 2016)64.0376.32
FEAT (Ye et al., 2020)65.1081.11
Transductive tuning (Dhillon et al., 2020)65.7378.40
SimpleShot (Wang et al., 2019)65.8782.09
SIB (Hu et al., 2020)70.079.2
BD-CSPN (Liu et al., 2019)70.3181.89
LaplacianShot (ours)73.4483.93

On tiered-ImageNet

With WRN network:

Methods1-shot5-shot
CC+rot (Gidaris et al., 2019)70.5384.98
FEAT (Ye et al., 2020)70.4184.38
Transductive tuning (Dhillon et al., 2020)73.3485.50
SimpleShot (Wang et al., 2019)70.9085.76
BD-CSPN (Liu et al., 2019)78.7486.92
LaplacianShot (ours)78.8087.48

On CUB

With ResNet-18 network

Methods1-shot5-shot
MatchingNet (Vinyals et al., 2016)73.4984.45
MAML (Finn et al., 2017)68.4283.47
ProtoNet (Snell et al., 2017)72.9986.64
RelationNet (Sung et al., 2018)68.5884.05
Chen (Chen et al., 2019)67.0283.58
SimpleShot (Wang et al., 2019)70.2886.37
LaplacianShot (ours)79.9088.69

On iNat

With WRN network Top-1 accuracy Per Class and Top-1 accuracy Mean:

MethodsPer ClassMean
SimpleShot (Wang et al., 2019)62.4465.08
LaplacianShot (ours)71.5574.97