Awesome
LaplacianShot: Laplacian Regularized Few Shot Learning
This repository contains the code for LaplacianShot. The code is adapted from SimpleShot github.
More details in the following ICML 2020 paper:
Laplacian Regularized Few-shot Learning
Imtiaz Masud Ziko, Jose Dolz, Eric Granger and Ismail Ben Ayed
In ICML 2020.
Introduction
We propose LaplacianShot for few-shot learning tasks, which integrates two types of potentials: (1) assigning query samples to the nearest class prototype, and (2) pairwise Laplacian potentials encouraging nearby query samples to have consistent predictions.
LaplacianShot is utilized during inference in few-shot scenarios, following the traditional training of a deep convolutional network on the base classes with the cross-entropy loss. In fact, LaplacianShot can be utilized during inference on top of any learned feature embeddings.
Usage
1. Code tested with Dependencies
- Python 3.6
- Pytorch 1.2
- Install dependencies by running:
pip install -r requirements.txt
2. Datasets
2.1 Mini-ImageNet
You can download the dataset from here. Unpack the dataset in to data/ directory.
2.2 Tiered-ImageNet
You can download the Tiered-ImageNet from here. Unpack this dataset in data/ directory. Then run the following script to generate split files.
python src/utils/tieredImagenet.py --data path-to-tiered --split split/tiered/
2.3 CUB
Download and unpack the CUB 200-2011 from here in data/ directory. Then run the following script to generate split files.
python src/utils/cub.py --data path-to-cub --split split/cub/
2.4 iNat2017
We follow the instruction from https://github.com/daviswer/fewshotlocal. Download and unpack the iNat2017 Training and validation images, and the Training bounding box annotations, to data/iNat directory from here. Also download traincatlist.pth and testcatlist.pth in the same directory from here. Then, run the following to setup the dataset:
cd ./data/iNat
python iNat_setup.py
And run the following script to generate the split files.
python ./src/inatural_split.py --data path-to-inat/setup --split ./split/inatural/
3 Train and Test
You can download the pretrained network models from here.
Alternatively to train the network on the base classes from scratch remove the "--evaluate " options in the following script. The scripts to test LaplacianShot:
sh run.sh
You can change the commented options accordingly for each dataset. Also all the different options are fairly described in the configuration.py file.
Results
We get the following results in different few-shot benchmarks:
On mini-ImageNet
With WRN network:
Methods | 1-shot | 5-shot |
---|---|---|
ProtoNet (Snell et al., 2017) | 62.60 | 79.97 |
CC+rot (Gidaris et al., 2019) | 62.93 | 79.87 |
MatchingNet (Vinyals et al., 2016) | 64.03 | 76.32 |
FEAT (Ye et al., 2020) | 65.10 | 81.11 |
Transductive tuning (Dhillon et al., 2020) | 65.73 | 78.40 |
SimpleShot (Wang et al., 2019) | 65.87 | 82.09 |
SIB (Hu et al., 2020) | 70.0 | 79.2 |
BD-CSPN (Liu et al., 2019) | 70.31 | 81.89 |
LaplacianShot (ours) | 73.44 | 83.93 |
On tiered-ImageNet
With WRN network:
Methods | 1-shot | 5-shot |
---|---|---|
CC+rot (Gidaris et al., 2019) | 70.53 | 84.98 |
FEAT (Ye et al., 2020) | 70.41 | 84.38 |
Transductive tuning (Dhillon et al., 2020) | 73.34 | 85.50 |
SimpleShot (Wang et al., 2019) | 70.90 | 85.76 |
BD-CSPN (Liu et al., 2019) | 78.74 | 86.92 |
LaplacianShot (ours) | 78.80 | 87.48 |
On CUB
With ResNet-18 network
Methods | 1-shot | 5-shot |
---|---|---|
MatchingNet (Vinyals et al., 2016) | 73.49 | 84.45 |
MAML (Finn et al., 2017) | 68.42 | 83.47 |
ProtoNet (Snell et al., 2017) | 72.99 | 86.64 |
RelationNet (Sung et al., 2018) | 68.58 | 84.05 |
Chen (Chen et al., 2019) | 67.02 | 83.58 |
SimpleShot (Wang et al., 2019) | 70.28 | 86.37 |
LaplacianShot (ours) | 79.90 | 88.69 |
On iNat
With WRN network Top-1 accuracy Per Class and Top-1 accuracy Mean:
Methods | Per Class | Mean |
---|---|---|
SimpleShot (Wang et al., 2019) | 62.44 | 65.08 |
LaplacianShot (ours) | 71.55 | 74.97 |