Home

Awesome

Hubs and Hyperspheres

This repository contains the code for the paper "Hubs and Hyperspheres: Reducing Hubness and Improving Transductive Few-shot Learning with Hyperspherical Embeddings", CVPR 2023.

Abstract - Distance-based classification is frequently used in transductive few-shot learning (FSL). However, due to the high-dimensionality of image representations, FSL classifiers are prone to suffer from the hubness problem, where a few points (hubs) occur frequently in multiple nearest neighbour lists of other points. Hubness negatively impacts distance-based classification when hubs from one class appear often among the nearest neighbors of points from another class, degrading the classifier's performance. To address the hubness problem in FSL, we first prove that hubness can be eliminated by distributing representations uniformly on the hypersphere. We then propose two new approaches to embed representations on the hypersphere, which we prove optimize a tradeoff between uniformity and local similarity preservation -- reducing hubness while retaining class structure. Our experiments show that the proposed methods reduce hubness, and significantly improves transductive FSL accuracy for a wide range of classifiers.

Results aggregated over classifiers

1-shot

miniminitieredtieredCUBCUB
EmbeddingAccScoreAccScoreAccScore
ResNet18None55.740.1762.610.063.780.17
L268.222.3375.942.1778.092.33
Cl269.562.8376.973.078.262.83
ZN60.02.3366.212.567.432.67
ReRep60.764.067.073.6769.64.17
EASE69.633.6777.054.078.843.67
TCPR69.974.077.183.3378.834.0
noHub (Ours)72.586.8379.776.8381.916.83
noHub-S (Ours)73.647.6780.67.6783.17.67
WideRes28-10None63.591.071.290.8379.231.17
L274.33.076.192.6788.613.5
Cl271.321.3375.172.088.523.33
ZN64.272.565.642.576.01.5
ReRep65.513.071.833.1783.13.5
EASE74.954.3376.593.6788.513.5
TCPR75.644.8376.514.088.222.5
noHub (Ours)78.227.079.767.090.255.67
noHub-S (Ours)79.247.6780.467.6790.827.67

5-shot

miniminitieredtieredCUBCUB
EmbeddingAccScoreAccScoreAccScore
ResNet18None69.830.8374.380.6776.011.17
L281.582.3386.051.8388.432.83
Cl281.952.6786.433.088.492.5
ZN71.494.075.323.8376.923.5
ReRep70.252.574.521.8376.432.5
EASE81.843.586.43.1788.573.5
TCPR82.14.086.543.8388.794.33
noHub (Ours)82.585.586.94.589.136.0
noHub-S (Ours)82.616.587.136.6788.935.33
WideRes28-10None78.771.584.11.6789.491.67
L285.654.086.293.8393.473.67
Cl283.141.3385.471.593.494.0
ZN74.614.3375.345.081.023.17
ReRep73.861.8381.511.6787.22.0
EASE85.513.586.293.3393.343.5
TCPR86.036.086.374.093.33.0
noHub (Ours)86.445.6787.075.593.654.17
noHub-S (Ours)85.955.587.055.8393.765.0

Datasets

The datasets can be downloaded by following the instructions in the repo Realistic evaluation of transductive few-shot learning .

After downloading the datasets, use the files in data/split to separate the images into directories data/[mini|tiered|cub]/[train|val|test].

Feature extractors

Download the checkpoints from:

And place them in the models directory.

Feature caching

When executing the evaluation script, setting --cache_dir /path/to/cached/features will save the computed features to the specified directory. By setting --use_cached True in subsequent runs, this will make repeated evaluations on the same dataset/feature extractors much faster.

Installing dependencies

Run

conda env create -f environment.yml

to create a conda environment with the base requirements.

Then, inside the newly created environment, install the other dependencies with pip:

pip install -r requirements.txt

Running with Docker

The docker directory contains a build script and a Dockerfile to build a docker image with all required dependencies.

Running evaluation

Evaluation is performed by running

python evaluate.py --checkpoint "<path/to/feature/extractor/checkpoint.ckpt>" --n_shots "<shots>" --dataset "[mini|tiered|cub]" --classifier "<classifier>" --embedding "<embedding>" "<optional arguments>" 

where "<classifier>" and "<embedding>" are one of the implemented classifiers and embedding methods, respectively (see below).

Alternatively, the arguments can be provided in a yaml file:

python evaluate.py -c path/to/config/file.yml "<optional arguments>"

See src/config/templates for examples of config files.

Implemented embedding methods

Name--embedding
Nonenone
L2l2
CL2cl2
ZNzn
ReReprr
EASEease
TCPRtcpr
noHub (Ours)nohub
noHub-S (Ours)nohubs

Implemented classifiers

Name--classifier
SmpleShotsimpleshot
Laplacianshotlaplacianshot
α-TIMalpha_tim
iLPCilpc
Oblique Manifoldom
SIAMESEsiamese