Awesome
About
This repo contains the code for the Class Normalization for Continual Zero-Shot Learning paper from ICLR 2021:
- the code to reproduce ZSL and CZSL results
- the proposed CZSL metrics (located in
src/utils/metrics.py
) - fast python implementation of the AUSUC metric
In this project, we explored different normalization strategies used in ZSL and proposed a new one (class normalization) that is suited for deep attribute embedders. This allowed us to outperform the existing ZSL model with a simple 3-layer MLP trained just in 30 seconds. Also, we extended ZSL ideas into a more generalized setting: Continual Zero-Shot Learning, proposed a set of metrics for it and tested several baselines.
<div style="text-align:center"> <img src="images/class-norm-illustration.jpg" alt="Class Normalization illustration" width="500"/> </div>Installation & training
Data preparation
For ZSL
For ZSL, we tested our method on the standard GBU datasets which you can download from the original website. It is the easiest to follow our Google Colab to reproduce the results.
For CZSL
For CZSL, we tested our method on SUN and CUB datasets. In contrast to ZSL, in CZSL we used raw images as inputs instead of an ImageNet-pretrained model's features. For CUB, please follow the instructions in the A-GEM repo. Note, that CUB images dataset are now to be downloaded manually from here, but we used the same splits as A-GEM. Put the A-GEM splits into the CUB data folder.
For SUN, download the data from the official website, put it under data/SUN
and then follow the instructions in scripts/sun_data_preprocessing.py
Installing the firelab
dependency
You will need to install firelab library to run the training:
pip install firelab
Running ZSL training
Please, refer to this Google Colab notebook: it contains the code to reproduce our results.
Running CZSL training
To run CZSL training you will need to run the command:
python src/run.py -c basic|agem|mas|joint -d cub|sun
Please note, that by default we load all the data into memory (to speed up things).
This behaviour is controled by the in_memory
flag in the config.