Home

Awesome

Recall@k Surrogate Loss with Large Batches and Similarity Mixup

Recall@k Surrogate Loss with Large Batches and Similarity Mixup, Yash Patel, Giorgos Tolias, Jiri Matas, IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022.

Citation

If you make use of the code in this repository for scientific purposes, we appreciate it if you cite our paper:

@inproceedings{patel2022recall,
  title={Recall@k surrogate loss with large batches and similarity mixup},
  author={Patel, Yash and Tolias, Giorgos and Matas, Ji{\v{r}}{\'\i}},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={7502--7511},
  year={2022}
}

Dependencies

A list of all dependencies that were used in our setup is listed in requirements.txt. Note that not all of them are necessary; some key dependencies are as follows:

Setting up datasets

Recall@k Surrogate demonstrates the performance on five publicly available datasets: iNaturalist, Stanford Online Products, PUK Vehicle ID, Stanford Cars, and Caltech CUB. Download each of these datasets from their respective sources.

Download links

File structure

Place the dataset folders directly in the RecallatK_surrogate folder. An example of the file structure with datasets.

Training

Some hyper-paramters are hard-coded in src/main.py. For training with <dataset>, use following command:

python src/main.py --source_path <path_to_RecallatK_surrogate> --loss recallatk --dataset <dataset> --mixup 0 --samples_per_class 4 --embed_dim 512 --fc_lr_mul 0

For training with SiMix, use the following command:

python src/main.py --source_path <path_to_RecallatK_surrogate> --loss recallatk --dataset <dataset> --mixup 1 --samples_per_class 4 --embed_dim 512 --fc_lr_mul 0

Keep the following in mind: