Awesome
MSCLNet for VI-ReID (ECCV 2022)
Modality Synergy Complement Learning with Cascaded Aggregation for Visible-Infrared Person Re-Identification
<p align="left"> <br> <a href='https://www.ecva.net/papers/eccv_2022/papers_ECCV/papers/136740450.pdf'> <img src='https://img.shields.io/badge/Paper-PDF-green?style=flat&logo=arXiv&logoColor=green' alt='Paper PDF'> </a> </p> <img src="asset/pipeline.png">Getting Started
-
Clone this repo:
git clone https://github.com/bitreidgroup/VI-ReID-MSCLNet.git
cd VI-ReID-MSCLNet
-
Create a conda environment and activate the environment
conda env create -f environment.yml
conda activate mscl
We recommend Python = 3.6, CUDA = 10.0, Cudnn = 7.6.5, Pytorch = 1.2, and CudaToolkit = 10.0.130 for the environment.
Preparing dataset
-
RegDB Dataset : The RegDB dataset can be downloaded from this website by submitting a copyright form.
(Named: "Dongguk Body-based Person Recognition Database (DBPerson-Recog-DB1)" on their website).
We do not preprocess the RegDB dataset.
-
SYSU-MM01 Dataset : The SYSU-MM01 dataset can be downloaded from this website.
-
We preprocess the SYSU-MM01 dataset to speed up the training process.
-
if you do not need the identities of the cameras, run the preprocess scripts
python pre_process_sysu.py
After running, the training data will be stored in ".npy" format.
-
if your need the identities of the cameras, run :
python pre_process_sysu_cam.py
The identities of cameras will be also stored in ".npy" format.
-
Pre-trained Models and Reproduce our experimental results
You may need manually define the data path in the utils/data_loader.py
and utils/data_manager.py
first.
bash scripts/reproduce.sh
4. Citation
If this repository helps your research, please cite :
@inproceedings{zhang2022modality,
title={Modality Synergy Complement Learning with Cascaded Aggregation for Visible-Infrared Person Re-Identification},
author={Zhang, Yiyuan and Zhao, Sanyuan and Kang, Yuhao and Shen, Jianbing},
booktitle={European Conference on Computer Vision},
pages={462--479},
year={2022},
organization={Springer}
}
Acknowledgement
Many thanks to the authors of AGW