Home

Awesome

Python >=3.7 PyTorch >=1.8

PASS: Part-Aware Self-Supervised Pre-Training for Person Re-Identification [pdf]

The official repository for PASS: Part-Aware Self-Supervised Pre-Training for Person Re-Identification ECCV2022.

Requirements

Installation

pip install -r requirements.txt

We recommend to use /torch=1.8.0 /torchvision=0.9.0 /timm=0.3.4 /cuda>11.1 /faiss-gpu=1.7.2/ A100 for training and evaluation. If you find some packages are missing, please install them manually. You can refer to DINO, TransReID and cluster-contrast-reid to install the environment of pre-training, supervised ReID and unsupervised ReID, respectively. You can also refer to TransReID-SSL to install the whole environments.

Prepare Datasets

mkdir data

Download the datasets:

Then unzip them and rename them under the directory like

data
├── market1501
│   └── bounding_box_train
│   └── bounding_box_test
│   └── ..
├── MSMT17
│   └── train
│   └── test
│   └── ..
└── LUP
    └── images 

Pre-trained Models

ModelDownload
ViT-S/16link
ViT-B/16link

Please download pre-trained models and put them into your custom file path.

ReID performance

We have reproduced the performance to verify the reproducibility. The reproduced results may have a gap of about 0.1~0.2% with the numbers in the paper.

Supervised ReID

Market-1501
ModelImage SizemAPRank-1Download
ViT-S/16256*12892.296.3model / log
ViT-S/16384*12892.696.8model / log
ViT-B/16256*12893.096.8model / log
ViT-B/16384*12893.396.9model / log
MSMT17
ModelImage SizemAPRank-1Download
ViT-S/16256*12869.186.5model / log
ViT-S/16384*12871.787.9model / log
ViT-B/16256*12871.888.2model / log
ViT-B/16384*12874.389.7model / log

UDA ReID

MSMT2Market
ModelImage SizemAPRank-1Download
ViT-S/16256*12890.295.8model / log
Market2MSMT
ModelImage SizemAPRank-1Download
ViT-S/16256*12849.172.7model / log

USL ReID

Market-1501
ModelImage SizemAPRank-1Download
ViT-S/16256*12888.795.0model / log
MSMT17
ModelImage SizemAPRank-1Download
ViT-S/16256*12841.067.0model / log

Acknowledgment

Our implementation is mainly based on the following codebases. We gratefully thank the authors for their wonderful works.

TransReID-SSL, LUPerson, DINO, TransReID, cluster-contrast-reid.

Citation

If you find this code useful for your research, please cite our paper

@article{zhu2022part,
  title={PASS: Part-Aware Self-Supervised Pre-Training for Person Re-Identification},
  author={Zhu, Kuan and Guo, Haiyun and Yan, Tianyi and Zhu, Yousong and Wang, Jinqiao and Tang, Ming},
  journal={arXiv preprint arXiv:2203.03931},
  year={2022}
}

Contact

If you have any question, please feel free to contact us. E-mail: kuan.zhu@nlpr.ia.ac.cn.