Awesome
PatchKD
Code for ACM MM 2022 paper Patch-based Knowledge Distillation for Lifelong Person Re-Identification.
Installation
git clone https://github.com/feifeiobama/PatchKD
cd PatchKD
pip install -r requirements.txt
python setup.py develop
Please follow Torchreid_Datasets_Doc to download datasets and unzip them to your data path (we refer to 'machine_dataset_path' in train_test.py). Alternatively, you could download some datasets from light-reid and DualNorm.
Quick Start
Training + evaluation. Make sure the visdom server is listening.
python train_test.py
Evaluation from checkpoint:
python train_test.py --mode test --resume_test_model /path/to/pretrained/model
Visualization from checkpoint:
python train_test.py --mode visualize --resume_visualize_model /path/to/pretrained/model
Experiment under a different training order:
python train_test.py --train_dataset duke msmt17 market subcuhksysu cuhk03
Results
The following results were obtained with single NVIDIA 2080 Ti GPU:
Citation
If you find this code useful for your research, please cite our paper.
@inproceedings{sun2022patch,
author = {Sun, Zhicheng and Mu, Yadong},
title = {Patch-based Knowledge Distillation for Lifelong Person Re-Identification},
booktitle = {Proceedings of the 30th ACM International Conference on Multimedia},
pages = {696--707},
year = {2022}
}
Acknowledgement
Our code is based on the PyTorch implementation of LifelongReID.