Home

Awesome

H-TransTrack

Modified files

To support hybrid branch

MODEL ZOO

CrowdHuman Pre-training

pretrained-model

MOT17 Validation

ModelMOTA%IDF1%FNCheckpoint
TransTrack67.170.315820model
TransTrack (Our repro.)67.168.115680model
H-TransTrack68.768.313657model

MOT17 Test

ModelMOTA%IDF1%FNCheckpoint
TransTrack74.563.9112137model
H-TransTrack75.764.491155model

Requirements

  1. Prepare datasets and annotations
mkdir crowdhuman
cp -r /path_to_crowdhuman_dataset/CrowdHuman_train crowdhuman/CrowdHuman_train
cp -r /path_to_crowdhuman_dataset/CrowdHuman_val crowdhuman/CrowdHuman_val
mkdir mot
cp -r /path_to_mot_dataset/train mot/train
cp -r /path_to_mot_dataset/test mot/test

CrowdHuman dataset is available in CrowdHuman.

python3 track_tools/convert_crowdhuman_to_coco.py

MOT dataset is available in MOT.

python3 track_tools/convert_mot_to_coco.py
  1. Pre-train on crowdhuman

sh configs/<path_to_config_file>.sh

  1. Train H-TransTrack

sh configs/<path_to_config_file>.sh

  1. Evaluate TransTrack

sh configs/<path_to_config_file>.sh

  1. Visualize TransTrack
python3 track_tools/txt2video.py

Citation

@article{jia2022detrs,
  title={DETRs with Hybrid Matching},
  author={Jia, Ding and Yuan, Yuhui and He, Haodi and Wu, Xiaopei and Yu, Haojun and Lin, Weihong and Sun, Lei and Zhang, Chao and Hu, Han},
  journal={arXiv preprint arXiv:2207.13080},
  year={2022}
}

@article{sun2020transtrack,
  title={Transtrack: Multiple object tracking with transformer},
  author={Sun, Peize and Cao, Jinkun and Jiang, Yi and Zhang, Rufeng and Xie, Enze and Yuan, Zehuan and Wang, Changhu and Luo, Ping},
  journal={arXiv preprint arXiv:2012.15460},
  year={2020}
}