Home

Awesome

Gait Recognition in the Wild: A Large-scale Benchmark and NAS-based Baseline

Paper

Gait Recognition in the Wild: A Large-scale Benchmark and NAS-based Baseline

Xianda Guo, Zheng Zhu, Tian Yang, BeiBei Lin, Junjie Huang, Jiankang Deng, Guan Huang, Jie Zhou, Jiwen Lu.

News

Getting Started

0. Prepare datasets

We provide the following tutorials for your reference:

1. SupernetTraining

CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 python -u -m torch.distributed.launch --nproc_per_node=8 opengait/main.py --cfgs configs/sposgait/sposgait_large_GREW_supertraining_triplet.yaml --phase train
<!-- - `--iter` You can specify a number of iterations or use `restore_hint` in the config file and resume training from there. -->

You can run commands in train.sh for training different models.

2. Search

多卡搜索
CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 python -u -m torch.distributed.launch --nproc_per_node=8  opengait/search.py --cfgs ./configs/sposgait/sposgait_large_GREW_supertraining_triplet.yaml --max-epochs 20

3. ReTrain

Train a model by

CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 python -u -m torch.distributed.launch --nproc_per_node=8 opengait/main.py --cfgs ./configs/sposgait/retrain/sposgait_large_GREW-train20000id_retrain.yaml --phase train
<!-- - `--iter` You can specify a number of iterations or use `restore_hint` in the config file and resume training from there. -->

You can run commands in train.sh for training different models.

4. Test

Evaluate the trained model by

CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 python -m torch.distributed.launch --nproc_per_node=8 opengait/main.py --cfgs ./configs/sposgait/retrain/sposgait_large_GREW-train20000id_retrain.yaml --phase test

You can run commands in test.sh for training different models.

Participants must package the submission.csv for submission using zip xxx.zip $CSV_PATH and then upload it to codalab.

Calculate_flops_and_params

CUDA_VISIBLE_DEVICES=0 python -u -m torch.distributed.launch --nproc_per_node=1 opengait/calculate_flops_and_params.py --cfgs configs/sposgait/retrain/sposgait_large_GREW-train20000id_retrain.yaml

Acknowledgement

Citation

If this work is helpful for your research, please consider citing the following BibTeX entries.

@inproceedings{zhu2021gait,
  title={Gait recognition in the wild: A benchmark},
  author={Zhu, Zheng and Guo, Xianda and Yang, Tian and Huang, Junjie and Deng, Jiankang and Huang, Guan and Du, Dalong and Lu, Jiwen and Zhou, Jie},
  booktitle={Proceedings of the IEEE/CVF international conference on computer vision},
  pages={14789--14799},
  year={2021}
}
@article{guo2022gait,
  title={Gait Recognition in the Wild: A Large-scale Benchmark and NAS-based Baseline},
  author={Guo, Xianda and Zhu, Zheng and Yang, Tian and Lin, Beibei and Huang, Junjie and Deng, Jiankang and Huang, Guan and Zhou, Jie and Lu, Jiwen},
  journal={arXiv e-prints},
  pages={arXiv--2205},
  year={2022}
}

Note: This code is only used for academic purposes, people cannot use this code for anything that might be considered commercial use.