Awesome
Gait3D-Benchmark
This repository contains the code and model for our CVPR 2022, ACM MM 2022, 2023, and 2024 papers. The Gait3D-Benchmark project is now maintained By Jinkai Zheng and Xinchen Liu. Thanks to all of our co-authors for their help, as well as the great repository that we list in the Acknowledgement.
<h2 align="center"> Gait3D (SMPLGait) </h2> | <h2 align="center"> MTSGait </h2> | <h2 align="center"> Gait3D-Parsing (ParsingGait) </h2> | <h2 align="center"> XGait </h2> |
---|---|---|---|
Gait Recognition in the Wild with Dense 3D Representations and A Benchmark (CVPR 2022) | Gait Recognition in the Wild with Multi-hop Temporal Switch (ACM MM 2022) | Parsing is All You Need for Accurate Gait Recognition in the Wild (ACM MM 2023) | It Takes Two: Accurate Gait Recognition in the Wild via Cross-granularity Alignment (ACM MM 2024) |
[Project Page] [Paper] | [Paper] | [Project Page] [Paper] | [Paper] |
What's New
- [Dec 2024] Our XGait method is released.
- [July 2024] The ACM MM'24 Multimodal Gait Recognition (MGR) Challenge is organized. You can get started quickly here.
- [Sept 2023] The code and model of CDGNet-Parsing are released here, you can use it to extract parsing data on your own data.
- [Sept 2023] Our Gait3D-Parsing dataset and ParsingGait method are released.
- [Sept 2022] Our MTSGait method is released.
- [Mar 2022] Our Gait3D dataset and SMPLGait method are released.
Model Zoo
Results and models are available in the model zoo.
Requirement and Installation
The requirement and installation procedure can be found here.
Data Downloading
Please download the Gait3D dataset by signing this agreement.
Please download the Gait3D-Parsing dataset by signing this agreement.
We ask for your information only to make sure the dataset is used for non-commercial purposes. We will not give it to any third party or publish it publicly anywhere.
Data Pretreatment
The data pretreatment can be found here.
Train
Run the following command:
sh train.sh
Test
Run the following command:
sh test.sh
Citation
Please cite these papers in your publications if it helps your research:
@inproceedings{zheng2022gait3d,
title={Gait Recognition in the Wild with Dense 3D Representations and A Benchmark},
author={Jinkai Zheng, Xinchen Liu, Wu Liu, Lingxiao He, Chenggang Yan, Tao Mei},
booktitle={IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
year={2022}
}
@inproceedings{zheng2022mtsgait,
title={Gait Recognition in the Wild with Multi-hop Temporal Switch},
author={Jinkai Zheng, Xinchen Liu, Xiaoyan Gu, Yaoqi Sun, Chuang Gan, Jiyong Zhang, Wu Liu, Chenggang Yan},
booktitle={ACM International Conference on Multimedia (ACM MM)},
year={2022}
}
@inproceedings{zheng2023parsinggait,
title={Parsing is All You Need for Accurate Gait Recognition in the Wild},
author={Jinkai Zheng, Xinchen Liu, Shuai Wang, Lihao Wang, Chenggang Yan, Wu Liu},
booktitle={ACM International Conference on Multimedia (ACM MM)},
year={2023}
}
@inproceedings{zheng2024xgait,
title={It Takes Two: Accurate Gait Recognition in the Wild via Cross-granularity Alignment},
author={Jinkai Zheng, Xinchen Liu, Boyue Zhang, Chenggang Yan, Jiyong Zhang, Wu Liu, Yongdong Zhang},
booktitle={ACM International Conference on Multimedia (ACM MM)},
year={2024}
}
Acknowledgement
Here are some great resources we benefit from: