Awesome
<img src="./assets/logo2.png" width = "320" height = "110" alt="logo" /> <div align="center"><img src="./assets/nm.gif" width = "100" height = "100" alt="nm" /><img src="./assets/bg.gif" width = "100" height = "100" alt="bg" /><img src="./assets/cl.gif" width = "100" height = "100" alt="cl" /></div><!-- 📣📣📣 **[*GaitLU-1M*](https://ieeexplore.ieee.org/document/10242019) relseased, pls checking the [tutorial](datasets/GaitLU-1M/README.md).** 📣📣📣 📣📣📣 **[*SUSTech1K*](https://lidargait.github.io) relseased, pls checking the [tutorial](datasets/SUSTech1K/README.md).** 📣📣📣 🎉🎉🎉 **[*OpenGait*](https://openaccess.thecvf.com/content/CVPR2023/papers/Fan_OpenGait_Revisiting_Gait_Recognition_Towards_Better_Practicality_CVPR_2023_paper.pdf) has been accpected by CVPR2023 as a highlight paper!** 🎉🎉🎉 -->
OpenGait is a flexible and extensible gait analysis project provided by the Shiqi Yu Group and supported in part by WATRIX.AI. The corresponding paper has been accepted by CVPR2023 as a highlight paper.
What's New
- [Dec 2024] The multimodal MultiGait++ has been accepted to AAAI2025🎉 Congratulations to Dongyang! This is his FIRST paper!
- [Jun 2024] The first large-scale gait-based scoliosis screening benchmark ScoNet is accepted to MICCAI2024🎉 Congratulations to Zirui! This is his FIRST paper! The code is released here, and you can refer to project homepage for details.
- [May 2024] The code of Large Vision Model based method BigGait is available at here. CCPG's checkpoints.
- [Apr 2024] Our team's latest checkpoints for projects such as DeepGaitv2, SkeletonGait, SkeletonGait++, and SwinGait will be released on Hugging Face. Additionally, previously released checkpoints will also be gradually made available on it.
- [Mar 2024] Chao gives a talk about 'Progress in Gait Recognition'. The video and slides are both available😊
- [Mar 2024] The code of SkeletonGait++ is released here, and you can refer to readme for details.
- [Mar 2024] BigGait has been accepted to CVPR2024🎉 Congratulations to Dingqiang! This is his FIRST paper!
- [Jan 2024] The code of transfomer-based SwinGait is available at here.
Our Publications
- [TBIOM'24] A Comprehensive Survey on Deep Gait Recognition: Algorithms, Datasets, and Challenges, Survey Paper.
- [AAAI'25] Exploring More from Multiple Gait Modalities for Human Identification, Paper and MultiGait++ Code (Coming soon).
- [MICCAI'24] Gait Patterns as Biomarkers: A Video-Based Approach for Classifying Scoliosis, Paper, Dataset, and ScoNet Code.
- [CVPR'24] BigGait: Learning Gait Representation You Want by Large Vision Models. Paper, and BigGait Code.
- [AAAI'24] SkeletonGait++: Gait Recognition Using Skeleton Maps. Paper, and SkeletonGait++ Code.
- [AAAI'24] Cross-Covariate Gait Recognition: A Benchmark. Paper, CCGR Dataset, and ParsingGait Code.
- [Arxiv'23] Exploring Deep Models for Practical Gait Recognition. Paper, DeepGaitV2 Code, and SwinGait Code.
- [PAMI'23] Learning Gait Representation from Massive Unlabelled Walking Videos: A Benchmark, Paper, GaitLU-1M Dataset, and GaitSSB Code.
- [CVPR'23] LidarGait: Benchmarking 3D Gait Recognition with Point Clouds, Paper, SUSTech1K Dataset and LidarGait Code.
- [CVPR'23] OpenGait: Revisiting Gait Recognition Toward Better Practicality, Highlight Paper, and GaitBase Code.
- [ECCV'22] GaitEdge: Beyond Plain End-to-end Gait Recognition for Better Practicality, Paper, and GaitEdge Code.
A Real Gait Recognition System: All-in-One-Gait
<div align="center"> <img src="./assets/probe1-After.gif" width = "455" height = "256" alt="probe1-After" /> </div>The workflow of All-in-One-Gait involves the processes of pedestrian tracking, segmentation and recognition. See here for details.
Highlighted features
- Multiple Dataset supported: CASIA-B, OUMVLP, SUSTech1K, HID, GREW, Gait3D, CCPG, CASIA-E, and GaitLU-1M.
- Multiple Models Support: We reproduced several SOTA methods and reached the same or even better performance.
- DDP Support: The officially recommended
Distributed Data Parallel (DDP)
mode is used during both the training and testing phases. - AMP Support: The
Auto Mixed Precision (AMP)
option is available. - Nice log: We use
tensorboard
andlogging
to log everything, which looks pretty.
Getting Started
Please see 0.get_started.md. We also provide the following tutorials for your reference:
Model Zoo
✨✨✨You can find all the checkpoint files at ✨✨✨!
The result list of appearance-based gait recognition is available here.
The result list of pose-based gait recognition is available here.
Authors:
- Chao Fan (樊超), 12131100@mail.sustech.edu.cn
- Chuanfu Shen (沈川福), 11950016@mail.sustech.edu.cn
- Junhao Liang (梁峻豪), 12132342@mail.sustech.edu.cn
Now OpenGait is mainly maintained by Dongyang Jin (金冬阳), 11911221@mail.sustech.edu.cn
Acknowledgement
-
GLN: Saihui Hou (侯赛辉)
-
GaitGL: Beibei Lin (林贝贝)
-
GREW: GREW TEAM
-
FastPoseGait Team: FastPoseGait Team
-
Gait3D Team: Gait3D Team
Citation
@InProceedings{Fan_2023_CVPR,
author = {Fan, Chao and Liang, Junhao and Shen, Chuanfu and Hou, Saihui and Huang, Yongzhen and Yu, Shiqi},
title = {OpenGait: Revisiting Gait Recognition Towards Better Practicality},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2023},
pages = {9707-9716}
}
Note: This code is only used for academic purposes, people cannot use this code for anything that might be considered commercial use.