Awesome
<div align="center"><img src="resources\logo.png" alt="logo" width = "428" height = "250" /></div>FastPoseGait is a user-friendly and flexible repository that aims to help researchers get started on pose-based gait recognition quickly. This repository is provided by BNU-IVC and supported in part by WATRIX.AI.
<div align="center"> </div>News!
- [Nov 2023] SUSTech1K and CCPG have been supported in our project! The result of GPGait on these two benchmarks can be found in Model Zoo.
- [Sep 2023] Our technical report FastPoseGait: A Toolbox and Benchmark for Efficient Pose-based Gait Recognition and the code of <i>Improved Version</i> are released! Check out Model Zoo.
- [Aug 2023] The official PyTorch implementation of <i>GPGait: Generalized Pose-based Gait Recognition</i> is released!Checkout code.
- [July 2023] Our paper GPGait: Generalized Pose-based Gait Recognition is accepted by ICCV 2023! Checkout paper, poster and video(bilibili YouTube).
Supports
Supported Algorithms
Supported Datasets
Getting Started
For the basic usage of FastPoseGait
git clone https://github.com/BNU-IVC/FastPoseGait
cd FastPoseGait
1. Installation
- python >= 3.9
- torch >= 1.8
- tqdm
- pyyaml
- tensorboard
- pytorch_metric_learning
Install the dependencies by pip:
pip install pyyaml tqdm tensorboard pytorch_metric_learning
pip install torch==1.8 torchvision==0.9
Install the dependencies by Anaconda:
conda create -n fastposegait python=3.9
conda install pytorch==1.8 torchvision -c pytorch
conda install pyyaml tqdm tensorboard -c conda-forge
pip install pytorch_metric_learning
2. Data Preparation
- CASIA-B Pose can be downloaded from this link.
- Or to obtain the official human keypoint annotations, you can apply for it:
- Suppose you have downloaded the official annotations, you need to use our provided script to generate the processed pickle files.
3. Training & Testing
Train a model by
CUDA_VISIBLE_DEVICES=0,1,2,3 python -m torch.distributed.launch --nproc_per_node=4 fastposegait/main.py --cfgs ./configs/gaittr/gaittr.yaml --phase train
-
python -m torch.distributed.launch
DDP launch instruction. -
--nproc_per_node
The number of gpus to use, and it must equal to the length ofCUDA_VISIBLE_DEVICES
. -
--cfgs
The path to config file. -
--phase
Specified astrain
. -
--log_to_file
If specified, the terminal log will be written on disk simultaneously.
You can run commands in dist_train.sh to train different models.
Test a model by
CUDA_VISIBLE_DEVICES=0,1,2,3 python -m torch.distributed.launch --nproc_per_node=4 fastposegait/main.py --cfgs ./configs/gaittr/gaittr.yaml --phase test
--phase
Specified astest
.
You can run commands in dist_test.sh for testing different models.
For developers who wish to develop based on FastPoseGait
Model Zoo
Results and models are available in the model zoo. [Google Drive] [百度网盘 提取码s4jj]
Acknowledgement
- GaitGraph/GaitGraph2: Torben Teepe
- GaitTR: Cun Zhang
- OpenGait Team
- CASIA-B Team
- OUMVLP-Pose Team
- GREW Team
- Gait3D Team
- SUSTech1K Team
- CCPG Team
Citation
If you find this project useful in your research, please consider citing:
@article{meng2023fastposegait,
title={FastPoseGait: A Toolbox and Benchmark for Efficient Pose-based Gait Recognition},
author={Meng, Shibei and Fu, Yang and Hou, Saihui and Cao, Chunshui and Liu, Xu and Huang, Yongzhen},
journal={arXiv preprint arXiv:2309.00794},
year={2023}
}
Note: This code is strictly intended for academic purposes and can not be utilized for any form of commercial use.
Authors
This project is built and maintained by ShiBei Meng and Yang Fu. We build this project based on the open-source project OpenGait.
We will keep up with the latest progress of the community, and support more popular algorithms and frameworks. We also appreciate all contributions to improve FastPoseGait. If you have any feature requests, please feel free to leave a comment, file an issue or contact the authors:
- ShiBei Meng, mengshibei@mail.bnu.edu.cn
- Yang Fu, aleeyanger@gmail.com