Home

Awesome

Shift-GCN

The implementation for "Skeleton-Based Action Recognition with Shift Graph Convolutional Network" (CVPR2020 oral). Shift-GCN is a lightweight skeleton-based action recognition model, which exceeds state-of-the-art methods with 10x less FLOPs.

image

Prerequisite

Compile cuda extensions

cd ./model/Temporal_shift
bash run.sh

Data Preparation

Training & Testing

Multi-stream ensemble

To ensemble the results of 4 streams. Change models name in ensemble.py depending on your experiment setting. Then run python ensemble.py.

Trained models

We release several trained models:

ModelDatasetSettingTop1(%)
./save_models/ntu_ShiftGCN_joint_xview.ptNTU-RGBDX-view95.1
./save_models/ntu_ShiftGCN_joint_xsub.ptNTU-RGBDX-sub87.8
./save_models/ntu120_ShiftGCN_joint_xsetup.ptNTU-RGBD120X-setup83.2
./save_models/ntu120_ShiftGCN_joint_xsub.ptNTU-RGBD120X-sub80.9

Citation

If you find this model useful for your research, please use the following BibTeX entry.

@inproceedings{cheng2020shiftgcn,  
  title     = {Skeleton-Based Action Recognition with Shift Graph Convolutional Network},  
  author    = {Ke Cheng and Yifan Zhang and Xiangyu He and Weihan Chen and Jian Cheng and Hanqing Lu},  
  booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},  
  year      = {2020},  
}