Home

Awesome

trt_pose

Want to detect hand poses? Check out the new trt_pose_hand project for real-time hand pose and gesture recognition!

<img src="https://user-images.githubusercontent.com/4212806/67125332-71a64580-f1a9-11e9-8ee1-e759a38de215.gif" height=256/>

trt_pose is aimed at enabling real-time pose estimation on NVIDIA Jetson. You may find it useful for other NVIDIA platforms as well. Currently the project includes

To get started, follow the instructions below. If you run into any issues please let us know.

Getting Started

To get started with trt_pose, follow these steps.

Step 1 - Install Dependencies

  1. Install PyTorch and Torchvision. To do this on NVIDIA Jetson, we recommend following this guide

  2. Install torch2trt

    git clone https://github.com/NVIDIA-AI-IOT/torch2trt
    cd torch2trt
    sudo python3 setup.py install --plugins
    
  3. Install other miscellaneous packages

    sudo pip3 install tqdm cython pycocotools
    sudo apt-get install python3-matplotlib
    

Step 2 - Install trt_pose

git clone https://github.com/NVIDIA-AI-IOT/trt_pose
cd trt_pose
sudo python3 setup.py install

Step 3 - Run the example notebook

We provide a couple of human pose estimation models pre-trained on the MSCOCO dataset. The throughput in FPS is shown for each platform

ModelJetson NanoJetson XavierWeights
resnet18_baseline_att_224x224_A22251download (81MB)
densenet121_baseline_att_256x256_B12101download (84MB)

To run the live Jupyter Notebook demo on real-time camera input, follow these steps

  1. Download the model weights using the link in the above table.

  2. Place the downloaded weights in the tasks/human_pose directory

  3. Open and follow the live_demo.ipynb notebook

    You may need to modify the notebook, depending on which model you use

See also

References

The trt_pose model architectures listed above are inspired by the following works, but are not a direct replica. Please review the open-source code and configuration files in this repository for architecture details. If you have any questions feel free to reach out.