Home

Awesome

Cross-Modal Deep Variational Hand Pose Estimation

Teaser

Project page

This repository provides a code base to evaluate the trained models of the paper Cross-Modal Deep Variational Hand Pose Estimation and reproduce the numbers of Table 2. It is a modified version of the code found here by Christian Zimmermann, adapted to run our model.

Recommended system

Recommended system (tested):

Python packages used by the example provided and their recommended version:

Preprocessing for training and evaluation

In order to use the training and evaluation scripts you need download and preprocess the datasets.

Rendered Hand Pose Dataset (RHD)

python create_binary_db.py

Stereo Tracking Benchmark Dataset (STB)

cd ./data/stb/
matlab -nodesktop -nosplash -r "create_db"

Trained models

Evaluation

PaperCode
Hhand_side_invariance
Sscale_invariance
python evaluate_model.py

License and Citation

This project is licensed under the terms of the GPL v2 license. By using the software, you are agreeing to the terms of the license agreement.

If you use this code in your research, please cite us as follows:

@inproceedings{spurr2018cvpr,
    author = {Spurr, Adrian and Song, Jie and Park, Seonwook and Hilliges, Otmar},
    title = {Cross-modal Deep Variational Hand Pose Estimation},
    booktitle = {CVPR},
    year = {2018},
    location = {Salt Lake City, USA},
}