Home

Awesome

HandDGP: Camera-Space Hand Mesh Predictions with Differentiable Global Positioning (ECCV 2024)

This is the reference PyTorch implementation for the method described in

HandDGP: Camera-Space Hand Mesh Predictions with Differentiable Global Positioning

Eugene Valassakis Guillermo Garcia-Hernando

Project Page, Video

<p align="center"> <img src="media/teaser.png"width="720" /> </p>

πŸ—ΊοΈ Overview

We provide the model implementation of HandDGP, model weights trained on FreiHAND dataset and the code reproduce our main paper results.

βš™οΈ Setup

We are going to create a new Mamba environment called handdgp. If you don't have Mamba, you can install it with:

make install-mamba

Then setup the environment with:

make mamba-env
mamba activate handdgp

In the code directory, install the repo as a pip package:

pip install -e .

Accept the licences of MobRecon and MANO and then add to the third party folder:

mkdir third_party
cd third_party
git clone https://github.com/SeanChenxy/HandMesh.git

πŸ“¦ Trained Models

We provide the model trained on FreiHAND dataset here. Download the model and place it in the weights directory. Alternatively, you can run the following bash script: scripts/download_weights.sh.

πŸ’Ύ Data

Accept the licence and download the FreiHAND eval data from here and extract it in data/freihand/data/freihand/FreiHAND_pub_v2_eval. Alternatively, you can run the following script: scripts/download_data.sh.

🌳 Folder structure

Please make sure to re-create this folder structure:

β”œβ”€β”€ configs
β”œβ”€β”€ data
β”‚   β”œβ”€β”€ freihand
β”‚   β”‚   β”œβ”€β”€ FreiHAND_pub_v2_eval
β”œβ”€β”€ scripts
β”œβ”€β”€ outputs
β”‚   β”œβ”€β”€ <experiment output folders>
β”œβ”€β”€ src
β”‚   β”œβ”€β”€ <source files>
β”œβ”€β”€ weights
β”‚   β”œβ”€β”€ <HandDGP weight files>
β”œβ”€β”€ third_party
β”‚   β”œβ”€β”€ HandMesh
β”œβ”€β”€ LICENSE
β”œβ”€β”€ Makefile
β”œβ”€β”€ pyproject.toml
β”œβ”€β”€ environment.yml
β”œβ”€β”€ README.md
β”œβ”€β”€ setup.py

πŸš€ Running HandDGP

To run HandDGP, please run the following command from the root folder:

python -m src.run --config_file configs/test_freihand.gin

This will generate an output file in the outputs directory with the test results on FreiHAND dataset in json format that you can directly use in the FreiHAND evaluation code. ``

πŸ™ Acknowledgements

We would like to thank the authors of the following repositories for their code, models and datasets:

πŸ“œ Citation

If you find our work useful in your research please consider citing our paper:

  @inproceedings{handdgp2024,
  title={{HandDGP}: Camera-Space Hand Mesh Prediction with Differentiable Global Positioning},
  author={Valassakis, Eugene and Garcia-Hernando, Guillermo},
  booktitle={Proceedings of the European Conference on Computer Vision (ECCV)},
  year={2024},
}

πŸ‘©β€βš–οΈ License

Copyright Β© Niantic, Inc. 2024. Patent Pending. All rights reserved. Please see the license file for terms.