Awesome
HandDGP: Camera-Space Hand Mesh Predictions with Differentiable Global Positioning (ECCV 2024)
This is the reference PyTorch implementation for the method described in
<p align="center"> <img src="media/teaser.png"width="720" /> </p>HandDGP: Camera-Space Hand Mesh Predictions with Differentiable Global Positioning
πΊοΈ Overview
We provide the model implementation of HandDGP, model weights trained on FreiHAND dataset and the code reproduce our main paper results.
βοΈ Setup
We are going to create a new Mamba environment called handdgp
. If you don't have Mamba, you can install it with:
make install-mamba
Then setup the environment with:
make mamba-env
mamba activate handdgp
In the code directory, install the repo as a pip package:
pip install -e .
Accept the licences of MobRecon and MANO and then add to the third party folder:
mkdir third_party
cd third_party
git clone https://github.com/SeanChenxy/HandMesh.git
π¦ Trained Models
We provide the model trained on FreiHAND dataset here. Download the model and place it in the weights
directory. Alternatively, you can run the following bash script: scripts/download_weights.sh
.
πΎ Data
Accept the licence and download the FreiHAND eval data from here and extract it in data/freihand/data/freihand/FreiHAND_pub_v2_eval
. Alternatively, you can run the following script: scripts/download_data.sh
.
π³ Folder structure
Please make sure to re-create this folder structure:
βββ configs
βββ data
β βββ freihand
β β βββ FreiHAND_pub_v2_eval
βββ scripts
βββ outputs
β βββ <experiment output folders>
βββ src
β βββ <source files>
βββ weights
β βββ <HandDGP weight files>
βββ third_party
β βββ HandMesh
βββ LICENSE
βββ Makefile
βββ pyproject.toml
βββ environment.yml
βββ README.md
βββ setup.py
π Running HandDGP
To run HandDGP, please run the following command from the root folder:
python -m src.run --config_file configs/test_freihand.gin
This will generate an output file in the outputs
directory with the test results on FreiHAND dataset in json format that you can directly use in the FreiHAND evaluation code.
``
π Acknowledgements
We would like to thank the authors of the following repositories for their code, models and datasets:
π Citation
If you find our work useful in your research please consider citing our paper:
@inproceedings{handdgp2024,
title={{HandDGP}: Camera-Space Hand Mesh Prediction with Differentiable Global Positioning},
author={Valassakis, Eugene and Garcia-Hernando, Guillermo},
booktitle={Proceedings of the European Conference on Computer Vision (ECCV)},
year={2024},
}
π©ββοΈ License
Copyright Β© Niantic, Inc. 2024. Patent Pending. All rights reserved. Please see the license file for terms.