Awesome
Handy: Towards a high fidelity 3D hand shape and appearance model
<H4 align="center"> Rolandos Alexandros Potamias, Stylianos Ploumpis, Stylianos Moschoglou, Vasileios Triantafyllou, Stefanos Zafeiriou </H4> <p align="center"> Imperial College London</p> <p align="center"> Published in the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR 2023)</p> <p align="center"><img width="100%" src="figures/teaser_fig.png" /></p>Abstract
Over the last few years, with the advent of virtual and augmented reality, an enormous amount of research has been focused on modelling, tracking and reconstructing human hands. Given their power to express human behaviour, hands have been a very important, but challenging component of the human body. Currently, most of the state-of-the-art reconstruction and pose estimation methods rely on the low polygon MANO model. Apart from its low polygon count, MANO model was trained with only 31 adult subjects, which not only limits its expressive power but also imposes unnecessary shape reconstruction constraints on pose estimation methods. Moreover, hand appearance remains almost unexplored and neglected from the majority of hand reconstruction methods. In this work, we propose "Handy", a large-scale model of the human hand, modelling both shape and appearance composed of over 1200 subjects which we make publicly available for the benefit of the research community. In contrast to current models, our proposed hand model was trained on a dataset with large diversity in age, gender, and ethnicity, which tackles the limitations of MANO and accurately reconstructs out-of-distribution samples. In order to create a high quality texture model, we trained a powerful GAN, which preserves high frequency details and is able to generate high resolution hand textures. To showcase the capabilities of the proposed model, we built a synthetic dataset of textured hands and trained a hand pose estimation network to reconstruct both the shape and appearance from single images. As it is demonstrated in an extensive series of quantitative as well as qualitative experiments, our model proves to be robust against the state-of-the-art and realistically captures the 3D hand shape and pose along with a high frequency detailed texture even in adverse "in-the-wild" conditions. </br>
Hand Model
The proposed Handy model is composed by a shape and an appearance model. The construction of both models was done after non-rigid registration of the hand template to the raw scans. </br>
Shape Model
After having all the 3D raw hand scans into dense correspondence with the template, we normalise them to a canonical open-palm pose to avoid capturing any unnecessary deformations into our final shape model. We construct a deformable hand shape model described as a linear basis of shapes. In particular, using PCA, we build a hand model with $N$ vertices that is described by an orthonormal basis after keeping the first $n_c$ principal components $\mathbf{U} \in \mathcal{\mathbb{R}} ^{3N \times n_c}$ and their associated $\lambda$ eigenvalues. New hand instances can then be generated by regressing the shape parameters $\mathbf{{\beta}} = [\beta_0, \beta_1, ..., \beta_{n_c}] \in \mathcal{\mathbb{R}} ^ {n_c}$ as:
$${B_s}(\mathbf{\beta}) = \mathbf{T} + \sum_{i=0}^{n_c} \mathbf{U_i \beta_i} \in \mathcal{\mathbb{R}} ^{3N}$$ where $\mathbf{T} \in \mathcal{\mathbb{R}} ^{3N}$ refers to the mean hand shape.
<p align="center"><img width="80%" src="figures/components.jpg" /></p>Additional to the high resolution hand template used in the paper, we release the Handy shape model using MANO template so that it can be directly adapted to any project that uses MANO model.
</br>Appearance Model
To model the hand textures we utilised the powerful StyleGAN-v3 architecture. Utilising such GAN architecture, we are capable of preserving the high frequency details such as veins, wrinckles and nail polish whilst avoiding any smoothness created by a PCA models.
<p align="center"><img width="80%" src="figures/generated_uvs.jpg" /></p> </br>Public release models
Both shape and appearance models have been publicly available for research and education purposes. To obtain access to the models, you need to complete and sign the user agreement form (can be found in this repository, user_agreement.pdf). This agreement should be completed by a full-time academic member. The form should be signed, and emailed to Rolandos Alexandros Potamias (r.potamias@imperial.ac.uk). We will verify your request and contact you on how to download the model package. Note that the agreement requires that:
The models along with their corresponding derivatives are used for non-commercial research and education purposes only. You agree not copy, sell, trade, or exploit the model for any commercial purposes. In any published research using the models, you cite the following paper:
Handy: Towards a High Fidelity 3D Hand Shape and Appearance Model, RA Potamias, S.Ploumpis, S.Moschoglou, V.Triantafyllou and S. Zafeiriou, Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), June, 2023
Usage
Once you download the hand shape and appearance models, place them under the models
folder. To run the following scripts numpy
, pickle
and trimesh
packages are needed.
Shape
To sample random shapes from the model use the following script:
import numpy as np
import pickle
import trimesh
with open("./models/Left_Hand_Shape.pkl", 'rb') as f:
hand_model = pickle.load(f)
num_comp = 30
sigma = 2
e = hand_model['eigenvalues'][:num_comp] ## Eigenvalues of the Covariance
T = hand_model['v_template'] ## Mean template
U = hand_model['components'][:num_comp] ## Eigenvectors of the Covariance
w = (np.random.rand(num_comp) - 0.5) * sigma * np.sqrt(e) ## Random sample shape parameters scaled by the eigenvalues
generated_hand = T + (U.T @ w ).reshape(T.shape[0] , 3)
trimesh.Trimesh(generated_hand, hand_model['f'] , process=False).export('./hand_mesh.obj')
Shape and Texture
In order to generate textured meshes make sure you install the required libraries of StyleGAN3. Once installed, you can sample textured meshes with the following script:
import numpy as np
import pickle
import trimesh
from sample_uv import sample_uv
with open("./models/Left_Hand_Shape.pkl", 'rb') as f:
hand_model = pickle.load(f)
num_comp = 30
sigma = 2
e = hand_model['eigenvalues'][:num_comp] ## Eigenvalues of the Covariance
T = hand_model['v_template'] ## Mean template
U = hand_model['components'][:num_comp] ## Eigenvectors of the Covariance
w = (np.random.rand(num_comp) - 0.5) * sigma * np.sqrt(e) ## Random sample shape parameters scaled by the eigenvalues
generated_hand = T + (U.T @ w ).reshape(T.shape[0] , 3)
texture_uv = sample_uv(np.random.randint(1e5)) # Random seed
generated_hand = generated_hand[hand_model['transform_to_uv']] ## Some vertices are repeated to zip the texture around the hand
trimesh.Trimesh(generated_hand, hand_model['f_uv'],
visual = trimesh.visual.texture.TextureVisuals(hand_model['uv_coords'] , image = texture_uv),
process=False).export('./hand_mesh.obj')
Articulation
Although Handy is a shape and appearance model, we can articulate Handy using the Linear Blend Skinning of MANO. To use the articulation of MANO download MANO model from the MANO website (you will need to create an account for that). All code and data from MANO website are under the MANO license.
Once downloaded, you can run the following script to generate a pickle '.pkl' file containing the articulated Handy model in a similar format to 'RIGHT_MANO.pkl'
python transfer_articulation_from_MANO.py --mano_model /PATH_TO_MANO/MANO_RIGHT.pkl --handy_model /PATH_TO_HANDY/Right_Hand_Shape.pkl --output_model ./HANDY_RIGHT.pkl
Citation
If you find this work is useful for your research, please consider citing our paper.
@InProceedings{Potamias_2023_CVPR,
author = {Potamias, Rolandos Alexandros and Ploumpis, Stylianos and Moschoglou, Stylianos and Triantafyllou, Vasileios and Zafeiriou, Stefanos},
title = {Handy: Towards a High Fidelity 3D Hand Shape and Appearance Model},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2023},
pages = {4670-4680}
}