Awesome
GraspTTA
Hand-Object Contact Consistency Reasoning for Human Grasps Generation (ICCV 2021).
Demo
Quick Results Visualization
We provide generated grasps on out-of-domain HO-3D dataset (saved at ./diverse_grasp/ho3d), you can visualize the results by:
python vis_diverse_grasp --obj_id=6
The visualization will look like this:
Generate diverse grasps on out-of-domain HO-3D dataset (the model is trained on ObMan dataset)
You can also generate the grasps by yourself
-
First, download pretrained weights, unzip and put into
checkpoints
. -
Second, download the MANO model files (
mano_v1_2.zip
) from MANO website. Unzip and putmano/models/MANO_RIGHT.pkl
intomodels/mano
. And please use this MANO repo. -
Third, download HO-3D object models, unzip and put into
models/HO3D_Object_models
. -
The structure should look like this:
GraspTTA/
checkpoints/
model_affordance_best_full.pth
model_cmap_best.pth
models/
HO3D_Object_models/
003_cracker_box/
points.xyz
textured_simple.obj
resampled.npy
......
mano/
MANO_RIGHT.pkl
- Then, install the V-HACD for building the simulation of grasp displacement. Change this line to your own path.
- Finally, run
run.sh
for installing other dependencies and start generating grasps.
Generate grasps on custom objects
- First, resample 3000 points on object surface as the input of the network. You can use this function.
- Second, write your own dataloader and related code in
gen_diverse_grasp_ho3d.py
.
Training code
Please email me if you have interest in the training code. I don't have enough bandwidth to clean up the code now. But I am happy to provide a raw version of it.
Citation
@inproceedings{jiang2021graspTTA,
title={Hand-Object Contact Consistency Reasoning for Human Grasps Generation},
author={Jiang, Hanwen and Liu, Shaowei and Wang, Jiashun and Wang, Xiaolong},
booktitle={Proceedings of the International Conference on Computer Vision},
year={2021}
}
Acknowledgments
We thank: