Awesome
Learning Joint Reconstruction of Hands and Manipulated Objects - ObMan dataset
Yana Hasson, Gül Varol, Dimitris Tzionas, Igor Kalevatykh, Michael J. Black, Ivan Laptev, Cordelia Schmid, CVPR 2019
Download required files
Download dataset images and data
- Request the dataset on the ObMan webpage. Note that the data falls under the following license
- Once you have approved the license. You then need to send an e-mail to yana.hasson.inria@gmail.com with "ObMan data request" as subject.
- unzip obman.zip to /path/to/obman
- Your dataset structure should look like
obman/
test/
rgb/
rgb_obj/
meta/
...
val/
rgb/
rgb_obj/
meta/
...
Download object meshes
- Download object models from ShapeNet
- Create an account on shapenet.org
- Download models from download page
Download code
git clone https://github.com/hassony2/obman
cd obman
Load samples
python readataset --root /path/to/obman --shapenet_root /path/to/ShapeNetCore.v2 --split test --viz
Options you might be interested in --segment
which keeps only the foreground --mini_factor 0.01
to load only 1% of the data (to speed-up loading)
Preprocess shapenet objects for training
Sample points on the external surface of the object:
python shapenet_samplepoints.py
Visualizations
Hand object and mesh in camera coordinates
Projected in pixel space
Hand vertices in blue, object vertices in red.
Citations
If you find this dataset useful for your research, consider citing:
@INPROCEEDINGS{hasson19_obman,
title = {Learning joint reconstruction of hands and manipulated objects},
author = {Hasson, Yana and Varol, G{\"u}l and Tzionas, Dimitris and Kalevatykh, Igor and Black, Michael J. and Laptev, Ivan and Schmid, Cordelia},
booktitle = {CVPR},
year = {2019}
}