Awesome
Bringing Inputs to Shared Domains for 3D Interacting Hands Recovery in the Wild
Our new Re:InterHand dataset has been released, which has much more diverse image appearances with more stable 3D GT. Check it out at here!
Introduction
This repo is official PyTorch implementation of Bringing Inputs to Shared Domains for 3D Interacting Hands Recovery in the Wild (CVPR 2023).
<p align="middle"> <img src="assets/teaser.png" width="1200" height="250"> </p> <p align="middle"> <img src="assets/demo1.png" width="250" height="150"><img src="assets/demo2.png" width="250" height="150"><img src="assets/demo3.png" width="250" height="150"><img src="assets/demo4.png" width="250" height="150"><img src="assets/demo5.png" width="250" height="150"><img src="assets/demo6.png" width="250" height="150"> </p>Demo
- Prepare
human_model_files
folder following belowDirectory
part and place it atcommon/utils/human_model_files
. - Move to
demo
folder. - Download pre-trained InterWild from here.
- Put input images at
images
. The image should be a cropped image, which contain a single human. For example, using a human detector. We have a hand detection network, so no worry about the hand postiions! - Run
python demo.py --gpu $GPU_ID
- Boxes, meshes, MANO parameters, and renderings are saved at
boxes
,meshes
,params
, andrenders
, respectively.
Directory
Root
The ${ROOT}
is described as below.
${ROOT}
|-- data
|-- demo
|-- common
|-- main
|-- output
data
contains data loading codes and soft links to images and annotations directories.demo
contains the demo codecommon
contains kernel code. You should putMANO_RIGHT.pkl
andMANO_LEFT.pkl
atcommon/utils/human_model_files/mano
, where those are available in here.main
contains high-level codes for training or testing the network.output
contains log, trained models, visualized outputs, and test result.
Data
You need to follow directory structure of the data
as below.
${ROOT}
|-- data
| |-- InterHand26M
| | |-- annotations
| | | |-- train
| | | |-- test
| | |-- images
| |-- MSCOCO
| | |-- annotations
| | | |-- coco_wholebody_train_v1.0.json
| | | |-- coco_wholebody_val_v1.0.json
| | | |-- MSCOCO_train_MANO_NeuralAnnot.json
| | |-- images
| | | |-- train2017
| | | |-- val2017
| |-- HIC
| | |-- data
| | | |-- HIC.json
| |-- ReInterHand
| | |-- data
| | | |-- m--*
- Download InterHand2.6M [HOMEPAGE].
images
contains images in 5 fps, andannotations
contains theH+M
subset. - Download the whole-body version of MSCOCO [HOMEPAGE].
MSCOCO_train_MANO_NeuralAnnot.json
can be downloaded from [here]. - Download HIC [HOMEPAGE] [annotations]. You need to download 1) all
Hand-Hand Interaction
sequences (01.zip
-14.zip
) and 2) some ofHand-Object Interaction
seuqneces (15.zip
-21.zip
) and 3) MANO fits. Or you can simply runpython download.py
in thedata/HIC
folder. - Download ReInterHand[HOMEPAGE] at
data/ReInterHand/data
.
Output
You need to follow the directory structure of the output
folder as below.
${ROOT}
|-- output
| |-- log
| |-- model_dump
| |-- result
| |-- vis
log
folder contains training log file.model_dump
folder contains saved checkpoints for each epoch.result
folder contains final estimation files generated in the testing stage.vis
folder contains visualized results.
Running InterWild
Start
- Prepare
human_model_files
folder following aboveDirectory
part and place it atcommon/utils/human_model_files
.
Train
In the main
folder, run
python train.py --gpu 0-3
to train the network on the GPU 0,1,2,3. --gpu 0,1,2,3
can be used instead of --gpu 0-3
. If you want to continue experiment, run use --continue
.
Test
- Checkpoint trained on IH26M (H+M) + MSCOCO. FYI, all experimental results of the paper is from a checkpoint trained on IH26M (H) + MSCOCO.
- Checkpoint trained on IH26M (H+M) + MSCOCO + ReInterHand (Mugsy_cameras).
- Checkpoint trained on IH26M (H+M) + MSCOCO + ReInterHand (Ego_cameras).
- Place the checkpoint at `output/model_dump'.
- Or if you want to test with our own trained model, place your model at
output/model_dump
. - For the evaluation on InterHand2.6M dataset, we evaluated all methods in the paper on
human_annot
subset of interHand2.6M usingdata/InterHand26M/aid_human_annot_test.txt
.
In the main
folder, run
python test.py --gpu 0-3 --test_epoch 6
to test the network on the GPU 0,1,2,3 with snapshot_6.pth
. --gpu 0,1,2,3
can be used instead of --gpu 0-3
.
Reference
@inproceedings{moon2023interwild,
author = {Moon, Gyeongsik},
title = {Bringing Inputs to Shared Domains for {3D} Interacting Hands Recovery in the Wild},
booktitle = {CVPR},
year = {2023}
}
@inproceedings{moon2023reinterhand,
title = {A Dataset of Relighted {3D} Interacting Hands},
author = {Moon, Gyeongsik and Saito, Shunsuke and Xu, Weipeng and Joshi, Rohan and Buffalini, Julia and Bellan, Harley and Rosen, Nicholas and Richardson, Jesse and Mize Mallorie and Bree, Philippe and Simon, Tomas and Peng, Bo and Garg, Shubham and McPhail, Kevyn and Shiratori, Takaaki},
booktitle = {NeurIPS Track on Datasets and Benchmarks},
year = {2023},
}
License
This repo is CC-BY-NC 4.0 licensed, as found in the LICENSE file.