Home

Awesome

BiHand - 3D Hand Mesh Reconstruction

This repo contains model, demo, training codes for our paper: "BiHand: Recovering Hand Mesh with Multi-stage Bisected Hourglass Networks"(PDF) (BMVC2020)

<img src="assets/teaser.png">

Get the code

git clone --recursive https://github.com/lixiny/bihand.git
cd bihand

Install Requirements

Install the dependencies listed in environment.yml through conda:

The above operation works well if you are lucky. However, we found that installing opendr is tricky. We solved the errors by:

sudo apt-get install libglu1-mesa-dev freeglut3-dev mesa-common-dev
sudo apt-get install libosmesa6-dev
## then reinstall opendr again
pip install opendr

Download and Prepare Datasets

Now your data folder structure should like this:

data/
    RHD/
        RHD_published_v2/
            evaluation/
            training/
            view_sample.py
            ...

    STB/
        images/
            B1Counting/
                SK_color_0.png
                SK_depth_0.png
                SK_depth_seg_0.png  <-- merged from STB_supp
                ...
            ...
        labels/
            B1Counting_BB.mat
            ...

Download and Prepare model files

MANO model

Now Your manopth folder structure should look like this:

manopth/
  mano/
    models/
      MANO_LEFT.pkl
      MANO_RIGHT.pkl
      ...
  manopth/
    __init__.py
    ...

BiHand models

Now your bihand folder should look like this:

BiHand-test/
    bihand/
    released_checkpoints/
        ├── ckp_seednet_all.pth.tar
        ├── ckp_siknet_synth.pth.tar
        ├── rhd/
        │   ├── ckp_liftnet_rhd.pth.tar
        │   └── ckp_siknet_rhd.pth.tar
        └── stb/
            ├── ckp_liftnet_stb.pth.tar
            └── ckp_siknet_stb.pth.tar
    data/
    ...

Launch Demo & Eval

export PYTHONPATH=/path/to/bihand:$PYTHONPATH
python run.py \
    --batch_size 8 --fine_tune rhd --checkpoint released_checkpoints --data_root data
python run.py \
    --batch_size 8 --fine_tune stb --checkpoint released_checkpoints  --data_root data
<img src="assets/stb_demo.gif" width="480">

Training

By adopting the multi-stage training scheme, we first train SeedNet for 100 epochs:

python training/train_seednet.py --net_modules seed --datasets stb rhd --ups_loss

and then exploit its outputs to train LiftNet for another 100 epochs:

python training/train_liftnet.py \
    --net_modules seed lift \
    --datasets stb rhd \
    --resume_seednet_pth ${path_to_your_SeedNet_checkpoints (xxx.pth.tar)} \
    --ups_loss \
    --train_batch 16

For SIKNet:

python training/train_siknet_sik1m.py
python training/train_siknet.py \
    --fine_tune ${stb, rhd} \
    --frozen_seednet_pth ${path_to_your_SeedNet_checkpoints} \
    --frozen_liftnet_pth ${path_to_your_LiftNet_checkpoints} \
    --resume_siknet_pth ${path_to_your_SIKNet_SIK-1M_checkpoints}

# e.g.
python training/train_siknet.py \
    --fine_tune rhd \
    --frozen_seednet_pth released_checkpoints/ckp_seednet_all.pth.tar \
    --frozen_liftnet_pth released_checkpoints/rhd/ckp_liftnet_rhd.pth.tar \
    --resume_siknet_pth released_checkpoints/ckp_siknet_synth.pth.tar

Limitation

Currently the released version of bihand requires camera intrinsics, root depth and bone length as inputs, thus cannot be applied in the wild.

Citation

If you find this work helpful, please consider citing us:

@inproceedings{yang2020bihand,
  title     = {BiHand: Recovering Hand Mesh with Multi-stage Bisected Hourglass Networks},
  author    = {Yang, Lixin and Li, Jiasen and Xu, Wenqiang and Diao, Yiqun and Lu, Cewu},
  booktitle = {BMVC},
  year      = {2020}
}

Acknowledgement