Awesome
Introduction
This repository holds the code for running MM-HAND model as proposed under <b>MM-HAND: 3D-Aware Multi-Modal Guided Hand Generation for Pose Data Augmentation<b> submitted to ACM-MM 2020 conference.
Quick Start
Environment
We tested our code on Ubuntu 19.10, with CUDA 10.1 and Pytorch v1.4.0
- clone the current repo
$ git clone https://github.com/ScottHoang/mm-hand.git
$ cd ./mm-hand
- create a new pytorch enviroment
- Install Pytorch.
- Install NVIDIA's APEX following the official link. Don't clone NVIDIA repo in our current directory.
- Install dependencies
$ pip install -r requirements.txt
Data
- create a datasets folder. Assuming the current directory is this repo
$ mkdir ./datasets
- Download Rendered-Hand-Pose dataset
- Download Stereo-Tracking dataset
- unzip the datasets
- run
$ python ./tools/create_STB_DB.py [Path to downloaded STB dataset] ./datasets/stb_dataset 256
$ python ./tools/create_RHD_DB.py [PATH to downloaded RHD dataset] ./datasets/rhd_dataset 256
Run Script
<b>Be sure to read the options that are available within scripts<b>
$ bash ./scripts/mm-train-ratio.sh