Awesome
Intro
This is the official repository for the following paper:
MotionAug: Augmentation with Physical Correction for Human Motion Prediction, CVPR2022
Download paper here
Prerequisites
python3.6.9
Dependencies
- Refer to DeepMimic_repo to install followings
BulletPhysics
Eigen
OpenGL
freeglut
glew
swig
MPI
For BulletPhysics
installation, do not forget the option -DUSE_DOUBLE_PRECISION=OFF
in build_cmake_pybullet_double.sh
.
If the installation of BulletPhysics
is failed from source, you can try sudo apt install libbullet-dev
.
Edit the DeepMimicCore/Makefile
to specify path to libraries.
-
JAVA installation
We also use caliko IK library which implement FABRIK algorithm.
The IK package for the CMUMocap bone is included in /lib/caliko. To install javasudo apt install default-jre
.
JAVA library refers$JAVA_HOME
. Please specify it.
(In my case,export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64/
) (You may also needsudo ldconfig /usr/lib/jvm/java-11-openjdk-amd64/lib/server /usr/lib64
) -
To compile the simulation environment
cd DeepMimicCore
make python -j8
- To create Python environment
pip install -r requirements.txt
Data preparation
bash prepare_data.sh
This command will unzip HDM05 motion dataset, align Left/Right of motions, split to each motion clups, and convert motions into Npz format.
Augmentations
Currently, we support following action classes
<p align="center"> {kick, punch, walk, jog, sneak, grab, deposit, throw} </p>You can skip this augmentation steps. download datasets from here.
- IK without motion correction
python generate_bvh_dataset.py --aug IK_kin --act_class {action class}
- VAE without motion correction
python vae_script.py --act_class {action_class} --gpu {gpu id}
python generate_bvh_dataset.py --aug VAE_kin --act_class {action class}
- IK with physical correction (take several days to finish)
python train_ik.py --act_class {act_class} --num_threads {total cpu threads to use}
python generate_bvh_dataset.py --aug IK_phys --act_class {act_class}
- VAE with physical correction (take several days to finish)
python vae_script.py --act_class {action_class} --gpu {gpu id}
python train_vae.py --act_class {act_class} --num_threads {total cpu threads to use}
python generate_bvh_dataset.py --aug VAE_phys --act_class {act_class}
- IK&VAE with physical correction & motion debiasing (take several days to finish, proposed method)
python train_ik.py --act_class {act_class} --num_threads {total cpu threads to use}
python vae_script.py --act_class {action_class} --gpu {gpu id}
python train_vae.py --act_class {act_class} --num_threads {total cpu threads to use}
python generate_bvh_dataset.py --aug VAE_phys --act_class {act_class}
python generate_bvh_dataset.py --aug IK_phys --act_class {act_class}
python generate_bvh_dataset.py --aug Fixed_phys --act_class {act_class}
cd evaluate/DTW
python debias.py --debiaser_type NN --phys_data_npz ../dataset/dataset_Fixed_phys_{act_class}.npz --aug_data_npz ../dataset/dataset_VAE_phys_{act_class}.npz --act_class {act_class}
python debias.py --debiaser_type NN --phys_data_npz ../dataset/dataset_Fixed_phys_{act_class}.npz --aug_data_npz ../dataset/dataset_IK_phys_{act_class}.npz --act_class {act_class}
Evaluation
So far, we prepared following augmentation options
NOAUG, NOISE # previous methods
VAE, IK, VAE_IK # augmentation without motion correction
VAE_PHYSICAL, IK_PHYSICAL, VAE_IK_PHYSICAL # augmentation with physical correction
VAE_PHYSICAL_OFFSET_NN, IK_PHYSICAL_OFFSET_NN,
VAE_IK_PHYSICAL_OFFSET_NN # augmentation with physical correction & motion debiasing
You can choose human motion prediction models from RNN, GCN, Transformer.
RNN: seq2seq, GCN: GCN, Transformer: transformer
please edit aug_mode, actions, model_types in evaluate/LearnTrajDep/run_train.py
.
cd evaluate/LearnTrajDep/
python run_train.py
Acknowledgement
This work is based on the DeepMimic that enabled the phsically simulated character to mimic various moitons.
We very appreciate their hard work and great research perspective.
We included the human motion prediction model (RNN, GCN, Transformer).
We also appreciate their great works.