Home

Awesome

PoSynDA: Multi-Hypothesis Pose Synthesis Domain Adaptation for Robust 3D Human Pose Estimation

PoSynDA is a novel framework for 3D Human Pose Estimation (3D HPE) that addresses the challenges of adapting to new datasets due to the scarcity of 2D-3D pose pairs in target domain training sets. This repository contains the official PyTorch implementation of the PoSynDA method as described in our paper.

Key Features

Prerequisites

You should download MATLAB if you want to evaluate our model on MPI-INF-3DHP dataset.

Installation

  1. Clone this repository:

    git clone https://github.com/hbing-l/PoSynDA.git
    cd PoSynDA
    
  2. Install the required packages:

    pip install -r requirements.txt
    

Dataset Preparation

Our model is evaluated on Human3.6M and MPI-INF-3DHP datasets.

Training

h36m_transfer.py is the code to transfer H36M S1 to S5, S6, S7, S8, and h36m_3dhp_transfer.py is the code to transfer H36M dataset to 3DHP dataset. To train the PoSynDA model on the target dataset (e.g. 3DHP), run:

python h36m_3dhp_transfer.py -k cpn_ft_h36m_dbb -num_proposals 3 -timestep 1000 -c checkpoint/ -gpu 0 --nolog

Evaluation

For evaluation of the provided model.

python h36m_3dhp_transfer.py -c checkpoint -gpu 0 --nolog --evaluate best_epoch.bin

Results

Our method achieves a 58.2mm MPJPE on the Human3.6M dataset without using 3D labels from the target domain, comparable to the target-specific MixSTE model (58.2mm vs. 57.9mm).

Citation

If you find this work useful for your research, please consider citing our paper:

@article{liu2023posynda,
  title={PoSynDA: Multi-Hypothesis Pose Synthesis Domain Adaptation for Robust 3D Human Pose Estimation},
  author={Liu, Hanbing and He, Jun-Yan and Cheng, Zhi-Qi and Xiang, Wangmeng and Yang, Qize and Chai, Wenhao and Wang, Gaoang and Bao, Xu and Luo, Bin and Geng, Yifeng and others},
  journal={arXiv preprint arXiv:2308.09678},
  year={2023}
}

Acknowledgments

We would like to thank all the following contributors and researchers who made this project possible.

License

This project is licensed under the MIT License - see the LICENSE.md file for details.