Home

Awesome

EgoPoseFormer

<p align="center"> <img src="assets/network.png" style="width:960px;"/> </p>

This repository contains the official PyTorch implementation of our paper:

EgoPoseFormer: A Simple Baseline for Egocentric 3D Human Pose Estimation, Chenhongyi Yang, Anastasia Tkach, Shreyas Hampali, Linguang Zhang, Elliot J. Crowley, Cem Keskin. ECCV 2024.

Usage

Environment Setup

conda create -n egoposeformer python=3.10 -y
source activate egoposeformer

pip install torch==1.13.1+cu116 torchvision==0.14.1+cu116 -f https://download.pytorch.org/whl/torch_stable.html
pip install pytorch-lightning==2.1.0 
pip install numba==0.56.4
pip install numpy==1.23.5
pip install mmcv-full==1.6.0

git clone https://github.com/ChenhongyiYang/egoposeformer.git
cd EgoPoseFormer
pip install -e .

Dataset Setup

We provide support for our main dataset UnrealEgo. Please refer to its official instruction to download the dataset. Specifically, you only need to download the UnrealEgoData_impl split. You also need to download pelvis_pos.pkl, which is extracted from the UnrealEgo meta data, for computing 3D to 2D projection. The file structures should be:

EgoPoseFormer
|-- configs
|-- pose_estimation
|-- ...
|-- data
|   |-- unrealego
|   |   |-- unrealego_impl
|   |   |    |-- ArchVisInterior_ArchVis_RT
|   |   |    |-- ...
|   |   |-- pelvis_pos.pkl
|   |   |-- train.txt
|   |   |-- validation.txt
|   |   |-- test.txt  

Training and Testing

You can easily run an experiments using the following commands:

# train
python run.py fit --config $CONFIG
# test
python run.py test --config $CONFIG --ckpt_path $PATH

For example, you can run a full UnrealEgo experiment by:

# 2D heatmap pre-training
python run.py fit --config ./configs/unrealego_r18_heatmap.yaml

# training EgoPoseFormer
# Note: You will need to put the pre-trained encoder path to 
#       the `encoder_pretrained` entry in the config file
python run.py fit --config ./configs/unrealego_r18_pose3d.yaml

# testing EgoPoseFormer
python run.py test --config ./configs/unrealego_r18_pose3d.yaml --ckpt_path path/to/ckpt

Results

BackboneMPJPEPA-MPJPEConfigWeights
ResNet-1834.533.4Pre-train / PoseLink

Note: The numbers are measured using newly trained models, so they are slightly different from the numbers reported in the paper.

Citation

@inproceedings{yang2024egoposeformer,
  title={EgoPoseFormer: A Simple Baseline for Stereo Egocentric 3D Human Pose Estimation},
  author={Yang, Chenhongyi and Tkach, Anastasia and Hampali, Shreyas and Zhang, Linguang and Crowley, Elliot J and Keskin, Cem},
  journal={European conference on computer vision},
  year={2024},
  organization={Springer}
}

Acknowledgement

This codebase is partially inspired by the UnrealEgo implementation.