Awesome
Human Pose Estimation Core Library
A library of functions for human pose estimation with event-driven cameras
Please cite:
@InProceedings{Goyal_2023_CVPR,
author = {Goyal, Gaurvi and Di Pietro, Franco and Carissimi, Nicolo and Glover, Arren and Bartolozzi, Chiara},
title = {MoveEnet: Online High-Frequency Human Pose Estimation With an Event Camera},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2023},
pages = {4023-4032}
}
article: MoveEnet: Online High-Frequency Human Pose Estimation With an Event Camera
also for the eH3.6m dataset:
https://zenodo.org/record/7842598
and checkout other publications
Please contribute your event-driven HPE application and datasets to enable comparisons!
Core (C++)
Compile and link the core C++ library in your application to use the event-based human pose estimation functions including:
- joint detectors: OpenPose built upon greyscales formed from events
- joint velocity estimation @>500Hz
- asynchronous pose fusion of joint velocity and detection
- event representation methods to be compatible with convolutional neural networks.
PyCore
Importable python libraries for joint detection
- event-based movenet: MoveEnet built on PyTorch
Examples
Some example applications are available giving ideas on how to use the HPE-core libraries
- MoveEnet inference code in corresponding example here
Evaluation
Python scripts can be used to compare different detectors and velocity estimation combinations
Datasets and Conversion
Scripts to convert datasets into common formats to easily facilitate valid comparisons, and consistency
Authors
Event-driven Perception for Robotics
See Also
@INPROCEEDINGS{9845526,
author={Carissimi, Nicolò and Goyal, Gaurvi and Pietro, Franco Di and Bartolozzi, Chiara and Glover, Arren},
booktitle={2022 8th International Conference on Event-Based Control, Communication, and Signal Processing (EBCCSP)},
title={Unlocking Static Images for Training Event-driven Neural Networks},
year={2022},
pages={1-4},
doi={10.1109/EBCCSP56922.2022.9845526}}