Home

Awesome

Anipose

PyPI version

Anipose is an open-source toolkit for robust, markerless 3D pose estimation of animal behavior from multiple camera views. It leverages the machine learning toolbox DeepLabCut to track keypoints in 2D, then triangulates across camera views to estimate 3D pose.

Check out the Anipose paper for more information.

The name Anipose comes from Animal Pose, but it also sounds like "any pose".

Documentation

Up to date documentation may be found at anipose.org .

Demos

<p align="center"> <img src="https://raw.githubusercontent.com/lambdaloop/anipose-docs/master/tracking_3cams_full_slower5.gif" width="70%" > </p> <p align="center"> Videos of flies by Evyn Dickinson (slowed 5x), <a href=http://faculty.washington.edu/tuthill/>Tuthill Lab</a> </p> <p align="center"> <img src="https://raw.githubusercontent.com/lambdaloop/anipose-docs/master/hand-demo.gif" width="70%" > </p> <p align="center"> Videos of hand by Katie Rupp </p>

References

Here are some references for DeepLabCut and other things this project relies upon: