Home

Awesome

<h1>ViSP: Open source Visual Servoing Platform</h1>

Github Releases License

PlatformBuild Status
Ubuntu 20.04, 22.04 (amd64)ubuntu dep apt workflow ubuntu dep src workflow
macOS 13 and 14macos workflow
iOS on macOS 11.0ios workflow
Windows 10Build status
Other arch Ubuntu 22.04 (aarch64, s390x)other arch workflow
ROS1 Noetic Ubuntu 20.04 FocalBuild Status
ROS2 Humble Ubuntu 22.04 JammyBuild Status
ROS2 Iron Ubuntu 22.04 JammyBuild Status
ROS2 Rolling Ubuntu 22.04 JammyBuild Status
Valgrindvalgrind workflow
Sanitizersanitizers workflow
Code coverageCode coverage
Other projectsBuild Status
UsTKmacOS Ubuntu
visp_contribUbuntu
visp_samplemacos workflow ubuntu dep apt workflow
camera_localizationubuntu_3rdparty_workflow
visp_startedubuntu_3rdparty_workflow

ViSP is a cross-platform library (Linux, Windows, MacOS, iOS, Android) that allows prototyping and developing applications using visual tracking and visual servoing technics at the heart of the researches done now by Inria <a href="https://team.inria.fr/rainbow">Rainbow team</a> and before 2018 by <a href="https://team.inria.fr/lagadic">Lagadic team</a>. ViSP is able to compute control laws that can be applied to robotic systems. It provides a set of visual features that can be tracked using real time image processing or computer vision algorithms. ViSP provides also simulation capabilities. ViSP can be useful in robotics, computer vision, augmented reality and computer animation. Our <a href="https://www.youtube.com/user/VispTeam">YouTube channel</a> gives an overview of the applications that could be tackled.

Citing ViSP

Please cite <a href="https://inria.hal.science/inria-00351899">ViSP</a> in your publications if it helps your research:

@article{Marchand05b,
   Author = {Marchand, E. and Spindler, F. and Chaumette, F.},
   Title = {ViSP for visual servoing: a generic software platform with a wide class of robot control skills},
   Journal = {IEEE Robotics and Automation Magazine},
   Volume = {12},
   Number = {4},
   Pages = {40--52},
   Publisher = {IEEE},
   Month = {December},
   Year = {2005}
}

To cite the <a href="https://inria.hal.science/hal-01853972v1">generic model-based tracker</a>:

@InProceedings{Trinh18a,
   Author = {Trinh, S. and Spindler, F. and Marchand, E. and Chaumette, F.},
   Title = {A modular framework for model-based visual tracking using edge, texture and depth features},
   BookTitle = {{IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, IROS'18}},
   Address = {Madrid, Spain},
   Month = {October},
   Year = {2018}
}

To cite <a href="https://hal.science/hal-01246370v1">pose estimation algorithms and hands-on survey</a> illustrated with <a href="https://github.com/lagadic/camera_localization">ViSP examples</a>:

@article{Marchand16a,
   Author = {Marchand, E. and Uchiyama, H. and Spindler, F.},
   Title = {Pose estimation for augmented reality: a hands-on survey},
   Journal = {IEEE Trans. on Visualization and Computer Graphics},
   Volume = {22},
   Number = {12},
   Pages = {2633--2651},
   Month = {December},
   Year = {2016}
}

Resources

Contributing

Please read before starting work on a pull request: https://visp.inria.fr/contributing-code/