Awesome
Awesome-LiDAR-Camera-Calibration
A Collection of LiDAR-Camera-Calibration Papers, Toolboxes and Notes.
Outline
0. Introduction
For applications such as autonomous driving, robotics, navigation systems, and 3-D scene reconstruction, data of the same scene is often captured using both lidar and camera sensors. To accurately interpret the objects in a scene, it is necessary to fuse the lidar and the camera outputs together. Lidar camera calibration estimates a rigid transformation matrix (extrinsics, rotation+translation, 6 DoF) that establishes the correspondences between the points in the 3-D lidar plane and the pixels in the image plane.
1. Target-based methods
C: camera, L: LiDAR, a: automaic, m: manual
2. Targetless methods
2.1. Motion-based methods
Paper | Feature | Optimization | Toolbox | Note |
---|---|---|---|---|
LiDAR and Camera Calibration Using Motions Estimated by Sensor Fusion Odometry, 2018 | C: motion (ICP), L: motion (VO) | hand-eye calibration | * | * |
2.2. Scene-based methods
2.2.1. Traditional methods
2.2.2. Deep-learning methods
3. Other toolboxes
Toolbox | Introduction | Note |
---|---|---|
Apollo sensor calibration tools | targetless method, no source code | CN |
Autoware camera lidar calibrator | pick points mannually, PnP | * |
Autoware calibration camera lidar | checkerboard, similar to LCCT | CN |
livox_camera_lidar_calibration | pick points mannually, PnP | * |
OpenCalib | target-based, target-less, mannual | OpenCalib: A Multi-sensor Calibration Toolbox for Autonomous Driving |
tier4/CalibrationTools | target-based, mannual | * |