Hauptinhalt

Calibration and Sensor Fusion

Perform lidar-camera calibration by finding extrinsic parameters between sensors, fuse data between sensors

Most modern autonomous systems in applications such as manufacturing, transportation, and construction employ multiple sensors. Sensor fusion is the process of bringing together data from multiple sensors, such as lidar sensors and cameras. The fused data enables greater accuracy because it leverages the strengths of each sensor to overcome the limitations of the others. Lidar Toolbox™ functions support sensor fusion processes such as projecting lidar points on images, fusing color information in lidar point clouds, and projecting bounding boxes from images to point clouds and from point clouds to images.

To understand and correlate the data from individual sensors, you must develop a geometric correspondence between them. Calibration is the process of developing this correspondence. Use Lidar Toolbox functions to perform lidar-camera calibration. To get started, see What Is Lidar-Camera Calibration?

You can also interactively calibrate the sensors by using the Lidar Camera Calibrator app. For more information, see Get Started with Lidar Camera Calibrator.

Apps

Lidar Camera CalibratorFind rotation and translation between lidar and camera

Functions

expand all

estimateBoardCornersCameraEstimate corners of calibration board in camera frame (Since R2026a)
estimateBoardCornersLidarEstimate corners of calibration board in lidar frame (Since R2026a)
estimateLidarCameraTransformEstimate rigid transformation from lidar sensor to camera
projectLidarPointsOnImageProject lidar point cloud data onto image coordinate frame
fuseCameraToLidarFuse image information to lidar point cloud
bboxCameraToLidarEstimate 3-D bounding boxes in point cloud from 2-D bounding boxes in image
bboxLidarToCameraEstimate 2-D bounding box in camera frame using 3-D bounding box in lidar frame

Topics

Featured Examples