Laser & Optoelectronics Progress, Volume. 59, Issue 2, 0215003(2022)
Spatial Pose Calibration Method for Lidar and Camera Based on Intensity Information
Lidar and camera fusion system can perceive the geometric size and color information of the environment, which has been widely used in many fields. In order to accurately fuse the two kinds of information, we propose a calibration method of external parameters of lidar and camera based on natural feature points. First, on the basis of lidar self correction, the gray image is generated by central projection of the point cloud using the intensity information of lidar data. Then, the scale-invariant feature transformation algorithm is used to extract and match the feature points of the gray image generated by projection and the camera image. Finally, the calibration mathematical model is established based on the information obtained from the feature points with the same name, and the data are optimized to calibrate the external parameters of the three-dimensional lidar system and camera system. The experimental results show that the re projection error from the point cloud to the image pixels calculated by this method is 2.3 pixel, which verifies the effectiveness and accuracy of the pose calibration method.
Get Citation
Copy Citation Text
Wensong Song, Zonghua Zhang, Nan Gao, Zhaozong Meng. Spatial Pose Calibration Method for Lidar and Camera Based on Intensity Information[J]. Laser & Optoelectronics Progress, 2022, 59(2): 0215003
Category: Machine Vision
Received: Feb. 3, 2021
Accepted: Mar. 11, 2021
Published Online: Dec. 29, 2021
The Author Email: Zhang Zonghua (zhzhang@hebut.edu.cn)