Infrared and Laser Engineering, Volume. 52, Issue 8, 20230427(2023)

Advancements in fusion calibration technology of lidar and camera

Shiqiang Wang, Zhaozong Meng, Nan Gao, and Zonghua Zhang
Author Affiliations
  • School of Mechanical Engineering, Hebei University of Technology, Tianjin 300130, China
  • show less

    SignificanceThe data gathered by a singular sensor is inherently incomplete. For instance, the point clouds obtained by lidar lacks texture and color information, and the picture captured by camera lacks depth information. The data fusion of lidar and camera enable the harnessing of complementary information between the sensors, resulting in the acquisition of precise three dimensional (3D) spatial perception, which is widely applied in various fields, including autonomous driving and mobile robotics. In recent years, a lot of scholars at home and abroad have made significant research advancements in the field of sensor fusion, especially in the fusion of lidar and camera. However, there is a lack of a comprehensive paper summarizing the research achievement in the field of sensor fusion by scholars from various backgrounds. This paper provides a comprehensive summary of the research outcomes pertaining to the calibration method for lidar and camera fusion, which serves as a valuable reference for future researchers working in this field. Additionally, this paper serves as a helpful resource for beginners seeking a concise introduction to the subject, allowing them to quickly familiarize themselves with the calibration method for lidar and camera fusion.ProgressFirst, the fundamental principles and techniques involved in the calibration of lidar and camera systems are presented. The fundamental principles of camera calibration is introduced. Moreover, a succinct overview of the existing camera calibration methods is provided, accompanied by a delineation of their individual characteristics. Simultaneously, the principle and classification of lidar are introduced, and the characteristics of different types of lidar are analyzed. A mathematical model for mechanical lidar is established and the calibration methods for internal parameters of mechanical lidar are summarized. Furthermore, the principle of joint calibration for lidar and camera is introduced. Secondly, the calibration process of lidar and camera systems involves two main stages of feature extraction and feature matching. The processing methods of point cloud and image are briefly introduced, then extrinsic calibration methods of lidar and camera are emphatically introduced. The extrinsic calibration methods of lidar and camera systems can be categorized into target-based calibration, targetless-based calibration, motion-based calibration and deep learning-based calibration. The existing research results of each calibration method are summarized. The target-based calibration approach achieves high precision. However, it entails a complex calibration process. The targetless-based calibration method is simple and convenient, allowing for online calibration, but it exhibits lower calibration accuracy compared to the target-based calibration. The motion-based and deep learning-based calibration methods are considered as pivotal research directions for future advancements. Finally, We conclude the paper and highlight the future development trends. Feature extraction and matching are the key progress in the calibration of lidar and camera. Although there have been many kinds of calibration methods for lidar and camera, it still needs a better way to improve the accuracy and robustness of the calibration results. In recent years, the development of deep learning technology has provided new opportunities for the fusion of lidar and camera data, and proposed new directions for online calibration in natural scene.Conclusions and ProspectsLidar and camera calibration has emerged as a significant research area, aiming to compensate for the limitations of individual sensor information and enable accurate perception of 3D information. The calibration technology primarily encompasses point cloud processing, image processing, and calibration methods. The crux of the calibration process lies in identifying corresponding features and subsequently matching them. In this paper, the characteristics of four distinct methods of targeted-based calibration, targetless-based calibration, motion-based calibration, and deep learning-based calibration are summarized. The accurate online calibration in diverse scenarios emerges as a prominent research focus in the future. In conclusion, the future research direction of calibration focuses on enhancing accuracy, improving robustness, online calibration, automating calibration, and establishing a unified verification standard. These advancements aim to further enhance the calibration process and its applicability in various domains.

    Tools

    Get Citation

    Copy Citation Text

    Shiqiang Wang, Zhaozong Meng, Nan Gao, Zonghua Zhang. Advancements in fusion calibration technology of lidar and camera[J]. Infrared and Laser Engineering, 2023, 52(8): 20230427

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Jun. 15, 2023

    Accepted: --

    Published Online: Oct. 19, 2023

    The Author Email:

    DOI:10.3788/IRLA20230427

    Topics