Laser & Optoelectronics Progress, Volume. 61, Issue 8, 0828004(2024)
Target Localization and Tracking Method Based on Camera and LiDAR Fusion
Environmental perception is a key technology for unmanned driving. However, cameras often lack depth information to locate and detect targets and have poor tracking accuracy; therefore, a target localization and tracking algorithm based on the fusion of camera and LiDAR technologies is proposed. This algorithm obtains the positioning information of the detected target by measuring the proportion of the area of the LiDAR point cloud cluster in the pixel plane within the image detection frame. Subsequently, based on the horizontal and vertical movement speeds of the detected target's contour point cloud in the pixel coordinate system, the center coordinate of the image detection frame is fused to improve the target tracking accuracy. The experimental results show that the accuracy of the proposed target localization algorithm is 88.5417%, and the average processing time per frame is only 0.03 s, meeting real-time requirements. The average error of the horizontal axis of the image detection frame center is 4.49 pixel, the average error of the vertical axis is 1.80 pixel, and the average area overlap rate is 87.42%.
Get Citation
Copy Citation Text
Pu Zhang, Jinqing Liu, Jinchao Xiao, Junfeng Xiong, Tianwei Feng, Zhongze Wang. Target Localization and Tracking Method Based on Camera and LiDAR Fusion[J]. Laser & Optoelectronics Progress, 2024, 61(8): 0828004
Category: Remote Sensing and Sensors
Received: Jun. 15, 2023
Accepted: Aug. 1, 2023
Published Online: Mar. 15, 2024
The Author Email: Liu Jinqing (jqliu8208@fjnu.ehu.com.cn)