Laser Journal, Volume. 45, Issue 10, 101(2024)

Fast tracking method for point targets in sequential moving images based on 3D laser point cloud

XU Shuxian... ZHAO Zhimei and WU Hongnan |Show fewer author(s)
Author Affiliations
  • Guilin University of Technology, Guilin Guangxi 541004, China
  • show less

    Due to the insufficient consideration of small and occluded target detection in existing methods, the tracking performance of targets is poor. Therefore, in response to the difficulty of point target tracking, a fast tracking method for point targets in sequence moving images based on 3D laser point clouds is proposed. Firstly, downsampling the 3D laser point cloud of a sequence of motion images and removing the ground data from it. By improving the Euclidean clustering method through dynamic thresholding, the image is segmented into targets and backgrounds. Then, the SECOND algorithm is improved, incorporating an adaptive spatial feature fusion module and a 2D convolutional neural network in the target detection stage. At the same time, 3D DIoU is used instead of Smooth L1 as the loss function to improve the target detection performance of the SECOND algorithm, Finally, the three-dimensional centers of each point target are constructed in the LiDAR coordinate system, and the 3D Kalman filter is used to continuously track the target. The intersection to union ratio and Euclidean distance are used as metrics, and greedy algorithms are used to match the nearest neighbor target to achieve fast tracking of point targets in sequence moving images. The experimental results show that the proposed method has a higher IoU value, reaching 0.971, and is more ideal for target tracking.

    Tools

    Get Citation

    Copy Citation Text

    XU Shuxian, ZHAO Zhimei, WU Hongnan. Fast tracking method for point targets in sequential moving images based on 3D laser point cloud[J]. Laser Journal, 2024, 45(10): 101

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Feb. 12, 2024

    Accepted: Jan. 2, 2025

    Published Online: Jan. 2, 2025

    The Author Email:

    DOI:10.14016/j.cnki.jgzz.2024.10.101

    Topics