Laser & Infrared, Volume. 55, Issue 7, 1012(2025)
LiDAR SLAM method based on BA/NDT and LOAM fusion
[1] [1] Grisetti G, Stachniss C, Burgard W, et al. Improved techniques for grid mapping with Rao-Blackwellized particle filters[J]. IEEE Transactions on Robotics, 2007, 23(1): 34-46.
[2] [2] Kohlbrecher S, Von Stryk O, Meyer J, et al. A flexible and scalable SLAM system with full 3D motion estimation[C]//2011 IEEE International Symposium on Safety, Security, and Rescue Robotics. Kyoto: IEEE, 2011: 155-160.
[3] [3] Hess W, Kohler D, Rapp H, et al. Real-time loop closure in 2D LIDAR SLAM[C]//IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2018: 16-21.
[4] [4] Zhang J, Singh S. Lidar odometry and mapping in real-time[J]. Robotics: Science and Systems, 2014, 2(9): 1-9.
[5] [5] Shan T X, Englot B. LeGO-LOAM: lightweight and ground-optimized lidar odometry and mapping on variable terrain[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Madrid: IEEE, 2018: 4758-4765.
[6] [6] Shan T X, Englot B, Ratti C. LVI-SAM: tightly-coupled Lidar-visual-inertial odometry via smoothing and mapping[C]//2021 IEEE International Conference on Robotics and Automation (ICRA). Xi′an: IEEE, 2017: 5692-5698.
[7] [7] Cattaneo D, Vaghi M, Valada A. LCDNet: deep loop closure detection and point cloud registration for LiDAR SLAM[J]. IEEE Transactions on Robotics, 2022, 38(4): 2074-2093.
[8] [8] Tsintotas K A, Bampis L, Gasteratos A. The revisiting problem in simultaneous localization and mapping: a survey on visual loop closure detection[J]. IEEE Transactions on Intelligent Transportation Systems, 2022, 23(11): 19929-19953.
[9] [9] Biber P, Strasser W. The normal distributions transform: a new approach to laser scan matching[C]//Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003). Las Vegas: IEEE, 2003: 2743-2748.
[10] [10] Magnusson M, Andreasson H, Nuchter A, et al. Appearance-based loop detection from 3D laser data using the normal distributions transform[C]//2009 IEEE International Conference on Robotics and Automation. Kobe: IEEE, 2009: 23-28.
[11] [11] Kim G, Kim A. Scan Context: egocentric spatial descriptor for place recognition within 3D point cloud map[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).2018: 4802-4809.
[12] [12] Behley J, Stachniss C. Efficient surfel-based SLAM using 3D laser range data in urban environments[EB/OL]. https://www.roboticsproceedings.org/rss14/p16.pdf.
[13] [13] Chen X, Lbe T, Milioto A, et al. OverlapNet: loop closing for LiDAR-based SLAM[DB/OL]. (2021-05-24). https://arxiv.org/abs/2105.11344.
[14] [14] Ma J Y, Zhang J, Xu J T, et al. Overlap Transformer: an efficient and yaw-angle-invariant transformer network for LiDAR-Based place recognition[J]. IEEE Robotics and Automation Letters, 2022, 7(3): 6958-6965.
[15] [15] Guadagnino T, Chen X, Sodano M, et al. Fast sparse LiDAR odometry using self-supervised feature selection on intensity images[J]. IEEE Robotics and Automation Letters, 2022, 7(3): 7597-7604.
[16] [16] DeTone D, Malisiewicz T, Rabinovich A. SuperPoint: self-supervised interest point detection and description[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops. Salt Lake City: CVF, 2018: 224-236.
[17] [17] Liu Z, Liu X Y, Zhang F. Efficient and consistent bundle adjustment on lidar point clouds[J]. IEEE Transactions on Robotics, 2023, 39(6): 4366-4386.
[18] [18] Chen Y K, Liu J H, Zhang X Y, et al. Voxelnext: fully sparse voxelnet for 3D object detection and tracking[C]//2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). Vancouver: IEEE, 2023: 21674-21683.
[21] [21] Anon. KITTI datasets[DB/OL]. http://www.cvlibs.net/datasets/kitti.
Get Citation
Copy Citation Text
YANG Kui, LIANG Dong-tai, HU Sheng-hui. LiDAR SLAM method based on BA/NDT and LOAM fusion[J]. Laser & Infrared, 2025, 55(7): 1012
Category:
Received: Oct. 8, 2024
Accepted: Sep. 12, 2025
Published Online: Sep. 12, 2025
The Author Email: LIANG Dong-tai (liangdongtai@nbu.edu.cn)