APPLIED LASER, Volume. 44, Issue 4, 113(2024)

Research on 3D Multimodal Mapping Based on Lidar-Vision Fusion

Yang Xudong, Lai Huige*, Kang Wen, Wang Peng, Tao Han, and Li Shaodong
Author Affiliations
  • School of Mechanical Engineering, Ningxia University, Yinchuan 750021, Ningxia, China
  • show less
    References(26)

    [1] [1] CADENA C, CARLONE L, CARRILLO H, et al. Past, present, and future of simultaneous localization and mapping: Toward the robust-perception age[J]. IEEE Transactions on Robotics, 2016, 32(6): 1309-1332.

    [2] [2] BOSSE M, ZLOT R, FLICK P. Zebedee: Design of a spring-mounted 3-D range sensor with application to mobile mapping[J]. IEEE Transactions on Robotics, 2012, 28(5): 1104-1119.

    [3] [3] BOSSE M, ZLOT R. Continuous 3D scan-matching with a spinning 2D laser[C]//2009 IEEE International Conference on Robotics and Automation. Kobe, Japan. IEEE, 2009: 4312-4319.

    [4] [4] PALIERI M, MORRELL B, THAKUR A, et al. LOCUS: A multi-sensor lidar-centric solution for high-precision odometry and 3D mapping in real-time[J]. IEEE Robotics and Automation Letters, 2021, 6(2): 421-428.

    [6] [6] ZHOU Y, YAN F H, ZHOU Z. Handling pure camera rotation in semi-dense monocular SLAM[J]. The Visual Computer, 2019, 35(1): 123-132.

    [7] [7] LI X, HE Y J, LIN J L, et al. Leveraging planar regularities for point line visual-inertial odometry[C]//2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Las Vegas, NV, USA. IEEE, 2020: 5120-5127.

    [8] [8] ZHANG X Y, WANG W, QI X Y, et al. Stereo plane SLAM based on intersecting lines[C]//2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Prague, Czech Republic; IEEE, 2021: 6566-6572.

    [10] [10] WEI X Y, HUANG J, MA X Y. Real-time monocular visual SLAM by combining points and lines[C]//2019 IEEE International Conference on Multimedia and Expo (ICME). Shanghai, China. IEEE, 2019: 103-108.

    [11] [11] MICUSIK B, WILDENAUER H. Structure from motion with line segments under relaxed endpoint constraints[J]. International Journal of Computer Vision, 2017, 124(1): 65-79.

    [16] [16] DAVISON A J, REID I D, MOLTON N D, et al. MonoSLAM: Real-time single camera SLAM[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2007, 29(6): 1052-1067.

    [17] [17] KLEIN G, MURRAY D. Parallel tracking and mapping for small AR workspaces[C]//2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality. Nara, Japan. IEEE, 2007: 225-234.

    [18] [18] KLEIN G, MURRAY D. Parallel tracking and mapping on a camera phone[C]//2009 8th IEEE International Symposium on Mixed and Augmented Reality. Orlando, FL, USA. IEEE, 2009: 83-86.

    [19] [19] MUR-ARTAL R, MONTIEL J M M, TARDS J D. ORB-SLAM: A versatile and accurate monocular SLAM system[J]. IEEE Transactions on Robotics, 2015, 31(5): 1147-1163.

    [20] [20] YANG Z F, GAO F, SHEN S J. Real-time monocular dense mapping on aerial robots using visual-inertial fusion[C]//2017 IEEE International Conference on Robotics and Automation (ICRA). Singapore. IEEE, 2017: 4552-4559.

    [21] [21] LI Q Q, YU X J, QUERALTA J, et al. Robust multi-modal multi-LiDAR -inertialodometry and mapping for indoor environments[J]. arXiv-CS-Robtics: 2303.02684.

    [22] [22] TIAN Y Z, LIU X N, LI L, et al. Intensity-assisted ICP for fast registration of 2D-LIDAR[J]. Sensors, 2019, 19(9): 2124.

    [23] [23] ZHANG Z. A flexible new technique for camera calibration[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(11): 1330-1334.

    [24] [24] JURI A, KENDE F, MARKOVI I, et al. A comparison of graph optimization approaches for pose estimation in SLAM[C]//2021 44th International Convention on Information, Communication and Electronic Technology (MIPRO). Opatija, Croatia. IEEE, 2021: 1113-1118.

    [25] [25] SHAN T X, ENGLOT B, RATTI C, et al. LVI-SAM: Tightly-coupled lidar-visual-inertial odometry via smoothing and mapping[C]//2021 IEEE International Conference on Robotics and Automation (ICRA). Xi′an, China. IEEE, 2021: 5692-5698.

    [26] [26] BESL P J, MCKAY N D. Method for registration of 3D shapes[C]//Proc SPIE 1611, Sensor Fusion IV: Control Paradigms and Data Structures, [s. l.]: SPIE, 1992, 1611: 586-606.

    [27] [27] YE H Y, CHEN Y Y, LIU M. Tightly coupled 3D lidar inertial odometry and mapping[C]//2019 International Conference on Robotics and Automation (ICRA). Montreal, QC, Canada. IEEE, 2019: 3144-3150.

    [28] [28] XU W, ZHANG F. FAST-LIO: A fast, Robust LiDAR-inertial odometry package by tightly-coupled iterated kalman filter[C]//In IEEE Robotics and Automation Letters, 2021, 6(2): 3317-3324.

    [29] [29] SHAN T X, ENGLOT B, MEYERS D, et al. LIO-SAM: Tightly-coupled lidar inertial odometry via smoothing and mapping[C]//2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).[s. l.]: ACM, 2020: 5135-5142.

    [30] [30] FORSTER C, CARLONE L, DELLAERT F, et al. On-manifold preintegration for real-time visual: Inertial odometry[J]. IEEE Transactions on Robotics, 2017, 33(1): 1-21.

    [31] [31] SOLA J. Quaternion kinematics for the error-state Kalman filter[J]. arxiv preprint arxiv: 1711.02508, 2017.

    [32] [32] BAO S, SHI W Z, FAN W Z, et al. A tight coupling mapping method to integrate the ESKF, g2o, and point cloud alignment[J]. The Journal of Supercomputing, 2022, 78(2): 1903-1922.

    Tools

    Get Citation

    Copy Citation Text

    Yang Xudong, Lai Huige, Kang Wen, Wang Peng, Tao Han, Li Shaodong. Research on 3D Multimodal Mapping Based on Lidar-Vision Fusion[J]. APPLIED LASER, 2024, 44(4): 113

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: May. 17, 2023

    Accepted: Dec. 13, 2024

    Published Online: Dec. 13, 2024

    The Author Email: Lai Huige (1491081634@qq.com)

    DOI:10.14128/j.cnki.al.20244404.113

    Topics