Infrared and Laser Engineering, Volume. 51, Issue 8, 20210651(2022)

Research on a real-time odometry system integrating vision, LiDAR and IMU for autonomous driving

Yaozhong Zhao1, Jinlong Xian1, and Wei Gao2、*
Author Affiliations
  • 1Yimin Open Mine, China Huaneng Group Yimin Coal and Power Co., Hulunbeier 021134, China
  • 2Beijing Baidu Netcom Science Technology Co., Ltd., Beijing 100089, China
  • show less
    References(14)

    [1] [1] Zuo X, Geneva P, Lee W, et al. LICfusion: LiDARinertialcamera odometry[C]IROS IEEE, 2019: 58485854.

    [2] [2] Shan T, Englot B, Ratti C, et al. LVISAM: Tightlycoupled lidarvisualinertial odometry via smoothing mapping[EBOL]. (20210422)[20210909]. http:arxiv.gabs2104.10831.

    [3] Ji Z, Singh S. Laser-visual-inertial odometry and mapping with high robustness and low drift[J]. Journal of Field Robotics, 35, 1242-1264(2018).

    [4] [4] Lin J, Zheng C, Xu W, et al. R2 LIVE: A robust, realtime, lidarinertialvisual tightlycoupled state estimat mapping [EBOL]. (20210910)[20210909]. http:arxiv.gabs2109.07982.

    [5] [5] Whelan T. ICPCUDA[EBOL]. (20190501)[20210909]. htps:github.commp3 guyICPCUDA.

    [6] [6] Jianbo Shi, Tomasi C. Good features to track[C]9th IEEE Conference on Computer Vision Pattern Recognition. Singape: Springer,1994: 593–600.

    [7] [7] Detone D, Malisiewicz T, Rabinovich A. SuperPoint: Selfsupervised interest point detection deion[C] 2018 IEEECVF Conference on Computer Vision Pattern Recognition Wkshops (CVPRW). IEEE, 2018.

    [8] [8] Sarlin P E, Detone D, Malisiewicz T, et al. SuperGlue: Learning feature matching with graph neural wks[C]2020 IEEECVF Conference on Computer Vision Pattern Recognition (CVPR), 2020: 49374946.

    [9] [9] Graeter J, Wilczynski A, Lauer M. LIMO: Lidarmonocular visual odometry[C]2018 IEEERSJ International Conference on Intelligent Robots Systems (IROS). IEEE, 2019.

    [10] [10] Ji Z, Kaess M, Singh S. On degeneracy of optimizationbased state estimation problems. [C]2016 IEEE International Conference on Robotics Automation (ICRA). IEEE, 2016.

    [11] Wang Shuai, Sun Huayan, Guo Huichao. Overlapping region extraction method for laser point clouds registration[J]. Infrared and Laser Engineering, 46, S126002(2017).

    [12] Qin Tong, Li Peiliang, Shen Shaojie. VINS-mono: A robust and versatile monocular visual-inertial state estimator[J]. IEEE Transactions on Robotics, 34, 1004-1020(2017).

    [13] Yu Jiayong, Cheng Lang, Tian Maoyi, et al. Boresight parameters calibration method of VMLS system based on reference planar features constraint[J]. Infrared and Laser Engineering, 49, 20190524(2020).

    [14] [14] Geiger A, Lenz P, Urtasun R. Are we ready f autonomous driving The KITTI vision benchmark suite[C]2012 IEEE Conference on Computer Vision Pattern Recognition, 2012: 33543361.

    Tools

    Get Citation

    Copy Citation Text

    Yaozhong Zhao, Jinlong Xian, Wei Gao. Research on a real-time odometry system integrating vision, LiDAR and IMU for autonomous driving[J]. Infrared and Laser Engineering, 2022, 51(8): 20210651

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Lasers & Laser optics

    Received: Sep. 9, 2021

    Accepted: Nov. 2, 2021

    Published Online: Jan. 9, 2023

    The Author Email: Wei Gao (xwgaowei@163.com)

    DOI:10.3788/IRLA20210651

    Topics