Acta Photonica Sinica, Volume. 53, Issue 4, 0415001(2024)

Global Low Bias Visual/inertial/weak-positional-aided Fusion Navigation System

Yufeng XU, Yuanzhi LIU, Minghui QIN, Hui ZHAO, and Wei TAO*
Author Affiliations
  • School of Sensing Science and Engineering,School of Electronic Information and Electrical Engineering, Shanghai Jiao Tong University,Shanghai 200240,China
  • show less
    References(25)

    [1] WANG Z, WU Y, NIU Q. Multi-sensor fusion in automated driving: a survey[J]. IEEE Access, 8, 2847-2868(2020).

    [2] CHEN C, ZHU H, LI M et al. A review of visual-inertial simultaneous localization and mapping from filtering-based and optimization-based perspectives[J]. Robotics, 7, 45(2018).

    [3] ZHANG Yu, XU Xiping, ZHANG Ning et al. Research on visual odometry based on catadioptric panoramic camera[J]. Acta Photonica Sinica, 50, 0415002(2021).

    [4] LI M, MOURIKIS A I. Improving the accuracy of EKF-based visual-inertial odometry[C], 828-835(2012).

    [5] GENEVA P, ECKENHOFF K, LEE W et al. OpenVINS: a research platform for visual-inertial estimation[C], 4666-4672(2020).

    [6] QIN T, LI P, SHEN S. VINS-Mono: a robust and versatile monocular visual-inertial state estimator[J]. IEEE Transactions on Robotics, 34, 1004-1020(2018).

    [7] CAMPOS C, ELVIRA R, RODRÍGUEZ J J G et al. ORB-SLAM3: an accurate open-source library for visual, visual-inertial, and multimap SLAM[J]. IEEE Transactions on Robotics, 37, 1874-1890(2021).

    [8] EBADI K, CHANG Y, PALIERI M et al. LAMP: large-scale autonomous mapping and positioning for exploration of perceptually-degraded subterranean environments[C], 80-86(2020).

    [9] LI T, PEI L, XIANG Y et al. P3-VINS: tightly-coupled PPP/INS/visual SLAM based on optimization approach[J]. IEEE Robotics and Automation Letters, 7, 7021-7027(2022).

    [10] CHU Jinkui, CHEN Jianhua, LI Jinshan et al. Polarized light/binocular vision bionic integrated navigation method[J]. Acta Photonica Sinica, 50, 0528001(2021).

    [11] GONG Z, LIU P, WEN F et al. Graph-based adaptive fusion of GNSS and VIO under intermittent GNSS-degraded environment[J]. IEEE Transactions on Instrumentation and Measurement, 70, 1-16(2021).

    [12] QIN T, PAN J, CAO S et al. A general optimization-based framework for local odometry estimation with multiple sensors[J]. arXiv Preprint(2019).

    [13] CAO S, LU X, SHEN S. GVINS: tightly coupled GNSS-visual-inertial fusion for smooth and consistent state estimation[J]. IEEE Transactions on Robotics, 38, 2004-2021(2022).

    [14] ALARIFI A, AL-SALMAN A, ALSALEH M et al. Ultra wideband indoor positioning technologies: analysis and recent advances[J]. Sensors, 16, 707(2016).

    [15] LUTZ P, SCHUSTER M J, STEIDLE F. Visual-inertial SLAM aided estimation of anchor poses and sensor error model parameters of UWB radio modules[C], 739-746(2019).

    [16] ZIZZO G, REN L. Position tracking during human walking using an integrated wearable sensing system[J]. Sensors, 17, 2866(2017).

    [17] OŠČÁDAL P, HECZKO D, VYSOCKÝ A et al. Improved pose estimation of Aruco tags using a novel 3D placement strategy[J]. Sensors, 20, 4825(2020).

    [18] ROMERO-RAMIREZ F J, MUÑOZ-SALINAS R, MEDINA-CARNICER R. Speeded up detection of squared fiducial markers[J]. Image and Vision Computing, 76, 38-47(2018).

    [19] PFROMMER B, SANKET N, DANIILIDIS K et al. PennCOSYVIO: a challenging visual inertial odometry benchmark[C], 3847-3854(2017).

    [20] HU Yue, LI Xu, XU Qimin et al. Reliable positioning method of intelligent vehicles based on factor graph in GNSS-denied environment[J]. Chinese Journal of Scientific Instrument, 42, 79-86(2021).

    [21] KBAYER N, SAHMOUDI M. Performances analysis of GNSS NLOS bias correction in urban environment using a three-dimensional city model and GNSS simulator[J]. IEEE Transactions on Aerospace and Electronic Systems, 54, 1799-1814(2018).

    [22] WANG S, DONG X, LIU G et al. GNSS RTK/UWB/DBA fusion positioning method and its performance evaluation[J]. Remote Sensing, 14, 5928(2022).

    [23] QIN T, SHEN S. Online temporal calibration for monocular visual-inertial systems[C], 3662-3669(2018).

    [24] LIU Y, FU Y, QIN M et al. BotanicGarden: a high-quality and large-scale robot navigation dataset in challenging natural environments[J]. arXiv Preprint(2023).

    [25] STURM J, ENGELHARD N, ENDRES F et al. A benchmark for the evaluation of RGB-D SLAM systems[C], 573-580(2012).

    Tools

    Get Citation

    Copy Citation Text

    Yufeng XU, Yuanzhi LIU, Minghui QIN, Hui ZHAO, Wei TAO. Global Low Bias Visual/inertial/weak-positional-aided Fusion Navigation System[J]. Acta Photonica Sinica, 2024, 53(4): 0415001

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Machine Vision

    Received: Oct. 26, 2023

    Accepted: Dec. 13, 2023

    Published Online: May. 15, 2024

    The Author Email: Wei TAO (taowei@sjtu.edu.cn)

    DOI:10.3788/gzxb20245304.0415001

    Topics