Acta Photonica Sinica, Volume. 53, Issue 4, 0415001(2024)

Global Low Bias Visual/inertial/weak-positional-aided Fusion Navigation System

Yufeng XU, Yuanzhi LIU, Minghui QIN, Hui ZHAO, and Wei TAO*
Author Affiliations
  • School of Sensing Science and Engineering,School of Electronic Information and Electrical Engineering, Shanghai Jiao Tong University,Shanghai 200240,China
  • show less
    Figures & Tables(14)
    Overall framework of our method
    Factor graph model
    ArUco target coordinate system definition and coordinate transformation between camera and ArUco target
    Diagram of the experimental site
    Wheeled robot platform and its sensor distribution
    Laser point cloud map generation and trajectory truth value generation
    Navigation result trajectories of each method under three scenarios
    Outdoor scene feature point tracking at night
    Examples of ArUco target distribution can be seen in the path and ArUco target detection
    Navigation trajectory comparison with or without ArUco assistance
    • Table 1. System hardware parameter

      View table
      View in Article

      Table 1. System hardware parameter

      Type of sensorPictureModelFrequency/HzSpecification
      Gray cameraDALSA M193020Resolution: 960×600
      IMUXsens Mti-680G400Noise: 0.002°/s Hz-0.5 and 0.01 ms-2 Hz-0.5
      GNSSu-blox ZED-F9P-01B10Horizontal pos accuracy (PVT): 1.5 m CEP
      Ultrasonic tagMarvelmind Super-Beacon0.1~7Accuracy: 1%~3% (Base station distance)
      LiDARVelodyne VLP1610FoV: 360°×30° Accuracy: ±3 cm@100 m
    • Table 2. Error test results under different experimental conditions

      View table
      View in Article

      Table 2. Error test results under different experimental conditions

      SequenceEvaluation indexError results (mean/median/RMSE)
      OursVINS-MonoORB-SLAM3
      IndoorRPE/%↓0.837/0.669/1.0441.401/1.317/1.5734.193/4.496/4.543
      ATE/m↓0.619/0.556/0.7403.359/3.587/3.5233.041/2.759/3.653
      Daytime in-outdoorRPE/%↓2.622/2.177/3.2343.039/2.382/3.7867.091/7.148/7.426
      ATE/m↓2.813/2.627/3.3255.017/4.867/5.4358.861/8.262/10.32
      Nighttime in-outdoorRPE/%↓2.028/1.743/2.2785.451/3.323/7.38710.32/9.899/11.61
      ATE/m↓2.988/3.108/3.4959.313/8.983/10.7712.55/10.62/15.02
    • Table 3. Error test results with or without ArUco assistance

      View table
      View in Article

      Table 3. Error test results with or without ArUco assistance

      Evaluation indexError results (mean/median/RMSE)
      VIO+ArUco (ours)VIO
      RPE/%↓3.518/3.182/4.0233.645/3.394/4.106
      ATE/m↓2.917/2.935/3.0456.412/7.132/6.683
    • Table 4. The single processing run time of this method in different scenarios

      View table
      View in Article

      Table 4. The single processing run time of this method in different scenarios

      IndoorDaytime outdoorNighttime outdoorDaytime in-outdoorNighttime in-outdoor
      Average time spent/ms29.0423.5127.3923.5427.95
    Tools

    Get Citation

    Copy Citation Text

    Yufeng XU, Yuanzhi LIU, Minghui QIN, Hui ZHAO, Wei TAO. Global Low Bias Visual/inertial/weak-positional-aided Fusion Navigation System[J]. Acta Photonica Sinica, 2024, 53(4): 0415001

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Machine Vision

    Received: Oct. 26, 2023

    Accepted: Dec. 13, 2023

    Published Online: May. 15, 2024

    The Author Email: Wei TAO (taowei@sjtu.edu.cn)

    DOI:10.3788/gzxb20245304.0415001

    Topics