Laser & Optoelectronics Progress, Volume. 57, Issue 10, 101509(2020)

Inertial Navigation Aided Image Feature Matching Method

Bin Wu and Xuri Wang*
Author Affiliations
  • State Key Laboratory of Precision Measuring Technology and Instruments, Tianjin University, Tianjin 300072, China
  • show less
    Figures & Tables(15)
    Process of three-dimensional rotation. (a) Process of rotation; (b) three-dimensional rotation
    Process of coordinate transformation
    Calculation process of displacement variation in navigation coordinate system
    Epipolar geometric constraint
    Process of inertial navigation aided image feature matching
    Experimental equipment
    Source images. (a) First image of experimental table; (b) second image of experimental table; (c) first image of laboratory; (d) second image of laboratory; (e) first image of office; (f) second image of office
    Brute-force match results of three sets of images
    Possible areas. (a) Four feature points selected in the first image of experimental table; (b) possible areas corresponding to the feature points in the first image of experimental table; (c) four feature points selected in the first image of laboratory; (d) possible areas corresponding to the feature points in the first image of laboratory; (e) four feature points selected in the first image of office; (f) possible areas corresponding to the feature points in the first image of office
    Match results of three sets of images after adding constraints
    Match results of three sets of images after removing outliers by RANSAC
    • Table 1. Rotation matrix

      View table

      Table 1. Rotation matrix

      SceneRotation matrix
      Experimentaltable0.948665-0.0059610.3162280.0081230.999952-0.005519-0.3161800.0078040.948667
      Laboratory0.9985340.004594-0.0539260.0020840.9923890.1231240.054082-0.1230560.990925
      Office0.996481-0.048260-0.0685320.0616190.9760500.2086390.056821-0.2121280.975589
    • Table 2. Translation vector

      View table

      Table 2. Translation vector

      SceneTranslation vector /mm
      Experimentaltable13.4731210.0045002.186387T
      Laboratory9.200683-4.7711470.862319T
      Office4.8232901.268367-0.356531T
    • Table 3. Fundamental matrix

      View table

      Table 3. Fundamental matrix

      SceneFundamental matrix
      Experimental table-1.565037×10-9-1.791871×10-71.964372×10-45.191469×10-7-9.682601×10-9-0.004100-5.179737×10-40.004090-0.100741
      Laboratory-2.130416×10-8-2.201791×10-8-1.334185×10-32.979019×10-89.308407×10-8-2.758886×10-31.364791×10-32.548693×10-39.785125×10-1
      Office7.710623×10-96.469725×10-93.591123×10-4-5.158159×10-88.523534×10-8-1.365610×10-3-2.318040×10-41.266611×10-39.907891×10-1
    • Table 4. Matching results of three sets of images

      View table

      Table 4. Matching results of three sets of images

      SceneMatching points of inertial navigationaided image feature matching methodMatching pointsof RANSACAccuracy rate /%
      Experimental table14513492.4
      Laboratory29328396.6
      Office17116194.2
    Tools

    Get Citation

    Copy Citation Text

    Bin Wu, Xuri Wang. Inertial Navigation Aided Image Feature Matching Method[J]. Laser & Optoelectronics Progress, 2020, 57(10): 101509

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Machine Vision

    Received: Nov. 9, 2019

    Accepted: Dec. 6, 2019

    Published Online: May. 8, 2020

    The Author Email: Xuri Wang (wangxuri@tju.edu.cn)

    DOI:10.3788/LOP57.101509

    Topics