Chinese Journal of Lasers, Volume. 49, Issue 6, 0610001(2022)

Robot Pose Estimation Method Based on Image and Point Cloud Fusion with Dynamic Feature Elimination

Lei Zhang1,2, Xiaobin Xu1,2,3,4、*, Chenfei Cao1,2, Jia He1,2, Yngying Ran1,2, Zhiying Tan1,2, and Minzhou Luo1,2
Author Affiliations
  • 1College of Mechanical & Electrical Engineering, Hohai University, Changzhou, Jiangsu 213022, China
  • 2Jiangsu Key Laboratory of Special Robot Technology, Hohai University, Changzhou, Jiangsu 213022, China
  • 3College of Mechanical and Electrical Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing, Jiangsu 210016, China
  • 4Changzhou Changgong Electronic Technology Co., Ltd., Changzhou, Jiangsu 213001, China
  • show less
    Figures & Tables(13)
    Overall framework of pose estimation
    Dynamic experimental scenes. (a) 05 sequence experiment scene; (b) 08 sequence experiment scene
    Elimination of dynamic features. (a) Elimination of dynamic point clouds; (b) elimination of dynamic feature points
    Box plots of pose error. (a) Angular error box plot; (b) displacement error box plot
    Experimental platform and experimental scene. (a) Experimental platform; (b) experimental scene
    Candidate frames extraction by deep learning. (a) Image candidate frame; (b) point cloud candidate frame
    Running time box plot
    • Table 1. Nonlinear optimization algorithm based on RANSAC

      View table

      Table 1. Nonlinear optimization algorithm based on RANSAC

      Input:Plast;pcurrent;Camera internal referenceK; Jacobian matrix of error function for poseJ
      Output: Camera pose of the current frameT
      1:Use EPNP to initially solve the camera poseT0
      2: TT0
      3:Sum of reprojection errors etotal=0
      4: fori=1→The maximum number of iterations n do
      5: forPjPlastpjpcurrentdo
      6:Reprojection error ej=|pjPjKT/si|
      7:ifej>Error threshold t0then
      8:recordPj andpj
      9:continue
      10:end if
      11:etotaletotalej
      12:Calculate the Hessian matrixHJJT,以及g=-Jej
      13:Calculate the iteration value ΔTH-1g
      14:T=ΔTT
      15:end for
      16:Eliminate the recordedPj andpj
      17: end for
    • Table 2. Mean error of comparison algorithms in different scenarios

      View table

      Table 2. Mean error of comparison algorithms in different scenarios

      SequenceAlgorithmPitch angle /(°)Yaw angle /(°)Roll angle /(°)x /my /mz /m
      05BA0.06690.03910.03630.01970.01040.0136
      Visual0.03020.02730.03800.01260.00870.0192
      LOAM0.03760.01380.03770.01600.00910.0093
      LIDAR0.03740.01480.03830.01600.00940.0107
      08BA0.04510.03590.04780.03180.01960.0225
      Visual0.01060.02480.01350.00960.00310.0059
      LOAM0.00520.00970.00490.00550.00160.0060
      LIDAR0.00490.00800.00460.00520.00150.0073
    • Table 3. Error standard deviation of comparison algorithms in different scenarios

      View table

      Table 3. Error standard deviation of comparison algorithms in different scenarios

      SequenceAlgorithmPitch angle /(°)Yaw angle /(°)Roll angle /(°)x /my /mz /m
      05BA0.09300.02630.03210.00990.00800.0103
      Visual0.03920.02160.02950.01120.00690.0131
      LOAM0.03220.00750.02210.00430.00740.0054
      LIDAR0.03300.00830.02150.00440.00740.0057
      08BA0.04310.03050.04460.03500.01660.0306
      Visual0.01290.01540.01150.00950.00140.0053
      LOAM0.00460.00540.00290.00380.00100.0044
      LIDAR0.00470.00710.00270.00360.00090.0041
    • Table 4. Mean error of different algorithms

      View table

      Table 4. Mean error of different algorithms

      SequenceAlgorithmPitch angle /(°)Yaw angle /(°)Roll angle /(°)x /my /mz /m
      05BA[29]0.06690.03910.03630.01970.01040.0136
      ORBSLAM2[19]0.03070.01140.02830.00990.02230.0243
      LOAM[23]0.03760.01380.03770.01600.00910.0093
      Fusion0.03320.00810.02210.00460.00700.0057
      08BA[29]0.04510.03590.04780.03180.01960.0225
      ORBSLAM2[19]0.00670.01770.00760.00460.00230.0068
      LOAM[23]0.00520.00970.00490.00550.00160.0060
      Fusion0.00600.00860.00640.00500.00180.0050
    • Table 5. Comparison of experimental results of relative pose errors in indoor dynamic scenes

      View table

      Table 5. Comparison of experimental results of relative pose errors in indoor dynamic scenes

      AlgorithmPitch angle /(°)Yaw angle /(°)Roll angle /(°)x /my /mz /m
      BA0.02551.14810.01340.01230.00160.1018
      ORBSLAM20.03230.09010.04160.00580.00380.0033
      Visual0.00840.35680.01060.00860.00260.0134
      LOAM0.07790.09530.25830.00710.00980.0019
      LIDAR0.00480.01980.10410.00580.00240.0052
      Fusion0.00350.01780.09260.00440.00240.0060
    • Table 6. Average running time of different algorithms

      View table

      Table 6. Average running time of different algorithms

      AlgorithmAverage running time /s
      BA0.0317
      ORBSLAM20.1462
      LOAM0.0242
      Fusion0.2974
    Tools

    Get Citation

    Copy Citation Text

    Lei Zhang, Xiaobin Xu, Chenfei Cao, Jia He, Yngying Ran, Zhiying Tan, Minzhou Luo. Robot Pose Estimation Method Based on Image and Point Cloud Fusion with Dynamic Feature Elimination[J]. Chinese Journal of Lasers, 2022, 49(6): 0610001

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Received: Jun. 9, 2021

    Accepted: Aug. 10, 2021

    Published Online: Mar. 2, 2022

    The Author Email: Xu Xiaobin (xxbtc@hhu.edu.cn)

    DOI:10.3788/CJL202249.0610001

    Topics