Laser & Optoelectronics Progress, Volume. 59, Issue 14, 1415002(2022)

Research Progresses of Pose Estimation Based on Virtual Cameras

Anhu Li†、*, Zhaojun Deng1、†, Xingsheng Liu, and Hao Chen
Author Affiliations
  • School of Mechanical Engineering, Tongji University, Shanghai 201804, China
  • show less
    Figures & Tables(13)
    Schematic of a laser tracker[20]
    Basic schematic diagram of mainstream inertial-unit-based pose estimation system[32-33]. (a) Pose estimation system based on laser inertial units; (b) pose estimation system based on fiber optic inertial units; (c) pose estimation system based on MEMS inertial units
    Virtual camera imaging system based on a bipartite prism. (a) Virtual viewpoint imaging system[55]; (b) virtual camera imaging system[56]
    Virtual camera imaging system based on a micro-prism array[64-65]. (a) System schematic; (b) reconstruction principle; (c) beam propagation principle; (d) system composition; (e) traditional endoscopic imaging; (f) virtual camera imaging
    Virtual camera imaging system based on a grating diffraction[68]
    Mirror-based adjustable virtual camera imaging system[70-72]. (a) Single mirror; (b) double mirrors; (c) triple mirrors; (d) triple mirrors with a beam splitter; (e) four mirrors; (f) dual mirrors with a prismatic mirror
    Schematic illustration of 3D imaging using FMCW LiDAR[81]. (a) System layout; (b) multi-beam scanning mechanism using Risley prisms
    Pose estimation system based on dynamic virtual cameras[86]
    Basic principle of target pose estimation based on point feature[90]
    Pose estimation based on CAD template matching[104]
    Basic principle of pose estimation method based on graph neural network[133]
    • Table 1. Performance comparison of mainstream pose estimation systems

      View table

      Table 1. Performance comparison of mainstream pose estimation systems

      NameAccuracyEfficiencyTargetRangeTouchingAdaptabilityDynamic measurementCost
      Coordinate measuring arm

      Angle:1″~2″

      Position:5-10 μm+3.3 μm/m

      +N≤5 mY+++N++
      Laser tracker

      Angle:1″~2″

      Position:10-20 μm+6 μm/m

      +++Y≤80 mN++Y+++
      Total station

      Angle:0.5″~1″

      Position:0.6 mm±1 mm/m

      ++Y≥100 mN+++N++
      IMU

      Angle:0.3°

      Position:2 cm

      ++NN++Y++
      Monocular+rangefinder

      Angle:1°

      Position:3 mm

      ++N≤30 mN++Y+
      Binocular

      Angle:0.2°

      Position:0.1 mm+1.2 mm/m

      ++N/Y≤10 mN+Y+
      Biprism-based systemPosition:≤2.36%+++N/Y≤50 mmN++Y+
      Mirror-based system

      Angle:≤1°

      Position:≤12 mm

      ++N/Y≥60 mmN++Y+
      Rotating-prism-based system

      Angle:≤0.4°

      Position:≤0.4 mm

      ++N/Y≥80 mmN+++Y+
    • Table 2. Comparison of mainstream pose estimation methods

      View table

      Table 2. Comparison of mainstream pose estimation methods

      CategoryMethodAccuracyTime consumingRobustnessOnline performanceScope of application
      Pose estimation based on 2D informationTarget feature++++++++++
      Template matching++++++++++
      Pose estimation based on 3D informationFeature matching+++++++++
      Absolute orientation++++/++++++
      Pose estimation based on deep learningDirect regression++++++++++
      Indirect regression+++++++++++
    Tools

    Get Citation

    Copy Citation Text

    Anhu Li, Zhaojun Deng, Xingsheng Liu, Hao Chen. Research Progresses of Pose Estimation Based on Virtual Cameras[J]. Laser & Optoelectronics Progress, 2022, 59(14): 1415002

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Machine Vision

    Received: Mar. 30, 2022

    Accepted: May. 11, 2022

    Published Online: Jul. 1, 2022

    The Author Email: Anhu Li (lah@tongji.edu.cn)

    DOI:10.3788/LOP202259.1415002

    Topics