Optical Technique, Volume. 51, Issue 2, 233(2025)

Robot movement tracking measurement system based on photogrammetry

XIA Changqing1, WU Long2、*, PENG Wenjun2, and Lu Guanhan1
Author Affiliations
  • 1CRRC Changchun Railway Vehicles Co.,LTD, Changchun 130062, China
  • 2Wuxi CRRC Times Intelligent Equipment Co., Ltd, Wuxi 214174, China
  • show less

    How to improve the fusion accuracy of measurement range and multi-position measurement data is the key to realize high precision measurement of large components. A mobile robot tracking measurement visual tracking system based on photogrammetry is designed to realize the fusion of multi-position measurement data with large precision and complete the measurement of high-speed train body. Specifically, aiming at the problem that the high-speed train body size is up to 25m×4m×4m and the range of single measuring equipment is limited, this system realizes the measurement of a wide range of the system through the robot grasping structural light equipment and guide rail to move the robot. In order to achieve high-precision measurement data fusion in multi-position pose, the system uses multiple monocular cameras to track the movement of measurement cameras, realizing rapid measurement data in multi-position pose and accurately aligning the tracking data to the same coordinate system. The system and laser tracker T-Scan measuring system are respectively used to measure the size of the door, and the experiment verifies that the measuring accuracy of the designed measuring system is equivalent to that of the laser tracker T-Scan measuring system.

    Tools

    Get Citation

    Copy Citation Text

    XIA Changqing, WU Long, PENG Wenjun, Lu Guanhan. Robot movement tracking measurement system based on photogrammetry[J]. Optical Technique, 2025, 51(2): 233

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Oct. 25, 2024

    Accepted: Apr. 22, 2025

    Published Online: Apr. 22, 2025

    The Author Email: WU Long (wuloong@hust.edu.cn)

    DOI:

    Topics