Journal of Applied Optics, Volume. 45, Issue 2, 307(2024)

High-precision occlusion-resistant quad-vision camera 3D reconstruction system

Peng CHEN1, Huiting LIU1, Lei ZHANG1, Keyi WANG1、*, and Bolin CAI2
Author Affiliations
  • 1Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China, Hefei 230026, China
  • 2School of Internet, Anhui University, Hefei 230039, China
  • show less

    Multi-camera for 3D reconstruction can improve the accuracy and overcome occlusion, allowing for the acquisition of 3D position of targets from multiple viewpoints. In order to more accurately recover the distribution of targets in space, a convergent quad-vision camera 3D reconstruction system was introduced. A reconstruction platform was designed and built with four cameras evenly distributed around the target scene. After calibrating the relative pose of adjacent cameras in the system, the position and pose of each camera in a unified coordinate system were obtained through coordinate system transformation. The pose of the camera with the most transformations was verified, and the measurement results were consistent with those derived from transformation. A chessboard target array of size 66×65 was reconstructed, with a maximum relative error of 0.061% within a range of 45 mm. Compared with the fitted results, the root-mean-square (RMS) error was 0.319 3 μm. By using a metal block for reconstruction experiments, the shape could be recovered through its vertices. Experimental results show that the device can be used in high-precision and occlusion-resistant 3D reconstruction systems.

    Keywords
    Tools

    Get Citation

    Copy Citation Text

    Peng CHEN, Huiting LIU, Lei ZHANG, Keyi WANG, Bolin CAI. High-precision occlusion-resistant quad-vision camera 3D reconstruction system[J]. Journal of Applied Optics, 2024, 45(2): 307

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Research Articles

    Received: Mar. 30, 2023

    Accepted: --

    Published Online: May. 28, 2024

    The Author Email: WANG Keyi (王克逸(1962—))

    DOI:10.5768/JAO202445.0201005

    Topics