Infrared and Laser Engineering, Volume. 50, Issue 11, 20210071(2021)

Camera calibration method based on double neural network

Wenyi Chen1...2, Jie Xu1,* and Hui Yang1 |Show fewer author(s)
Author Affiliations
  • 1Industry School of Modern Post, Xi’an University of Posts and Telecommunications, Xi’an 710061, China
  • 2Collaborative Innovation Center for Modern Post, Xi’an University of Posts and Telecommunications, Xi’an 710121, China
  • show less

    In computer vision, camera calibration as the premise of camera measurement technology, is an essential part. Aiming at the problem that the training accuracy of camera calibration method based on neural network is not high enough, a camera calibration method based on double neural network was proposed. Starting from the imaging model, it was deduced that the camera coordinate ${Z_{\text{c}}}$ was a function of the world coordinate and the pixel coordinate. On the basis of considering ${Z_{\text{c}}}$, the imaging model was simplified into two function relations, and two neural networks were used for calibration, which not only differentiated the task amount of single neural network, but also fully followed the imaging model. The experimental results show that compared with other calibration methods based on neural network, this method improves the accuracy of camera calibration. And the average calibration error is 0.1786 ${\rm{mm}}$ in the calibration range of $400\;{\rm{mm}} \times 300\;{\rm{mm}}$, which verifies the feasibility and effectiveness of proposed method.

    Tools

    Get Citation

    Copy Citation Text

    Wenyi Chen, Jie Xu, Hui Yang. Camera calibration method based on double neural network[J]. Infrared and Laser Engineering, 2021, 50(11): 20210071

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Photoelectric measurement

    Received: Jan. 27, 2021

    Accepted: --

    Published Online: Dec. 7, 2021

    The Author Email: Xu Jie (1141849828@qq.com)

    DOI:10.3788/IRLA20210071

    Topics