APPLIED LASER, Volume. 44, Issue 5, 169(2024)

Outdoor Laser Robot Vision Defogging Method Under Deep Network

Zhan Fei1, Fu Hui2, and Yan Shengli1
Author Affiliations
  • 1Guang'an Vocational and Technical College, Guang'an 638000, Sichuan, China
  • 2College of Electrical and Information Engineering, Lanzhou University of Technology, Lanzhou 730050, Gansu, China
  • show less

    This paper proposes a deep network-based defogging method to enhance the outdoor laser robot's vision perception capability, addressing the low visibility issues caused by atmospheric suspended particles, such as color bias, distortion, noise, etc., in the images. Firstly, flat and connected areas are identified and labeled as the sky light domain in the gradient domain and the gray domain, and the atmospheric light value is obtained using the quadtree method. Then, based on the input RGB feature map and the texture features extracted by the auto-encoder network, nonlinear mapping and transmission map reconstruction are completed. Finally, the estimated atmospheric light value and transmission rate are combined to restore the clear image using the atmospheric scattering model. Objective and subjective comparative experiments were conducted on our method, Retinex method, dual-threshold segmentation method, auto-encoder method, and AFF-Net method. The results showed that our method achieves superior objective indicators, both with and without reference, effectively preserving the original color tone of the real scene and generating clear, detailed images.

    Tools

    Get Citation

    Copy Citation Text

    Zhan Fei, Fu Hui, Yan Shengli. Outdoor Laser Robot Vision Defogging Method Under Deep Network[J]. APPLIED LASER, 2024, 44(5): 169

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Feb. 21, 2023

    Accepted: Dec. 13, 2024

    Published Online: Dec. 13, 2024

    The Author Email:

    DOI:10.14128/j.cnki.al.20244405.169

    Topics