Infrared Technology, Volume. 45, Issue 9, 897(2023)

Infrared and Visible Image Fusion Based on N-RGAN Model

Yu SHEN, Li LIANG, Hailong WANG, Yuan YAN, Guanghui LIU, and Jing SONG
Author Affiliations
  • [in Chinese]
  • show less

    At present, infrared and visible image fusion algorithms still have problems such as low applicability to complex scenes, large loss of detail and texture information in fusion images, and low contrast and sharpness of fusion images. In view of the above problems, this study proposes an N-RGAN model that combines a non-subsampled shearlet transform (NSST) and a residual network (ResNet). Infrared and visible images are decomposed into high- and low-frequency sub-bands using NSST. The high-frequency sub-bands are spliced and input into the generator improved by the residual module, and the source infrared image is taken as the decision standard to improve network fusion performance, fusion image detail description, and target-highlighting ability. The salient features of infrared and visible images are extracted, and the low-frequency sub-bands are fused by adaptive weighting to improve image contrast and sharpness. The fusion results of the high- and low-frequency sub-bands are obtained by the NSST inverse transformation. Based on a comparison of various fusion algorithms, the proposed method improves peak signal-to-noise ratio (PSNR), average gradient (AVG), image entropy (IE), spatial frequency (SF), edge strength (ES), and image clarity (IC), thereby improving infrared and visible light image fusion effects in complex scenes, alleviating information loss in image detail texture, and enhancing image contrast and resolution.

    Tools

    Get Citation

    Copy Citation Text

    SHEN Yu, LIANG Li, WANG Hailong, YAN Yuan, LIU Guanghui, SONG Jing. Infrared and Visible Image Fusion Based on N-RGAN Model[J]. Infrared Technology, 2023, 45(9): 897

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Jun. 3, 2022

    Accepted: --

    Published Online: Dec. 15, 2023

    The Author Email:

    DOI:

    Topics