Infrared Technology, Volume. 45, Issue 9, 915(2023)

Infrared and Visible Image Fusion Based on Guided Filter and Sparse Representation in NSST Domain

Lingxiao WU*, Jiayin KANG, and Yunxiang JI
Author Affiliations
  • [in Chinese]
  • show less

    Image fusion technology aims to solve the problem of insufficient and incomplete information provided by a single-modality image. This paper proposes a novel method based on guided filter (GF) and sparse representation (SR) in the non-subsampled shearlet transform (NSST) domain, to fuse infrared and visible images. Specifically, ① the infrared and visible images are respectively decomposed using NSST to obtain the corresponding high-frequency and low-frequency sub-band images; ② The GF-weighted fusion strategy is exploited to fuse the high-frequency sub-band images; ③ Rolling guidance filter (RGF) is used to further decompose the low-frequency sub-band images into base and detail layers, whereby the base layers are fused via SR, and the detail layers are fused using local maximum strategy which is based on consistency verification; ④ An inverse NSST is performed on the fused high-frequency and low-frequency sub-band images to obtain the final fusion result. Compared to those of other methods, experimental results on public datasets show that the fusion result obtained by the proposed method has richer texture detail and better subjective visual effects. In addition, the proposed method achieves overall better performance in terms of objective metrics that are commonly used for evaluating fusion results.

    Tools

    Get Citation

    Copy Citation Text

    WU Lingxiao, KANG Jiayin, JI Yunxiang. Infrared and Visible Image Fusion Based on Guided Filter and Sparse Representation in NSST Domain[J]. Infrared Technology, 2023, 45(9): 915

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Aug. 2, 2022

    Accepted: --

    Published Online: Dec. 15, 2023

    The Author Email: Lingxiao WU (wlx970831@163.com)

    DOI:

    Topics