Journal of Terahertz Science and Electronic Information Technology , Volume. 19, Issue 1, 125(2021)

Visible and infrared image fusion algorithm based on saliency guidance

TANG Zhongjian1、* and MAO Chun2
Author Affiliations
  • 1[in Chinese]
  • 2[in Chinese]
  • show less

    The current image fusion methods mainly use image energy features to fuse layer content, and ignore the significant information of image, resulting in low contrast in the fusion image. In this paper, a method is proposed, which will fuse visible and infrared images based on the significant information of image. Firstly, a smooth transform is designed to decompose the visible and infrared images by using the L0 and L1 norms, and obtain the base layer and detail layer images with good edge features. Then, by using the frequency tuning method, the significant information in the infrared image is obtained to establish the fusion model of the base layer image, and get the fusion base layer image. Through the information entropy features of image, the fusion model of detail level image is constructed, and the fusion detail level image is obtained from the information relevance of different detail level images. The fusion image is obtained by summing the fusion detail layer image and the fusion base layer image. Experimental results show that this algorithm can better fuse visible and infrared images than current algorithms; its fusion results can not only highlight the target information, but also have better contrast.

    Tools

    Get Citation

    Copy Citation Text

    TANG Zhongjian, MAO Chun. Visible and infrared image fusion algorithm based on saliency guidance[J]. Journal of Terahertz Science and Electronic Information Technology , 2021, 19(1): 125

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Mar. 13, 2020

    Accepted: --

    Published Online: Apr. 21, 2021

    The Author Email: Zhongjian TANG (TangJz1972cq@aliyun.com)

    DOI:10.11805/tkyda2020103

    Topics