Laser & Infrared, Volume. 54, Issue 3, 457(2024)

Infrared and visible image fusion based on transformer and spatial attention model

GENG Jun, WU Zi-hao*, LI Wen-hai, and LI Xiao-yu
Author Affiliations
  • College of Software, Xinjiang University, Urumqi 830091, China
  • show less
    References(20)

    [1] [1] Ma J, Ma Y, Li C. Infrared and visible image fusion methods and applications: A survey[J]. Information Fusion, 2019, 45: 153-178.

    [2] [2] Li H, Wu X J. DenseFuse: a fusion approach to infrared and visible images[J]. IEEE Transactions on Image Processing, 2018, 28(5): 2614-2623.

    [3] [3] Li H, Wu X J, Durrani T. NestFuse: an infrared and visible image fusion architecture based on nest connection and spatial/channel attention models[J]. IEEE Transactions on Instrumentation and Measurement, 2020, 69(12): 9645-9656.

    [4] [4] Ma J, Yu W, Liang P, et al. FusionGAN: a generative adversarial network for infrared and visible image fusion[J]. Information Fusion, 2019, 48: 11-26.

    [5] [5] Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[J]. Advances in Neural Information Processing Systems, 2017, 30.

    [6] [6] Dosovitskiy A, Beyer L, Kolesnikov A, et al. An image is worth 16×16 words: Transformers for image recognition at scale[J]. arXiv Preprint arXiv: 2010.11929, 2020.

    [7] [7] Xydeas C S, Petrovic V. Objective image fusion performance measure[J]. Electronics Letters, 2000, 36(4): 308-309.

    [8] [8] Eskicioglu A M, Fisher P S. Image quality measures and their performance[J]. IEEE Transactions on Communications, 1995, 43(12): 2959-2965.

    [9] [9] Roberts J W, Van Aardt J A, Ahmed F B. Assessment of image fusion procedures using entropy, image quality, and multispectral classification[J]. Journal of Applied Remote Sensing, 2008, 2(1): 023522.

    [10] [10] Haghighat M, Razian M A. Fast-FMI: non-reference image fusion metric[C]//2014 IEEE 8th International Conference on Application of Information and Communication Technologies (AICT). IEEE, 2014: 1-3.

    [11] [11] Wang Z, Bovik A C, Sheikh H R, et al. Image quality assessment: from error visibility to structural similarity[J]. IEEE Transactions on Image Processing, 2004, 13(4): 600-612.

    [12] [12] Qu G, Zhang D, Yan P. Information measure for performance of image fusion[J]. Electronics Letters, 2002, 38(7): 1.

    [13] [13] Sheikh H R, Bovik A C. Image information and visual quality[J]. IEEE Transactions on Image Processing, 2006, 15(2): 430-444.

    [14] [14] Wang Q, Shen Y, Jin J. Performance evaluation of image fusion techniques[J]. Image Fusion: Algorithms and Applications, 2008, 19: 469-492.

    [15] [15] Shreyamsha Kumar B K. Multifocus and multispectral image fusion based on pixel significance using discrete cosine harmonic wavelet transform[J]. Signal, Image and Video Processing, 2013, 7: 1125-1143.

    [16] [16] Ma J, Chen C, Li C, et al. Infrared and visible image fusion via gradient transfer and total variation minimization[J]. Information Fusion, 2016, 31: 100-109.

    [17] [17] Zhang Y, Liu Y, Sun P, et al. IFCNN: a general image fusion framework based on convolutional neural network[J]. Information Fusion, 2020, 54: 99-118.

    [18] [18] Zhang H, Xu H, Xiao Y, et al. Rethinking the image fusion: a fast unified image fusion network based on proportional maintenance of gradient and intensity[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2020, 34(7): 12797-12804.

    [19] [19] Xu H, Ma J, Jiang J, et al. U2Fusion: a unified unsupervised image fusion network[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2020, 44(1): 502-518.

    [20] [20] Wang Z, Chen Y, Shao W, et al. SwinFuse: a residual swin transformer fusion network for infrared and visible images[J]. IEEE Transactions on Instrumentation and Measurement, 2022, 71: 1-12.

    Tools

    Get Citation

    Copy Citation Text

    GENG Jun, WU Zi-hao, LI Wen-hai, LI Xiao-yu. Infrared and visible image fusion based on transformer and spatial attention model[J]. Laser & Infrared, 2024, 54(3): 457

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Mar. 20, 2023

    Accepted: Jun. 4, 2025

    Published Online: Jun. 4, 2025

    The Author Email: WU Zi-hao (761545864@qq.com)

    DOI:10.3969/j.issn.1001-5078.2024.03.018

    Topics