Laser Journal, Volume. 46, Issue 3, 133(2025)

The fusion method of low-light visible light and infrared images based on parallel networks

ZHOU Ye, DU Xiaoyu, TAN Yajun, and ZHANG Jing*
Author Affiliations
  • North University of China, Taiyuan 030051, China
  • show less
    References(19)

    [3] [3] Wang Z B, Ma Y K, Zhang Y N. Review of pixel-level remote sensing image fusion based on deep learning, Information Fusion[J]. 2023, 90: 36-58.

    [5] [5] Zuo F Y, Huang Y D, Li Q F, et al, Infrared and visible image fusion using multi-scale pyramid network, International Journal of Wavelets, Multiresolution and Information Processing[J]. 2022, 20: 02196913

    [7] [7] Singh S, Singh H, Gehlot A, et al, IR and visible image fusion using DWT and bilateral filter[J]. Microsystem Technologies, 2023, 29: 457-467.

    [8] [8] Zhang Y, Liu Y, Sun P, et al. IFCNN: A General Image Fusion Framework Based on Convolutional Neural Network[J]. Information Fusion, 2020, 54: 99-118.

    [9] [9] Li H, Wu X-J. DenseFuse: A Fusion Approach to Infrared and Visible Images[J]. IEEE Transactions on Image Processing, 2019, 28: 2614-23.

    [10] [10] Li H, Wu X-J, Durrani T. NestFuse: An Infrared and Visible Image Fusion Architecture based on Nest Connection and Spatial/Channel Attention Models[J]. IEEE Transactions on Instrumentation and Measurement, 2020, 69: 9645-9656.

    [11] [11] Li Hi, Wu X-J, Kittler J. RFN-Nest: An End-to-End Residual Fusion Network for Infrared and Visible Images[J]. Information Fusion, 2021, 73: 72-86.

    [12] [12] Wang Z S, Wu Y Y, Wang J Y, et al. Res2Fusion: Infrared and Visible Image Fusion Based on Dense Res2net and Double Nonlocal Attention Models[J]. Information Fusion, 2022, 71: 1-12.

    [13] [13] Ma J Y, Wei Y, Peng W L, et al. FusionGAN: A Generative Adversarial Network for Infrared and Visible Image Fusion[J]. Information Fusion, 2019, 48: 11-26.

    [14] [14] Ma J Y, Xu H, Jiang J J, et al. DDcGAN: A Dual-Discriminator Conditional Generative Adversarial Network for Multi-Resolution Image Fusion[J], IEEE Transactions on Image Processing, 2020, 29: 4980-4995.

    [15] [15] Rao D Y, Xu T Y, Wu X-J. TGFuse: An Infrared and Visible Image Fusion Approach Based on Transformer and Generative Adversarial Network[J]. Information Fusion, 2023, 75: 1-1.

    [16] [16] Tang L F, Yuan J T, Ma J Y. Image fusion in the loop of high-level vision tasks: A semantic-aware real-time infrared and visible image fusion network[J]. Information Fusion, 2021, 82: 28-42.

    [20] [20] Tang L F, Yuan J T, Zhang Hao, et al. PIAFusion: A progressive infrared and visible image fusion network based on illumination aware[J]. Information Fusion, 2022, 83: 79-92.

    [21] [21] Tang L F, Xiang, X Y, Zhang H, et al. DIVFusion: Darkness-free infrared and visible image fusion[J]. Information Fusion, 2023, 91: 477-493.

    [22] [22] Yin R Y, Yang B, Huang Z Y, et al. DSA-Net: Infrared and Visible Image Fusion via Dual-Stream Asymmetric Network[J]. Sensors, 2023, 23: 14248220.

    [23] [23] Wei C, Wang W J, Yang W H, et al. Deep retinex decomposition for low-light enhancement[C]//British Machine Vision Conference 2018, BMVC 2018, 2019: Newcastle, United Kingdom.

    [24] [24] Zhang Y H, Zhang J W, Guo X J. Kindling the darkness: A practical low-light image enhancer[C]//MM 2019-Proceedings of the 27th ACM International Conference on Multimedia, 2019: 1632-1640.

    [25] [25] Jia X Y, Zhu C, Li M Z, et al. LLVIP: A Visible-infrared Paired Dataset for Low-light Vision[C]//Proceedings of the IEEE International Conference on Computer Vision, 2021: 3489-3497.

    [26] [26] Toet A. The TNO Multiband Image Data Collection[J]. Data in Brief, 2017, 15: 249-251.

    Tools

    Get Citation

    Copy Citation Text

    ZHOU Ye, DU Xiaoyu, TAN Yajun, ZHANG Jing. The fusion method of low-light visible light and infrared images based on parallel networks[J]. Laser Journal, 2025, 46(3): 133

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Nov. 9, 2024

    Accepted: Jun. 12, 2025

    Published Online: Jun. 12, 2025

    The Author Email: ZHANG Jing (252448121@qq.com)

    DOI:10.14016/j.cnki.jgzz.2025.03.133

    Topics