Laser & Infrared, Volume. 54, Issue 7, 1141(2024)
Infrared and visible image fusion based on AGF and CNN
[1] [1] Ma Jiayi, Ma Yong, Li Chang. Infrared and visible imagefusion methods and applications: asurvey[J]. Information Fusion, 2019, 45: 153-178.
[2] [2] Piella Gemma. A general framework for multiresolution image fusion: from pixels to regions[J]. Information Fusion, 2003, 4(4): 259-280.
[3] [3] Han Ju, Bhanu Bir. Fusion of color and infrared video for moving human detection[J]. Pattern Recognition, 2007, 40(6): 1771-1784.
[5] [5] Liu Yu, Chen Xun, Wang Zengfu, et al. Deep learning forpixel-level image fusion: recent advances andfuture prospects[J]. Information Fusion, 2018, 42: 158-173.
[8] [8] Li Hui, Wu Xiaojun. Dense fuse: a fusion approachto infrared and visible images[J]. IEEE Transactions on Image Processing, 2019, 28(5): 2614-2623.
[9] [9] Ma Jiayi, Yu Wei, Liang Pengwei, et al. FusionGAN: a generative adversarial network for infrared and visible image fusion[J]. Information Fusion, 2019, 48: 11-26.
[10] [10] Ma Jiayi, Liang Pengwei, Yu Wei, et al. Infrared andvisible image fusion via detail preserving adversarial learning[J]. Information Fusion, 2020, 54: 85-98.
[11] [11] Xu Han, Ma Jiayi, Jiang Junjun, et al. U2Fusion: a unified unsupervised image fusion network[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022, 44(1): 502-518.
[12] [12] Zhao Zixiang, Bai Haowen, Zhang Jiangshe, et al. Cddfuse: correlation-driven dual-branch feature decompositionfor multi-modality image fusion[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2023: 5906-5916.
[14] [14] He Kaiming, Sun Jian, Tang Xiaoou. Guided image filtering[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2013, (6): 35.
[15] [15] Zhang Qi, Shen Xiaoyong, Xu Li, et al. Rolling guidance filter[C]//Computer Vision-ECCV 2014: 13th European Conference, Zurich, Switzerland, September 6-12, 2014.
[16] [16] Kniefacz Philipp, Kropatsch Walter. Smooth and iteratively restore: a simple and fast edge-preserving smoothing model[J/OL]. https://arxiv.org/pdf/1505.06702.
[17] [17] Toet A. Alternating guided image filtering[J]. PeerJ Computer Science, 2016, 2(e72).
[18] [18] Huang Gao, Liu Zhuang, Laurens Van Der Maaten, et al. Densely connected convolutional metworks[J]. 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 2017: 2261-2269.
[19] [19] He Kaiming, Zhang Xiangyu, Ren Shaoqing, et al. Deep residual learning for image recognition[J]. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 2016: 770-778.
[20] [20] Toet A. TNO image fusion dataset[J]. Data in Brief, 2014, 15: 248-251.
[21] [21] Ma Jiayi, Zhou Yi. Infrared and visible image fusion via gradientlet filter[J]. Computer Visionand Image Understanding, 2020, 197-198(2): 103016.
[22] [22] Zhang Yongxin, Wei Wei, Yuan Yating. Multi-focus image fusion with alternating guided filtering[J]. Signal, Image and Video Processing, 2018, 13: 737-735.
[23] [23] Liu Yu, Chen Xun, Cheng Juan, et al. Infrared and visible image fusion with convolutional neuralnetworks[J]. International Journal of Wavelets, Multiresolution and Information Processing, 2018, 16(3).
[24] [24] Tang Linfeng, Yuan Jiteng, Zhang Hao, et al. PIAFusion: a progressive infrared and visible image fusion network based on illumination aware[J]. Information Fusion, 2022, 83-84: 79-92.
[25] [25] Kumar BK S. Multifocus and multispectral image fusionbased on pixel significance using discrete cosine harmonic wavelet transform[J]. Signal, Image and Video Processing, 2013, 7(6).
[26] [26] Haghighat Mohammad, RazianMasoud Amirkabiri. Fast-FMI: non-reference image fusionmetric[C]//2014 IEEE 8th International Conference on Application of Information and Communication Technologies (AICT), Astana, Kazakhstan, 2014: 1-3.
Get Citation
Copy Citation Text
YANG Yan-chun, YANG Wan-xuan, LEI Hui-yun. Infrared and visible image fusion based on AGF and CNN[J]. Laser & Infrared, 2024, 54(7): 1141
Category:
Received: Sep. 26, 2023
Accepted: Apr. 30, 2025
Published Online: Apr. 30, 2025
The Author Email: