Optics and Precision Engineering, Volume. 31, Issue 10, 1548(2023)
Infrared and visible image fusion based on fast alternating guided filtering and CNN
In order to solve the problems of the loss of detail information, blurred edges, and artifacts in infrared and visible image fusion, this paper proposes a fast alternating guided filter, which significantly increases the operation efficiency while ensuring the quality of the fused image. The proposed filer combines a convolutional neural network (CNN) and infrared feature extraction effective fusion. First, quadtree decomposition and Bessel interpolation are used to extract the infrared brightness features of the source images, and the initial fusion image is obtained by combining the visible image. Second, the information of the base layer and the detail layer of the source images is obtained through fast alternating guided filtering. The base layer obtains the fused base image through the CNN and Laplace transform, and the detail layer obtains the fused detail image through the saliency measurement method. Finally, the initial fusion map, basic fusion map, and detail fusion map are added to obtain the final fusion result. Because of the fast alternating guided filtering and feature extraction performance of this algorithm, the final fusion result contains rich texture details and clear edges. The experimental results indicate that the fusion results obtained by the algorithm have good fidelity in vision, and its objective evaluation indicators are compared with those of other methods. The information entropy, standard deviation, spatial frequency, wavelet feature mutual information, visual fidelity, and average gradient show improvements by 9.9%, 6.8%, 43.6%, 11.3%, 32.3%, and 47.1%, respectively, on average.
Get Citation
Copy Citation Text
Yanchun YANG, Yongping LI, Jianwu DANG, Yangping WANG. Infrared and visible image fusion based on fast alternating guided filtering and CNN[J]. Optics and Precision Engineering, 2023, 31(10): 1548
Category: Information Sciences
Received: Aug. 17, 2022
Accepted: --
Published Online: Jul. 4, 2023
The Author Email: YANG Yanchun (yangyanchun102@sina. com)