Laser Journal, Volume. 45, Issue 12, 106(2024)
Infrared and visible image fusion based on shuffle attention mechanism and residual dense network
Aiming at the problem that detailed information features are easy to be lost in the fusion process of infrared and visible image fusion algorithm, this paper proposes an infrared and visible image fusion algorithm based on shuffle attention mechanism and residual dense network. Firstly, the encoding network downsamples the source image at different scales to obtain the feature map with rich semantic information. Then the shuffle attention residual fusion network fuses the feature map extracted from the encoding network, and the shuffle attention mechanism aggregates the feature maps through the channel attention and spatial attention shuffling, and utilizes the residual dense connection to maximize the retention of effective image information on the aggregated feature maps. Finally, the decoding network reconstructs the image map through up-sampling. Compared to other fusion algorithms, the fusion images produced by the algorithm proposed in this paper demonstrate a clear advantage in terms of clarity, especially when dealing with complex situations such as blur, occlusion, and smoke, as evidenced by subjective evaluations. This suggests that the algorithm may have a competitive edge in the field of image fusion, particularly in generating clearer fusion results when handling images under complex conditions. In the objective index comparison, the fused images of the algorithm proposed in this paper have different degrees of improvement and achieve the optimal value in the index criteria of entropy, mutual information and peak signal-to-noise ration, which are 6.930, 13.860, 17.144 and 0.574, respectively.
Get Citation
Copy Citation Text
LIU Peipei, ZHANG Yuxiao, YUAN Shuozhi, WANG Shuo, XU Huyang. Infrared and visible image fusion based on shuffle attention mechanism and residual dense network[J]. Laser Journal, 2024, 45(12): 106
Category:
Received: Mar. 21, 2024
Accepted: Mar. 10, 2025
Published Online: Mar. 10, 2025
The Author Email: Yuxiao ZHANG (1269109080@qq.com)