Optics and Precision Engineering, Volume. 31, Issue 15, 2273(2023)
Image super-resolution reconstruction based on attention and wide-activated dense residual network
To address the problem of the blurring of the texture details of reconstructed images due to the insufficient utilization of global and local high- and low-frequency spatial information, this paper proposes an image super-resolution reconstruction model based on attention and a wide-activated dense residual network. First, four parallel convolution kernels with different scales are used to fully extract the low-frequency features of the image as the prior information for spatial feature transformation. Second, a wide-activated residual block fused with attention is constructed in the deep feature mapping module, and the low-frequency prior information is used to guide the extraction of the high-frequency features. In addition, the wide-activated residual block extracts deeper feature maps by expanding the number of feature channels before the activation function. As a result, the constructed global and local residual connections not only strengthen the forward propagation of the residual blocks and network features, but also enrich the diversity of the extracted features without increasing the number of parameters. Finally, the feature map is upsampled and reconstructed to obtain a clear high-resolution image. the experimental results show that compared with the LatticeNet model, the peak signal-to-noise ratio of the proposed algorithm is improved by 0.14 dB, and the structural similarity is improved by 0.001 at 4× super resolution on the BSD100 dataset. In addition, the local texture details of the reconstructed image are also clearer in subjective visualization.
Get Citation
Copy Citation Text
Qiqi KOU, Chao LI, Deqiang CHENG, Liangliang CHEN, Haohui MA, Jianying ZHANG. Image super-resolution reconstruction based on attention and wide-activated dense residual network[J]. Optics and Precision Engineering, 2023, 31(15): 2273
Category: Information Sciences
Received: Nov. 1, 2022
Accepted: --
Published Online: Sep. 5, 2023
The Author Email: KOU Qiqi (kouqiqi@cumt.edu.cn)