Laser & Optoelectronics Progress, Volume. 57, Issue 16, 161012(2020)

Mural Image Super Resolution Reconstruction Based on Multi-Scale Residual Attention Network

Zhigang Xu, Juanjuan Yan, and Honglei Zhu*
Author Affiliations
  • School of Computer and Communication, Lanzhou University of Technology, Lanzhou, Gansu 730050, China
  • show less

    Mural image has the characteristics of rich structural details, complex textures and variable colors, while the mural image reconstructed by image super-resolution algorithm based on convolution neural network has problems of texture blur and edge staircase effect. Therefore, we propose a super-resolution reconstruction algorithm based on multi-scale residual attention network. First, the features of low-resolution mural images with convolution kernels of different scales are extracted directly by multi-scale mapping unit. Then, the fused feature maps are input into the attention block of residual channel , so that the weight of each feature map is optimized from the global information, and the depth mapping ability of the network model is enhanced. Finally, a sub-pixel convolutional layer is introduced at the end of the network to rearrange the pixels to obtain the reconstructed high-resolution mural image. Experimental results show that this algorithm can reduce the reconstruction error, enhance the edge and structure information of the reconstructed mural image, and enrich the texture details of the reconstructed mural image.

    Tools

    Get Citation

    Copy Citation Text

    Zhigang Xu, Juanjuan Yan, Honglei Zhu. Mural Image Super Resolution Reconstruction Based on Multi-Scale Residual Attention Network[J]. Laser & Optoelectronics Progress, 2020, 57(16): 161012

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Image Processing

    Received: Dec. 10, 2019

    Accepted: Jan. 14, 2020

    Published Online: Aug. 5, 2020

    The Author Email: Zhu Honglei (xzg_cn@163.com)

    DOI:10.3788/LOP57.161012

    Topics