Opto-Electronic Engineering, Volume. 49, Issue 5, 210382(2022)

Self-similarity enhancement network for image super-resolution

Ronggui Wang... Hui Lei, Juan Yang* and Lixia Xue |Show fewer author(s)
Author Affiliations
  • School of Computer and Information, Hefei University of Technology, Hefei, Anhui 230601, China
  • show less

    Deep convolutional neural networks (DCNN) recently demonstrated high-quality restoration in the single image super-resolution (SISR). However, most of the existing image super-resolution methods only consider making full use of the inherent static characteristics of the training sets, ignoring the internal self-similarity of low-resolution images. In this paper, a self-similarity enhancement network (SSEN) is proposed to address above-mentioned problems. Specifically, we embedded the deformable convolution into the pyramid structure and combined it with the cross-level co-attention to design a module that can fully mine multi-level self-similarity, namely the cross-level feature enhancement module. In addition, we introduce a pooling attention mechanism into the stacked residual dense blocks, which uses a strip pooling to expand the receptive field of the convolutional neural network and establish remote dependencies within the deep features, so that the patches with high similarity in deep features can complement each other. Extensive experiments on five benchmark datasets have shown that the SSEN has a significant improvement in reconstruction effect compared with the existing methods.

    Tools

    Get Citation

    Copy Citation Text

    Ronggui Wang, Hui Lei, Juan Yang, Lixia Xue. Self-similarity enhancement network for image super-resolution[J]. Opto-Electronic Engineering, 2022, 49(5): 210382

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Nov. 26, 2021

    Accepted: --

    Published Online: Jun. 10, 2022

    The Author Email: Yang Juan (yangjuan6985@163.com)

    DOI:10.12086/oee.2022.210382

    Topics