Opto-Electronic Engineering, Volume. 51, Issue 7, 240126(2024)

Global pooling residual classification network guided by local attention

Wentao Jiang1... Rui Dong1,* and Shengchong Zhang2 |Show fewer author(s)
Author Affiliations
  • 1College of Software, Liaoning Technical University, Huludao, Liaoning 125105, China
  • 2Science and Technology on Electro-Optical Information Security Control Laboratory, Tianjin 300308, China
  • show less

    Most attention mechanisms, while enhancing image features, do not consider the impact of local feature interaction on overall feature representation. To address this issue, this paper proposes a global pooling residual classification network guided by local attention (MSLENet). The baseline network for MSLENet was ResNet34. First, the initial layer structure was modified to retain important image information. Second, a multiple segmentation local enhancement attention mechanism (MSLE) module was introduced. The MSLE module first segmented the image into multiple small images, then enhanced the local features of each small image, and finally integrated these important local features into the global features through feature group interaction. Lastly, a pooling residual (PR) module was proposed to address the information loss problem in the ResNet residual structure and improve the information utilization between layers. The experimental results show that by enhancing the interaction of local features, MSLENet achieves good performance on multiple datasets and effectively improves the expressive ability of the network.

    Keywords
    Tools

    Get Citation

    Copy Citation Text

    Wentao Jiang, Rui Dong, Shengchong Zhang. Global pooling residual classification network guided by local attention[J]. Opto-Electronic Engineering, 2024, 51(7): 240126

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Article

    Received: May. 28, 2024

    Accepted: Aug. 5, 2024

    Published Online: Nov. 12, 2024

    The Author Email: Dong Rui (董睿)

    DOI:10.12086/oee.2024.240126

    Topics