Opto-Electronic Engineering, Volume. 51, Issue 5, 240050(2024)
Sparse feature image classification network with spatial position correction
To sparse semantics and enhance attention to key features, enhance the correlation between spatial and local features, and constrain the spatial position of features, this paper proposes a sparse feature image classification network with spatial position correction (SSCNet) for spatial position correction. This network is based on the ResNet-34 residual network. Firstly, a sparse semantic enhanced feature (SSEF) module is proposed, which combines depthwise separable convolution (DSC) and SE to enhance feature extraction ability while maintaining the integrity of spatial information; Then, the spatial position correction symmetric attention mechanism (SPCS) is proposed. SPCS adds the symmetric global coordinate attention mechanism to specific positions in the network, which can strengthen the spatial relationships between features, constrain and correct the spatial positions of features, and enhance the network's perception of global detailed features; Finally, the average pooling module (APM) is proposed and applied to each residual branch of the network, enabling the network to more effectively capture global feature information, enhance feature translation invariance, delay network overfitting, and improve network generalization ability. In the CIFAR-10, CIFAR-100, SVHN, Imagenette, and Imagewood datasets, SSCNet has shown varying degrees of improvement in classification accuracy compared to other high-performance networks, proving that SSCNet can better extract local detail information while balancing global information, with high classification accuracy and strong generalization performance.
Get Citation
Copy Citation Text
Wentao Jiang, Chen Chen, Shengchong Zhang. Sparse feature image classification network with spatial position correction[J]. Opto-Electronic Engineering, 2024, 51(5): 240050
Category: Article
Received: Mar. 6, 2024
Accepted: Apr. 24, 2024
Published Online: Jul. 31, 2024
The Author Email: Chen Chen (陈晨)