Optics and Precision Engineering, Volume. 31, Issue 2, 234(2023)
Parallel path and strong attention mechanism for building segmentation in remote sensing images
Building segmentation in remote sensing images is widely used in urban planning and military fields, and is a current focus of research in the remote sensing field. To solve the problems of large-scale changes between buildings, building occlusion, and similar building shadows and edges in remote sensing images, which result in low building segmentation accuracy, a convolutional neural network with parallel paths and strong attention mechanism was developed. The model was based on the idea of residual connections of a ResNet network, and used ResNet as the basic network to improve the network depth and convolution downsampling to obtain parallel paths to extract multi-scale features of buildings to reduce the influence of scale changes between buildings. A strong attention mechanism was then added to enhance the fusion effect of the multi-scale information and discrimination of different features, and suppress the influence of building occlusion and shadows. Finally, a pyramid space pooling module was added after the multi-scale fusion features to suppress the appearance of holes inside the building in the segmentation result and improve the segmentation accuracy. Experiments were conducted on the WHU and Massachusetts Buildings public datasets, and the segmentation results were quantitatively compared using four indicators, namely MIoU, recall, precision, and F1-score. In the Massachusetts Buildings dataset, MIoU reaches 72.84%, which is 1.46% higher than the MIoU obtained with ResUNet-a. Thus, the model effectively improved the segmentation accuracy of buildings in remote sensing images.
Get Citation
Copy Citation Text
Jianhua YANG, Hao ZHANG, Haiyang HUA. Parallel path and strong attention mechanism for building segmentation in remote sensing images[J]. Optics and Precision Engineering, 2023, 31(2): 234
Category: Information Sciences
Received: Mar. 1, 2022
Accepted: --
Published Online: Feb. 9, 2023
The Author Email: HUA Haiyang (c3i11@sia.cn)