Laser & Optoelectronics Progress, Volume. 62, Issue 8, 0828003(2025)
Improved ResUNet Method for Extracting Buildings from Remote Sensing Images
To address the limitations in existing semantic segmentation networks for remote sensing images, specifically the loss of fine details, insufficient focus on local information, and restricted ability to capture multi-scale contextual information, SGMFResUNet, a deep residual network enhanced by a spatial information-enhanced global attention mechanism and multi-module scale fusion for automatic building features extraction from remote sensing images, is proposed in this study. Built upon the ResUNet architecture, the proposed model increases network depth to capture richer multi-level features and employs asymmetric convolutional blocks to improve feature representation and extraction capabilities at various levels. A dual pooling dense pyramid module is designed to enable dense capture of multi-scale contextual information, supplemented by global features. In addition, a hierarchical detail enhancement module is designed to progressively integrate shallow-level features, reducing detail loss. The improved global attention mechanism further adapts feature adjustment, enhancing cross-dimensional interactions to provide more robust feature representation. Experiments on the WHU and Massachusetts building datasets demonstrate that SGMFResUNet achieves intersection over union scores of 90.74% and 77.00%, representing improvements of 1.93 percentage points and 3.29 percentage points, respectively, with respect to ResUNet. Furthermore, compared to ResUNet, HRNetV2, MSFCN, BuildFormer, DC-Swin, and SDSC-UNet, SGMFResUNet consistently demonstrates superior accuracy in building extraction.
Get Citation
Copy Citation Text
Qianrong Sun, Xiaopeng Wang. Improved ResUNet Method for Extracting Buildings from Remote Sensing Images[J]. Laser & Optoelectronics Progress, 2025, 62(8): 0828003
Category: Remote Sensing and Sensors
Received: Sep. 4, 2024
Accepted: Oct. 28, 2024
Published Online: Apr. 8, 2025
The Author Email:
CSTR:32186.14.LOP241955