Spacecraft Recovery & Remote Sensing, Volume. 45, Issue 1, 136(2024)
Building Extraction Model Based on ASPP and Dual Attention Mechanism
Accurate and efficient building information extraction from high-resolution remote sensing images is of great significance for land planning and mapping. In recent years, great progress has been made in building information extraction based on convolutional neural networks. However, there still exists the problems that the advanced semantic features of the images are not sufficiently utilized and it is difficult to obtain detailed and high-precision segmentation images when processing high-resolution remote sensing images. To solve the above problems, a deep learning network architecture, Atrous Space and Channel Perception Network (ASCP-Net), is proposed for automatic building extraction. The Atrous Spatial Pyramid Pooling (ASPP) and Spatial and channel-wise Attention (SCA) modules are integrated into the encoder-decoder structure. Multi-scale context information is captured and aggregated through the ASPP module. Meanwhile, the SCA module is used to selectively enhance the more useful information in specific locations and channels, and the high and low-layer feature information is input into the decoding network to achieve efficient building information extraction. Experiments on the WHU Building Dataset show that, for the overall accuracy and F1 score, the proposed method reachese 97.4% and 94.6% respectively, and can obtain clearer building boundaries compared with other models, especially for the extraction of incomplete buildings at image edges, and effectively improving the accuracy and integrity of building extraction.
Get Citation
Copy Citation Text
Mingyang YU, Haiqing XU, Wenzhuo ZHANG, Shuai XU, Fangliang ZHOU. Building Extraction Model Based on ASPP and Dual Attention Mechanism[J]. Spacecraft Recovery & Remote Sensing, 2024, 45(1): 136
Category:
Received: Mar. 27, 2023
Accepted: --
Published Online: Apr. 22, 2024
The Author Email: XU Haiqing (2021160105@stu.sdjzu.edu.cn)