Opto-Electronic Engineering, Volume. 52, Issue 1, 240236(2025)
Remote sensing image road extraction by integrating ResNeSt and multi-scale feature fusion
Aiming at the issues of discontinuous road edge segmentation, low accuracy in segmenting small-scale roads, and misclassification of target roads in high-resolution remote sensing imagery, this paper proposes a road extraction method that integrates ResNeSt and multi-scale feature fusion for road extraction from remote sensing imagery. Referencing the ResNeSt network module, a U-shaped network encoder is constructed to enable the initial encoder to extract information more entirely and ensure more continuous segmentation of target edges. Firstly, Triplet Attention is introduced into the encoder to suppress useless feature information. Secondly, convolutional blocks replace max pooling operations, increasing feature dimensionality and network depth while reducing the loss of road information. Finally, a multi-scale feature fusion (MSFF) module is utilized at the bridge connection between the encoder and decoder networks to capture long-range dependencies between regions and improve road segmentation performance. The experiments were conducted on the Massachusetts Roads dataset and the DeepGlobe dataset. The experimental results demonstrate that our proposed method achieved Intersection over Union scores of 65.39% and 65.45%, respectively, on these datasets, representing improvements of 1.42% and 1.74% compared to the original MINet model. These findings indicate that the ResT-UNet network effectively enhances the extraction accuracy of road features in remote sensing imagery, providing a novel approach for interpreting semantic information in remote sensing images.
Get Citation
Copy Citation Text
Ming Hao, He Bai, Tingting Xu. Remote sensing image road extraction by integrating ResNeSt and multi-scale feature fusion[J]. Opto-Electronic Engineering, 2025, 52(1): 240236
Category: Article
Received: Oct. 9, 2024
Accepted: Dec. 16, 2024
Published Online: Feb. 21, 2025
The Author Email: Hao Ming (郝明)