Optics and Precision Engineering, Volume. 33, Issue 8, 1289(2025)
Remote sensing object detection algorithm based on ultra fusion residual marching geometric perception
This paper proposed an ultra-fusion residual marching geometric perception algorithm, which aimed to solve the challenges of multi-scale, dense overlap, and uneven data distribution in remote sensing image object detection. The hyper-fusion residual marching module optimized the network structure, and its multi-level convolution operation used different scale receptive fields to capture the details of each scale of the object, enhance the model’s perception of the object features, and achieve small-scale object feature extraction and large-scale object accurate positioning. The detection effect was accurately evaluated by calculating the geometric similarity between the detection and the real results, and the fit was carefully considered in the dense overlapping scene of the object, so as to screen the final results, reduce missed detection and false detection, and improve the mAP of the algorithm. A multi-path feature fusion module was designed to fuse different levels of feature information, extract richer object features, enhance network representation and discrimination capabilities, and improve detection mAP and stability. The experimental results on the NWPU-VHR-10 data set showed that mPrecision, mRecall, mAP and mF1 Score were increased by 0.041 9,0.104 0,0.045 5 and 0.085 0, respectively. The experimental results on the RSOD data set show that mPrecision, mRecall, mAP, and mF1 Score are increased by 0.022 1,0.103 4,0.061 9, and 0.087 5, respectively. The effectiveness and superiority of the proposed ultra-fusion residual marching geometric perception algorithm in the field of remote sensing image object detection are fully proved.
Get Citation
Copy Citation Text
Chenshuai BAI, Xiaofeng BAI, Kaijun WU, Haowen WANG. Remote sensing object detection algorithm based on ultra fusion residual marching geometric perception[J]. Optics and Precision Engineering, 2025, 33(8): 1289
Category:
Received: Aug. 30, 2024
Accepted: --
Published Online: Jul. 1, 2025
The Author Email: Kaijun WU (wkj@mail.lzjtu.cn)