Laser & Optoelectronics Progress, Volume. 61, Issue 22, 2212007(2024)
Small-Target Detection Method for Transmission Lines Based on Motion Blurred Image Restoration
Small-target images of transmission lines are prone to motion blur owing to factors such as air disturbance, rotor vibration, and relative motion during unmanned-aerial-vehicle inspections. This blurring leads to the loss of texture details, rendering small-target detection difficult. To address this problem, this study proposes a method for detecting small targets in transmission lines based on motion-blurred-image restoration. The proposed method utilizes a conditional vision Transformer-based generative adversarial network (ViT-GAN) to restore motion-blurred images of small targets, thereby enhancing the involved feature-extraction backbone's ability to perceive the global and regional information in images and improving the quality of image restoration for subsequent object detection. The involved YOLOv8 network is enhanced by introducing a multi-head self-attention mechanism, adding a small-target detection layer, and optimizing the boundary-frame-loss function. This helps to achieve good small-target detection in transmission line environments with complex backgrounds and large target-scale variations. Experimental results demonstrate that the proposed method can be used for accurate small-target detection for transmission lines. The average recognition accuracy of six categories of small targets is 92.77%, with an average recall rate of 94.19%, and average F1-score of 94.94%. Overall, the proposed method effectively mitigates the problem of missing and false detection, demonstrating its high accuracy and robustness.
Get Citation
Copy Citation Text
Xunhao Tang, Shaosheng Fan. Small-Target Detection Method for Transmission Lines Based on Motion Blurred Image Restoration[J]. Laser & Optoelectronics Progress, 2024, 61(22): 2212007
Category: Instrumentation, Measurement and Metrology
Received: Mar. 4, 2024
Accepted: Apr. 11, 2024
Published Online: Nov. 19, 2024
The Author Email: Shaosheng Fan (fss508@163.com)
CSTR:32186.14.LOP240806