Laser & Infrared, Volume. 55, Issue 7, 1128(2025)
Lightweight infrared target detection algorithm based on improved YOLOv8
In response to the challenges of poor resolution, low contrast, and low signal-to-noise ratio in infrared image object detection tasks, a lightweight infrared image object detection algorithm based on YOLOv8 is proposed in this paper. Firstly, a Faster Block module is constructed to replace the Bottleneck module in the Neck section, effectively reducing the number of model parameters and improving the lightweight level of the model. Then, the SE attention mechanism is added to enhance the network′s ability to focus on important features and improve the effectiveness of feature extraction, thereby enhancing the robustness and stability of the model. Meanwhile, a dual-layer routing attention mechanism is introduced to utilize the large amount of redundant information in the feature map and save computation and memory overhead through sparse connections. Finally, the loss function is improved by introducing the complete intersection to union ratio EIoU as the regression loss, which improves the regression accuracy of the model for the target bounding box. The experimental results show that compared with mainstream algorithms such as YOLOv5 and YOLOv8, the improved algorithm in this paper achieves a recall rate of 81%. The model volume is decreased by 7.2% and 21.6% respectively, with only 4.6 MB. Simultaneously, the parameter count and computational complexity are significantly reduced. Compared with mainstream algorithms, the improved algorithm in this paper exhibits significant improvements in detection accuracy, model volume, and computational complexity, thus meeting the detection requirements for infrared targets.
Get Citation
Copy Citation Text
SONG Cheng-liang, ZHANG Qi-zhi, LIU Wei, LIU Qiong. Lightweight infrared target detection algorithm based on improved YOLOv8[J]. Laser & Infrared, 2025, 55(7): 1128
Category:
Received: Sep. 10, 2024
Accepted: Sep. 12, 2025
Published Online: Sep. 12, 2025
The Author Email: ZHANG Qi-zhi (zqzbim@163.com)