Laser & Optoelectronics Progress, Volume. 57, Issue 22, 221010(2020)

Air-to-Ground Target Detection Algorithm Based on DenseNet and Channel Attention Mechanism

Wenqing Wang1, Lin Feng1, Yang Liu1、*, Dongfang Yang2, and Meng Zhang2
Author Affiliations
  • 1College of Automation, Xi'an University of Posts & Telecommunications, Xi'an, Shaanxi 710121, China;
  • 2College of Missile Engineering, Rocket Force University of Engineering, Xi'an, Shaanxi 710025, China
  • show less

    In the air-to-ground environment, the imaging perspective is single, and it is necessary to rely on deep network to provide stronger feature representation capabilities. Aiming at the problems of large amount of calculation and slow convergence speed brought by deep network. Under the framework of densely connected network (DenseNet), a target detection network model expressed by channel differentiation is proposed. First, this article uses DenseNet as a feature extraction network, and uses fewer parameters to deepen the network to improve the ability to extract objects. Second, channel attention mechanism is introduced to make the network pay more attention to the effective feature channels in the feature layer and readjust the feature map. Finally, a comparative experiment is carried out by using the air-to-ground object detection data. The results show that the mean average precision of the improved model is 3.44 percentage points higher than that of single shot multibox detection algorithm based on visual geometry group (VGG16).

    Tools

    Get Citation

    Copy Citation Text

    Wenqing Wang, Lin Feng, Yang Liu, Dongfang Yang, Meng Zhang. Air-to-Ground Target Detection Algorithm Based on DenseNet and Channel Attention Mechanism[J]. Laser & Optoelectronics Progress, 2020, 57(22): 221010

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Image Processing

    Received: Feb. 7, 2020

    Accepted: Apr. 13, 2020

    Published Online: Nov. 11, 2020

    The Author Email: Liu Yang (yyangbrand@163.com)

    DOI:10.3788/LOP57.221010

    Topics