Optics and Precision Engineering, Volume. 31, Issue 6, 860(2023)

Multi-object pedestrian tracking method based on improved high resolution neural network

Hongying ZHANG*... Pengyi HE and Xiaowen PENG |Show fewer author(s)
Author Affiliations
  • College of Electronic Information and Automation, Civil Aviation University of China, Tianjin300300, China
  • show less

    This study proposes an improved high-resolution neural network to address the issue of detection and tracking failures caused by target blockage in a multi-target pedestrian tracking process. First, to enhance the initial feature extraction capability of the network for pedestrian targets, a second-generation bottleneck residual block structure was introduced into the backbone of a high-resolution neural network, thus improving the receptive field and feature expression capability. Second, a new residual detection block architecture with a two-layer efficient channel attention module was designed to replace the one at the multi-scale information exchange stage of the original network, thus improving the test performance of the entire network system. Finally, the network was fully trained by selecting appropriate parameters, and subsequently, the algorithm was tested using multiple test sets. The test results indicated that the tracking accuracy of the proposed algorithm was 0.1%, 1.6%, and 0.8% higher than that of FairMOT on 2DMOT15, MOT17, and MOT20 datasets, respectively. In conclusion, the proposed algorithm-tracking stability for longer video sequences was greatly improved. Therefore, it can be applied to special scenarios with more targets and occlusion area.

    Tools

    Get Citation

    Copy Citation Text

    Hongying ZHANG, Pengyi HE, Xiaowen PENG. Multi-object pedestrian tracking method based on improved high resolution neural network[J]. Optics and Precision Engineering, 2023, 31(6): 860

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Information Sciences

    Received: May. 26, 2022

    Accepted: --

    Published Online: Apr. 4, 2023

    The Author Email: ZHANG Hongying (carole_zhang0716@163.com)

    DOI:10.37188/OPE.20233106.0860

    Topics