Opto-Electronic Engineering, Volume. 51, Issue 9, 240119-1(2024)

Quadrupl-stream input-guided feature complementary visible-infrared person re-identification

Bin Ge1,2、*, Nuo Xu1, Chenxing Xia1, and Haijun Zheng1
Author Affiliations
  • 1School of Computer Science and Engineering, Anhui University of Science and Technology, Huainan, Anhui 232001, China
  • 2Institute of Energy, Hefei Comprehensive National Science Center, Hefei, Anhui 230031, China
  • show less

    Current visible-infrared person re-identification research focuses on extracting modal shared saliency features through the attention mechanism to minimize modal differences. However, these methods only focus on the most salient features of pedestrians, and cannot make full use of modal information. To solve this problem, a quadrupl-stream input-guided feature complementary network (QFCNet) is proposed in this paper. Firstly, a quadrupl-stream feature extraction and fusion module is designed in the mode-specific feature extraction stage. By adding two data enhancement inputs, the color differences between modalities are alleviated, the semantic information of the modalities is enriched and the multi-dimensional feature fusion is further promoted. Secondly, a sub-salient feature complementation module is designed to supplement the pedestrian detail information ignored by the attention mechanism in the global feature through the inversion operation, to strengthen the pedestrian discriminative features. The experimental results on two public datasets SYSU-MM01 and RegDB show the superiority of this method. In the full search mode of SYSU-MM01, the rank-1 and mAP values reach 76.12% and 71.51%, respectively.

    Keywords
    Tools

    Get Citation

    Copy Citation Text

    Bin Ge, Nuo Xu, Chenxing Xia, Haijun Zheng. Quadrupl-stream input-guided feature complementary visible-infrared person re-identification[J]. Opto-Electronic Engineering, 2024, 51(9): 240119-1

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Article

    Received: May. 23, 2024

    Accepted: Aug. 18, 2024

    Published Online: Dec. 12, 2024

    The Author Email:

    DOI:10.12086/oee.2024.240119

    Topics