Journal of Qingdao University(Engineering & Technology Edition), Volume. 40, Issue 2, 24(2025)

Cloth-changing Person Re-identification Model Based on Positional Mask-guided

GE Jiashang, SONG Shimiao, GU Feifan, and YANG Jie*
Author Affiliations
  • College of Mechanical and Electrical Engineering, Qingdao University, Qingdao 266071, China
  • show less

    In cloth-changing person re-identification tasks, clothing variation is a critical factor degrading recognition accuracy. To discover clothing-invariant features, a Positional Mask-Guided Model (PMGM) for Cloth-changing Person Re-identification was proposed. The PMGM model leverages four positional masks (head, upper body, lower body, and arms) to guide the network in capturing local fine-grained features, which are fused with global features to precisely extract clothing-invariant representations. During inference, integrating head feature matching with identity feature matching further enhances the model discriminative capability. Experimental results show that the PMGM model achieves 5.7% improvement in mAP and 6.1% improvement in Rank-1 on the PRCC dataset compared to baseline models.

    Tools

    Get Citation

    Copy Citation Text

    GE Jiashang, SONG Shimiao, GU Feifan, YANG Jie. Cloth-changing Person Re-identification Model Based on Positional Mask-guided[J]. Journal of Qingdao University(Engineering & Technology Edition), 2025, 40(2): 24

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Received: Mar. 1, 2025

    Accepted: Aug. 22, 2025

    Published Online: Aug. 22, 2025

    The Author Email: YANG Jie (yangjie@qdu.edu.cn)

    DOI:10.13306/j.1006-9798.2025.02.004

    Topics