Optics and Precision Engineering, Volume. 32, Issue 3, 435(2024)

Attention interaction based RGB-T tracking method

Wei WANG... Feiya FU, Hao LEI and Zili TANG* |Show fewer author(s)
Author Affiliations
  • The 63870 Unit of PLA, Weinan714299, China
  • show less

    In visible and thermal infrared tracking(RGB-T), to effectively merge these two modalities building on traditional tracking techniques, this study introduces an attention-based RGB-T tracking approach based on the attention mechanism. This method employs the attention mechanism to augment and integrate features from both visible and infrared images. It features a self-feature enhancement encoder to boost single modality features, and a cross-feature interaction decoder for merging the enhanced features from both modalities. Both the encoder and decoder incorporate dual layers of attention modules. To streamline the network, the traditional attention module is simplified by substituting fully connected layers with 1×1 convolutions. Moreover, it merges features from various convolutional layers to thoroughly explore details and semantic insights. Comparative experiments on three datasets—GTOT, RGBT234, and LasHeR—demonstrate that our method achieves superior tracking performance, underscoring the efficacy of the attention mechanism in RGB-T tracking.

    Keywords
    Tools

    Get Citation

    Copy Citation Text

    Wei WANG, Feiya FU, Hao LEI, Zili TANG. Attention interaction based RGB-T tracking method[J]. Optics and Precision Engineering, 2024, 32(3): 435

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Jul. 19, 2023

    Accepted: --

    Published Online: Apr. 2, 2024

    The Author Email: TANG Zili (tang_zili@qq.com)

    DOI:10.37188/OPE.20243203.0435

    Topics