Optics and Precision Engineering, Volume. 33, Issue 2, 324(2025)

Special attribute-based cross-modal interactive fusion network for RGBT tracking

Xiaoqiang SHAO, Hao LI*, Zhiyue LÜ, Bo MA, Mingqian LIU, and Zehui HAN
Author Affiliations
  • College of Electrical and Control Engineering, Xi’an University of Science and Technology,Xi'an710054, China
  • show less
    References(42)

    [1] LAURENSE V A, GOH J Y, GERDES J C. Path-tracking for autonomous vehicles at the limit of friction[C], 5586-5591(2017).

    [2] PAJARES G, MANUEL DE LA CRUZ J. A wavelet-based image fusion tutorial[J]. Pattern Recognition, 37, 1855-1872(2004).

    [3] AL-JARRAH M A, YASEEN M A, AL-DWEIK A et al. Decision fusion for IoT-based wireless sensor networks[J]. IEEE Internet of Things Journal, 7, 1313-1326(2020).

    [4] ILHAN H O, SERBES G, AYDIN N. Decision and feature level fusion of deep features extracted from public COVID-19 data-sets[J]. Applied Intelligence, 52, 8551-8571(2022).

    [5] ZHANG P Y, WANG D, LU H C et al. Learning adaptive attribute-driven representation for real-time RGB-T tracking[J]. International Journal of Computer Vision, 129, 2714-2729(2021).

    [6] XIAO Y, YANG M M, LI C L et al. Attribute-based progressive fusion network for RGBT tracking[J]. Proceedings of the AAAI Conference on Artificial Intelligence, 36, 2831-2838(2022).

    [7] LI H, WU X J. DenseFuse: a fusion approach to infrared and visible images[J]. IEEE Transactions on Image Processing, 28, 2614-2623(2019).

    [8] ZHANG L C, DANELLJAN M, GONZALEZ-GARCIA A et al. Multi-modal fusion for end-to-End Rgb-T tracking[C], 2252-2261(2019).

    [9] HAN B. Learning multi-domain convolutional neural networks for visual tracking[C], 4293-4302(2016).

    [10] LIU Q, LU X H, HE Z Y et al. Deep convolutional neural networks for thermal infrared object tracking[J]. Knowledge-Based Systems, 134, 189-198(2017).

    [11] LIU Q, LI X, HE Z Y et al. Learning deep multi-level similarity for thermal infrared object tracking[J]. IEEE Transactions on Multimedia, 23, 2114-2126(2020).

    [12] LIU Q, YUAN D, FAN N N et al. Learning dual-level deep representation for thermal infrared tracking[J]. IEEE Transactions on Multimedia, 25, 1269-1281(2022).

    [13] LIU Z, LIN Y T, CAO Y et al. Swin transformer: hierarchical vision transformer using shifted windows[C], 9992-10002(2021).

    [14] KELLY K[M]. The inevitable: Understanding the 12 Technological Forces That Will Shape Our Future(2016).

    [15] ZHOU K L, CHEN L S, CAO X[M]. Improving Multispectral Pedestrian Detection by Addressing Modality Imbalance Problems, 787-803(2020).

    [16] PENG J C, ZHAO H T, HU Z W. Dynamic fusion network for RGBT tracking[J]. IEEE Transactions on Intelligent Transportation Systems, 24, 3822-3832(2023).

    [17] ZHU Y B, LI C L, TANG J et al. RGBT tracking by trident fusion network[J]. IEEE Transactions on Circuits and Systems for Video Technology, 32, 579-592(2021).

    [18] ZHANG Q, LIU X R, ZHANG T L. RGB-T tracking by modality difference reduction and feature re-selection[J]. Image and Vision Computing, 127, 104547(2022).

    [19] ZHU J W, LAI S M, CHEN X et al. Visual prompt multi-modal tracking[C], 9516-9526(2023).

    [20] LI M Y, ZHANG P, YAN M et al. Dynamic feature-memory transformer network for RGBT tracking[J]. IEEE Sensors Journal, 23, 19692-19703(2023).

    [21] LI C L, LIU L, LU A D et al[M]. Challenge-aware RGBT Tracking, 222-237(2020).

    [23] LI X, WANG W, HU X et al. Selective kernel networks[C], 2019, 510-519.

    [24] PARK J, LEE J Y et al. CBAM: Convolutional block attention module[C], 3-19(2018).

    [25] DENG J, DONG W, SOCHER R et al. ImageNet: a large-scale hierarchical image database[C], 248-255(2009).

    [26] LI C L, CHENG H, HU S Y et al. Learning collaborative sparse representation for grayscale-thermal tracking[J]. IEEE Transactions on Image Processing, 25, 5743-5756(2016).

    [27] LI C L, LIANG X Y, LU Y J et al. RGB-T object tracking: Benchmark and baseline[J]. Pattern Recognition, 96, 106977(2019).

    [28] LI C L, XUE W L, JIA Y Q et al. LasHeR: a large-scale high-diversity benchmark for RGBT tracking[J]. IEEE Transactions on Image Processing, 31, 392-404(2021).

    [29] JUNG I, BAEK M et al. Real-time mdnet[C], 83-98(2018).

    [30] LI C L, LU A D, ZHENG A H et al. Multi-adapter RGBT tracking[C], 27, 0-0(2019).

    [31] DANELLJAN M, BHAT G, KHAN F S et al. ECO: efficient convolution operators for tracking[C], 6931-6939(2017).

    [32] YUN S, CHOI J, YOO Y et al. Action-decision networks for visual tracking with deep reinforcement learning[C], 1349-1358(2017).

    [33] ZHANG J M, MA S G, SCLAROFF S[M]. MEEM: robust tracking via multiple experts using entropy minimization, 188-203(2014).

    [34] ZHANG Z P, PENG H W. Deeper and wider siamese networks for real-time visual tracking[C], 4586-4595(2019).

    [35] ZHU Y B, LI C L, LUO B et al. Dense feature aggregation and pruning for RGBT tracking[C], 465-472(25).

    [36] PU S, SONG Y B, MA C et al. Deep attentive tracking via reciprocative learning[C], 1935-1945(8).

    [37] HARE S, GOLODETZ S, SAFFARI A et al. Struck: structured output tracking with kernels[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 38, 2096-2109(2016).

    [38] LI C L, ZHAO N, LU Y J et al. Weighted sparse representation regularized graph learning for Rgb-T object tracking[C], 1856-1864(2017).

    [39] LUKEŽIC A, VOJÍR T, ZAJC L C et al. Discriminative correlation filter with channel and spatial reliability[C], 4847-4856(2017).

    [40] DANELLJAN M, HÄGER G, KHAN F S et al. Discriminative scale space tracking[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39, 1561-1575(2016).

    [42] HENRIQUES J F, CASEIRO R, MARTINS P et al. High-Speed tracking with kernelized correlation filters[J]. IEEE Transactions on Pattern Analysis & Machine Intelligence, 37, 583-596(2015).

    Tools

    Get Citation

    Copy Citation Text

    Xiaoqiang SHAO, Hao LI, Zhiyue LÜ, Bo MA, Mingqian LIU, Zehui HAN. Special attribute-based cross-modal interactive fusion network for RGBT tracking[J]. Optics and Precision Engineering, 2025, 33(2): 324

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Jul. 9, 2024

    Accepted: --

    Published Online: Apr. 30, 2025

    The Author Email: Hao LI (2670815399@qq.com)

    DOI:10.37188/OPE.20253302.0324

    Topics