Infrared and Laser Engineering, Volume. 54, Issue 2, 20240373(2025)
Sparse multiple hypothesis matching and model lightweighting for infrared multi-object tracking
[1] LUO H B, XU L Y, HUI B et al. Status and prospect of target tracking based on deep learning[J]. Infrared and Laser Engineering, 46, 0502002(2017).
[2] [2] CHEN F L, DING Q H, LUO H B, et al. Antiocclusion real time target tracking algithm employing spatiotempal context [J]. Infrared Laser Engineering , 2021, 50(1): 20200105. (in Chinese)
[3] [3] CARION N, MASSA F, SYNNAEVE G, et al. Endtoend object detection with transfmers [C]European Conference on Computer Vision, 2020: 213229.
[4] [4] BEWLEY A, GE Z, OTT L, et al. Simple online realtime tracking [C]IEEE International Conference on Image Processing, 2016: 34643468.
[5] [5] WOJKE N, BEWLEY A, PAULUS D. Simple online realtime tracking with a deep association metric [C]IEEE International Conference on Image Processing, 2017: 36453649.
[6] [6] ZHANG Y, SUN P, JIANG Y, et al. Bytetrack: Multiobject tracking by associating every detection box [C]European Conference on Computer Vision, 2022: 121.
[7] [7] YANG F, ODASHIMA S, MASUI S, et al. Hard to track objects with irregular motions similar appearances Make it easier by buffering the matching space [C]IEEECVF Winter Conference on Applications of Computer Vision, 2023: 47994808.
[8] [8] LIU Z, WANG X, WANG C, et al. Sparsetrack: multiobject tracking by perfming scene decomposition based on pseudodepth [JOL]. (20231120) [2024519] https:arxiv.gabs2306.05238.
[9] [9] SLER M, HOWARD A, ZHU M, et al. Mobilev2: Inverted residuals linear bottlenecks [C]IEEE Conference on Computer Vision Pattern Recognition, 2018: 45104520.
[10] [10] HAN K, WANG Y, TIAN Q, et al. Ghost: me features from cheap operations [C]IEEECVF Conference on Computer Vision Pattern Recognition, 2020: 15801589.
[11] WU T, TANG S, ZHANG R et al. CGNet: A light-weight context guided network for semantic segmentation[J]. IEEE Transactions on Image Processing, 30, 1169-1179(2020).
[12] [12] CHEN J, KAO S, HE H, et al. Run, Don''t walk: chasing higher FLOPS f faster neural wks [C]IEEECVF Conference on Computer Vision Pattern Recognition, 2023: 1202112031.
[13] [13] KIM C, LI F, CIPTADI A, et al. Multiple hypothesis tracking revisited [C]IEEE International Conference on Computer Vision, 2015: 46964704.
[15] [15] LEE J, PARK S, MO S, et al. Layeradaptive sparsity f the magnitudebased pruning [C]International Conference on Learning Representations, 2020: 119.
[16] XIAO X, ZHOU Y, GONG Y J. RGB-‘D’ saliency detection with pseudo depth[J]. IEEE Transactions on Image Processing, 28, 2126-2139(2018).
[17] [17] QIU K, LAI Y, LIU S, et al. Selfsupervised multiview stereo via inter intra wk pseudo depth[C]ACM International Conference on Multimedia, 2022: 23052313.
[18] [18] SCHÖN R, LUDWIG K, LIENHART R. Impact of pseudo depth on open wld object segmentation with minimal user guidance[C]IEEECVF Conference on Computer Vision Pattern Recognition, 2023: 48094819.
[19] [19] MALACH E, YEHUDAI G, SHALEVSCHWARTZ S, et al. Proving the lottery ticket hypothesis: pruning is all you need [C]International Conference on Machine Learning, 2020: 66826691.
[20] PRASAD D K, RAJAN D, RACHMAWATI L et al. Video processing from electro-optical sensors for object detection and tracking in a maritime environment: A survey[J]. IEEE Transactions on Intelligent Transportation Systems, 18, 1993-2016(2017).
[21] [21] STADLER D, BEYERER J. Modelling ambiguous assignments f multiperson tracking in crowds [C]IEEECVF Winter Conference on Applications of Computer Vision, 2022: 133142.
[23] [23] AHARON N, FAIG R, BOBROVSKY B Z. BoTST: Robust associations multipedestrian tracking [JOL]. (20220707) [2024519] https:arxiv.gabs2206.14651.
Get Citation
Copy Citation Text
Changqi XU, Haoxian WANG, Jun WANG, Zhiquan ZHOU. Sparse multiple hypothesis matching and model lightweighting for infrared multi-object tracking[J]. Infrared and Laser Engineering, 2025, 54(2): 20240373
Category:
Received: Nov. 13, 2024
Accepted: --
Published Online: Mar. 14, 2025
The Author Email: