Laser & Optoelectronics Progress, Volume. 56, Issue 22, 221503(2019)

Foreground-Aware Based Spatiotemporal Correlation Filter Tracking Algorithm

Yueyang Yu1,2,3,4,5、*, Zelin Shi1,2,3,4,5, and Yunpeng Liu2,3,4,5
Author Affiliations
  • 1School of Information Science and Technology, University of Science and Technology of China, Hefei, Anhui 230026, China
  • 2Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang, Liaoning 110016, China
  • 3Institutes for Robotics and Intelligent Manufacturing, Chinese Academy of Sciences, Shenyang, Liaoning 110016, China
  • 4Key Laboratory of Opto-Electronic Information Processing, Chinese Academy of Science, Shenyang, Liaoning 110016, China
  • 5Key Laboratory of Image Understanding and Computer Vision, Shenyang, Liaoning 110016, China
  • show less
    Figures & Tables(11)
    Temporal consistency constraints with object area selection function explained by sequence Tiger
    Take one-dimensional vector as example, assuming length of target is D=3. Left side is one-dimensional signal xi with L=5. xi[Δτj] image is result of all cyclic shifts. Five one-dimensional vectors with length of 3 can be obtained by multiplying mask matrix P on this image, where first 3 rows are real positive samples with same size of object
    Comparison of training samples between traditional correlation filters and proposed method. (a) Cyclic-shift training samples of traditional correlation filter; (b) training samples of foreground-aware correlation filter
    Relationship between IoU value and tracking confidence score for carRace and ball sequences without re-detector. (a) Relationship between IoU value of carRace and tracking confidence score; (b) 502nd-frame tracking result of carRace; (c) 510th-frame tracking result of carRace; (d) relationship between IoU of ball and tracking confidence score; (e) 209th-frame tracking result of ball; (f) 211st-frame tracking result of ball
    Plots of OPE and success rate of trackers with traditional features on OTB-2013 dataset. (a) Plots of OPE; (b) plots of success rate
    Plots of OPE and success rate of trackers with convolutional features on OTB-2013 dataset. (a) Plots of OPE; (b) plots of success rate
    Comparison of tracking results of SiamFC, CCOT, DSST, KCF, ECO, CF2, and proposed algorithm on 8 challenging sequences from OTB-2015 dataset. From top to bottom: singer2, girl2, tiger, bird1, dragonbaby, motorrolling, skiing, and soccer
    • Table 1. Success rate, precision, and tracking speed of tracking algorithm based on traditional features on OTB-2013 dataset

      View table

      Table 1. Success rate, precision, and tracking speed of tracking algorithm based on traditional features on OTB-2013 dataset

      ParameterOursECO-HCLCTSRDCFStaple-CAStapleBACFDSSTKCF
      Mean OP /%85.581.081.378.177.675.485.467.062.3
      Mean DP /%89.287.484.883.883.379.378.574.074.0
      Tracking speed /(frame·s-1)25.34218.55.835.376.623.220.4171.8
    • Table 2. Performance evaluation of each tracker on OTB-2013 dataset

      View table

      Table 2. Performance evaluation of each tracker on OTB-2013 dataset

      AlgorithmSVOVOROCCDEFMBFMIRBCLRIV
      ECO-HC0.6270.6940.6680.670.6450.6100.6070.5890.6060.6720.612
      Ours0.6540.6670.6320.6690.6640.6050.6120.6370.6250.5440.626
      LCT0.5530.5940.6240.6270.6680.5240.5340.5920.5870.5410.588
      SRDCF0.5870.5550.5990.6270.6350.6010.5690.5660.5870.5410.576
      SAMF0.5070.5550.5590.6120.6250.4610.4830.5250.5200.5260.513
      Staple-CA0.5740.5620.5940.6000.6320.5690.5660.6010.5870.4970.596
      Staple0.5510.5470.5750.5930.6180.5410.5080.5800.5760.4960.568
      KCF0.4270.5500.4950.5140.5340.4970.4590.4970.5350.5370.493
      DSST0.5460.4620.5360.5320.5060.4550.4280.5630.5170.3450.561
    • Table 3. Success rate, precision, and tracking speed of tracking algorithm based on convolutional features on OTB-2013 dataset

      View table

      Table 3. Success rate, precision, and tracking speed of tracking algorithm based on convolutional features on OTB-2013 dataset

      ParameterOursECOMDNetCCOTDeepSRDCFSiamFCCFNetCF2
      Mean OP /%89.488.791.183.279.579.176.974.0
      Mean DP /%90.093.094.889.984.981.580.789.1
      Tracking speed /(frame·s-1)10.69.80.80.80.283.778.410.2
    • Table 4. Evaluations of EAO, precision, and robustness of algorithms on VOT2016 dataset

      View table

      Table 4. Evaluations of EAO, precision, and robustness of algorithms on VOT2016 dataset

      AlgorithmEAOAccuracyRobustness
      DSST0.1810.5002.720
      ECO0.3750.5300.730
      Staple0.2950.5401.350
      MDNet0.2570.5301.200
      BACF0.2230.5601.880
      SRDCF0.2470.5201.500
      ECO-HC0.3220.5101.080
      DeepSRDCF0.2760.5101.170
      CCOT0.3310.5300.238
      SiamFC0.2770.5490.382
      Ours0.3200.5350.926
      Oursdeep0.2850.5551.330
    Tools

    Get Citation

    Copy Citation Text

    Yueyang Yu, Zelin Shi, Yunpeng Liu. Foreground-Aware Based Spatiotemporal Correlation Filter Tracking Algorithm[J]. Laser & Optoelectronics Progress, 2019, 56(22): 221503

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Machine Vision

    Received: Apr. 1, 2019

    Accepted: May. 17, 2019

    Published Online: Nov. 2, 2019

    The Author Email: Yu Yueyang (yuyueyang@sia.cn)

    DOI:10.3788/LOP56.221503

    Topics