Acta Optica Sinica, Volume. 43, Issue 20, 2012002(2023)

Hyperspectral Target Tracking Based on Spectral Matching Dimensionality Reduction and Feature Fusion

Yecai Guo1,2, Jialu Cao1,2, Yingying Han3, Tianmeng Zhang4, Dong Zhao1,2、*, and Xu Tao1,2
Author Affiliations
  • 1School of Electronics & Information Engineering, Nanjing University of Information Science & Technology, Nanjing 210044, Jiangsu , China
  • 2School of Electronics and Information Engineering, Wuxi University, Wuxi 214105, Jiangsu , China
  • 3No. 703 Research Institute of China State Shipbuilding Corporation Limited, Harbin 150000, Heilongjiang , China
  • 4College of Aerospace and Civil Engineering, Harbin Engineering University, Harbin 150000, Heilongjiang , China
  • show less

    Objective

    Spectral features in hyperspectral video (HSV) enhance the ability to identify similar targets. However, HSV has high dimensions and a large amount of data, which causes great difficulties and high computing costs for feature extraction, and thus it is difficult to apply target tracking technology to HSV. In recent years, the development of snapshot hyperspectral technology has made it possible to acquire HSV. Many researchers have also turned their focus to HSV target tracking technology. In many target tracking processes, the target scale often changes to result in failed algorithm tracking. How to track targets robustly under the scale variations is an urgent problem to be solved.

    Methods

    The algorithm is based on the correlation filtering framework and the scale-adaptive kernel correlation filter tracker. We employ the difference between the spectral curve of each pixel and the local spectral curve of the target and count the error value to segment the target pixel and the background pixel. The target spectral curve is obtained by averaging the target pixels, and the dimensionality reduction is realized by adopting the simple correlation between the target spectral curve and the image. Meanwhile, the dimensionally reduced image is input into the MobileNet V2 to extract deep features. The target area is judged by the local variance, and the 3D histogram of oriented gradient (HOG) features of the target are enhanced. To preserve the unique spectral information of hyperspectral images and the semantic information of deep features, we utilize the method of channel convolution fusion to obtain more discriminative deep convolution HOG features which are fed into the filter to adapt to scale variations through the scale pooling idea.

    Results and Discussions

    Three hyperspectral target tracking algorithms are selected for comparison in the experiment to verify the effectiveness of the proposed algorithm. Additionally, the results are presented in the experimental sequence for visualizing the performance of the algorithm. Fig. 7 presents the qualitative results of the algorithm on selected experimental sequences. In the book sequence, since the proposed algorithm adopts the scale pooling idea, it can estimate the scale of the target and track it stably. In the excavator sequence, the proposed algorithm is more robust by leveraging multi-feature fusion. The algorithm shows better adaptability when car sequence and face sequence are challenged by scale variations. We quantitatively evaluate the algorithm performance from two aspects of precision and success rate. Tables 1 and 2 present the values of the precision and success rates of the four algorithms respectively. Fig. 8 indicates the precision and success rate curves of each algorithm on the selected test sequences. Figs. 9 and 10 display the precision and success rate curves related to the scale variation challenge and the out-of-plane rotation challenge respectively. The precision and success rate results indicate that the proposed algorithm shows sound performance on the test sequence. As shown in Fig. 10, the proposed algorithm ranks first in the precision and success rate of the total test sequence. Specifically, the precision is improved by 3.3% and the success rate is increased by 2.2% compared with material based object tracking in hyperspectral video (MHT). Due to the utilization of fused features, the proposed algorithm is more robust. The precision and success rate of the proposed algorithm under scale variations are 19.8% and 14.0% higher than those of the second place, showing excellent adaptability (Fig. 11). The other three algorithms perform slightly worse under the scale variation challenge because they do not have corresponding scale estimation modules. Additionally, as shown in Fig. 12, the precision of the proposed algorithm under the out-of-plane rotation challenge is 1.31% lower than that of MHT, but it ranks first in the success rate due to the absence of a corresponding coordinate affine transformation strategy. Table 3 reveals the precision and success rate of the ablation experiment, and the proposed methods have improved the target tracking robustness.

    Conclusions

    To solve the tracking failure caused by scale variations in hyperspectral target tracking tasks, we propose a hyperspectral target tracking algorithm based on spectral matching dimensionality reduction and feature fusion. Spectral matching dimensionality reduction provides low-dimensional feature input for the network and reduces computational complexity, and DC-HOG features improve target discriminability. Experimental results demonstrate that the proposed algorithm is better than other algorithms and can handle the scale variations well. Future research will explore an HSV target tracking algorithm with better performance to deal with challenges such as background clutter and out-of-plane.

    Tools

    Get Citation

    Copy Citation Text

    Yecai Guo, Jialu Cao, Yingying Han, Tianmeng Zhang, Dong Zhao, Xu Tao. Hyperspectral Target Tracking Based on Spectral Matching Dimensionality Reduction and Feature Fusion[J]. Acta Optica Sinica, 2023, 43(20): 2012002

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Instrumentation, Measurement and Metrology

    Received: Apr. 4, 2023

    Accepted: May. 19, 2023

    Published Online: Oct. 23, 2023

    The Author Email: Zhao Dong (dzhao@cwxu.edu.cn)

    DOI:10.3788/AOS230776

    Topics