Optics and Precision Engineering, Volume. 28, Issue 4, 954(2020)
Improved SIFT feature extraction and matching technology based on hyperspectral image
Aiming at the small number of feature points and high error rate in the traditional Scale Invariant Feature Transform(SIFT) algorithm, an improved SIFT algorithm was improved based on hyperspectral images. First, hyperspectral images were used as the images generated by Gaussian transformation based on the Gaussian pyramid construction in the traditional SIFT algorithmand the characteristics of hyperspectral images with the same macro-characteristics in different wavebands.This considerably increased the number of real significant feature points detected. Second, the traditional SIFT algorithm and several improved methods only construct the feature descriptor through the pixel information in the neighborhood of the target pixeland ignore the position information of the pixel. In this study, the position information of the target pixel was included in the feature descriptor. The pixel information in the neighborhood was first used for coarse matching, and the position information in the feature descriptor was subsequently used for fine matching. The simulation results showed that by limiting the ratio of the suboptimal value, the method of constructing Gaussian pyramid with hyperspectral images significantly increased the number of feature points extracted, and more extreme points in the image could be extracted.Furthermore, the position information of the target pixel was added to the feature descriptor as the judgment basis of the second stage of feature point matching. Consequently, the number of correct matching was at least 59 times that of the original method, which greatly improved the matching performance of the algorithm.
Get Citation
Copy Citation Text
DING Guo-shen, QIAO Yan-li, YI Wei-ning, DU Li-li, FANG Wei. Improved SIFT feature extraction and matching technology based on hyperspectral image[J]. Optics and Precision Engineering, 2020, 28(4): 954
Category:
Received: Oct. 31, 2019
Accepted: --
Published Online: Jul. 2, 2020
The Author Email: Guo-shen DING (guoshenahu@163.com)