Optics and Precision Engineering, Volume. 26, Issue 7, 1774(2018)
A terracotta image partition matching method based on learned invariant feature transform
A novel feature partition matching scheme for two-view Terracotta warrior images was presented to address the problem of high false matching rate and low feature matching efficiency during 3D reconstruction in this paper. The new scheme was as follows: First, the features of the complete Terracotta warriors image were extracted using the learned invariant feature transform (LIFT) method. Second, the position of the dividing line on the head of the image of the warrior was determined by applying the proposed prior knowledge-based feature point distribution curve, and the extracted features were then divided into head and torso features based on the dividing line. Third, the Euclidean distance was used to perform the regional feature matching, and the random sample consensus (RANSAC) algorithm was subsequently used to filter out the mismatched point set from the matched result set. Experimental results show that in the terracotta image feature extraction and matching, the correct matching rate of the new scheme can reach 98%; the correct matching rate is increased by approximately 20% compared with those of the SIFT and SURF methods, and the repeat rate of the feature points is increased by 10% while the iteration time of RANSAC is decreased by 50%. The new scheme also has better robustness when scale, illumination, and angle are changed in the images. Therefore, the proposed scheme can achieve correct matching of the feature points with sufficient accuracy and has applications in the robust 3D reconstruction of the Terracotta warrior images.
Get Citation
Copy Citation Text
FENG Jun, YAN Yu-yu, ZHAO Yan, XIAO Fang, LIU Xiao-ning. A terracotta image partition matching method based on learned invariant feature transform[J]. Optics and Precision Engineering, 2018, 26(7): 1774
Category:
Received: Nov. 27, 2017
Accepted: --
Published Online: Oct. 2, 2018
The Author Email: