Optics and Precision Engineering, Volume. 31, Issue 5, 621(2023)
A visual SLAM algorithm based on adaptive inertial navigation assistant feature matching
This paper proposes a feature-matching algorithm based on an adaptive search radius to improve the accuracy of SLAM localization and mapping. This method can overcome the problem in which the search radius of feature matching is fixed in the traditional algorithm, leading to a high mismatching rate of the visual odometer in high dynamic motion. The algorithm first extracts and matches the features of the left and right images of the binocular camera and obtains the three-dimensional coordinates of the map points. Second, the camera pose is predicted by the measured values of the pre-integral inertial measurement unit. Then, the covariance of the predicted pose is calculated according to the error propagation law. Finally, the predicted pose is used to project the map points to the image to get the corresponding pixel coordinates. According to the error in pixel coordinates, the most likely radius of the map point is determined. Experimental results show that this method can effectively reduce the search radius of feature matching and significantly improve the accuracy of image feature matching. The position and pose accuracy of the tracking thread in the ORB-SLAM3 system is improved by approximately 38.09%, and the system's whole position and pose accuracy is improved by approximately 16.38%. This method can provide an adaptive region constraint for each feature point, improve the accuracy of feature point matching, improve the precision of position and pose estimation of the whole SLAM system, and build a more accurate dense map.
Get Citation
Copy Citation Text
Xiaoxue JIA, Dongqing ZHAO, Letian ZHANG, Guorui XIAO, Qing XU. A visual SLAM algorithm based on adaptive inertial navigation assistant feature matching[J]. Optics and Precision Engineering, 2023, 31(5): 621
Category: Three-dimensional topographic mapping
Received: May. 27, 2022
Accepted: --
Published Online: Apr. 4, 2023
The Author Email: ZHAO Dongqing (dongqingtree@qq.com)