Journal of Optoelectronics · Laser, Volume. 33, Issue 1, 14(2022)
A SLAM method for feature point matching network based on attention mechanism
The current feature point matching method for simultaneous localization and mapping (SLAM) is generally affected by the change of perspective,which makes the matching of feature points difficult,which in turn deteriorates the accuracy of feature point matching, and ultimately influences the construction of three-dimensional (3D) point cloud maps and the estimation accuracy of camera motion pose.For this reason,this paper presents an attention based on feature point matching network for SLAM.The innovation of this article is that compared with the existing SLAM and we replaces the feature point matching method of the visual odometer module in SLAM with an attention based on feature point matching network for feature point matching.And we make a new combination of feature point extraction and matching with the traditional feature point extraction method to form a new visual odometer and a new SLAM.Firstly,we encode the extracted feature points and descriptor vectors and we learn through the graph attention neural network to obtain matching descriptors.Then we create a score matrix based on the matching descriptors and use the optimal transmission algorithm to solve the optimal score matrix.In the end we calculate the optimal matching point pair and complete camera positioning,mapping,and loop detection based on the optimal matching point pairs.The experimental results show that when the viewing angle is unstable,an attention based on feature point matching network for SLAM can significantly improve the accuracy of the camera′s trajectory and the estimation accuracy of camera motion pose.
Get Citation
Copy Citation Text
ZU Chenyang, LIU Fenglian, WANG Riwei. A SLAM method for feature point matching network based on attention mechanism[J]. Journal of Optoelectronics · Laser, 2022, 33(1): 14
Received: Aug. 10, 2021
Accepted: --
Published Online: Oct. 9, 2024
The Author Email: LIU Fenglian (lflian@tjut.edu.cn)