Optics and Precision Engineering, Volume. 33, Issue 8, 1259(2025)
A visual inertial SLAM system based on key planes with heterogeneous feature fusion
Planar feature is widely used in structured environments as a high-level geometric feature and is a good complement to most Simultaneous Localization and Mapping (SLAM) systems. In order to address the fact that new errors were introduced when fusing feature points with planar features and the possibility of planar degradation existed, we proposed a monocular visual inertial SLAM system that fuses heterogeneous features in this paper. Firstly, feature points were extracted from grayscale images; secondly, the set of feature points was triangulated and the results of the triangulation were transformed to the world coordinate system. Next, the initialization process was modeled as a constrained optimization problem and solved with the alternating-direction multiplier method in a distributed fashion. Then, the similar planes were clustered and the planes were fitted with the proposed planar collision probability model to get the corresponding bounded-plane parameters. Finally, geometric constraints on the plane features were introduced in the factor graph, and the camera motion as well as the plane parameters were simultaneously optimized by the error model. Compared with the typical visual inertial SLAM system VINS, the mean absolute trajectory error of the system proposed in this paper was reduced by 50% in the EuRoC dataset. The mean absolute trajectory error on the TUM-VI dataset was reduced by 40%. The method works stably and continuously in structured scenes and improves the localization accuracy and robustness in weakly textured regions.
Get Citation
Copy Citation Text
Yehu SHEN, Yifan HE, Jikun WEI, Daqing ZHANG. A visual inertial SLAM system based on key planes with heterogeneous feature fusion[J]. Optics and Precision Engineering, 2025, 33(8): 1259
Category:
Received: Dec. 9, 2024
Accepted: --
Published Online: Jul. 1, 2025
The Author Email: Yifan HE (heyifan@reconova.com)