Optics and Precision Engineering, Volume. 33, Issue 6, 993(2025)
Pose decoupled RGBD-SLAM based on point-line-plane features
[1] 龚坤, 徐鑫, 陈小庆. 弱纹理环境下融合点线特征的双目视觉同步定位与建图[J]. 光学 精密工程, 32, 752-763(2024).
GONG K, XU X, CHEN X Q et al. Binocular vision SLAM with fused point and line features in weak texture environment[J]. Opt. Precision Eng., 32, 752-763(2024).
[2] ZHUANG L C, ZHONG X R, XU L J et al. Visual SLAM for unmanned aerial vehicles: localization and perception[J]. Sensors, 24, 2980(2024).
[3] KLEIN G, MURRAY D. Parallel tracking and mapping for small AR workspaces[C], 225-234(2007).
[4] ENGEL J, KOLTUN V, CREMERS D. Direct sparse odometry[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 40, 611-625(2018).
[5] MUR-ARTAL R, TARDÓS J D. ORB-SLAM2: an open-source SLAM system for monocular, stereo, and RGB-D cameras[J]. IEEE Transactions on Robotics, 33, 1255-1262(2017).
[6] SHU F W, WANG J X, PAGANI A et al. Structure PLP-SLAM: efficient sparse mapping and localization using point, line and plane for monocular, RGB-D and stereo cameras[C], 2105-2112(2023).
[7] GROMPONE VON GIOI R, JAKUBOWICZ J, MOREL J M et al. LSD: a fast line segment detector with a false detection control[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32, 722-732(2010).
[8] PUMAROLA A, VAKHITOV A, AGUDO A et al. PL-SLAM: Real-time monocular visual SLAM with points and lines[C], 4503-4508(2017).
[9] QIN T, LI P L, SHEN S J. VINS-mono: a robust and versatile monocular visual-inertial state estimator[J]. IEEE Transactions on Robotics, 34, 1004-1020(2018).
[10] HE Y J, ZHAO J, GUO Y et al. PL-VIO: tightly-coupled monocular visual-inertial odometry using point and line features[J]. Sensors, 18, 1159(2018).
[12] AKINLAR C, TOPAL C. EDLines: a real-time line segment detector with a false detection control[J]. Pattern Recognition Letters, 32, 1633-1642(2011).
[13] YANG G, MENG W D, HOU G D et al. Real-time visual-inertial odometry based on point-line feature fusion[J]. Gyroscopy and Navigation, 14, 339-352(2023).
[14] ZHANG X Y, WANG W, QI X Y et al. Point-plane SLAM using supposed planes for indoor environments[J]. Sensors, 19, 3795(2019).
[15] KIM P, COLTIN B, KIM H J. Low-drift visual odometry in structured environments by decoupling rotational and translational motion[C], 7247-7253(2018).
[16] RUBLEE E, RABAUD V, KONOLIGE K et al. ORB: An Efficient Alternative to SIFT or SURF[C], 2564-2571(2011).
[17] 王立玲, 朱旭阳, 马东. 基于点线特征视觉惯性融合的机器人SLAM算法[J]. 中国惯性技术学报, 30, 730-737(2022).
WANG L L, ZHU X Y et al. Robot SLAM algorithm based on visual inertia fusion of point-line features[J]. Journal of Chinese Inertial Technology, 30, 730-737(2022).
[18] FENG C, TAGUCHI Y, KAMAT V R. Fast plane extraction in organized point clouds using agglomerative hierarchical clustering[C], 6218-6225(2014).
[19] ZHOU Y, KNEIP L, RODRIGUEZ C et al. Divide and conquer: efficient density-based tracking of 3d sensors in manhattan worlds[C], 3-19(2017).
[20] HANDA A, WHELAN T, MCDONALD J et al. A benchmark for RGB-D visual odometry, 3D Reconstruction and SLAM[C], 1524-1531(2014).
[21] STURM J, ENGELHARD N, ENDRES F et al. A Benchmark for the evaluation of RGB-D SLAM systems[C]. Portugal, 573-580(2012).
Get Citation
Copy Citation Text
Gang YANG, Wengang ZHANG, Tianle CAO. Pose decoupled RGBD-SLAM based on point-line-plane features[J]. Optics and Precision Engineering, 2025, 33(6): 993
Category:
Received: Aug. 6, 2024
Accepted: --
Published Online: Jun. 16, 2025
The Author Email: Wengang ZHANG (wengz0208@163.com)