Optics and Precision Engineering, Volume. 32, Issue 6, 857(2024)

RGB-D SLAM method of dynamic scene based on instance segmentation and optical flow

Chenggen WANG1, Jinlong SHI1、*, Haowei ZHU1, Suqin BAI1, Yunhan SUN2, Jiawen LU1, and Shucheng HUANG1
Author Affiliations
  • 1School of Computer Science and Engineering, Jiangsu University of Science and Technology, Zhenjiang22000,China
  • 2State Key Laboratory for Novel Software Technology, Nanjing University, Nanjing10046,China
  • show less

    A new method for improving the accuracy of camera pose estimation in RGB-D SLAM of dynamic scenes was proposed. This method was based on instance segmentation and optical flow. The first step was to detect objects in the scene using instance segmentation, eliminate non-rigid objects, and construct a semantic map. The second step involved calculating motion residuals through optical flow information, detecting dynamic rigid objects, and tracking them in the semantic map. Next, dynamic feature points on non-rigid objects and dynamic rigid objects in each frame were removed, and the camera pose was optimized using stable feature points. Finally, the static background was reconstructed using the TSDF model, and the dynamic rigid objects were displayed as point clouds. Tests conducted on the TUM and Bonn datasets demonstrate that Compared with the most advanced work ACEFusion, the method proposed in this article improves camera accuracy by approximately 43%. The results show that retaining feature points of dynamic rigid objects in a static state can significantly improve camera pose estimation results. The dense mapping experiments show that our method outperforms better in dynamic 3D reconstruction, the average reconstruction error is 0.042 m. Our code is available athttps://github.com/wawcg/dy_wcg.

    Keywords
    Tools

    Get Citation

    Copy Citation Text

    Chenggen WANG, Jinlong SHI, Haowei ZHU, Suqin BAI, Yunhan SUN, Jiawen LU, Shucheng HUANG. RGB-D SLAM method of dynamic scene based on instance segmentation and optical flow[J]. Optics and Precision Engineering, 2024, 32(6): 857

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Sep. 7, 2023

    Accepted: --

    Published Online: Apr. 19, 2024

    The Author Email: SHI Jinlong (shi_jinlong@163. com)

    DOI:10.37188/OPE.20243206.0857

    Topics