Laser & Optoelectronics Progress, Volume. 56, Issue 5, 052804(2019)

Terrain Classification of LiDAR Point Cloud Based on Multi-Scale Features and PointNet

Zhongyang Zhao1, Yinglei Cheng1、*, Xiaosong Shi1, Xianxiang Qin1, and Xin Li2
Author Affiliations
  • 1 Information and Navigation College, Air Force Engineering University, Xi'an, Shaanxi 710077, China
  • 2 School of Science, Northeast Electric Power University, Jilin, Jilin 132000, China
  • show less

    For the terrain classification problem of light detection and ranging (LiDAR) point cloud data in complex scenes, a deep neural network model based on multi-scale features and PointNet is proposed. The method improves the ability of PointNet to extract local features and realizes the automatic classification of LiDAR point cloud under complex scenes. Multi-scale network on the basis of PointNet network is added to extract the local features of points, and the local features of different scale points are formed into a multi-dimensional feature through the full connection layer, and combined with the global features extracted by PointNet, the score of each point class is returned to complete the point cloud classification label. The proposed deep neural network model is verified by using the Semantic three-dimensional dataset and the Vaihingen dataset provided by ISPRS. The research results show that the proposed algorithm achieves higher classification accuracy compared with other neural networks for point cloud classification.

    Tools

    Get Citation

    Copy Citation Text

    Zhongyang Zhao, Yinglei Cheng, Xiaosong Shi, Xianxiang Qin, Xin Li. Terrain Classification of LiDAR Point Cloud Based on Multi-Scale Features and PointNet[J]. Laser & Optoelectronics Progress, 2019, 56(5): 052804

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Remote Sensing and Sensors

    Received: Sep. 4, 2018

    Accepted: Sep. 21, 2018

    Published Online: Jul. 31, 2019

    The Author Email: Cheng Yinglei (ylcheng718@163.com)

    DOI:10.3788/LOP56.052804

    Topics