Laser Technology, Volume. 47, Issue 5, 700(2023)
Multi-resolution point cloud completion fusing graph attention
In order to solve the problem that it is difficult to extract the local feature information of point cloud in 3-D point cloud completion, a multi-resolution point cloud completion network structure based on fusion graph attention was proposed. The method of data processing with generative adversarial network framework was adopted. The structure of the point cloud image was constructed by the generator through the graph attention layer, the feature information of different resolutions with grid data was fused, and the folding operation was combined to reconstruct the missing structure and output the stepwise completed point cloud data. The truth and falsity of the point cloud was discriminated by the discriminator. The accuracy was improved through feedback, and the generator was optimized, so that the generated data has a fine geometric structure, which is similar to the real point cloud. The proposed method was verified experimentally and analyzed theoretically with four related methods on the shape dataset, and the optimal results were obtained. The results show that the proposed method can effectively complete the missing part of point cloud shape and obtain a complete and uniform point cloud shape, the network performance is improved by about 1.79% compared with point fractal network, the proposed method also achieves the expected effect on the completion of the measured data. The proposed point cloud completion network structure not only extracts the global shape features of point cloud, but also better extracts the local geometric feature information of point cloud, making the completed point cloud shape more refined. This study provides a reference for 3-D modeling of smart cities.
Get Citation
Copy Citation Text
PAN Lilin, SHAO Jianfei. Multi-resolution point cloud completion fusing graph attention[J]. Laser Technology, 2023, 47(5): 700
Category:
Received: Jul. 25, 2022
Accepted: --
Published Online: Dec. 11, 2023
The Author Email: SHAO Jianfei (1156468319@qq.com)