Laser Journal, Volume. 45, Issue 9, 177(2024)
Defect segmentation model for LiDAR remote sensing images based on visual communication technology
With the development of remote sensing technology, the application of LiDAR remote sensing images in various fields is becoming increasingly widespread. However, there are often various defects in these images, such as noise, distortion, occlusion etc. which can have adverse effects on the analysis and application of the images. Therefore, it is necessary to implement defect segmentation on LiDAR remote sensing images. The aim of this study is to develop a defect segmentation model for LiDAR remote sensing images based on visual communication technology, in order to improve the accuracy and reliability of remote sensing image processing. The application of bilateral filtering functions for discrete point cloud denoising of LiDAR remote sensing images preserves the surface geometric features of the point cloud well while denoising. Based on the image enhancement technology in visual communication technology, LiDAR remote sensing image enhancement processing is implemented to improve the understanding and readability of visual information in remote sensing images. The selected image enhancement technology is the local contrast enhancement variational model. Design a semantic segmentation network that integrates attention mechanism as a remote sensing image defect segmentation model to achieve defect segmentation in LiDAR remote sensing images. The experimental test results show that the mPA of the designed model is relatively high, overall higher than 95%. As the amount of test data increases, there is no significant decrease in the mPA of the designed model. The design model has a high MIoU for different terrain scenes and can maintain a high MIoU for complex terrain scenes, with strong robustness.
Get Citation
Copy Citation Text
NONG Linlin. Defect segmentation model for LiDAR remote sensing images based on visual communication technology[J]. Laser Journal, 2024, 45(9): 177
Category:
Received: Dec. 7, 2023
Accepted: Dec. 20, 2024
Published Online: Dec. 20, 2024
The Author Email: