Laser & Optoelectronics Progress, Volume. 61, Issue 4, 0437005(2024)

Coal and Gangue Recognition Method Based on Dual-Channel Pseudocolor Image by Lidar

Yan Wang, Jichuan Xing*, and Yaozhi Wang
Author Affiliations
  • School of Optoelectronics, Beijing Institute of Technology, Beijing 100081, China
  • show less

    The recognition accuracy and efficiency of coal and gangue have a great impact on coal-production capacity but the existing recognition and separation methods of these minerals still have deficiencies in terms of separation equipment, accuracy, and efficiency. Herein, a coal and gangue recognition method is presented based on two-channel pseudocolor lidar images and deep learning. Firstly, a height threshold is set to remove the interference information from the target ore based on the lidar distance channel information. Concurrently, the original point-cloud data are projected in a reduced dimension to quickly obtain the reflection intensity information and surface texture features of coal gangue. The intensity and distance channels after dimensional reduction are then fused to construct the dual-channel pseudocolor image dataset for coal and gangue. On this basis, the DenseNet-121 is optimized for the pseudocolor dataset, and the DenseNet-40 network is used for model training and testing. The results show that the recognition accuracy of coal gangue is 94.56%, which proves that the two-channel pseudo-color image acquired by lidar has scientific and engineering value in the field of ore recognition.

    Tools

    Get Citation

    Copy Citation Text

    Yan Wang, Jichuan Xing, Yaozhi Wang. Coal and Gangue Recognition Method Based on Dual-Channel Pseudocolor Image by Lidar[J]. Laser & Optoelectronics Progress, 2024, 61(4): 0437005

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Digital Image Processing

    Received: Dec. 1, 2022

    Accepted: Feb. 6, 2023

    Published Online: Feb. 26, 2024

    The Author Email: Xing Jichuan (michaelhsing@bit.edu.cn)

    DOI:10.3788/LOP223222

    Topics