Chinese Journal of Lasers, Volume. 52, Issue 10, 1010001(2025)

Aerosol Identification Based on Attention-Unet Neural Network

Changqing Fu1, Zhipeng Yang1、*, Chengli Ji2、**, Tao Fu1, Fa Tao2, and Jianhui Zheng1
Author Affiliations
  • 1School of Electronic Engineering, Chengdu University of Information Technology, Chengdu 610255, Sichuan , China
  • 2Meteorological Observation Center, China Meteorological Administration, Beijing 100081, China
  • show less

    Objective

    Accurately identifying the vertical structure of aerosols is essential for a comprehensive understanding of atmospheric processes and climate change. Conventional aerosol-identification methods primarily include the empirical threshold method and the selective iterative boundary location (SIBYL) method, although both have certain limitations. Therefore, this paper presents an automatic aerosol-classification algorithm based on the Attention-Unet. This method is suitable for lidar-based atmospheric observations and enables the automatic identification of aerosols.

    Methods

    To address the existing issues, this paper proposes an improved method for cloud-aerosol identification. First, the proposed method directly uses raw lidar data as input to train an intelligent cloud-aerosol identification model, thus avoiding the information loss that occurs during the conversion of raw data into images. Second, to achieve more accurate identification results, the U-Net model architecture was enhanced via the introduction of attention mechanisms and a pyramid pooling module. The pyramid pooling module integrates multiscale feature information, thereby improving the model robustness. Meanwhile, the attention mechanism assigns different weights to features, thus enhancing the model ability to recognize details such as edges.

    Results and Discussions

    This study compared the performances of four models—the FCN, SegNet, U-Net, and Attention-Unet—on the same dataset, with the results shown in Table 4. The Attention-Unet model performs exceptionally well across all evaluation metrics, achieving an overall accuracy of 96.5%, an average precision of 91.5%, and an average recall of 89.9%. By contrast, the performances of the FCN, SegNet, and U-Net are slightly inferior, thus indicating that the Attention-Unet model is more adept at managing complex information. Figure 4 presents the confusion matrix for the cloud-aerosol classification model based on the Attention-Unet. The figure shows that the classification accuracy for all categories exceeds 84%, thus indicating favorable overall classification performance.

    Conclusions

    Using lidar and millimeter-wave cloud radar observation data obtained from the southern suburb station of Beijing between October 2022 and May 2023, this study proposes an aerosol-cloud identification algorithm based on the Attention-Unet. The proposed algorithm was tested against the FCN, SegNet, and U-Net models on the same dataset. The results show that the proposed method achieves excellent performance in terms of both quantitative metrics and visual evaluation, thus demonstrating its high application value in aerosol research and atmospheric operational observations. Although the Attention-Unet model offers outstanding performance in the current tests, the radar parameters used in this study are limited, and factors such as humidity were not considered. Therefore, the dataset requires further optimization.

    Keywords
    Tools

    Get Citation

    Copy Citation Text

    Changqing Fu, Zhipeng Yang, Chengli Ji, Tao Fu, Fa Tao, Jianhui Zheng. Aerosol Identification Based on Attention-Unet Neural Network[J]. Chinese Journal of Lasers, 2025, 52(10): 1010001

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: remote sensing and sensor

    Received: Sep. 12, 2024

    Accepted: Jan. 14, 2025

    Published Online: Apr. 23, 2025

    The Author Email: Zhipeng Yang (yangzp@cuit.edu.cn), Chengli Ji (jcl0606@163.com)

    DOI:10.3788/CJL241202

    CSTR:32183.14.CJL241202

    Topics