Acta Optica Sinica, Volume. 45, Issue 12, 1228001(2025)

Polarization Remote Sensing Cloud Detection Using Multi-Dimensional Information Fusion Deep Learning Network

Shu Li1,3, Xingyuan Ji1,3, Xiaoxue Chu2,3、*, Song Ye1,3, Ziyang Zhang1,3, Yongying Gan1,3, Xinqiang Wang1,3, and Fangyuan Wang1,3
Author Affiliations
  • 1School of Optoelectronic Engineering, Guilin University of Electronic Technology, Shanghai Institute of Optics and Fine Mechanics, Guilin 541004, Guangxi , China
  • 2School of Life and Environmental Sciences, Guilin University of Electronic Technology, Guilin 541004, Guangxi , China
  • 3Guangxi Key Laboratory of Optoelectronic Information Processing, Guilin 541004, Guangxi , China
  • show less

    Objective

    Cloud-related uncertainties in global climate models (GCMs) significantly affect the accuracy of global climate simulations. The presence of clouds poses challenges to remote sensing applications, such as atmospheric and surface parameter retrieval, limiting the usability of remote sensing images and reducing data utilization rates. Accurate cloud information extraction from remote sensing images helps mitigate the negative effects of cloud cover on applications like aerosol parameter retrieval, atmospheric correction, and land cover change detection, while also enhancing the accuracy of cloud parameter inversion. Therefore, precise cloud detection is crucial for optimizing the application of remote sensing data.

    Methods

    In recent years, deep learning has been extensively applied to cloud detection, with convolutional neural networks (CNNs) playing a pivotal role. However, traditional CNNs typically focus on either spectral or spatial image information, neglecting the multi-angle, multi-dimensional space that can provide richer image context and improve detection outcomes. In addition, deep learning approaches often require large training datasets, while cloud samples are typically fragmented, sparse, and scattered, necessitating significant labor and time for manual annotation. To address these challenges, we propose a multi-information fusion cloud detection network (MIFCD-Net) that integrates spectral, polarization, and multi-angle information. The MIFCD-Net framework (Fig. 2) leverages training data sourced from the ICARE website, eliminating the need for manual cloud label annotation. MIFCD-Net comprises three primary modules: the data preprocessing module, the spectral-polarization-spatial global multi-angle information perception module and the spectral-polarization-spatial global multi-angle information fusion module. The data preprocessing module performs multi-dimensional feature selection (FE) and processing (FP) to eliminate scale differences and enhance category separability. The spectral-polarization-spatial global multi-angle information perception module employs a multi-path structure, including a multi-angle adaptive attention module, a spectral-polarization attention module, and a spatial-global attention module. This module captures local features in detail, comprehensively considers global multi-angle information, dynamically adjusts focus on different features, and extracts multi-angle, multi-spectral, polarization, and spatial texture information. Finally, the spectral-polarization-spatial global multi-angle information fusion module effectively integrates multi-dimensional feature information while minimizing redundancy and noise, enabling robust cloud detection and classification through a fully connected layer (Fig. 3).

    Results and Discussions

    Extensive qualitative and quantitative experiments are conducted to evaluate MIFCD-Net’s performance across diverse global surface types, supplemented by ablation studies for in-depth analysis. To validate its effectiveness, MIFCD-Net is compared with the ResNet model, a benchmark in tabular tasks. The experimental results (Table 3) show that MIFCD-Net achieves a consistency rate of 95.53% for oceanic cloud detection, 81.49% for mountainous regions, and 75.98% for agricultural areas—outperforming the official POLDER3 cloud labeling algorithm. Furthermore, MIFCD-Net demonstrates superior performance in capturing cloud boundary contours (Figs. 3?5). While challenges such as surface type influence on detection accuracy and cloud boundary misclassification remain, MIFCD-Net exhibits strong overall detection performance. Ablation experiments (Fig. 6) confirm that the spectral-polarization-spatial global multi-angle perception module significantly enhances feature extraction and boundary detection, with each component contributing to the model’s accuracy and robustness. Comparisons further underscore MIFCD-Net’s superiority in detection precision and adaptability.

    Conclusions

    In this paper, we propose MIFCD-Net, a novel multi-information fusion cloud detection network that integrates multi-spectral, polarization, and multi-angle detection. MIFCD-Net is constructed using a spectral-polarization-spatial global multi-angle information perception module and a spectral-polarization-spatial global multi-angle information fusion module, designed to fully capture the multi-angle and multi-band information of clouds while supplementing contextual details. We utilize POLDER3 multi-band and multi-angle data as research samples to validate the cloud detection capabilities of the proposed model by comparing its consistency with MODIS cloud label data. The results demonstrate that this method outperforms the official POLDER3 cloud detection algorithm in identifying various surface types and cloud shapes, particularly excelling in capturing cloud texture details. Moreover, this approach provides innovative insights for advancing cloud detection within China’s “polarization exchange” detection framework.

    Keywords
    Tools

    Get Citation

    Copy Citation Text

    Shu Li, Xingyuan Ji, Xiaoxue Chu, Song Ye, Ziyang Zhang, Yongying Gan, Xinqiang Wang, Fangyuan Wang. Polarization Remote Sensing Cloud Detection Using Multi-Dimensional Information Fusion Deep Learning Network[J]. Acta Optica Sinica, 2025, 45(12): 1228001

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Remote Sensing and Sensors

    Received: Oct. 20, 2024

    Accepted: Dec. 10, 2024

    Published Online: Jun. 23, 2025

    The Author Email: Xiaoxue Chu (chuxiaoxue@guet.edu.cn)

    DOI:10.3788/AOS241653

    CSTR:32393.14.AOS241653

    Topics