Acta Optica Sinica, Volume. 44, Issue 24, 2428003(2024)

A Method for Cloud Removal Using Optical and Synthetic Aperture Radar Image Fusion

Xunqiang Gong1,2,3, Qirui Fang1, Zhaoyang Hou4, Zhihua Zhang5, and Yuanping Xia1、*
Author Affiliations
  • 1Key Laboratory of Regional Mining Environmental Monitoring and Control of Poyang Lake, Ministry of Natural Resources, East China University of Technology, Nanchang 330013, Jiangxi , China
  • 2State Key Laboratory of Information Engineering in Surveying, Mapping, and Remote Sensing, Wuhan University, Wuhan 430079, Hubei , China
  • 3Jiangxi Academy of Eco-Environmental Sciences and Planning, Nanchang 330039, Jiangxi , China
  • 4Faculty of Geomatics, Lanzhou Jiaotong University, Lanzhou 730070, Gansu , China
  • 5The Sixth Geological Brigade, Jiangxi Bureau of Geology, Yingtan 335000, Jiangxi , China
  • show less

    Objective

    Synthetic aperture radar (SAR) data can penetrate clouds and fog in all weather conditions, which makes it a valuable tool for supplementing ground information obscured by thick clouds when SAR images are used as auxiliary data. SAR-assisted cloud removal techniques allow for the generation of cloud-free references on days when images are contaminated by clouds. However, there are still two main challenges in using SAR data for cloud removal. First, the differences in imaging mechanisms between optical and SAR systems make it difficult for SAR data to directly substitute the ground information blocked by clouds. Second, there are concerns regarding image quality after SAR speckle noise reduction and fusion.

    Methods

    To effectively reconstruct cloud-contaminated ground information using SAR data, we propose a new method for cloud removal through optical and SAR image fusion. First, the cloud regions are detected and extracted using the fractal net evolution approach (FNEA), which separates the image into areas with clouds and without clouds. Corresponding fusion rules are then set for the cloud-free and cloudy regions. Next, the images are decomposed into low-frequency and high-frequency parts using the non-subsampled shearlet transform (NSST). In the low-frequency component, the window center distance weighted regional energy (DWRE) is utilized to preserve texture details in the final fused image. For the high-frequency component, the dual-channel unit-linking pulse coupled neural network (DCULPCNN) and rolling guidance filter (RGF) are applied to the cloud-free and cloudy regions, respectively. Thus, the linear correlation is enhanced between the SAR image and the optical image, while minimizing the introduction of SAR coherent spot noise. Finally, the fusion images are obtained through inverse NSST.

    Results and Discussions

    The experimental results demonstrate that the proposed method achieves superior performance in both qualitative and quantitative evaluations compared to nine other methods. Qualitatively, as depicted in Figs. 2?7, our approach effectively suppresses SAR noise while preserving details in the original cloud-free regions, which results in images with reduced distortion and improved visual quality compared to the other methods. Quantitatively, our method outperforms others across six evaluation metrics: information entropy (EN), average gradient (AG), space frequency (SF), structural similarity index measure (SSIM), peak signal-to-noise ratio (PSNR), and root mean square error (RMSE). Compared to the second-best method, the improvements of our method are 0.054, 0.450, 0.910, 0.029, 0.215, and 0.290 respectively. These enhancements effectively retain texture and detail information of ground objects, remove cloud contamination, and enhance overall image quality.

    Conclusions

    Given that most current SAR image fusion cloud removal methods fail to effectively address the substantial structural differences between optical and SAR images, and still retain SAR image speckle noise post-fusion, we propose a new method for cloud removal using optical and SAR image fusion. In terms of fusion rule setting, DWRE is employed to retain energy from both images and extract detailed information in the low-frequency component. In the high-frequency component, the use of RGF and DCULPCNN aims to suppress SAR image speckle noise and enhance texture information while reducing spatial structural differences between the two images. Comparative analysis against nine other methods demonstrates that the proposed fusion cloud removal method excels in quantitative evaluation, which achieves superior performance across metrics such as EN, AG, SF, SSIM, PSNR, and RMSE. However, it should be noted that the proposed method is currently limited to cloud removal in panchromatic images. Future research will focus on adapting and improving this method for application to multispectral data.

    Keywords
    Tools

    Get Citation

    Copy Citation Text

    Xunqiang Gong, Qirui Fang, Zhaoyang Hou, Zhihua Zhang, Yuanping Xia. A Method for Cloud Removal Using Optical and Synthetic Aperture Radar Image Fusion[J]. Acta Optica Sinica, 2024, 44(24): 2428003

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Remote Sensing and Sensors

    Received: Jan. 23, 2024

    Accepted: Apr. 30, 2024

    Published Online: Dec. 17, 2024

    The Author Email: Xia Yuanping (ypxia@ecut.edu.cn)

    DOI:10.3788/AOS240550

    Topics