Laser & Optoelectronics Progress, Volume. 62, Issue 7, 0722005(2025)
Efficient Metasurface Design Method Based on Super-Resolution Reconstruction Technology
Metasurfaces offer unique advantages for controlling incident electromagnetic waves. However, the design of a complete metalens involves extensive modeling and analysis of numerous unit structures. This process is both time-consuming and challenging, particularly when aligning phase distributions with desired outcomes, which poses significant limitations to simulation efficiency. In this study, we propose a deep learning-based super-resolution reconstruction model that exploits a generative adversarial network for metasurface design. The proposed model enables rapid and precise upscaling of phase-geometric arrays, generating unit structures with higher data density in the output phase. By training on 9000 sample sets, the proposed model achieves a prediction accuracy of 95.17% when upscaling a 60 pixel × 60 pixel phase-geometric array to 120 pixel × 120 pixel. To validate its feasibility, we design a visible-light achromatic metalens using the predicted phase-geometric array of typical achromatic metalens unit structures. This design is verified by simulation. The results demonstrate that the proposed deep learning model significantly reduces simulation time and enhances accuracy compared to traditional simulation and interpolation methods. The proposed model also offers an efficient approach for the initial design and selection of metasurface units, providing a scalable solution for accelerated computation and data expansion in electromagnetic wave control based on a generative adversarial network.
Get Citation
Copy Citation Text
Kaiwei Zhang, Hongyun Gao, Haifei Lü, Yuanjia Xia, Guobing Chen. Efficient Metasurface Design Method Based on Super-Resolution Reconstruction Technology[J]. Laser & Optoelectronics Progress, 2025, 62(7): 0722005
Category: Optical Design and Fabrication
Received: Nov. 5, 2024
Accepted: Dec. 25, 2024
Published Online: Mar. 20, 2025
The Author Email: Hongyun Gao (ccyun@126.com)
CSTR:32186.14.LOP242223