Advanced Photonics Nexus, Volume. 3, Issue 5, 056003(2024)

Hybrid deep-learning and physics-based neural network for programmable illumination computational microscopy Editors' Pick

Ruiqing Sun1, Delong Yang1, Shaohui Zhang1、*, and Qun Hao1,2、*
Author Affiliations
  • 1Beijing Institute of Technology, School of Optics and Photonics, Beijing, China
  • 2Changchun University of Science and Technology, Changchun, China
  • show less
    Figures & Tables(8)
    The imaging system we used and how our framework works. (a) The system we used in experiments to verify the effectiveness of our framework. (b) The overview of the framework we proposed, where PL refers to the physical layer in the physical model. (c) Reconstruction results of 10 LR target images captured using a 0.13NA objective.
    The details of our framework. We describe the first step of the framework in (a), the second step in (c), and the last step in (d). We show the noise introduced by the physical model (PM) and an example output of the DL model in (b).
    The overview of our proposed data augmentation methods. (a1) The ground truth of the resolution target. (a2) The background we extract. (a3) The ROI of the resolution target. (b) An example of simple samples. (c) An example of complex samples.
    The illumination model we generated. The gray circles represent the corresponding sample spectrum range when different LED lights are on.
    Experiments results on the USAF chart. (a) The captured image illuminated by the middle LED. (b) The low-resolution area without reconstruction. (c) The reconstruction result of the DL model trained on a simple data set. (d) The reconstruction result of the DL model trained on the complex data set. (e) The reconstruction result of the physical model (PM) initialized by the image middle LED illuminated. (f) The reconstruction result of the PM initialized by the output of the DL model trained on the complex data set. (g) The final reconstruction result of our framework. (h) The ground truth (GT) reconstructed from 121 LR images captured sequentially. (i)–(m) Detailed outputs from different methods.
    (a) The captured image illuminated by the middle LED. (b) The LR image without reconstruction. (c) The reconstruction result of the first DL model. (d) The reconstruction result of the physical model (PM). (e) The final reconstruction result of our framework.
    The comparison of the outputs between the DL and physics model (PM). (a) The LR image without reconstruction. (b) Reconstruction results of sample blank areas using the physics model. (c) Reconstruction results of sample blank areas using the data-driven DL model. (d) The final reconstruction result of our framework.
    • Table 1. Comparison of results from different methods in ablation experiments.

      View table
      View in Article

      Table 1. Comparison of results from different methods in ablation experiments.

      MethodEvaluation Metric (amplitude)
      SSIM↑PSNR↑NIQE↓LPIPS↓
      Reconstruction with DL model0.53217.450.40.195
      Reconstruction with PM0.59422.862.20.129
      Reconstruction with our framework (DL, DL)0.58722.370.00.124
      Reconstruction with our framework (PM, PM)0.72826.249.90.108
      Reconstruction with our framework (DL, PM)0.74026.848.80.108
    Tools

    Get Citation

    Copy Citation Text

    Ruiqing Sun, Delong Yang, Shaohui Zhang, Qun Hao, "Hybrid deep-learning and physics-based neural network for programmable illumination computational microscopy," Adv. Photon. Nexus 3, 056003 (2024)

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Research Articles

    Received: Jan. 22, 2024

    Accepted: Jun. 20, 2024

    Published Online: Aug. 19, 2024

    The Author Email: Zhang Shaohui (zhangshaohui@bit.edu.cn), Hao Qun (qhao@bit.edu.cn)

    DOI:10.1117/1.APN.3.5.056003

    CSTR:32397.14.1.APN.3.5.056003

    Topics