Advanced Photonics Nexus, Volume. 3, Issue 5, 056005(2024)

NeuPh: scalable and generalizable neural phase retrieval with local conditional neural fields Article Video , Editors' Pick

Hao Wang1, Jiabei Zhu1, Yunzhe Li1、†, Qianwan Yang1, and Lei Tian1,2、*
Author Affiliations
  • 1Boston University, Department of Electrical and Computer Engineering, Boston, Massachusetts, United States
  • 2Boston University, Department of Biomedical Engineering, Boston, Massachusetts, United States
  • show less
    Figures & Tables(5)
    Conceptual illustration of the NeuPh framework. (a) NeuPh employs a CNN-based encoder to learn measurement-specific information and encode them into a latent-space representation. The MLP decoder reconstructs the phase values at specific locations with an increased spatial resolution by synthesizing local conditional information from the corresponding latent vectors. (b) FPM experimental setup and illumination patterns for acquiring multiplexed BF and DF measurements. (c) Example low-resolution BF measurement and high-resolution phase reconstruction from the model-based FPM algorithm and NeuPh. NeuPh learns a continuous-domain representation and can infer phase maps on an arbitrary pixel grid (illustration with 6×, 21×, 49.8× pixel density compared with the raw measurement).
    Reconstruction results using NeuPh trained with the experimental dataset. An example of the BF low-resolution intensity image, DPC estimation, model-based FPM reconstruction, and NeuPh reconstruction for (a) Hela(E) and (b) Hela(F). Subareas (1)–(6) highlight specific regions of interest, demonstrating NeuPh’s capacity to accurately reconstruct subcellular high-resolution features without any artifacts.
    (a) NeuPh’s robustness to phase artifacts. NeuPh eliminates discontinuous phase-unwrapping errors (marked by red arrows) and background rippling artifacts (noted by the block box). The phase histogram of the background areas, measuring the residual background fluctuations, is shown in the rightmost column. The standard deviations (σ) are shown at the bottom of the reconstructions. (b) NeuPh outperforms the CNN-based reconstruction method and existing neural networks. Comparison between the reconstructions by NeuPh (NeuPhE), CNN-based (CNNE) networks, GAN (GANE), and traditional NF networks (NFE), benchmarked by the ground-truth model-based reconstruction. Zoomed-in regions showcase intricate subcellular features that can be reconstructed with better resolution by NeuPh than other networks, as highlighted by the red circles and arrows. The reconstructed spectra are shown at the bottom left of each image, with blue, red, and brown circles indicating the bandwidth of the objective (0.1 NA), BF measurements (0.2 NA), and theoretically achievable reconstruction bandwidth (0.51 NA), respectively.
    Strong generalization capability of NeuPh. Reconstructions of ethanol-fixed Hela cells with different dataset-trained networks.
    Wide-FOV high-resolution phase reconstruction by NeuPh. (a) BF image captured over a 2160-pixel diameter (3.51 mm) FOV. Wide-FOV reconstruction by training NeuPh with the (b) experimental dataset (NeuPhE(18)) and (c) simulated dataset (NeuPhSim). (d)–(g) Selected subareas extracted from the central to the edge of the FOV, identified as (i)–(iv), and enclosed within different colored boxes. (d) BF image. (e) Model-based reconstruction. (f) NeuPhE(18) reconstruction. The experimental dataset used for training NeuPhE(18) is obtained from the central region, indicated by the dashed black square. (g) NeuPhSim reconstruction.
    Tools

    Get Citation

    Copy Citation Text

    Hao Wang, Jiabei Zhu, Yunzhe Li, Qianwan Yang, Lei Tian, "NeuPh: scalable and generalizable neural phase retrieval with local conditional neural fields," Adv. Photon. Nexus 3, 056005 (2024)

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Research Articles

    Received: Mar. 28, 2024

    Accepted: Jul. 30, 2024

    Published Online: Aug. 29, 2024

    The Author Email: Tian Lei (leitian@bu.edu)

    DOI:10.1117/1.APN.3.5.056005

    CSTR:32397.14.1.APN.3.5.056005

    Topics