Advanced Imaging, Volume. 1, Issue 1, 012001(2024)
Future-proof imaging: computational imaging
Fig. 4. Computational light source. (a) Light vector modulation: ptychographic iterative engine[59] and Fourier ptychographic microscopy[73]. (b) Phase modulation: structured-light 3D imaging[110] and structured illumination microscopy[111,112]. (c) Coherent imaging: optical coherence tomography[103,329] and holography[109]. (d) Time modulation: coded exposure[330] and time of flight[331]. (e) Wavelength modulation: stochastic optical reconstruction microscopy[48] and synthetic wavelength holography[57].
Fig. 5. Coding methods and experimental results of structured illumination 3D-imaging. (a) Example of the pattern sequence that combines gray code and phase-shift projection[11]. (b) Novel phase-coding method for absolute phase retrieval[12]. (b1) The sinusoidal fringe pattern and the wrapped phase obtained from it. (b2) Phase-coding fringe and the codewords extracted from it. (c) Comparison of projection results between the method based on the phase-coding and the traditional phase-shifted method[12]. (c1)–(c3) Three sinusoidal phase-shifted fringe images. (c4) Wrapped phase map. (c5)–(c7) Three phase encoded fringe patterns. (c8) Wrapped stair phase map. (d) The phase-measuring profilometry based on the composite color-coding method[15]. (d1) Schematic of the feature points mapping-based principle. (d2) 3D shape of a stair model. (d3) Experimental result.
Fig. 6. Common SIM scheme and experimental results. (a) Schematic of the four-beam experimental setup[34]. (b) Simulated imaging performance on a fibrous ground truth test image, shown as an
Fig. 7. Scheme of STEDD microscopy[43]. (a) Sketch of the STEDD, including the sequence of excitation and depletion pulses. (b) Detailed temporal sequence of fluorescence excitation. Shortly after the excitation pulse, the first STED1 pulse (intensity profile visualized in the
Fig. 8. PSF-based multicolor STORM and deep learning-based STORM. (a)–(c) Multicolor STORM[47]. (a) Raw data from the recorded super-resolution imaging movie. Insets: two enlarged example PSFs of a green label (horizontally elongated, top) and a red label (vertically elongated, bottom) with arrows indicating the elongation direction. (b) Super-resolution image obtained by localizing each emitter in the movie and assigning its color (red, microtubules; green, mitochondria). Inset: diffraction-limited data. (c) Histogram of all of the localizations within the dotted white box surrounding an
Fig. 9. Schematics and experimental result synthetic wavelength holography (SWH) for NLoS imaging through scattering media[57]. (a) SWH image formation and reconstruction. The synthetic wavelength
Fig. 10. Multi-angle illumination lensless imaging and mask-modulated lensless imaging. (a)–(d) Multi-angle illumination lensless imaging[59]. (a) The optical setup of multi-angle illumination lensless imaging system. (b) The corresponding forward model expression. (c) The corresponding single-shot measurement. (d) Recovered results of a USAF-1951 resolution chart. (e)–(f) Mask-modulated lensless imaging[60]. (e) Forward imaging model of the mask-modulated lensless imaging. (f) Comparison of the recovered images using the USAF-1951 resolution target.
Fig. 11. FPM and corresponding illumination improvement strategies. (a) Iterative recovery procedure of FPM (five steps)[62]. (b) Multiplexed coded illumination for FP with an LED array microscope[66]. (Top) Four randomly chosen LEDs are turned on for each measurement. (Middle) The captured images corresponding to each LED pattern. (Bottom) Fourier coverage of the sample’s Fourier space for each of the LED patterns (drawn to scale). (c) Experimental setup of FP based on the laser illumination source[72].
Fig. 12. 3D imaging and scattering imaging based on ToF. (a) Experimental results of range-gated laser imaging based on the time slice[75]. Terrain vehicle imaged from ranges of 1.9 km (left) and 7.2 km (right). (b) The imaging results of range-gated laser 3D imaging based on intensity correlation at different distances[76]. (c) 3D structure of the towers derived from the polarization-modulated 3D imaging lidar[78]. (d) Principle and results of imaging through realistic fog with a SPAD camera[82].
Fig. 13. Experimental results of different methods of deblurring. (a) Coded exposure that depends on the speed of an object’s motion[86]. Column 1: input images. Column 2: matching metric versus velocity. Column 3: deblurred results using our estimated velocity. (b) Comparison of the deblurring performance with different sequence lengths under the same exposure[88]. (b1) Sequence length = 40, 1 chop duration = 3 ms. (b2) Sequence length = 120, 1 chop duration = 1 ms.
Fig. 14. TDOCT structures and metasurface-based bijective illumination collection imaging (BICI). (a) Simplified block diagram of the TDOCT method[103]. (b) Incorporation of BICI through one arm of an interferometer (orange lines represent a single-mode fiber)[102]. (c) Tissue imaging comparison of BICI and a conventional approach[102]. Imaging swine tracheobronchial tissue specimens using a plano-convex lens with common illumination and collection paths (c1, c2, c5, and c6) and BICI (c3, c4, c7, and c8). (c9) Corresponding histology image of the tissue imaged using the conventional approach.
Fig. 17. (a) Aerial view of the remote active imaging experiment. (b) Results obtained based on different imaging algorithms. (c) Long-range 3D imaging over 45 km[119].
Fig. 19. (a) Principle of polarization difference[127]. (b) Differences between conventional imaging and polarization differential imaging[128]. (c) A is the imaging effect of the traditional TYO model, and B is the imaging effect of active linearly polarized illumination[129]. In (d)[130], (d1) is a polarization image, (d2) is a polarization angle image, (d3) is an imaging effect of traditional polarization differential imaging, and (d4) is an imaging effect of adaptive polarization differential imaging.
Fig. 20. (a) Schematic diagram of the atmospheric scattering model[132,133]. In (b), A and B are the best and worst polarization images, respectively, and C is the effect of dehazing using the Y. Y. Schechner method[134,135]. In (c), A is the original intensity image under dense haze conditions, and B is the rendering after multi-scale polarization imaging trans-haze algorithm[136]. (d) Comparison before underwater scattering imaging[137].
Fig. 21. (a) Principle of polarization imaging based on Stokes[138]. In (b)[139], b1 is the imaging effect based on Stokes vector interpolation, and b2 is the imaging effect of traditional differential imaging. In (c)[140], (c1) is the original polarization image, and (c2) is the rendering of the polarization dehazing method based on the polarization angle distribution analysis. In (d)[141], 1 is the original intensity image, 2 is the reconstructed target image, and 3 is the estimation of backscattered light.
Fig. 22. (a) Principle of polarization difference based on the Mueller matrix[143]. In (b)[143], (b1), (b2), and (b3) are the intensity images of three targets in a highly concentrated scattering medium, (b4), (b5), and (b6) are descatter images of three targets under the worst linearly polarized light illumination, and (b7), (b8), and (b9) are the descattering images of three targets under optimal linearly polarized light illumination. In (c)[144], (c1) is the intensity image, (c2) is the image recovered with the proposed descattering method, and (c3) and (c4) are magnified views of the region of interest marked with a red rectangle in (c1) and (c2).
Fig. 24. (a) Super-resolution imaging through scattering media with SOSLI in comparison to other imaging techniques. (b) Principle and simulation results of SOSLI. (c) Experimental results of imaging through a ground glass diffuser with different techniques. (d) Experimental demonstration of three techniques for imaging several complex objects hidden behind a ground glass diffuser[147].
Fig. 26. (a) Wavefront shaping technology[160]. (a1) Experimental setup. (a2) System with a layer of airbrush paint present and unmodified incident wavefront. (a3) The wave was shaped to achieve constructive interference in the target. (b) Spatiotemporal focusing by optimizing a two-photon fluorescence (2PF) signal[156]. (b1) Experimental setup. (b2) 2PF images before optimization at the optimized plane (
Fig. 27. (a) Measuring transmission matrix in the spatial domain[161]. (a1) Experimental setup. (a2) Initial grayscale image. (a3) Reconstructed image using scattered input. (b) Measuring the transmission matrix in the spatial domain[162]. (b1) Experimental setup. (b2) Pattern before inserting the scattering medium. (b3) Reconstructed image using scattered input.
Fig. 28. (a) NLOS imaging based on a streak camera[163]. (a1) The process of capturing photons. (a2) An example of streak images sequentially collected. (a3) The 2D projected view of the hidden object. (b) NLOS imaging based on SPAD[164]. (b1) Experimental setup. (b2) Objects in the scene to be reconstructed. (b3) Reconstruction of the letter T. (c) NLOS imaging based on ToF[165]. (c1) Experimental setup. (c2) Unknown object. (c3) Reconstructed depth (volume as probability). (c4) Reconstructed depth (strongest peak).
Fig. 29. (a) Shape recovery from coherence measurements[166]. (a1) Experimental setup. (a2), (a3) Plots of real and imaginary components of SCF measured for the square and equilateral triangle objects, respectively. (b) NLOS imaging based on multimodal data fusion[167]. (b1) Experimental scene. (b2) The intensity sample. (b3) The reconstruction using this intensity sample alone. (b4) The additional measurement of scattered coherence. (b5) The reconstruction when both the intensity and coherence measurements are used.
Fig. 30. In (a)[169], (
Fig. 31. In (a)[172], (
Fig. 32. (a) Corner setup. (b) Comparing HOG features in the raw frames and the denoised frames. (c) Reconstruction algorithm for 2D shape recovery and 3D localization. (d) A general diagram of the experiments. (e) A schematic diagram of the speckle correlation imaging setup with a monochromatic, pseudothermal source object in an around-the-corner geometry. (f) Image recovery under the pseudothermal setup. (g) A comparison of results under line-of-sight and NLOS conditions using the setup[174,175].
Fig. 33. General framework for computing optical systems. (a) Metalens. The use of metalenses can meet the needs of miniaturization and integration of optical systems[332]. (b) Simplified optical system. The simplified optical system seeks to achieve optimal performance of the entire system[214]. (c) Adaptive optical systems. The adaptive optical imaging system is designed to eliminate the interference of complex environments on the amplitude and phase of the imaging light field[334]. (d) Coded aperture. The introduction of coded aperture improves the dimension of information collected, making it possible to create super-resolution imaging and high-speed imaging[210]. (e) Single pixel imaging. Only a single pixel detector is used for spatial imaging. The advantages are high SNR and low cost[248]. (f) Wide area optical system. Wide area optical systems can achieve both large FOV and high resolution[189].
Fig. 35. The operational status of the hyperspectral imaging device[186]. (a) Schematic diagram of the structure of the device. (b) Schematic diagram of the basic modulation unit, including, from top to bottom, the metasurface, microlens (used to increase quantum efficiency), and CMOS image sensor. (c) Snapshot of spectral imaging. The light from the object to be imaged is incident on the metasurface superunit. (d) Hyperspectral imaging chip with reconfigurable metasurface superunits placed on top of the camera.
Fig. 36. Mueller matrix imaging reflection results[187]. (a) Imaging of the Mueller matrix placed in the “Fourier plane” using a 4
Fig. 38. Wide-field optical imaging method. (a) Multi-detector splicing system[191,192,194]. (a1) UltraCam-D(UCD) camera detector splicing scheme. (a2) Complete focal plane array assembly of the Kepler telescope. (a3) ARGUS-IS imaging system. (a4) Full FOV image. (b) Multi-aperture imaging system prototype and imaging effect[198].
Fig. 39. (a) Coded aperture mask used in gamma-ray imaging. (b) Comparison of traditional sampling and coded exposure sampling[204]. (c) Wavefront coding imaging system[210]. (d) Schematic diagram of the coded aperture snapshot spectral imaging (CASSI) physical system[205]. (e) Schematic diagram of the CASSI imaging process[205].
Fig. 40. (a) Single-lens camera input image and deblurring results[211]. (b) Joint end-to-end optimization of the optical design framework[214]. (c) Diagram of the principle of the diffractive telescope imaging system experimental platform and comparison of the results with and without the image recovery point target[220].
Fig. 41. Imaging results and image quality evaluation of cooke triplet and doublet lens and optical systems based on deep learning combined with wavefront encoding[223]. (a) Optical structure models of the three optical systems. (b)–(f) the imaging results of different systems at defocus distances of
Fig. 42. Adaptive optics using direct wavefront sensing[334]. (a) The distortion of the wavefront (blue lines) is directly measured with a wavefront sensor and minimized by a wavefront modulator (e.g., a deformable mirror) to improve the image quality of a telescope. Sgr
Fig. 45. (a), (b) Schematic diagrams of the two experimental setups. (c) Schematic diagram of generating a composite light pattern (
Fig. 46. Experimental setup and results of photoacoustic imaging[255]. (a) Experimental setup diagram. (b) Experimental phantom for photoacoustic imaging—distorted black polymer ribbon. (c) The
Fig. 47. (a) Single-pixel 3D imaging system. (b) Illumination laser pulses backscattered from the scene are measured as (c) broadened signals. (d) Image cubes containing images of different depths are obtained using measurement signals. (e) Each lateral position has an intensity distribution along the vertical axis, indicating depth information. (f) Reflectance and (g) depth maps can be estimated from the image cube and then used to reconstruct a 3D image of the (h) scene[256].
Fig. 49. (a) Empirical phase transition graph of non-uniform wavelet bandpass sampling (NUWBS) for multi-band signal acquisition compared to the theoretical
Fig. 50. (a) Schematic of the human visual system. (b) The human eye and (c) the retina. (d) Schematic of our eyes’ imaging system. (e) The working mechanism of eyes. (f) Perovskite nanowires and their crystal structures[265]. (g) Schematic of the test bench used for characterization of the curved digital X-ray detector, showing the X-ray source, bone phantom, and curved digital X-ray detector[264]. (h) Imaging results acquired by the adaptive imager for objects at different distances[266].
Fig. 51. (a) The schematics of metasurface enabled quantum edge detection. (b) The switch state ON or OFF of the heralding arm. When the idler photon of the omen arm projects onto the surface
Fig. 52. (a) Detection results in the multi-object detection experiment. (b) Object numbers in the multi-object detection experiment[277]. (c) Video reconstructions of high-speed physical phenomena[278]. (d) Data processing flow. (e) The event denoising results of the dataset overlaid on the corresponding image[280].
Fig. 54. MFF-GAN[283]. (a) Overall fusion framework. (b) Illustration of the decision block. (c) Network architecture of the discriminator. (d) Network architecture of the generator.
Fig. 55. PIAFusion network[286]. (a) The framework of PIAFusion network. (b) Visualized results of images and feature maps in the nighttime scenario. The first column shows the infrared image, visible image, and fused image, respectively. The following three columns present the feature maps corresponding to the infrared, visible, and fused images in various channel dimensions.
Fig. 56. MHF-net[288]. (a) and (b) are illustrations of the observation models for HrMS and LrHS images, respectively. (c) is the illustration of how to create the training data when HrHS images are unavailable. (d) is the illustration of the blind MH/HS fusion net. (e) is the experimental results.
Fig. 58. IE-CGAN[293]. (a) An overview of IE-CGAN. (b) Results of two methods.
Fig. 60. PWGCM[300]. (a) Overview of PWGCM. (b) Visualization of gamma correction map and the results in each iteration. (c) Results of several methods.
Fig. 61. Rivenson’s method[302]. (a) The schematic outlines the steps in the standard (top) and virtual (bottom) staining techniques. (b) Virtual staining GAN architecture. (c), (d) Virtual staining results match Masson’s trichrome stain for lung tissue sections.
Fig. 62. DPE-MEF[307]. (a) The architecture of the detail enhancement module. The numbers indicate the channel amounts. (b) The architecture of the color enhancement module. The numbers indicate the channel amounts. (c) Imaging results of DPE-MEF.
Fig. 63. HDR-GAN[309]. (a) Illustration of the proposed framework. (b) Imaging results of HDR-GAN.
Fig. 64. Experimental results of Wu’s method[313]. (a) shows sample HR images including a wall image and a grape image, which are downsampled by factor 4 to get the corresponding LR images for testing. (b)–(d) show the experimental results conducted on the low-resolution image.
Fig. 65. Experimental result of Yang’s method[317]. (a) Low-resolution image. (b) The result of bicubic interpolation. (c) Results of the proposed method.
Fig. 66. DRLN[322]. (a) The detailed network architecture of DRLN. (b) Results of different methods. The key contrast parts in the red rectangle are magnified to display on the right. The LR image used for reconstruction is obtained by downsampling the HR image by a factor of 4.
Fig. 67. EMASRN[324]. (a) An overview of the EMASRN network. (b) Results of different methods. The key contrast parts in the red rectangle are magnified to display on the right. The LR image used for reconstruction is obtained by downsampling the HR image by a factor of 4.
Fig. 68. Experimental comparisons of differential ghost imaging (DGI)[327]. GISC (GI using sparsity constraint) and GIDC in terms of both the sampling ratio and reconstruction SNR. (a) Schematic diagram of the experimental setup. (b) Experimental results for binary objects. (c) Experimental results for a grayscale object. (d) Experimental results on a flying drone.
Get Citation
Copy Citation Text
Jinpeng Liu, Yi Feng, Yuzhi Wang, Juncheng Liu, Feiyan Zhou, Wenguang Xiang, Yuhan Zhang, Haodong Yang, Chang Cai, Fei Liu, Xiaopeng Shao, "Future-proof imaging: computational imaging," Adv. Imaging 1, 012001 (2024)
Category: Review Article
Received: May. 19, 2024
Accepted: Jun. 20, 2024
Published Online: Jul. 17, 2024
The Author Email: Fei Liu (feiliu@xidian.edu.cn), Xiaopeng Shao (xpshao@opt.ac.cn)