Chinese Optics, Volume. 15, Issue 6, 1105(2022)

Resolution, super-resolution and spatial bandwidth product expansion——some thoughts from the perspective of computational optical imaging

Chao ZUO1,2,3、* and Qian CHEN2、*
Author Affiliations
  • 1Smart Computational Imaging Laboratory (SCILab), School of Electronic and Optical Engineering, Nanjing University of Science and Technology, Nanjing 210094, China
  • 2Jiangsu Key Laboratory of Spectral Imaging and Intelligent Sense, Nanjing University of Science and Technology, Nanjing 210094, China
  • 3Smart Computational Imaging Research Institute (SCIRI) of Nanjing University of Science and Technology, Nanjing 210019, China
  • show less
    Figures & Tables(55)
    The typical phenomenon of an Airy spot consists of the brightest spot at its center and the surrounding diffraction ring
    Visual representation of the "Rayleigh criterion". (a) The minimum resolvable distance (optical angular resolution) of the imaging system is inversely proportional to the aperture of the imaging system. (b-d) Airy spot images of two non-coherent point targets at different spacings
    The Abbe imaging principle explicitly divides the imaging process into two steps: frequency division and synthesis
    Airy spot (a) and 4 widely-utilized criterion (i.e., Rayleigh (b), Sparrow (c), Abbe (d), and FWHM (e)) for resolution computation. The gray and blue curves represent the individual intensity variations at different points in a specimen where the vertical (y-) axis is the intensity and the horizontal (x-) axis is the lateral separation between the points. The bottom plots describe the individual contributions to the intensity distribution while the top plots illustrate a super-imposed intensity profile formed by each of the individual components in the respective bottom plots
    The magnitude and phase of the OTF. The former expresses the effect on intensity modulation, i.e., contrast (a), and the latter is the spatial distribution (b). OTF magnitude depends solely on the relative magnitude of the minimum intensity (IMIN) of the sinusoidal pattern vs. its maximum (IMAX). To incorporate the effect of a possible phase shift, OTF is constructed within a unit circle in complex coordinates, with its real and imaginary parts reflecting the magnitude of phase shift, but not the OTF magnitude itself, which is in these coordinates given by a square root of the sum of squared real and imaginary parts of OTF, therefore remaining unit value independent of the phase shift (c)
    Computational model and distribution profile of coherent transfer function and optical transfer function under incoherence imaging condition
    Geometric schematic of the optical transfer function under different illumination conditions. (a) Coherent and incoherent imaging cases (source aperture is infinitely small, or is greater than or equal to the objective aperture); (b) partially coherent imaging case (source aperture is smaller than the objective aperture)
    The relationship between the intensity of two point sources under coherent imaging and their phase difference. The vertical lines indicate the positions of the two point sources, where ϕ is the relative phase between the two point sources[24]
    Simulated example of a resolution report with the Siemens star for a coherent imaging system (λ = 0.40 μm, 100× 0.8 numerical aperture objective, pixel size = 1.3 μm, with Poisson noise)[50]. (a) Ideal target image. (b) Imaging effect of region in while box of (a). (c) Re-imaged target center after moving it to the edge of the sensor, where aberrations further limit effective resolution. (d) Plot of amplitude values along a segment of the blue circle in (c) at 533 nm spoke periodicity. As noisy values within ‘dark’ spokes (circled) exceed values within ‘bright’ spokes, it is not possible to unambiguously claim a resolution of 533 nm. (e) Similar plot along the red circle in (c), showing that spokes at a periodicity of 550 nm are unambiguously resolved (verified for all spokes)
    Abbe diffraction limit—optical imaging systems have difficulty seeing details of objects smaller than half the wavelength
    The process of discretization and digital recording of optical images. (a) Original ideal optical image; (b) local area discrete sampling image; (c) enlarged view of the area in the red box in (a); (d) pixel gray scale of the corresponding area
    Shannon-Nyquist sampling theorem. (a) The correct periodic variation of the signal can be captured when the sampling spacing exactly satisfies the Nyquist sampling frequency ; (b) the correct periodic signal cannot be captured when the sampling spacing is less than the Nyquist sampling frequency ; (c) when the Nyquist sampling frequency is satisfied, the original signal spectrum is replicated along the frequency domain but no aliasing occurs; (d) the signal spectrum is overlapped when the Nyquist sampling frequency is not satisfied香农-奈奎斯特采样定理。(a)采样间距恰好满足Nyquist采样频率可以采集到信号正确的周期变化;(b)采样间距小于Nyquist采样频率无法采集到正确的周期信号;(c)满足Nyquist采样频率时,原信号频谱沿频域产生复制但不产生混叠;(d)不满足Nyquist采样频率时信号频谱产生混叠
    Nyquist sampling limit (mosaic effect) limited by detector image element size. (a) Information aliasing caused by under-sampling of pixels (too large pixel size); (b) The case when the Nyquist sampling limit is exactly satisfied; (c) imaging results of a typical thermal imaging camera for a human target at different distances (image element size of 38 μm, pixels of 320 × 240, and a lens of 50 mm focal length)
    The pixel size of most thermal imaging cameras, especially cooled mid-wave cameras and uncooled long-wave cameras, is large, especially when equipped with a large-aperture optical imaging system in a wide field of view, and the image element size becomes a fundamental factor limiting its imaging resolution (sampling ratio Fλ/d less than 2)[56]
    The characterization of common signal transformations in phase space. (a) Fresnel propagation; (b) Chirp modulation (lens); (c) Fourier transform; (d) Fractional Fourier transform; (e) beam magnifier
    For conventional optical systems, the two parameters of the field of view and resolution are contradictory and cannot be accommodated at the same time. (a) Field of view of 35 mm SLR cameras at different focal lengths; (b) typical images captured by 35 mm SLR cameras at different focal lengths
    There is a tradeoff between the resolution and FOV in traditional microscopes: the FOV under low-magnification objective is large with the low resolution; for high-magnification objective, the resolution is improved while the FOV is reduced dramatically
    Lukosz-type superresolution system. The signal in the object plane (OP) is propagated to the first grating (G1). The encoded signal is then imaged to the conjugate plane located at the second grating (G2) by the 4f imaging system consisting of two Fourier lenses, L1 and L2. The system aperture of size A resides in the Fraunhofer plane (FP). The decoded signal is observed in the image plane (IP) of the system
    Phase-space diagram of the superresolution system. (a) Signal passing the 4f system without encoding; (b) signal with a bandwidth exceeding the pass band of the 4f system by a factor of two; (c) before the first grating (G1); (d) after G1; (e) encoded signal after passing the 4f system and before the second grating (G2); (f) after G2; (g) signal back-propagated to the image plane IP; and (h) after removing artifacts outside the signal area
    Synthetic Aperture Radar (SAR), the earliest computational imaging technique
    Synthetic Aperture Ladar (SAR)[102]. (a) Principle diagram of laser synthetic aperture radar imaging based on optical fibers developed by Aerospace Corporation of the United States; (b) comparison of imaging results (right image is diffraction-limited imaging results, left image is synthetic aperture results)
    Schematic diagram of Fourier ptychographic microscopy
    Reflective Fourier ptychographic imaging system and its schematic diagram[111]
    Principle of incoherent synthetic aperture technology[116]. (a) Process for synthetic aperture super-resolution imaging based on time and aperture division synthetic aperture of phase reconstructive; (b) point spread function optimization based on time and aperture division synthetic aperture of phase reconstructive; (c) image comparison before and after super resolution
    Structured illumination microscopy. (a) Optical train and spectral modulation process of conventional (linear) structured illumination microscopy; (b) spectral modulation process of saturated structured illumination microscopy; (c) SIM super-resolution images of f-actin in COS-7 cells and the comparison results with different methods (upper left: wide field, upper right: deconvolution, lower left: SIM, lower right: SSIM); (d) super-resolution images of caveolae in COS-7 cells with different methods (upper left: wide field, upper right: deconvolution, lower left: SIM, lower right: SSIM); (e) SSIM super-resolution results of caveolae in living COS-7 cells[117,120-121]
    The schematic diagram and results of super-resolution STED. (a) A typical STED setup; (b) the principle of STED; (c) the wide-field image of microtubules; (d) the super-resolution image of microtubules [128]
    Schematic diagram and result diagram of PALM super-resolution imaging. (a) Detected single raw photon image; (b) Gaussian fitting of (a); (c) localized center of (a); (d) wide-field image of plain polystyrene beads; (e) the plain polystyrene bead image obtained by superimposing the single molecule images in the entire PALM data stack; (f) PALM super-resolution image of plain polystyrene beads
    Principles of confocal microscopy with superoscillatory illumination. The inset in the upper right corner shows the intensity distribution of a superoscillatory hotspot[141]
    Bandwidth compression via linear optical transformations. (a) Phase-space diagram of band-limited function; (b) PSD after chirping; (c) PSD after fractional Fourier transformation to recover band-limited function
    Generalized sampling of Fresnel holograms. (a) Phase-space diagram of a signal compact in space; (b) signal in the domain of sampling; (c) signal in (b) after dechirping
    Pixel level light intensity change caused by controllable sub-pixel movement
    Image subpixel super-resolution. (a) Image downsampling forward model; (b) spectral aliasing effect due to insufficient sampling frequency; (c) schematic diagram of subpixel shift super-resolution reconstruction
    Single-frame super-resolution image reconstruction algorithm based on deep learning[184]. (a) Block diagram of single-frame image super-resolution neural network structure based on image feature extraction; (b) results of single-frame image neural network super-resolution reconstruction. Although the image details become clearer, most of them do not match with the real image at all
    The basic principle of multi-frame image super-resolution. The point spread function produces pixel-level light intensity variations (sampling matrix) by performing different modulation methods
    Micro-scanning device. (a) Optical refraction method; (b) flat plate rotation method; (c) piezoelectric ceramics
    The principle of coded aperture pixel super-resolution imaging[116]. (a) Schematic diagram of optical path structure of imaging system; (b) the point spread function modulated by coded aperture is compared with the traditional fixed aperture imaging; (c) distribution of optical transfer function and point spread function under different patterns; (d) frequency domain aliasing caused by the insufficient spatial sampling of the detector and demodulated image after coded aperture constructive imaging
    Typical experimental results of coded aperture-based pixel super-resolution imaging technique[116]. (a) Long-wave infrared imaging system for standard resolution target imaging test; (b)−(d) comparison of imaging resolution before and after applying pixel super-resolution algorithm on USAF target and vehicle
    Gigapan panoramic imaging system and the gigapixel panorama image obtained by stitching
    Integration and stitching of multiple detectors. (a) MOA-cam3 is composed of 10 CCD4482 chips; (b) the Gaia Astronomical Telescope's focal plane array consists of 106 CCDs stitched together; (c) the focal plane array of the Large Synoptic Survey Telescope (LSST) is composed of 21 modules. Each module consists of 9 CCD detectors; (d) the correction mirror of large sky area multi-target spectral astronomical telescope is composed of 24 hexagonal lenses
    ARGUS-IS system and its imaging results. (a) ARGUS-IS system appearance; (b) the system adopts 368 image sensors and four main lenses, of which 92 sensors are in one group and share one main lens. By skillfully setting the installation position of the sensors, the images obtained by each group of sensors are misaligned to complement each other, and then by image stitching, a better overall imaging result can be obtained. (c) This imaging system effectively covers a ground area of 7.2 km×7.2 km at 6000 m altitude
    Multi-camera stitching system. (a) Immerge, a light field acquisition system developed by Lytro; (b) Stanford semi-ring camera array system; (c) Stanford planar camera array system; (d) CAMatrix ring camera array system; (e) Tsinghua University birdcage camera array system
    Bionic compound eye imaging system; (a) Bionic compound eye imaging device Panoptic designed and developed by a research team at the Swiss Federal Institute of Technology Lausanne (EPFL); (b) large field of view high-resolution OMNI-R system; (c) Avery ground-based telescope Evryscope developed by Nicholas Law
    Multiscale imaging system. (a) AWARE-2 architecture; (b) AWARE-10 architecture; (c) AWARE-40 architecture
    SBP representation for different Fresnel-type holograms in Wigner space. (a) Original SBP of the object; (b) SBP for in-line geometry; and (c) SBP for off-axis geometry matching on the different elementary apertures
    Synthetic aperture-based digital holographic microscopy[220]. (a) Spectrum after synthetic aperture; (b) conventional single-aperture low-resolution reconstruction; (c) high-resolution reconstruction after synthetic aperture
    Fourier ptychographic microscopy uses a 2× 0.08 NA objective and achieves an imaging resolution of approximately 20× objective in its large field of view through synthetic aperture[91]
    High throughput quantitative microscopic imaging based on annular illumination Fourier ptychographic microscopy[36]
    Schematic diagrams of the sub-pixel super-resolution based lensfree on-chip imaging setup. (a) Sub-pixel shifting of illumination source[57]; (b) 2D horizontal sub-pixel sensor motion[264]; (c) fiber-optic array based source scanning[268]; (d) illumination wavelength scanning[266]; (e) axial scanning with multiple sample-to-sensor distances[265]; (f) active source micro-scanning using parallel plates[267]
    Lens-free microscopic imaging results. (a) Full-field results of embryonic stem cells from a projection-based imaging system[255]; (b1−b9) time-series results of subpixel shift super-resolution in the red-boxed region in (a); (c) full-field recovery results of Hela cells based on a lens-free on-chip microscopy system with multi-wavelength scanning[269]; (d) time-series results of pixel super-resolution of Area1 in (c)
    Extrapolation of the entire signal through finite interval spectrum information
    Illustration of the SIM and PSIM principles[298]. (a) SIM; (b) PSIM; (c) schematic of the dispersion curves of the propagating photon in dielectric media
    • Table 1. Properties of WDF

      View table
      View in Article

      Table 1. Properties of WDF

      PropertiesRepresentationExplanation
      Realness$ W\left( {{\boldsymbol{x}},{\boldsymbol{u}}} \right) \in \mathbb{R} $W is always a real function
      Spatial marginal property$ I\left( {\boldsymbol{x}} \right) = \displaystyle\int {W\left( {{\boldsymbol{x}},{\boldsymbol{u}}} \right){\rm{d}}{\boldsymbol{u}}} $$ I\left(\mathit{x}\right) $ is the intensity
      Spatial frequency marginal property$ S\left( {\boldsymbol{u}} \right) = \displaystyle\int {W\left( {{\boldsymbol{x}},{\boldsymbol{u}}} \right){\rm{d}}{\boldsymbol{x}}} $$ S\left(\mathit{u}\right) $ is the power spectrum
      Convolution property$ \begin{gathered} U\left( {\boldsymbol{x}} \right) = {U_1}\left( {\boldsymbol{x}} \right){U_2}\left( {\boldsymbol{x}} \right){\text{ }}W\left( {{\boldsymbol{x}},{\boldsymbol{u}}} \right) = {W_1}\left( {{\boldsymbol{x}},{\boldsymbol{u}}} \right)\mathop \otimes \limits_{\boldsymbol{u}} {W_2}\left( {{\boldsymbol{x}},{\boldsymbol{u}}} \right) \\ U\left( {\boldsymbol{x}} \right) = {U_1}\left( {\boldsymbol{x}} \right)\mathop \otimes \limits_{\boldsymbol{x}} {U_2}\left( {\boldsymbol{x}} \right){\text{ }}W\left( {{\boldsymbol{x}},{\boldsymbol{u}}} \right) = {W_1}\left( {{\boldsymbol{x}},{\boldsymbol{u}}} \right)\mathop \otimes \limits_{\boldsymbol{x}} {W_2}\left( {{\boldsymbol{x}},{\boldsymbol{u}}} \right) \\ \end{gathered} $$ \underset{\mathit{x}}{\otimes } $ is the convolution over x$ \underset{\mathit{u}}{\otimes } $ is the convolution over u
      Instantaneous frequency$ \dfrac{{\displaystyle\int {{\boldsymbol{u}}W\left( {{\boldsymbol{x}},{\boldsymbol{u}}} \right){\rm{d}}{\boldsymbol{u}}} }}{{\displaystyle\int {W\left( {{\boldsymbol{x}},{\boldsymbol{u}}} \right){\rm{d}}{\boldsymbol{u}}} }} = \dfrac{1}{{2{\text{π}} }}\nabla \phi \left( {\boldsymbol{x}} \right) $$ \varphi \left(\mathit{x}\right) $ is the phase component $ \nabla \varphi \left(\mathit{x}\right) $ is the instantaneous frequency
    • Table 2. Common optical transformations of WDF

      View table
      View in Article

      Table 2. Common optical transformations of WDF

      Optical transformationsRepresentationExplanation
      Fresnel diffraction$ {W_{\textit{z}}}\left( {{\boldsymbol{x}},{\boldsymbol{u}}} \right) = {W_0}\left( {{\boldsymbol{x}} - \lambda {\textit{z}}{\boldsymbol{u}},{\boldsymbol{u}}} \right) $λ is the wavelength z is diffraction distance
      Chirp modulation (lens)$ W\left( {{\boldsymbol{x}},{\boldsymbol{u}}} \right) = {W_0}\left( {{\boldsymbol{x}},{\boldsymbol{u}} + \dfrac{{\boldsymbol{x}}}{{\lambda f}}} \right) $λ is the wavelength f is the focal length of lens
      Fourier transform (Fraunhofer diffraction) $ {W_{\hat U}}\left( {{\boldsymbol{x}},{\boldsymbol{u}}} \right) = {W_U}\left( { - {\boldsymbol{u}},{\boldsymbol{x}}} \right) $$ \widehat{U} $ is the Fourier transform of signal
      Fractional Fourier transform$ {W_{{{\hat U}_\theta }}}\left( {{\boldsymbol{x}},{\boldsymbol{u}}} \right) = {W_U}\left( {{\boldsymbol{x}}\cos \theta - {\boldsymbol{u}}\sin \theta ,{\boldsymbol{u}}\cos \theta + {\boldsymbol{x}}\sin \theta } \right) $$ {\widehat{U}}_{\theta } $ is the Fractional Fourier transform, θ is the rotation angle
      Beam amplifier (compressor)$ W\left( {{\boldsymbol{x}},{\boldsymbol{u}}} \right) = {W_0}\left( {{\boldsymbol{x}},{{\boldsymbol{u}} \mathord{\left/ {\vphantom {{\boldsymbol{u}} M}} \right. } M}} \right) $M is the magnification
      First order optical system$ \left[ \begin{gathered} {{\boldsymbol{x'}}} \\ {{\boldsymbol{u'}}} \\ \end{gathered} \right] = \left[ {\begin{array}{*{20}{c}} A&B \\ C&D \end{array}} \right]\left[ \begin{gathered} {\boldsymbol{x}} \\ {\boldsymbol{u}} \\ \end{gathered} \right] $A, B, C, D corresponding to first order optical system
    • Table 3. Spatial bandwidth product of typical 35 mm SLR lens

      View table
      View in Article

      Table 3. Spatial bandwidth product of typical 35 mm SLR lens

      Focal length/mmField of view (Diagonal)/(°) Typical F# Equivalent NA Resolution/μm (Incident wavelength 550 nm) SBP (Megapixel/MP) Angular resolution/mrad
      81803.50.142.3963.350.29
      2094.51.80.271.24212.40.06
      5046.81.20.410.81828.70.016
      8528.61.40.350.95920.90.011
      10024.42.80.171.9734.90.018
      20012.340.122.7952.40.013
      4006.25.60.084.1931.10.009
      10002.580.065.5910.610.005
    • Table 4. Spatial bandwidth product of typical microscopic objectives

      View table
      View in Article

      Table 4. Spatial bandwidth product of typical microscopic objectives

      Objectives (Magnification / Numerical aperture/Field number) Resolution/nm (Incident wavelength 532 nm) SBP (Megapixel/MP)
      1.25×/0.04/26.5811321.5
      2×/0.08/26.5405733.5
      4×/0.16/26.5202833.5
      10×/0.3/26.5108218.9
      20×/0.5/26.564913.1
      40×/0.75/26.54337.4
      60×/0.9/26.53614.7
      100×/1.3/26.52503.5
    Tools

    Get Citation

    Copy Citation Text

    Chao ZUO, Qian CHEN. Resolution, super-resolution and spatial bandwidth product expansion——some thoughts from the perspective of computational optical imaging[J]. Chinese Optics, 2022, 15(6): 1105

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Review

    Received: Jun. 2, 2022

    Accepted: --

    Published Online: Feb. 9, 2023

    The Author Email:

    DOI:10.37188/CO.2022-0105

    Topics