Acta Optica Sinica (Online), Volume. 1, Issue 1, 0111001(2024)
Computational Imaging Based on Focal Plane-Coded Modulation: A Review (Invited)
Fig. 1. Radiation pyramid light diagram drawn by da Vinci in the 16th century[1]
Fig. 2. Schematic of plenoptic function, denoted as
Fig. 3. Computational imaging processing based on focal plane coded modulation
Fig. 4. Classification and representation of computational imaging technology based on focal plane coding modulation
Fig. 5. Correspondence between image pixel resolution and pixel size (Nyquist sampling). (a) Detector's pixel pitch directly related to its minimum resolvable linewidth; (b) relationship between pixel size (Nyquist sampling) and sampling information, as the sampling frequency increases, more imaging details can be obtained; (c) detection results of long-wave infrared detection system for building targets at different imaging distances (pixel size of 38 μm, pixel resolution of 320×240, lens focal length of 50 mm)
Fig. 6. Single frame image super-resolution reconstruction algorithm based on image feature extraction[36-37]. (a) Structure of single frame image super-resolution neural network based on image feature extraction; (b) comparison of super-resolution reconstruction results of single frame image neural network
Fig. 7. Basic principle of passive micro scanning sub-pixel displacement super-resolution imaging[38]
Fig. 8. Principle of Google handheld super-resolution imaging technology[40]
Fig. 9. Micro-scanning devices. (a) MEMS micro-scanning device; (b) galvanometer; (c) piezoelectric ceramic micro-scanning device
Fig. 10. Schematic of infrared micro-scanning imaging system[41]. (a) Physical system diagram; (b) imaging principle diagram
Fig. 11. Sony sub-pixel super-resolution imaging technology[42]. (a) Principle of Sony pixel shift multi-shooting technology; (b1) super-resolution imaging results of texture detail scenes; (b2) super-resolution imaging results of architectural detail scenes
Fig. 12. Jitter camera system[43]. (a) Physical setup and imaging principle of jitter camera; (b1) a low resolution video image captured by the jitter camera system; (b2) video super-resolution reconstruction results based on active micro-scanning
Fig. 14. Imaging system based on binary optical switching devices and corresponding reconstruction results. (a) Large aperture infrared monitoring system based on aperture coding developed by QinetiQ (the role of swing mirror is to automatically switch between the blackbody radiation source and the target scene) [47]; (b1) binary optical switch component based on MOEMS and (b2) physical image of the device developed by QinetiQ[47]; (c1) low resolution infrared image and (c2) super-resolution reconstruction result achieved by the system[48]
Fig. 15. Near-infrared super-resolution imaging system based on CS theory developed by Rice University[49]
Fig. 16. Compressed encoding aperture super-resolution imaging results[36]. (a) Ground truth image; (b) uncoded low resolution image, with root-mean-square error (MSE) is 0.1011; (c) modulated low resolution image; (d) reconstruction effect of low resolution modulation image with MSE of 0.0867; (e) (f) reconstructed images after adjusting the parameters of h and p, with MSEs of 0.0897 and 0.0924, respectively
Fig. 17. Coded aperture super-resolution imaging scheme based on CS theory[51]. (a) Principle of coded aperture super-resolution imaging based on CS theory; (b) forward model of coded aperture super-resolution imaging based on CS theory; (c1) orignal infrared image with 240 pixel×240 pixel; (c2) encoded low resolution infrared-image; (c3) reconstructed super-resolution image; (c4) error between true value image and super-resolution image; (c5) PSNR of reconstructed image at different reconstruction magnifications
Fig. 18. Super-resolution imaging scheme based on time delay and encoding exposure[52]. (a) Architecture of TDI CECI system; (b) experimental setup of TDI CECI system; (c) timing synchronization scheme between encoding exposure and TDI line transmission; (d) super-resolution comparison results
Fig. 19. Camera array imaging scheme[53]. (a1) Synthetic image captured by four cameras at different exposure time; (a2) local detail of synthetic image captured by four cameras; (a3) image captured by Canon 20D; (a4) enlarged local detail of image captured by Canon 20D; (b1) a camera array equipped with a telephoto lens for high-resolution imaging; (b2) a camera array equipped with a wide-angle lens for high-speed video imaging; (b3) a compound eye array system equipped with an independent processor
Fig. 20. Sub-pixel super-resolution imaging method based on camera array[55]. (a) A four camera array super-resolution system; (b) comparison between super-resolution reconstruction results and low resolution images; (c) schematic of forward model and sub-pixel super-resolution principle for camera array imaging
Fig. 21. Super-resolution reconstruction results based on micro lens array[56]. (a) Schematic of relative displacement of sub images generated by each lens in the lens array; (b1)(b4)(b7) low resolution images of different scenes; (b2)(b5)(b8) super-resolution reconstruction results using the proposed method; (b3)(b6)(b9) super-resolution results using pixel rearrangement; (c) sub pixel super-resolution imaging flowchart for micro lens array compound eye camera
Fig. 22. Super-resolution imaging scheme based on micro lens array[57]. (a) Principle of micro lens array system; (b) low resolution images captured by micro lens array; (c) physical image of the system; (d) simulation reconstruction results of different super-resolution algorithms; (e) simulation reconstruction results of different super-resolution algorithms
Fig. 23. Light field sensing based on focal plane coding. (a) Four-dimensional (4D) parameterization of light field; (b) light field coded image stack obtained by encoding at the focal plane; (c) application of light field sensing in all-focus imaging and aberration correction
Fig. 24. Shack‒Hartmann wavefront sensor[59]. The wavefronts of different sub-apertures segmented by microlens array are reflected as spots with different intensity distributions in the image plane. The local slope of the wavefront of each sub-aperture can be calculated by accurately detecting the centroid position of the spot in each sub-aperture. The wavefront reconstruction algorithm is used to recover the whole wavefront shape
Fig. 26. Plenoptic camera for macroscopic imaging[64]. (a) Structure comparison between conventional cameras and plenoptic cameras 1.0 and 2.0; (b) comparison of recorded information between conventional cameras and plenoptic cameras 1.0 and 2.0, and the diagram shows the sampling of focal plane object, which is clipped when the light comes from outside the focal plane; (c) different hardware implementations of plenoptic cameras
Fig. 27. Meta-imaging sensor. (a) Meta-imaging sensor composed of a periodic circular pattern with a circular intensity mask, a microlens array, a piezo stage, and a conventional CMOS sensor[71]; (b) digital adaptive optical reconstruction of images containing dynamic turbulence carried out with a ground-based telescope[72]
Fig. 30. Digital adaptive optics scanning light-field mutual iterative tomography (DAOSLIMIT) imaging system[77]. (a) Principle of scanning light field microscopy of DAOSLIMIT and the process of digital adaptive optic; (b) application of DAOSLIMIT in long-term, high-speed subcellular imaging of mice
Fig. 31. Dappled photography light field imaging[79]. (a) Light field camera structure, and a narrowband two-dimensional cosine mask is placed near the line scanning sensor (as shown in the bottom left corner); (b) light field camera principle, in ray space, a cosine mask at d casts a shadow on the sensor, and in the Fourier domain, the scene spectrum (left green) is convolved with the mask spectrum (middle) to produce offset spectrum blocks (right); (c) spectrum and reconstruction results of acquired image, with 81 spectrum patches corresponding to an angular resolution of 9×9
Fig. 32. Light field imaging based on diffuser[80]. (a) Procedure for recording and reconstructing the light field using diffuser, object light propagates to the sensor after passing through the imaging lens and diffuser, caustics encode spatial and angular information in the focal plane, and the light field containing 3D information is reconstructed by solving the linear inverse problem, so as to realize the digital refocusing and other functions; (b) light field reconstruction of two playing cards
Fig. 33. Light field microscopic imaging based on diffuser. (a) Fourier DiffuserScope, where the diffuser is placed in the Fourier plane of the objective (relayed by the 4f system) and the sensor is placed behind a microlens focal length, from a single 2D sensor measurement, combined with a stack of previously calibrated point spread functions, 3D object can be reconstructed by solving a sparse constrained inverse problem[81]; (b) Miniscope3D, where a 55 μm thick optimized phase mask is placed at the aperture stop (Fourier plane) of the objective, and a sparse set of calibrated point spread functions (64 per depth) can be collected by scanning 2.5 μm green fluorescent beads throughout the volume, utilizing this dataset to pre-calculate an accurate forward model of aberrations varying with the field of view, and enabling the reconstruction of 3D volumes from a single 2D measurement[82]
Fig. 34. Lensless imaging system[90-91]. (a) Forward model of lensless imaging with different modulation modes; (b) characteristic of point spread function (PSF) in lensless imaging, including depth-dependence and lateral-dependence, which provides depth and 2D intensity encoding for lensless imaging systems respectively; (c) 3D reconstruction by solving the inverse problem
Fig. 36. FlatScope[96]. (a) Prototype and its mask; (b) 3D volumetric reconstruction of a moving 10 μm fluorescent bead
Fig. 39. DiffuserCam[90]. (a) System setup and reconstruction processes; (b) 3D reconstruction result of slanted resolution target, cropped to 640 voxel×640 voxel×50 voxel; (c) 3D reconstruction of plant leaves, cropped to 480 voxel×320 voxel×128 voxel
Fig. 40. On-chip wide-field fluorescence microscopy based on random microlens scattering[106]. (a) System architecture and sparse PSF calibration method; (b) fluorescent beads flow in microfluidic channels; (c) NeuroD:GCaMP6f larval zebrafish
Fig. 41. PhlatCam[107]. (a) Comparison of PhlatCam with conventional imaging system; (b) PSF design; (c) refocusing experimental results; (d) 3D reconstruction experimental results
Fig. 42. Lensless camera based on programmable mask, which can be divided into “expanding imaging function” (blue background) and “improving imaging quality” (orange background). (a) Multi-layer programmable LCD mask camera structure[109]; (b) camera structure based on CS[112]; (c) SweepCam[114]; (d) structure of coded aperture camera with mask pattern changed by rotation[115]; (e) NoRDS-CAIC[116]
Fig. 44. Low-cost integrated hyperspectral imaging sensor[130]. (a) Composition and structure of integrated hyperspectral imaging sensor; (b) transmission response of 16 materials, average light throughput of each material, and correlation coefficients matrix between each spectral response; (c) experimental results of this hyperspectral imaging sensor
Fig. 45. Scheme of real-time hyperspectral imaging[131]. (a) Conceptual scheme of compressive hyperspectral sensing; (b) transmittance patterns of a coded mask at four different wavelengths; (c) structure of coded masked hyperspectral sensor
Fig. 46. Scheme of metasurface-based narrowband filter[133]. (a) Optical image of fabricated 100 pixel metasurface; (b) linear relationship between scaling factor and ellipse feature size confirmed by SEM images; (c) scheme of imaging system; (d)(e) reflectance images of pixelated metasurface recorded at four specific wavenumbers
Fig. 47. Spectral sensor based on photonic crystal filter[142]. (a) Sensor principle and device image; (b) experimental results of spectral imaging
Fig. 48. Spectral sensor chip based on quantum dot filter and its spectral reconstruction algorithm[145]
Fig. 49. DiffuserSpec[146]. (a) Schematic of conventional spectrometer and DiffuserSpec; (b) reconstruction algorithm and comparison of coding patterns at 818 nm and 828 nm; (c) reconstruction results of narrowband spectral; (d) analysis of spectral resolution
Fig. 50. Spectral DiffuserCam[147]. (a) Overview of the Spectral DiffuserCam imaging; (b) PSF calibration under different wavelengths; (c) reconstruction results of hyperspectral image
Fig. 51. Ultra-simplified computational spectrometer[148]. (a) Principle and structure of the system; (b) reconstruction results of spectral image
Fig. 52. THETA multi-spectral camera[150]. (a) Scheme of the system; (b) multi-spectral imaging of diffusion process of sewage in a miniature landscape of a river
Fig. 53. Global shutter periodic scene imaging[165]. (a) System principle; (b) reconstruction results of tool head at different rotation speeds
Fig. 54. Programmable pixel compressive camera (P2C2) [166]. (a) Schematic of P2C2 principle; (b) optical system diagram and actual setup; (c) comparison of reconstructed results from a video sequence showing a marble dropping into water
Fig. 55. Video extraction from a single encoded exposure photo using a learned over-complete dictionary[169]. (a) Prototype of the system, simulating individual pixel exposure on the sensor surface via LCoS; (b) proposed method consists of three parts, i.e., encoded exposure sampling and projection of the spatiotemporal volume onto the image, learning an over-complete dictionary from training video data, and sparsely reconstructed spatio-temporal information from a single coded image; (c) experimental reconstruction results
Fig. 56. Translation mask temporal multiplexing coded imaging[171]. (a) Forward sensing model for translation mask temporal multiplexing coded imaging; (b) schematic of system design; (c) reconstruction of a card in a scene with arbitrary motion using constrained least squares method with a high-pass filter
Fig. 57. Coded rolling shutter[167]. (a) Address generator in the CMOS image sensor is used to implement a coded rolling shutter with the desired row reset and row selection patterns for flexible spatiotemporal sampling; (b) interlaced readout for high-speed photography
Fig. 58. High-speed video from a single rolling shutter image captured by a lensless computational camera[172]. (a) Algorithm principle; (b) image formed from a temporally varying scene, where two point sources (one yellow and one blue) flash at unique y position and at time t0 and t1; (c) experimental video reconstructed from a single captured image with exposure time of 660 µs
Fig. 59. Compressed ultrafast photography[164]. (a) Schematic of the system; (b) representative time frames showing the laser pulse being reflected by a mirror in the air, refracted at the air-resin interface, and the race between two laser pulses
Fig. 60. Multi-encoding CUP and corresponding imaging results[175]. (a) Schematic of data acquisition, where t represents time, Ck denotes spatial encoding operators, S is the temporal shearing operator, and T is the spatiotemporal integration operator; (b) schematic of the system; (c) experimental results of spatially modulated laser pulses under different encoding numbers
Fig. 61. Compressed ultrafast photography with frame rate of 1013 frame/s [177]. (a) Schematic of the system. (b) T-CUP performs real-time imaging of fundamental optical phenomena of laser pulse scanning, spatial focusing, reflection, and splitting
Fig. 62. Compressed ultrafast spectral photography[179]. (a) Schematic of active CUSP system for 70 Tframe/s imaging; (b) CUSP imaging of ultrafast linear optical phenomena
Get Citation
Copy Citation Text
Bowen Wang, Xu Zhang, Haitao Guan, Kunyao Liang, Sheng Li, Runnan Zhang, Min Zeng, Zihao Pei, Qian Chen, Chao Zuo. Computational Imaging Based on Focal Plane-Coded Modulation: A Review (Invited)[J]. Acta Optica Sinica (Online), 2024, 1(1): 0111001
Category: Research Articles
Received: Sep. 13, 2024
Accepted: Sep. 26, 2024
Published Online: Oct. 13, 2024
The Author Email: Chen Qian (zuochao@njust.edu.cn), Zuo Chao (chenqian@njust.edu.cn)
CSTR:32394.14.AOSOL240452