Infrared and Laser Engineering, Volume. 53, Issue 9, 20240347(2024)
Light field representation and its resolution improvement techniques: an overview (invited)
Fig. 1. Schematic diagram of the plenoptic function and its simplified light field definition. (a) The 7D plenoptic function; (b) Radiance
Fig. 2. Light field, WDF and ALF. (a) The phase of a wavefront is related to the angle of corresponding rays. The WDF of the spherical wave at a given
Fig. 3. Alternative parameterizations of the 4D light field, which represents the flow of light through an empty region of 3D space. (a) Points on a plane or curved surface and directions leaving each point; (b) Pairs of points on the surface of a sphere; (c) Pairs of points on two planes in general (meaning any) position
Fig. 4. Traditional camera and light field camera's sampling models. (a) The cone of rays summed to produce one pixel in a photograph; (b) Sampling of a photograph’s light field provided by a plenoptic camera
Fig. 5. Three light field visualization method. (a) The raw image read by the sensor behind the microlens array; (b) The sub-aperture image; (c) The EPI image
Fig. 6. Light field rendering[1]. (a) The object-movie function of QuickTime VR enables users to virtually navigate around an object (represented by a blue shape) by swiftly flipping through closely spaced photographs of it (indicated by red dots); (b) If the photographs are taken at intervals close enough, users can reorder the pixels to generate novel perspective views without the need to physically occupy those positions (denoted by a yellow dot); this process is known as light field rendering; (c) A light field can be conceptualized as a two-dimensional assembly of two-dimensional images, each captured from a distinct vantage point
Fig. 7. Light field acquisition and rendering systems. (a) The light field gantry built by Stanford University in 1996 can achieve the acquisition of static light fields, and high-quality static light field data can be collected through full freedom control of objects, cameras, and lighting; (b) The multi-camera and multi-lighting dome from Tsinghua University [59]; (c) 360° light field rendering device from University of Southern California[60]; (d) Google record immersive light field video using 46 action sports cameras mounted to an acrylic dome[61]
Fig. 8. Fourier slice theorem[67]. (a) Classical Fourier slice theorem; (b) Generalized Fourier slice theorem
Fig. 9. Fourier slice photograph theorem[67]. Transform relationships between the 4D light field
Fig. 10. Filtered light field photography theorem. Transform relationships between a 4D light field
Fig. 12. Video-rate light-field microscopy imaging technology based on a microlens array. (a) Conventional microscopy and light field microscopy structure[72]; (b) Functional imaging of neuronal activity in the entire Caenorhabditis elegans and zebrafish larval brain[74]; (c) Functional imaging of neural activity during visually evoked and predatory behaviors in larval zebrafish[76]; (d) Video-rate volumetric Ca2+ imaging to 380-μm depth in mouse cortex[77]; (e) Characterization of light-field flow cytometry using fluorescent microspheres[78]
Fig. 13. Light field imaging with a phase diffuser. (a) DiffuserCam: Pipeline for recording and reconstructing light fields with phase plates (a diffuser). The object light passes through an imaging lens and the phase plate, then propagates to the sensor, where caustics encode spatial and angular information. A linear inverse problem is solved to reconstruct the light field, which contains 3D information, enabling digital refocus, among other benefits[80]; (b) Fourier DiffuserScope: A diffuser or microlens array is placed in the Fourier plane of the objective (relayed by a 4
Fig. 14. The modulation and demodulation process of the light field in the Fourier domain for a heterodyne light field camera[84]
Fig. 15. Different configuration of camera array. (a) Light field video camera[87]; (b) Camera array structure designed by YANG[88]; (c) The large-scale camera array designed by WILBURN and JOSHI [90]; (d) Rendering high-quality dynamic scenes with eight cameras[58]; (e) Lytro’s latest VR light-field camera Immerge 2.0
Fig. 16. High-speed video sequence capture using camera arrays. (a) Slicing the spatiotemporal volume to correct rolling shutter distortion and alignment of rolling shutter images in the spatiotemporal volume; (b) Overlapped exposures with temporal superresolution; (c)
Fig. 18. Light field superresolution. (a) Schematic of a 2D section of a light field camera; (b) Top row: One view from our LF image, detail of corresponding LF image and detail of central view (one pixel per microlens, as in a traditional rendering). Bottom row: Estimated depth map (scale in m), above LF image rearranged as views, superresolved central view
Fig. 19. Light field deconvolution imaging based on wave optics theory. (a) Light field deconvolution based on the wave optics model and experimental results of volumetric imaging of pollen grains[73]; (b) In vivo volumetric calcium imaging of a larval zebrafish with the addition of a cubic phase mask[97]; (c) Dynamic volumetric imaging results of COS-7 cells by altering the optical path structure between the microlens array and the sensor[98]; (d) Optical path structure of Fourier light field microscopy and the imaging results[99]; (e) Principle of phase-space deconvolution and three-dimensional volumetric imaging results of Caenorhabditis elegans[100]
Fig. 20. Light field imaging based on prior knowledge constraint. (a) The structure of a compressive light field camera and the use of light field atoms as the fundamental building blocks of natural light fields to sparsely reconstruct a 4D light field from optimized 2D projection[108]; (b) The principle of compressive light-field microscopy and extracting light-field signatures and 3D positions of individual neural structures[109]; (c) The principle of sparse decomposition light field microscopy and whole-brain imaging of larval zebrafish[110]
Fig. 21. Programmable aperture light field imaging. (a) Programmable aperture photography[113]; (b) Programmable aperture microscopy and multi-modal imaging[115]; (c) Multiplexed phase-space imaging for 3D fluorescence microscopy [116]; (d) 3D OTFs under different programmable aperture [117]; (e) Programmable aperture light field microscopy[118]
Fig. 22. Scanning light field superresolution imaging. (a) Principle of DAOSLIMIT and the migration process observed during neutrophil migration in the liver of mice[128]; (b) Principle of the integrated meta-imaging sensor and multisite DAO against dynamic turbulence for ground-based telescopes[52]
Fig. 24. Confocal light field microscopy[140-141]. (a) Design and characterization of confocal LFM; (b) Tracking and imaging whole-brain neural activity during larval zebrafish’s prey capture behavior; (c) Diagram of csLFM system; (d) The raw measurements on a thick brain slice after pixel realignment for the comparison among sLFM, cLFM and csLFM
Fig. 25. Principle of light field reconstruction with back projection. (a) Relationship between the captured image and the light ray field, and light ray field reconstruction from captured images using back projection[142]; (b) An example of the reconstructed EPIs of a real 3D scene[143]; (c) Iterative light field reconstruction based on SART method[144]
Fig. 26. Light field representation of a slowly varying object under spatially stationary illumination[148]
Fig. 27. Angular superresolution diagram. (a) Disparity refinement[153]. After angular superresolution, one can observe the high quality and accurate occlusion boundaries of the resulting view interpolation; (b) Iteratively performing disparity estimation and view synthesis in the phase domain, we reconstructed a densely sampled four-dimensional light field from a micro-baseline stereo pair[154]
Fig. 28. Sparsity in the discrete vs. continuous Fourier domain, and our reconstruction results[156]. (a) The discrete Fourier transform (top) of a particular 2D angular slice of the crystal ball’s light field, and its reconstructed continuous version (bottom); (b) A grid showing the original images from the Stanford light field archive. The used images are highlighted; (c) and (d) two examples of reconstructed viewpoints showing successful reconstruction of this highly non-Lambertian scene. The
Fig. 29. Epipolar-plane image formation and its frequency domain properties[157]. (a) Frequency domain structure of an EPI being insufficiently sampled over
Fig. 30. Compact light-field imaging device. (a) Head-mounted miniature light field microscope (MiniLFM). Explosion (left) and section (right) diagrams of MiniLFM. Some parts have been rendered transparently for visual clarity; (b) Photo of an adult mouse with a head-mounted MiniLFM [167]; (c) The CM2 combines an MLA optics and light-emitting diode (LED) array excitation in a compact and lightweight platform[168]
Fig. 31. Metalens array for light field imaging. (a) Schematic diagram of light-field imaging with metalens array and rendered images[171]; (b) Schematic of integral imaging based on achromatic metalenses[172]; (c) Schematic of the transversely dispersive metalens. Image of a letter “4” by the metalens with a white light illumination with a transmission window of 450–650 nm[173]
Fig. 32. Light field imaging in computational photography. (a1)-(a2) Light field refocusing and (a3) extended depth-of field technique[39]; High Dynamic Range panoramic videography, (b1) all cameras are set to the same exposure level, which allows for the observation of saturated areas in sunlight and dark regions in the shade, (b2) individual exposure settings for each camera to produce a high dynamic range image[90]; (c) Synthetic Aperture Imaging, (c1) a sample image from a single camera, (c2) synthetic aperture focusing on the plane where the people are located, computed by aligning and averaging images from all cameras as described in the text, (c3) suppressing contributions from static pixels in each camera results in a more vivid view of the scene behind the occluder[90]
Get Citation
Copy Citation Text
Runnan ZHANG, Ning ZHOU, Zihao ZHOU, Heheng DU, Qian CHEN, Chao ZUO. Light field representation and its resolution improvement techniques: an overview (invited)[J]. Infrared and Laser Engineering, 2024, 53(9): 20240347
Category: Special issue—Computational optical imaging and application Ⅱ
Received: Jun. 4, 2024
Accepted: --
Published Online: Oct. 22, 2024
The Author Email: