Photonics Research, Volume. 12, Issue 10, 2311(2024)

Single-pixel super-resolution with a space–time modulated computational metasurface imager

Wenzhi Li1, Jiaran Qi1、*, and Andrea Alu2,3,4
Author Affiliations
  • 1Department of Microwave Engineering, School of Electronics and Information Engineering, Harbin Institute of Technology, Harbin 150001, China
  • 2Photonics Initiative, Advanced Science Research Center, City University of New York, New York, New York 10031, USA
  • 3Physics Program, Graduate Center of the City University of New York, New York, New York 10016, USA
  • 4e-mail: aalu@gc.cuny.edu
  • show less

    Single-pixel imaging is a burgeoning computational imaging technique that utilizes a single detector devoid of spatial resolution to capture an image, offering great potential for creating cost-effective and simplified imaging systems. Nevertheless, achieving super-resolution with a single pixel remains a formidable challenge. Here, we introduce a single-pixel super-resolution imaging technique based on space–time modulation. The modulation parametrically mixes the incoming signals, enabling the space–time scattered signals of the object carrying finer details to be captured by the single-pixel imaging system. To validate our proposed technique, we designed and fabricated a computational metasurface imager that needs only a single transmitting port and a single receiving port. The achieved resolution surpasses the Abbe resolution limit. The principle of our proposed technique is well-suited for low-cost and compact imaging systems.

    1. INTRODUCTION

    Imaging is one of the most important functionalities in many technologies. The semiconductor revolution has enabled low-cost fabrication of high-density detectors within a given area for imaging at visible frequencies. However, at longer wavelengths, including microwave and terahertz, detector arrays are usually high-cost or unavailable due to the lack of suitable materials for the construction of high-density detectors and the consideration of system size, cost, and deployment situations [1,2]. Imaging at these longer wavelengths is of great significance for practical applications [3,4], for example, identifying objects hidden in optically opaque materials, all-weather detection, material spectroscopy, and more. In the quest for low-cost and easy-to-deploy solutions for the longer wavelengths, single-pixel imaging has emerged as a vital area of exploration [57]. This technique requires only a single-pixel detector rather than an array to capture spatial information. A major advantage is that it does not need pixelated detectors, or detector arrays, bringing the ability of tackling challenges in low-cost imaging at the longer wavelengths.

    Despite the rapid development of sing-pixel imaging, exploiting super-resolution information with a single pixel is still an intractable problem. Super-resolution imaging is a sought-after goal for many applications, since it allows revelation of finer details of the objects beyond the diffraction limit [8,9]. Many super-resolution imaging methods have been proposed, which can be classified into two major categories: near-field and far-field schemes. For the near-field schemes, evanescent waves that carry sub-diffraction-limited information are captured by a sensitive probe or amplified by a superlens placed near the objective surface with a subwavelength distance [10,11]. For far-field schemes, since the evanescent waves are not directly available, researchers are motivated to search for alternative methods. According to the angular spectrum theory, evanescent wavenumbers can be converted to propagating wavenumbers. Pioneering works based on far-field superlenses were proposed as promising solutions for this purpose [1214]. Alternatively, structured illumination imaging is also based on this theory and uses a series of structured illumination patterns to resolve fine details [9]. Structured illumination patterns include sinusoidal patterns with shifting phases and orientations [1517], illumination-angle-dependent patterns [1820], cylindrical-vector-beam patterns [21], vortex-beam patterns with different orbital angular momenta [22,23], and other related techniques. However, transferring these successfully demonstrated super-resolution methods to single-pixel imaging is non-trivial. On one hand, single-pixel imaging usually requires projecting random illumination patterns or using a series of random amplitude/phase masks for the receiving signals. For instance, in ghost imaging temporal or spatial radiation patterns are randomly generated to illuminate objects and then a statistics model is used to reconstruct them [24]. On the other hand, in order to properly shift finer details to propagating modes, the aforementioned super-resolution methods require the use of deterministic structural illuminations, which are difficult to generate due to the lack of a proper projector at longer wavelengths. The difference of illumination patterns or phase masks between single-pixel and super-resolution imaging implies a challenging combination. In addition, when dealing with longer wavelengths, pixelated arrays can be with relatively high cost to generate illumination patterns or receive signals. Therefore, single-pixel super-resolution imaging becomes a tough nut to crack.

    Metasurfaces have been paving a new route for imaging ranging from microwave to optics [25]. Owing to their remarkable properties, successful demonstrations of extreme wavefront control based on metasurfaces have been carried out, including meta-lenses [26], cloaks [27,28], and holography [29,30]. Reconfigurable metasurfaces, capable of manipulating the electromagnetic response of each unit cell in real time, have also seen growing interest [3134]. Another significant area is space–time modulated metasurfaces, which introduce the time dimension as a degree of freedom [3537]. A space–time modulated metasurface enables many interesting operations, such as multifrequency control [38], power combining [39], and non-reciprocity [40]. Combining with computational imaging methods, single-pixel imaging using reconfigurable metasurfaces has been envisioned as an important tool for longer wavelengths [41,42].

    Here we propose a super-resolution imaging technique that combines the advantages of space–time modulation, single-pixel imaging, and structured illumination. In this technique, by spatially and temporally modulating two metasurfaces, a set of space–time modulated scattered waves by one metasurface can shift the high-spatial-frequency information contained in the evanescent spectrum to low-spatial-frequency waves, which can be captured by the other metasurface with a single receiving antenna. Our experimental results demonstrate achieved resolution with a 42.3% improvement compared to the Abbe limit. Moreover, our proposed technique employs single-pixel intensity measurements without the requirements of phase measurements, reducing system complexity and providing significant convenience for imaging at longer wavelengths.

    2. IMAGING SYSTEM

    The schematic setup of the proposed technique is shown in Fig. 1. This setup contains two antennas: one serves as the transmitter, while the other serves as the receiver. Two reconfigurable metasurfaces, MetaTX and MetaRX, are placed in front of the transmitter and receiver with fixed distances, respectively. MetaTX is spatially and temporally modulated. As the radiated waves from the transmitter pass through MetaTX, a series of space–time structured illuminations is generated and illuminates the object. MetaRX captures the space–time-modulated scattered fields of the object according to a matched-filter process, which is aimed at maximizing the signal-to-noise ratio of the received intensities by the receiver. Without the need for postprocessing, the measured intensities are a collection of harmonic images of the object. These harmonic images are synthesized by an iterative recovery procedure, which is based on Fourier ptychography [43,44]. As a result, a super-resolution image is reconstructed with finer details beyond the imaging system limit in the absence of space–time modulation. The working principle of the proposed technique is described as follows.

    Schematic of the proposed single-pixel super-resolution technique.

    Figure 1.Schematic of the proposed single-pixel super-resolution technique.

    First, MetaTX is spatially and temporally modulated to generate space–time modulated signals, which are a set of illumination patterns with different wavevectors. MetaTX is a reconfigurable metasurface, which consists of many independent active unit cells. Each unit cell can manipulate the amplitude and phase of the transmitted wave. For a given unit cell of MetaTX, the transmission coefficient T(t) is a time-periodic function with period T0. Within one period, the time-modulated sequence is {T1,T2,,TN}, where each element has a duration τ=T0/N. Here, N is a positive number indicating the element number of the time-modulated sequence, and Tn is the complex transmission coefficient of the unit cell with the nth time slot. Considering an incident wave Ei(t)=E0ejωct, the transmitted wave of the unit cell is Eo(t)=Ei(t)T(t). By applying a Fourier transform to the transmitted wave, we have [35] E0(ω)=2πE0m=amδ(ωmω0ωc),with am=n=1NTnNsin(πmN)πmNexp[jπm(2n1)N],where am is the complex amplitude of the mth harmonic, ωc is the circular frequency of the incident wave, and ω0=2π/T0.

    Equation (1) implies that the transmitted wave can be written as a superposition of harmonic waves with frequencies ωc+mω0 (m=0,±1,±2,). The amplitude and phase of each of them are dependent on the time-modulated sequence, as shown in Eq. (2). Therefore, we can set the desired amplitudes and phases at the desired harmonics by properly controlling the time-modulated elements of the time sequence. Note that the above time modulation refers to a single unit. If the same time series is applied to all units of a metasurface, we can only manage the frequency of the transmitted wave. To produce a set of plane-wave illumination patterns with different wavevectors, different time sequences are applied to different unit cells, introducing also space modulation. By combining time and space modulation (space–time modulation), we can hence synthesize the transmitted wave as Et(x,y)=m=1MAmej(kxmx+kymy)ejωmt,where m denotes the mth harmonic with the corresponding frequency ωm=ωc+mω0, and (kxm,kym) is the wavevector of the mth harmonic. The illumination patterns are a linear combination of M plane waves, each of which has a corresponding frequency ωm and a corresponding wavevector (kxm,kym).

    Second, a single-pixel imaging scheme based on the matched-filter theory is then enabled by MetaRX and the receiver, aimed at collecting a series of space–time scattered signals to form a set of harmonic images (denoted as an image sequence). Here we assume that the object has a transmittance distribution o(x,y). According to the matched-filter theory, the phase pattern of MetaRX has the form M(xrx,yrx;Xo,Yo)=exp{jk[(xrxXo)2+(yrxYo)2+zo2+xrx2+yrx2+zr2]},where (xrx,yrx) is the coordinate of the MetaRX plane, (Xo,Yo) is the pixel coordinate of the image, k is the wavenumber of the incident wave, zo is the distance between the object plane and the XrxYrx plane, and zr is the distance between the XrxYrx plane and the XrYr plane, as shown in Fig. 1. As the phase pattern of Eq. (4) is applied to MetaRX, the receiving intensity of the receiver can be expressed as i(Xo,Yo)=Ao(Xo,Yo)o(Xo,Yo),where A is a constant. The derivation of Eqs. (4) and (5) is detailed in Appendix A.1. As it can be seen from Eq. (5), if (Xo,Yo)=(x,y), the received field i(x,y) is proportional to o(x,y), that is, the object’s transmittance at (x,y). Therefore, by applying Eq. (4) to the object plane, the single-pixel measured intensity is the transmittance distribution of the object.

    In our case, we assume the time-modulation frequency ω0 and the bandwidth Mω0 of the space–time modulated waves to be much smaller than the frequency ωc of the incident wave, i.e., ω0 and Mω0ωc, when M harmonics are considered. Under such assumption, the wavenumbers of other harmonics are approximately equal to the one of the zeroth harmonic, i.e., kmkc, where kc is the wavenumber of the radiated wave from the transmitter. The wavenumber in Eq. (4) can be replaced with kc for all harmonic waves. Consequently, we can apply the same phase patterns to receive all harmonics and thus capture all corresponding harmonic images. Since the harmonics are generated simultaneously by the space–time modulation and illuminate the object, we can capture an image sequence, i.e., all harmonic images, and index its elements by the harmonic indices {m}.

    We are now ready to perform a phase retrieval procedure on the image sequence to reconstruct a super-resolution image. Consider the illumination pattern of the mth harmonic, i.e., Amexp[j(kxmx+kymy)]. The corresponding transmitted wave is om(x,y)=o(x,y)Amexp[j(kxmx+kymy)] [or Om(kx,ky)=AmO(kxkxm,kykym) in the Fourier domain]. When the transmitted wave is captured by the matched-filter procedure, the intensity image of the mth harmonic can be modeled as im(x,y)=|F1[Om(kx,ky)P(kx,ky)]|=|F1[AmO(kxkxm,kykym)P(kx,ky)]|,where F1 denotes the inverse Fourier transform, and P(kx,ky) is the pupil function of the imaging system. The support region of P(kx,ky) is |k|=|(kx2+ky2)1/2|2πNA/λ, where NA is the numerical aperture of MetaRX. As can be seen from Eq. (6), limited by the range of the support of P(kx,ky), the single-pixel imaging system composed by MetaRX and the receiver can capture merely a fraction of the angular spectrum of the object transmittance’s Fourier transform rather than all angular spectrum components, resulting in the diffraction-limited image. In other words, each harmonic (corresponding to individual images within the image sequence) retrieves a portion of the object transmittance in the Fourier domain, which is centered at (kxm,kym) and truncated by the support of the pupil function. Although the images in the image sequence are inherently diffraction-limited, they contain different angular spectrum components of the object transmittance. For example, if (kxm,kym)=(0,0), the maximum angular component is kmax=2πNA/λ; if (kxm,kym)=(2πNA/λ,0), the maximum angular component is kmax=2πNA/λ, which is twice that of the previous one. Therefore, once these angular spectrum components are fused, an extended angular spectrum is obtained, yielding a super-resolution image.

    To synthesize all these angular spectrum components, a postprocessing, that is, the phase retrieval procedure, is employed for acquiring a broader angular spectrum. The phase retrieval procedure is adopted from Fourier ptychography, which is carried out by alternative projections between the image sequence and the estimated image. Specifically, in our case, the image sequence is used as the image constraint for the estimated image, i.e., replacing the estimated image’s amplitude with the corresponding image in the imaging sequence. Consequently, the estimated image includes all the different angular components in the image sequence, that is, a broader angular spectrum coverage in the Fourier domain compared with the images in the image sequence, leading to the super-resolution. Appendix A.2 gives a more detailed description for the phase retrieval procedure.

    3. IMAGING SYSTEM

    A. Experimental Setup

    The experimental setup is shown in Figs. 2(a) and 2(b). We designed and fabricated two 2-bit reconfigurable transmissive metasurfaces, which consist of 13×13 unit cells and operate at 5.8 GHz (λ=51.7  mm), one used as MetaTX and the other as MetaRX, as shown in Fig. 2(c). Each unit cell has a period of 23.5 mm and is capable of switching between four states, i.e., changing the phase of transmitted waves among 0°, 90°, 180°, and 270°, respectively (see Appendix A.3). MetaTX and MetaRX are controlled by two field-programmable gate arrays (FPGAs), respectively, as shown in Fig. 2(e). A probe antenna placed on the right-hand side is the receiver and is connected to a spectrum analyzer. Another probe antenna placed on the left-hand side is utilized as the transmitter and connected to a signal generator. Both the probe antennas are the same, as shown in Fig. 2(d). The target object is placed between MetaTX and MetaRX.

    Experimental setup. (a) The schematic of the experimental setup. (b) A zoom view of the experimental setup. (c) The probe antenna. (d) The 2-bit reconfigurable transmissive metasurface. (e) The FPGA controller.

    Figure 2.Experimental setup. (a) The schematic of the experimental setup. (b) A zoom view of the experimental setup. (c) The probe antenna. (d) The 2-bit reconfigurable transmissive metasurface. (e) The FPGA controller.

    B. Image Collection and Reconstruction

    The distance between the object and MetaRx is 200 mm. The object is located in a 200  mm×200  mm region. We divided this region into 41×41 pixels. The matched-filter processing was applied to this region. The phase patterns were calculated by Eq. (4). A spectrum analyzer was used to measure the receiving intensities with respect to the phase patterns that form the raw images. The acquisition time was 200  s. We used two FPGAs (Altera Cyclone IV) to provide the space–time modulation on the metasurfaces (MetaTX and MetaRX). A laptop was used to synthesize the raw images to reconstruct the super-resolution image. The raw images (41×41 pixels) were first converted to 201×201-pixel images. This took 0.03  s for all raw images. Next, further postprocessing was performed to reconstruct the super-resolution image. This took 5.2  s. The space–time modulation makes the transmitter and MetaTX equivalent to a virtual aperture with a numerical aperture of sin45°. The numerical aperture of MetaRX is sin37°.

    The experimental steps are described as follows. First, an image is acquired without space–time modulation. MetaTX converts the spherical wavefront of the transmitter to a plane wavefront to illuminate the object. MetaRX performs the single-pixel imaging based on the matched filter. The receiver samples the intensity values at the central frequency ωc. These measurement values, denoted as i(ωc), constitute the original-resolution image. The corresponding imaging resolution is determined by the numerical aperture of MetaRX, given by Δ0=0.61λ/NAr, where NAr is the numerical aperture with respect to MetaRX. Second, the space–time modulated phase patterns are applied to MetaTX, which generates a set of space–time modulated beams to illuminate the object. MetaRX performs the same matched-filter processing as the first step. Different from the first step, the receiver samples all intensity values at all designed harmonics {ωc+mω0} with m representing the mth harmonic. The resultant images are referred to as harmonic images {i(ωc+mω0)}. Third, a phase retrieval iteration is carried out by a laptop. A super-resolution image is obtained by synthesizing all images {i(ωc+mω0)}.

    C. Optimization of Space–Time Sequence

    The metasurfaces used in the experiments are 13×13 arrays. The unit cells of the metasurfaces are controlled independently. Since the unit cell provides a 2-bit phase manipulation, an optimization is necessary to implement the transmitted wave in the form of Eq. (3). Each unit cell has an optimized time sequence to provide the desired phases at the selected harmonics. In our experiments, the length of the time sequence has a length of 21, that is, 21 intervals within one period. Each unit cell switches its phase according to its time sequence periodically. This sequence length was selected since it provides a relatively large degree of freedom for the optimization and avoids the FPGA’s memory overflow problem. To make each harmonic easily separable by the spectrum and avoid the kmkc condition being invalid, the time-modulation frequency of the FPGA is 12.5 kHz. Here, we used the particle swarm optimization (PSO) algorithm to optimize the time sequences. We optimized two time sequences, each of which generated nine harmonics with nine wavevectors. The total bandwidth of the nine harmonics is 100  kHz5.8  GHz. By rotating the time sequences according to the metasurface’s center, another 18 wavevectors were obtained. This rotation can avoid repetitive optimizations. A detailed description of the time-sequence optimization is given in Appendix A.4.

    D. Experimental Results

    We first experimentally estimated the resolution of the proposed technique. In this experiment, the test objects are a set of metal planes with two holes, as shown in Figs. 3(a)–3(d). The diameter of the holes is 25 mm, which is selected to simulate a point source. The minimum distance between the two holes ranges from 20 to 50 mm. The distance between MetaRX and the object (as well as between MetaTX and the object) is 200 mm. The corresponding Abbe-limit resolution is calculated as 52 mm.

    Experimental estimation of the achieved resolution. (a)–(d) The optical images of the two-hole objects with the minimum distances of (a) 50 mm, (b) 40 mm, (c) 30 mm, and (d) 20 mm. (e)–(h) The simulated results without the space–time modulation (OR) and with the space–time modulation (SR). (i)–(l) The corresponding measured results without/with the space–time modulation. The blue circles indicate the ground-truth positions of the holes. (m)–(p) The intensities along the dotted lines of the third row. GT, ground truth; OR, original resolution; SR, super-resolution.

    Figure 3.Experimental estimation of the achieved resolution. (a)–(d) The optical images of the two-hole objects with the minimum distances of (a) 50 mm, (b) 40 mm, (c) 30 mm, and (d) 20 mm. (e)–(h) The simulated results without the space–time modulation (OR) and with the space–time modulation (SR). (i)–(l) The corresponding measured results without/with the space–time modulation. The blue circles indicate the ground-truth positions of the holes. (m)–(p) The intensities along the dotted lines of the third row. GT, ground truth; OR, original resolution; SR, super-resolution.

    Figures 3(e)–3(h) show the simulated results without and with the space–time modulation. In the simulation, the configurations mirror those employed in the experimental setup. Both the metasurfaces MetaTX and MetaRX comprise 13×13 unit cells, each of which has a size of 23.5  mm×23.5  mm and is capable of switching among four states as that used in the experiment. The distances among the probe antennas, MetaTX, MetaRX, and the object plane are set as the same as the experiment. In addition, the time sequences applied to the metasurfaces in the simulation are identical to those utilized in the experiment, ensuring a high degree of consistency between the simulated and experimental scenarios. As can be seen from the left panel of Fig. 3(e), the imaging system without space–time modulation can just resolve the two holes with a minimum distance of 50 mm, which is close to the Abbe theoretical limit. However, as the minimum distance between the two holes is lower than 50 mm, the imaging system cannot resolve the holes—a single spot is observed, as shown in the left panels of Figs. 3(f)–3(h). In contrast, when space–time modulation is applied to MetaTX, the imaging system can resolve the holes with minimum distances smaller than 50 mm. From the right panels of Figs. 3(f) and 3(g), we can observe that two holes are clearly resolved—two spots with distinct distances are presented in their desired positions. As the minimum distance decreases to 20 mm, the two spots trend to coalesce, as shown in Fig. 3(h). The corresponding measured results for the set of two-hole metal planes are shown in Figs. 3(i)–3(l). From Figs. 3(k) and 3(l), we can estimate that the experimentally achieved resolution of the proposed technique is at least 30 mm, which is a 42.3% improvement compared with the original resolution of 52 mm.

    To further demonstrate the proposed imaging technique, we performed experiments on another group of objects, which are shown in Figs. 4(a1)–4(a4). The minimum distance between two adjacent holes in Figs. 4(a1)–4(a4) ranges from around 30 to 40 mm, which is smaller than the original resolution of 52 mm. From Fig. 4(b1), we observe that the imaging system cannot distinguish four holes without space–time modulation. In contrast, four holes were clearly shown in Fig. 4(c1). To demonstrate the achieved resolution in both the x and y directions, seven holes were arranged along the x and y directions with the adjacent distance of 35 mm, as shown in Fig. 4(a2). Three holes were placed along the x direction, while four holes were placed along the y direction. Without the space–time modulation, we only see a T-like shape, as shown in Fig. 4(b2). As MetaTX was spatiotemporally modulated, seven spots corresponding to the seven holes were observed, as shown in Fig. 4(c2). Similarly, an eight-hole circle was used to demonstrate the achieved resolution in other directions, as shown in Figs. 4(a3), 4(b3), and 4(c3). The corresponding measured results are shown in Figs. 4(d3), 4(e3), and 4(f3). Moreover, a three-strip object was also imaged, as shown in Fig. 4(d4). The minimum distance between two adjacent strips is 30 mm. As can be observed from Figs. 4(b4) and 4(d4), the three strips were squeezed together since the minimum distance is lower than the original resolution. With space–time modulation, the imaging system was able to resolve the three strips, as shown in Figs. 4(c4) and 4(e4).

    Experimental results of multiple objects. (a1)–(a4) Optical images of more complex objects with (a1) four holes, (a2) seven holes, (a3) eight holes, and (a4) three strips. (b1)–(b4) The simulated results without the space–time modulation. (c1)–(c4) The simulated results with the space–time modulation. (d1)–(d4) The measured results without the space–time modulation. (e1)–(e4) The measured results with the space–time modulation. The circles in (b1)–(e4) indicate the ground-truth object positions. (f1)–(f4) The intensities along the dotted lines in (e1)–(e4).

    Figure 4.Experimental results of multiple objects. (a1)–(a4) Optical images of more complex objects with (a1) four holes, (a2) seven holes, (a3) eight holes, and (a4) three strips. (b1)–(b4) The simulated results without the space–time modulation. (c1)–(c4) The simulated results with the space–time modulation. (d1)–(d4) The measured results without the space–time modulation. (e1)–(e4) The measured results with the space–time modulation. The circles in (b1)–(e4) indicate the ground-truth object positions. (f1)–(f4) The intensities along the dotted lines in (e1)–(e4).

    The experimental results validate the super-resolution capability of the proposed technique, particularly when the space–time modulation is employed. Through the space–time modulation, the transmitter and MetaTX function as a virtual aperture with a numerical aperture of sin45°. The numerical aperture of MetaRX is sin37°. The resultant synthesized numerical aperture is equal to the sum of the numerical apertures of the virtual aperture and MetaRX. Consequently, the theoretical resolution limit of the proposed method is 0.61×51.7/(sin45°+sin37°)24.1  mm. From Fig. 3(k), we can observe that the experimentally achieved resolution is close to the theoretical limit. The remarkable resolution improvement, quantified at 42.3%, underscores the potential of integrating space–time modulation into imaging systems. This significant enhancement in performance is particularly evident in the capability to resolve fine details in objects with features smaller than the Abbe limit, as demonstrated through the super-resolution images in Figs. 3 and 4. A series of experiments conducted on a variety of object geometries—from holes to strips—demonstrates the robustness and broad applicability of our technique. From Fig. 4(e4), we can observe that the resultant image of the strips has non-uniform intensities. This is attributed to the measurement noise and the imperfect phase and amplitude distributions generated by the metasurfaces, as shown in Figs. 10 and 11 (Appendix A). The latter may be solved by using metasurfaces with a higher phase resolution to acquire more uniform phase and amplitude distributions. Future refinements aimed at improving the uniformity of intensity distributions and further optimizing the phase and amplitude control mechanisms can bolster the technique’s effectiveness, paving the way for its widespread adoption in advanced imaging systems.

    4. CONCLUSION

    In this paper, we have introduced and experimentally demonstrated a single-pixel super-resolution imaging technique based on space–time modulated metasurfaces. This computation-enabled technique realizes super-resolution imaging without detecting evanescent waves in near-field regions. The space–time modulation converts the high-spatial-frequency components, which contain finer details, to the low-spatial-frequency components, which can be captured by the imaging system. The space–time modulated metasurfaces are capable of simultaneously generating multiple illuminations with different wavevectors and capturing all the corresponding harmonic images. This technique uses only a single transmitting antenna and a single receiving antenna to emit and detect signals, rather than transmitting or receiving antenna arrays. Different portions of the object information in the Fourier domain are captured by the two-antenna imaging system. A broader spectrum is in turn captured, enabling the super-resolution characteristic. Moreover, the intensity-only measurement can further reduce the imaging system’s complexity. Therefore, this proposed technique is cost-effective, compact, and provides large flexibility for imaging applications at longer wavelengths.

    Our work enables placing the image device in the far-field region, which provides larger flexibility for practical applications compared with evanescent wave-based techniques that require imaging devices being placed in the near-field regions. Contrasted against conventional structured illumination techniques, our work requires not a detector array, but rather a single detector, rendering our work more cost-efficient. Despite the existence of various metasurface-based space–time modulation techniques [3540], including frequency conversions, beam scanning, and non-reciprocity, their potential in the domain of super-resolution imaging has yet to be tapped. Our work pioneers the application of space–time modulation specifically tailored for super-resolution imaging purposes, underscoring its potential to boost the field with its unique advantages.

    We believe that the prospects of our single-pixel super-resolution imaging technique, anchored in space–time modulated metasurfaces, are promising and multifaceted. This technique not only reduces hardware complexity and cost but also paves the way for a more agile and versatile imaging solution, particularly suitable for applications in longer-wavelength regimes where conventional approaches struggle. Future developments may include the integration of machine learning algorithms for real-time optimization of imaging parameters, the exploration of adaptive space–time modulation patterns for dynamic scene analysis, and the miniaturization of metasurface elements for integrated devices. In addition, integrating dynamic mechanisms of tunable optical properties on metasurfaces can be a promising avenue for extending the working wavelength to optics [45]. With ongoing advancements in metasurface engineering and computational imaging techniques, we envision a future where super-resolution imaging is not only ubiquitous but also tailored to meet the specific demands of various scientific, medical, and industrial applications.

    Acknowledgment

    Acknowledgment. J. Qi thanks the National Natural Science Foundation of China for supporting this work.

    APPENDIX A

    Single-Pixel Imaging Model

    The single-pixel imaging model is illustrated in Fig. 5. We first investigate the field distributions on three planes, which are the object plane (XoYo), the MetaRX plane (XrxYrx), and the receiver plane (XrYr). The transmittance of the object can be modeled as o(x,y). For normal incidence, the transmitted wave through the object is E(xo,yo)=Eo(xo,yo), where E is a constant. The incident field on the XrxYrx plane can be written as Ei(xrx,yrx)=1jλSoE(xo,yo)zoejk(xrxxo)2+(yrxyo)2+zo2(xrxxo)2+(yrxyo)2+zo2dxodyo,where λ is the incident wavelength, k=2π/λ is the incident wavenumber, zo is the distance between the XoYo and XrxYrx planes, and So is the object plane. The incident field is manipulated by the metasurface Srx with a transmissive coefficient distribution M(xrx,yrx), which is a complex function. The outgoing field is Eo(xrx,yrx)=Ei(xrx,yrx)M(xrx,yrx). Similar to Eq. (A1), the field distribution on the XrYr plane can be expressed as Er(xr,yr)=1jλSrxEo(xrx,yrx)zrejk(xrxrx)2+(yryrx)2+zr2(xrxrx)2+(yryrx)2+zr2dxrxdyrx.

    Schematic of the single-pixel imaging model.

    Figure 5.Schematic of the single-pixel imaging model.

    Flowchart of the phase retrieval procedure.

    Figure 6.Flowchart of the phase retrieval procedure.

    (a) A radiation-source array. Each source has a signal channel. (b) The space–time modulated metasurface radiation source.

    Figure 7.(a) A radiation-source array. Each source has a signal channel. (b) The space–time modulated metasurface radiation source.

    Phase distributions of the VRS illumination patterns. The amplitude distributions are assumed to be unitary.

    Figure 8.Phase distributions of the VRS illumination patterns. The amplitude distributions are assumed to be unitary.

    Simulated phase distributions of MetaTX. MetaTX consists of 13×13 unit cells. Ti denote the ith time sequence. The set {−4,−3,−2,−1,0,1,2,3,4} denotes the harmonic indices, which correspond to the radiation sources in Fig. 7(b) from left to right, respectively.

    Figure 9.Simulated phase distributions of MetaTX. MetaTX consists of 13×13 unit cells. Ti denote the ith time sequence. The set {4,3,2,1,0,1,2,3,4} denotes the harmonic indices, which correspond to the radiation sources in Fig. 7(b) from left to right, respectively.

    Simulated phase distributions of MetaTX after eliminating the spherical wave.

    Figure 10.Simulated phase distributions of MetaTX after eliminating the spherical wave.

    Simulated amplitude distributions of MetaTX.

    Figure 11.Simulated amplitude distributions of MetaTX.

    [47] A. V. Oppenheim. Discrete-Time Signal Processing(1999).

    [48] R. W. Gerchberg. A practical algorithm for the determination of plane from image and diffraction pictures. Optik, 35, 237-246(1972).

    Tools

    Get Citation

    Copy Citation Text

    Wenzhi Li, Jiaran Qi, Andrea Alu, "Single-pixel super-resolution with a space–time modulated computational metasurface imager," Photonics Res. 12, 2311 (2024)

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Surface Optics and Plasmonics

    Received: Jun. 7, 2024

    Accepted: Jul. 12, 2024

    Published Online: Oct. 8, 2024

    The Author Email: Jiaran Qi (qi.jiaran@hit.edu.cn)

    DOI:10.1364/PRJ.532222

    Topics