Chinese Optics Letters, Volume. 22, Issue 8, 080501(2024)

Autofocus by Lissajous scanning in time reversal optical scanning holography

Jie Liu1, Haiyan Ou1,2、*, Hua Wang1, Lin Peng3, and Wei Shao2
Author Affiliations
  • 1Shenzhen Institute for Advanced Study, University of Electronic Science and Technology of China, Shenzhen 518000, China
  • 2School of Physics, University of Electronic Science and Technology of China, Chengdu 611731, China
  • 3Key Laboratory of Cognitive Radio and Information Processing, Guilin University of Electronic Technology, Guilin 541004, China
  • show less

    In this Letter, an autofocusing method in optical scanning holography (OSH) system is proposed. By introducing Lissajous scanning into multiple signal classification (MUSIC) method in time-reversal (TR) OSH, the axial locations of the targets can be retrieved with better resolution and the peak prominence increases from 0.21 to 0.34. The feasibility of this method is confirmed by simulation as well as experiment.

    Keywords

    1. Introduction

    Optical scanning holography (OSH) is an incoherent digital holography (DH) technology[1] that has found extensive application in domains including encryption[2,3], three-dimensional (3D) display[4], remote sensing[5], and microscopy[6]. In OSH, the amplitude and phase information of the 3D objects are recorded in two-dimensional (2D) holograms by raster scanning.

    Reconstruction is an important process in OSH, which means to retrieve distinct sections of the 3D object from the hologram. It is generally necessary to use an autofocusing algorithm to extract the true position of the object from the hologram for reconstructing a focused and sharp image[7].

    There are a number of pioneering autofocusing works in OSH[711]. Kim and Poon utilized the Wigner distribution to retrieve the depth parameter[12]. Ren et al. put forward an entropy minimization method to achieve autofocusing[13]. Meanwhile, edge sparsity has also demonstrated its capability to deal with autofocusing[14].

    With the ongoing development in deep learning, neural networks have also been applied to autofocus in recent years[15-19]. Pitkäaho et al. employed the AlexNet architecture to estimate the focal position, which necessitates hologram preprocessing[20]. Autofocus is treated as a classification problem and operationalized with deep learning[21]. Despite its high efficacy, this method demands a predefined set of discrete distances, making it inflexible. Madali and co-authors proposed two depth information extraction methods based on U-Net architecture[22].

    The spatiotemporal focusing characteristics of time reversal (TR) can accurately locate objects, making it useful in OSH for addressing autofocusing[23]. Formerly, we presented an autofocusing method using multisignal classification (MUSIC) based on the TR technique[24]. However, the MUSIC method typically requires point-by-point scanning to achieve high resolution, which can be time-consuming. The diagonal scanning method proposed improves calculation speed but sacrifices resolution. Therefore, the balance between resolution and autofocus speed needs to be considered carefully further. In this paper, by incorporating Lissajous scanning into TR-MUSIC, a higher resolution with a reasonable time investment has been achieved.

    This article is organized as follows. Section 2 presents the OSH system principle, followed by an explanation of axial localization, i.e., autofocusing based on Lissajous scanning theory. In order to highlight the effectiveness of the proposed method, results from both simulation and experimentation are presented in Section 3 and Section 4, respectively. The concluding remarks are provided in Section 5.

    2. Principle

    2.1. OSH

    Figure 1 illustrates an OSH system[1]. The laser emits a light beam that splits into two at the beam splitter BS1. One of the beams transmits through the mirror M1, the pupil p1(x,y), and the lens L1. The other beam first undergoes frequency modulation at an acousto-optic shifter (AOFS) before passing through the mirror M2, the pupil p2(x,y), and then the lens L2. The XY scanning mirror reflects the two beams, which converge at the BS2, enabling them to scan the object point by point. The optical signal is transformed into an electrical signal by a photodetector (PD) following light’s passage through lens L3. After demodulation, the electrical signal is stored as a hologram on the computer.

    OSH system. BS, beam splitter; AOFS, acousto-optic frequency shifter; M, mirror; p(x,y), pupil; L, lens; PD, photodetector; BPF, bandpass filter; LPF, low-pass filter.

    Figure 1.OSH system. BS, beam splitter; AOFS, acousto-optic frequency shifter; M, mirror; p(x,y), pupil; L, lens; PD, photodetector; BPF, bandpass filter; LPF, low-pass filter.

    If we discretize the 3D object into N sections along the z axis, then the complex amplitude of the object can be expressed as O(x,y;zi), in which x and y are space coordinates, and zi denotes the distance between the ith section and the scanning mirrors. In OSH, we choose specifically pupils, where p1(x,y)=1 (omitting finite size effects) and p2(x,y)=δ(x,y), i.e., a Dirac delta function. The spatial impulse response h(x,y;z) can be expressed as[8]h(x,y;z)=(j/λz)exp((π/NA2z2+jπ/λz)(x2+y2)),where j is the imaginary unit and λ is the wavelength of light. NA stands for the numerical aperture of Gaussian function. Therefore, the hologram can be expressed as g(x,y)=(|O(x,y,z)|2h(x,y;z))dzi=1N(|O(x,y,zi)|2h(x,y;zi)),where represents the convolution operation.

    The reconstructed image of the lth layer can be expressed as Iout(x,y;zl)=i=1N(|O(x,y,zi)|2h(x,y;zi))h*(x,y;zl)=|O(x,y,zl)|2γ+ilN|O(x,y,zi)|2h(x,y;zi)h*(x.y;zl),where γ=h(x,y;zl)h*(x.y;zl) and * represent the conjugation operation. The equation consists of two terms: the information image of the focused section and the defocus noise from other sections.

    2.2. TR OSH

    TR is a technique to focus wave energy on a selected point in space and time. It can locate targets in turbid media due to its inherent synchronized time and space focusing properties[23,25]. While the MUSIC algorithm, based on eigenvalue decomposition and subspace theory, can provide high resolution and stability to detect target directions[26,27], TR-MUSIC is a combination algorithm that merges the adaptive focusing of TR technique with the high resolution of the MUSIC algorithm. It has been widely used in electromagnetic waves, optical waves, and other fields[2832].

    In our previous work, it was shown that TR-MUSIC could be adapted to work with digital holography[24]. Due to the reciprocity of light propagation, TR matrices TDSSD and TSDDS in digital holography are formulated, where TDSSD represents the light propagation from the detector to the light source and back, while TSDDS refers to the reverse propagation of light originating from the source. The matrix TDSSD can be expressed by means of singular value decomposition (SVD) as[33]TDSSD=F1{GGH}=m=1Mvx(m)·|O(xm,ym,zm)|2·vy(m)·vxH(m),where G is the Fourier transform of g(x,y). The superscript H indicates the conjugate transpose operation. m=1,2,…,M. |·| and · represent the absolute value and modulo operations, respectively. The column vectors in x and y directions are denoted as vx and vy. TSDDS is constructed as TSDDS=F1{GHG}.

    If the object has M target points, then the matrix would exhibit M positive eigenvalues, with the other eigenvalues near zero. The primary M vectors of the TR matrix denote the signal subspace, while the remaining vectors correspond to the noise subspace. The connection between them can be expressed as TDSSD·vx(m)=|O(xm,ym,zm)|2·vy(m)·vx(m)·vx(m),TSDDS·vy*(m)=|O(xm,ym,zm)|2·vy(m)·vx(m)·vy*(m),where vx(m) and vy(m) are the eigenvectors of matrices TDSSD and TSDDS, respectively. As previously mentioned, the 3D object is discretized into N layers along the z axis, with each layer divided into U×U points. The locations of the targets can be determined through computation, Kx(Xp,zi)=m=M+1U|vx(m)Tv1*(Xp,zi)|2,Ky(Xp,zi)=m=M+1U|vy*(m)Tv2(Xp,zi)|2,where Xp represents the position of the test point in the xy plane. v1*(Xp,zi) and v2(Xp,zi) are eigenvectors of matrices G(Xp)G(Xp)H and G(Xp)HG(Xp), respectively.

    When the test object is in the same position as the actual target, the minimum value of Eq. (6) is achieved due to the orthogonality between the signal and noise subspace.

    The pseudo-spectrum Px(Xp,zi) and Py(Xp,zi) in the x and y directions can be expressed as Px(Xp,zi)=v1(Xp,zi)2/Kx(Xp,zi),Py(Xp,zi)=v2(Xp,zi)2/Ky(Xp,zi).

    Then the combined pseudo-spectrum becomes P(Xp,zi)=Px(Xp,zi)·Py(Xp,zi).

    We then traverse the test positions in the section at z=zi. The parameters of each layer P(zi) can be calculated, and the local maxima are considered as the axial locations, P(zi)=p=1U2P(Xp,zi).

    Typically, all U×U points of each layer are selected as test positions, resulting in high resolution in the MUSIC algorithm. However, this approach can be very time-consuming. As indicated by Eq. (7), one can label the diagonal elements as test points[24]. The target point will always respond at the diagonal position, regardless of its location. For instance, if the object is located at (x1,y1), it will generate responses at x=x1 and y=y1. They will intersect the diagonal and produce two response points at (x1,x1) and (y1,y1). Therefore, we can only calculate the pseudo-spectrum at the diagonal position, reducing computational effort. The diagonal scanning method can be expressed as P(zi)|diagonal=d=1UP(xd,yd,zi),where xd=yd, and (xd,yd) represents the diagonal position of the test point Xp in the xy plane. It can be deduced from Eq. (10) that this method significantly reduces computation time. However, it does come at the cost of resolution.

    2.3. Lissajous scanning

    Considering the limitations of both the full spectrum and diagonal pseudo-spectrum, we introduce here Lissajous scanning as a method for the balance between resolution and autofocus speed. The two axes in a Lissajous scan possess similar single-tone frequencies. This technology allows the scanning system to operate at resonance, enabling fast, large-amplitude scanning, especially low power consumption during high-quality factor scanning[34]. The Lissajous trajectory technique is utilized in various fields, including optical coherence tomography (OCT)[35,36], frequency-modulated gyroscopes[37], and microscopy[38]. The equation for the trajectory can be formulated as follows: {x=Axsin(pt)y=Aysin(qt+ϕ),where A represents amplitude, and p and q represent frequency, respectively. The subscripts x and y indicate the x and y directions. The term ϕ denotes the phase difference.

    As the conventional TR-MUSIC method fails to achieve a balance between time and resolution, we introduce Lissajous scanning to improve this. By selecting test points located on the Lissajous scan curve, better resolution can be achieved with a reasonable amount of time. The process is as follows: (1) adjust the amplitude ratio of the Lissajous curve to match the size of the section; (2) calculate the period T of the Lissajous curve based on the values of p and q, T=2π/gca(p,q), where gca(p,q) means calculating the greatest common divisor of p and q; (3) select V evenly spaced points within the time period from 0 to T to discretize the curve; (4) follow the nearest neighbor rule to select V units as test positions from the U×U points of each layer; (5) use these selected positions as the test points Xp to estimate the depth location of targets.

    The Lissajous scanning-based approach delivers greater precision than diagonal element testing and requires fewer computations than point-by-point testing. Figure 2 shows the Lissajous curve (red line) with an amplitude ratio of 10:10, a frequency ratio of 3:2, and a phase difference of 0. By following the aforementioned steps (1)–(5), the discrete points are generated with V=30 in step (3) as an example. The results are shown by the blue points in Fig. 2.

    The Lissajous curve and the selected test positions.

    Figure 2.The Lissajous curve and the selected test positions.

    3. Simulation

    In this section, we demonstrate the viability of the proposed method by applying it to single-point targets, multipoint targets, as well as complex objects. The computer configuration is AMD Ryzen 5 5600 6-Core Processor @ 3.50 GHz, 32G RAM. Results and discussions can be found in the subsections below.

    3.1. Single-point target

    We test the proposed method using a simple single-point object first. The laser wavelength is set to 632.8 nm in the simulation. The size of each section is 1mm×1mm and is divided into 100×100 pixels. Parameters for the Lissajous trajectory curve include an amplitude ratio of 100:100, a frequency ratio of 13:21, and a phase difference of 0. Taking into account both calculation time and resolution, Lissajous scanning is utilized to choose 1000 test locations per layer for simulation, which is 10 times the dimension of the matrix.

    There are two possible scenarios because the Lissajous trajectory does not pass through every unit in the plane. The outcomes could either be that the test points selected using the Lissajous curve encompass the single-point target or that they do not encompass it. In the first case, the target point is located at (10,6), while for the second case, the target is at (30,10) with no intersection with the Lissajous trajectory.

    The generated hologram and the first 50 eigenvalues of the TR matrix are shown in Figs. 3(a) and 3(b), respectively. It is apparent that there is only one significant eigenvalue, denoting a single target in the signal subspace.

    (a) The generated hologram; (b) the first 50 eigenvalues.

    Figure 3.(a) The generated hologram; (b) the first 50 eigenvalues.

    Figures 4(a) and 4(b) show the peak values, indicating that the axial position of the detected object is at 30 mm for both cases. The axial location of the target can be detected regardless of whether the chosen test points of the Lissajous trajectory curve pass through the target point.

    The sum of the pseudo-spectrum along the z axis, with target at (a) (10,6) and (b) (30,10).

    Figure 4.The sum of the pseudo-spectrum along the z axis, with target at (a) (10,6) and (b) (30,10).

    When analyzing the proposed method, it is important to consider not only the position of the test point but also the amplitude ratio, frequency ratio, and phase difference of the Lissajous curve. The amplitude ratio is an adjustable factor in the Lissajous curve that depends on the size of the section. Based on Eq. (11), modifications to the frequency ratio and phase difference also affect the trajectory. In the simulation, the test point is located at (20,5) in the xy plane and 30 mm along the z axis. The sums of the pseudo-spectrum along the z axis with different frequency ratios and phase differences are shown in Figs. 5(a) and 5(b). One can observe from the figure that the calculated axial location is 30 mm under different scenarios, which is consistent with the simulation.

    The sum of the pseudo-spectrum along the z axis. (a) Frequency ratio p∶q = 13∶21 or p∶q = 23∶11; (b) phase difference ϕ = 0 or ϕ = π.

    Figure 5.The sum of the pseudo-spectrum along the z axis. (a) Frequency ratio p∶q = 13∶21 or p∶q = 23∶11; (b) phase difference ϕ = 0 or ϕ = π.

    Tables 1 and 2 present the calculation time and full width at half-maximum (FWHM) results with different frequency ratios and phase differences, respectively. FWHM is defined as the distance between two points on the sum of the pseudo-spectrum curve where the function reaches half of its maximum value. As the FWHM value decreases, the peak becomes sharper, and the local maximum that needs to be found becomes more prominent. It can be seen from the results that variations in frequency ratios or phase differences do not noticeably affect the outcome of axial localization, with only a marginal impact on calculation time.

    • Table 1. Calculation Time and FWHM with Different Frequency Ratios in Lissajous Curve

      Table 1. Calculation Time and FWHM with Different Frequency Ratios in Lissajous Curve

      pq3∶24∶59∶813∶2119∶2323∶11
      Time (s)33.6935.2435.8731.0934.9131.13
      FWHM0.010.010.010.010.010.01
    • Table 2. Calculation Time and FWHM with Different Phase Differences in Lissajous Curve

      Table 2. Calculation Time and FWHM with Different Phase Differences in Lissajous Curve

      ϕ0π/4π/23π/4π3π/2
      Time (s)31.2541.5941.6942.2331.2241.24
      FWHM0.010.010.010.010.010.01

    3.2. Multipoint target

    Multipoint targets include two situations: multiple points on the same plane and multiple points on different planes. This subsection provides simulations for both situations.

    First, we simulate the situation where multiple points are in different layers, with each layer containing only one point. In the simulation, one of the points is placed at (10,6) in the xy plane with z1=30mm in the z axis, while the other point is at (10,6) with z2=30.5mm.

    Figure 6(a) shows the distribution of the first 50 eigenvalues. It can be observed that there are two prominent eigenvalues, indicating that there are two targets. The pseudo-spectrum accumulated along the z axis is displayed in Fig. 6(b). The two local maxima represent the calculated positions of the two targets: z1=30mm and z2=30.5mm. These values are consistent with the actual positions.

    (a) The first 50 eigenvalues of TR matrix; (b) the sum of the pseudo-spectrum along the z axis.

    Figure 6.(a) The first 50 eigenvalues of TR matrix; (b) the sum of the pseudo-spectrum along the z axis.

    Then, we simulate the situation where multiple points are situated on the same layer. First, we will analyze two points: one locates at (10,6) and the other is at (13,12). Both points are at the same axial position of 30 mm. The distribution of the first 50 eigenvalues is shown in Fig. 7(a). One can observe that there are two target points. A local maximum can be observed at z=30mm in Fig. 7(b), which indicates the axial location of the targets.

    (a) The first 50 eigenvalues of TR matrix; (b) the sum of pseudo-spectrum of each layer along the z axis.

    Figure 7.(a) The first 50 eigenvalues of TR matrix; (b) the sum of pseudo-spectrum of each layer along the z axis.

    We then consider a more complex situation involving multipoint targets: with one target at z1=29.5mm, two targets at z2=30mm, and three targets at z3=30.5mm. The result is shown in Fig. 8. It can be observed that there are three local maxima with axial locations at 29.5 mm, 30 mm, and 30.5 mm, which match perfectly with the actual case. This demonstrates the feasibility of the proposed approach.

    The sum of pseudo-spectrum of each layer along the z axis.

    Figure 8.The sum of pseudo-spectrum of each layer along the z axis.

    3.3. Resolution analysis

    In this section, the resolution analysis of the proposed method is presented. As the TR matrix is based on the hologram, the axial resolution of the proposed methods depends highly on several factors that are crucial to holograms, including the distance of the target as well as the wavelength of light.

    We analyze the relationship between the resolution and target distance first. In the simulation, two point targets at (10,6) in the xy plane with different axial locations are considered. The two targets are kept 1 mm apart along the z axis. The results are shown in Figs. 9(a)9(c), with z1 at 30 mm, 50 mm, and 70 mm, respectively. It can be deduced from Fig. 9 that the resolution degrades as we gradually move the targets away from the scanning mirror. When the object gets too far away, the method fails to distinguish the two points apart, as is shown in Fig. 9(c).

    The sum of pseudo-spectrum along the z axis. (a) z1 = 30 mm, z2 = 31.5 mm, (b) z1 = 50 mm, z2 = 51.5 mm, (c) z1 = 70 mm, z2 = 71.5 mm.

    Figure 9.The sum of pseudo-spectrum along the z axis. (a) z1 = 30 mm, z2 = 31.5 mm, (b) z1 = 50 mm, z2 = 51.5 mm, (c) z1 = 70 mm, z2 = 71.5 mm.

    The impact of the laser wavelength on axial resolution is also considered. In the simulation, two point targets are used as the former case, with z1=30mm and z2=30.5mm. The calculated sum of pseudo-spectrum along the z axis with wavelengths at 405 nm, 780 nm, and 980 nm is shown in Figs. 10(a)10(c), respectively. It can be observed from the figure that the two local maxima become less distinct as longer wavelengths are used. This indicates that the axial resolution reduces as the wavelength increases.

    The sum of pseudo-spectrum along the z axis. (a) λ = 405 nm, (b) λ = 780 nm, (c) λ = 980 nm.

    Figure 10.The sum of pseudo-spectrum along the z axis. (a) λ = 405 nm, (b) λ = 780 nm, (c) λ = 980 nm.

    3.4. Noise analysis

    In this subsection, the impact of noise on the proposed method will be evaluated via simulation with different signal-to-noise ratios (SNRs). The two test points are both located at (10,6) on the xy plane but with different axial positions. One of the test points is fixed at the axial position of 30 mm. Gaussian white noise is added to the generated hologram in the simulation. The SNR can be calculated by SNR=10×log10((1RCr=1Rc=1CF(r,c))21RCr=1Rc=1C(F(r,c)Fn(r,c))2),where R and C stand for the number of rows and the number of columns of the matrix, respectively. F represents the original image, while Fn represents the image with noise.

    The relationship between the two variables is shown in Fig. 11. As can be expected, the resolution gets better with a higher SNR: when SNR increases from 10 dB to 100 dB, the resolution will change from 5.45 mm to 0.31 mm.

    The relationship between SNR and resolution.

    Figure 11.The relationship between SNR and resolution.

    3.5. Complex objects

    In this section, the proposed method will be verified through simulation using complex objects. The laser wavelength is set to 632.8 nm. During the simulation, each section of the complex graphics has a size of 1mm×1mm, and is standardized to 128×128 pixels, as is shown in Figs. 12(a) and 12(b). The axial positions of each section are z1=30mm and z2=35mm, respectively. The frequency ratio and phase difference of the Lissajous trajectory are 13:21 and 0, respectively. Similarly, the number of test positions per layer is selected to be 10 times the dimension of the matrix, which is 1280.

    Complex objects (a) at z1 = 30 mm and (b) at z2 = 35 mm.

    Figure 12.Complex objects (a) at z1 = 30 mm and (b) at z2 = 35 mm.

    The hologram in Fig. 13(a) was generated based on Eq. (2). The distribution of eigenvalues is shown in Fig. 13(b). One can observe that there are lots of eigenvalues in the signal space for the complex objects. Here, we use the L-curve method to partition the signal space[39].

    (a) The generated hologram; (b) the eigenvalues of TR matrix.

    Figure 13.(a) The generated hologram; (b) the eigenvalues of TR matrix.

    Figure 14(a) shows the comparison of the TR-MUSIC method using Lissajous scanning (blue line) and the entropy minimization method (red line)[13]. The two local maxima of the proposed method are much clearer compared to the entropy minimization method.

    The proposed method is compared with various positioning methods. (a) Entropy minimization; (b) TR-MUSIC based on diagonal elements.

    Figure 14.The proposed method is compared with various positioning methods. (a) Entropy minimization; (b) TR-MUSIC based on diagonal elements.

    Figure 14(b) shows the axial positioning results of the Lissajous and diagonal scanning methods. It should be noted that the minimum value of the normalized curve obtained through the diagonal scanning method is greater than 0.5. Therefore, the FWHM cannot be calculated. Here we use peak prominence (PP) to measure the prominence of local peak[40]. The PP can be defined as PP=(Lmax(S1)Lmin(S1))+(Lmax(S2)Lmin(S2))MaxMin,where Lmax(S1) and Lmax(S2) represent the local maximum value at the first section (S1) and the second section (S2) of the object, respectively. Lmin(S1) and Lmin(S2) represent the nearest local minimum value corresponding to Lmax(S1) and Lmax(S2). The terms Max and Min, respectively, represent the maximum value and minimum value in the curve.

    Figure 14 and Table 3 indicate that the proposed method outperforms the other two algorithms in dealing with complex objects.

    • Table 3. Performance Comparison in Simulation

      Table 3. Performance Comparison in Simulation

      MethodEntropyDiagonal scanLissajous scan
      Time (s)0.33.835.5
      PPFailed1.471.96

    4. Experiments

    In this section, we verify the proposed axial localization method based on Lissajous scanning using a real hologram, which was retrieved experimentally by Kim et al.[8]. The laser wavelength used in the experiment is 632.8 nm. The diameter of the collimated beam is D=25mm, and the focal length of the lens is f=500mm. Thus, NAD/(2f)=0.025. The two sections of the object, sampled with a size of 500×500 pixels, are situated at distances of z1=87cm and z2=107cm.

    The proposed method utilizes a Lissajous trajectory with a frequency ratio of 13:21 and a phase difference of 0. The number of test locations in each layer is set to 10 times the matrix dimension, which is 5000. Figures 15(a) and 15(b)[8] are the real part and imaginary part of the hologram, respectively.

    The recorded hologram. (a) Real part; (b) imaginary part.

    Figure 15.The recorded hologram. (a) Real part; (b) imaginary part.

    The eigenvalues of the TR matrix are shown in Fig. 16(a). The calculated results along the z axis are presented in Fig. 16(b), with the red curve representing the entropy minimization method, and the black and blue ones standing for the diagonal and the proposed method, respectively. During the experiment, the entropy minimization method took 1.1 s, but the positioning was unsuccessful. Table 4 presents the calculation time and PP for both the diagonal scanning method and the Lissajous scanning method. The Lissajous scanning method calculates the position of each layer to be 10 times that of the diagonal method. The actual calculation time is consistent with this relationship.

    • Table 4. Performance Comparison in Experiment

      Table 4. Performance Comparison in Experiment

      MethodEntropyDiagonal scanLissajous scan
      Time (s)1.1218.52281.7
      PPFailed0.210.34

    (a) Eigenvalues of TR matrix; (b) parameters calculated along the z axis obtained by different methods.

    Figure 16.(a) Eigenvalues of TR matrix; (b) parameters calculated along the z axis obtained by different methods.

    One can observe that the entropy minimization method fails to retrieve the axial location of the two sections, while the results of the other two methods show consistent localization, but the proposed method at z1=86cm and z2=109cm is more prominent.

    The reconstruction results based on the calculated axial location of the two sections are listed in Figs. 17(a) and 17(b) using the inverse imaging method[41]. Experiment results demonstrated that the axial localization method based on Lissajous scanning in TR OSH can achieve accurate positioning of objects.

    Reconstructed image at (a) z1 and (b) z2.

    Figure 17.Reconstructed image at (a) z1 and (b) z2.

    5. Conclusions

    Accurate axial localization is of utmost importance when reconstructing holograms in OSH. Based on the TR-MUSIC algorithm, the Lissajous trajectory curve is introduced to obtain effective test points through which a better autofocus resolution has been achieved. The feasibility of this method was verified through simulations as well as experiments. Compared to both the entropy minimization algorithm and the diagonal algorithm, the proposed method demonstrated higher resolution and a more optimal balance between resolution and autofocus time investment.

    [1] T.-C. Poon. Optical Scanning Holography with MATLAB(2007).

    [13] Z. Ren, N. Chen, A. Chan et al. Autofocusing of optical scanning holography based on entropy minimization. Digital Holography and Three-Dimensional Imaging, DT4A-4(2015).

    [16] T. Shimobaba, T. Kakue, T. Ito. Convolutional neural network-based regression for depth prediction in digital holography. IEEE 27th International Symposium on Industrial Electronics (ISIE), 1323(2018).

    [20] T. Pitkäaho, A. Manninen, T. J. Naughton. Focus classification in digital holographic microscopy using deep convolutional neural networks. Advances in Microscopic Imaging, 104140K(2017).

    [23] M. Fink. Time reversal acoustics. Rep. Prog. Phys., 50, 34(1997).

    [39] B. Wu, W. Cai, M. Alrubaiee et al. Three dimensional time reversal optical tomography. Phys. Rev. Lett., 92, 033902(2011).

    Tools

    Get Citation

    Copy Citation Text

    Jie Liu, Haiyan Ou, Hua Wang, Lin Peng, Wei Shao, "Autofocus by Lissajous scanning in time reversal optical scanning holography," Chin. Opt. Lett. 22, 080501 (2024)

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Diffraction, Gratings, and Holography

    Received: Jan. 20, 2024

    Accepted: Apr. 17, 2024

    Published Online: Aug. 21, 2024

    The Author Email: Haiyan Ou (ouhaiyan@uestc.edu.cn)

    DOI:10.3788/COL202422.080501

    CSTR:32184.14.COL202422.080501

    Topics