With the development of high harmonic generation (HHG), lensless extreme-ultraviolet (XUV) imaging at nanoscale resolution has become possible with table-top systems. Specifically, ptychographic phase retrieval using monochromatic XUV illumination exhibits extraordinary robustness and accuracy to computationally reconstruct the object and the illumination beam profile. In ptychography, using structured illumination has been shown to improve reconstruction robustness and image resolution by enhancing high spatial-frequency diffraction. However, broadband imaging has remained challenging, as the required multiwavelength algorithms become increasingly demanding. One major aspect is the ability to separate the available information into different physically meaningful states, such as different spectral components. Here, we show that introducing spatial diversity between spectral components of an HHG beam can significantly improve the reconstruction quality in multiwavelength XUV ptychography. We quantify the diversity in the polychromatic illumination by analyzing the diffraction patterns using established geometry- and information-theory-based dissimilarity metrics. We experimentally verify the major influence of diversity by comparing ptychography measurements using HHG beams with Gaussian and binary structured profiles as well as with beams carrying wavelength-dependent orbital angular momentum. Our results demonstrate how structured illumination acts in twofold by separating the spectral information in a single diffraction pattern while providing maximized added information with every new scan position. We anticipate our work to be a starting point for high-fidelity polychromatic imaging of next-generation nanostructured devices at XUV and soft-X-ray wavelengths.
【AIGC One Sentence Reading】:Introducing spatial diversity in HHG beams enhances reconstruction quality in multiwavelength XUV ptychography.
【AIGC Short Abstract】:Our study reveals that spatial diversity in multiwavelength extreme ultraviolet ptychography enhances reconstruction quality. By quantifying illumination diversity and comparing measurements, we show structured illumination separates spectral information and maximizes added information per scan. This work paves the way for high-fidelity polychromatic imaging of nanostructured devices.
Note: This section is automatically generated by AI . The website and platform operators shall not be liable for any commercial or legal consequences arising from your use of AI generated content on this website. Please be aware of this.
1. INTRODUCTION
Advances in diffraction-based imaging technology [1,2] have pushed the achievable resolution well beyond the capabilities of conventional microscopes. In particular, coherent diffraction lensless imaging (CDI) in the extreme ultraviolet (XUV) regime has become an active research area. The short wavelengths in the XUV region give better diffraction-limited resolution [3] compared with visible or infrared light, while it is still possible to generate coherent light at these wavelengths with tabletop sources via high harmonic generation (HHG) from a near-infrared (NIR) driving laser [4–9]. One specific technique for CDI is ptychography, in which the object is translated laterally to the source, and a series of correlated diffraction patterns is captured [10,11]. The reconstruction algorithm can then computationally retrieve the missing phase of the measured diffraction patterns and reconstruct complex-valued expressions for the object and the illumination source. Ptychography has been investigated extensively in the XUV [12–22] and X-ray ranges [23–26] and proven to be a robust method to image the object and the illumination, called “probe” hereafter, in principle without the necessity for support constraints or additional prior knowledge.
In HHG, a number of high harmonics of the driving laser field are generated. This large bandwidth in principle allows for broadband imaging, which can reveal element-specific information of a sample due to the material-specific transmission windows in the XUV spectral range [27]. However, the polychromatic beam lacks the necessary longitudinal coherence for diffraction-based methods such as ptychography. In many recent works, coherence is achieved by spectrally filtering the HHG beam and selecting a single harmonic [15–17,19], which is effective for single-wavelength object reconstruction but removes the ability for spectroscopic imaging. Retrieving full spectral information from broadband diffraction can be achieved through two-pulse Fourier-transform methods [28,29], but the need for two coherent sources and the required temporal scanning make this concept challenging to combine with ptychography. A more efficient and flexible approach is multiwavelength ptychography [13,14,20,26,30–32]. In multiwavelength ptychography, the probe and object are typically modeled as a set of incoherent modes, similar to the mixed-states approach for partially coherent beams [33], with each mode corresponding to a different wavelength. However, due to the presence and necessity to reconstruct complex-valued expressions for all probe and object modes, the demands on the reconstruction algorithm become increasingly challenging. Experience from earlier works has shown that structured illumination improves the reconstruction quality and algorithm convergence [15,34–40]. A structured beam provides higher-illumination NA and reduces the dynamic range of the diffraction pattern, which leads to more efficient use of the full chip of the camera. For ptychography with HHG beams, a structure can be accomplished either by structuring directly the XUV beam with the use of a mask [15,24,37] or a phase-shifting diffuser in the beamline [41] or indirectly by structuring the driving laser beam, which transfers amplitude and phase properties to the high harmonics [42,43]. A specific example of such phase transfer is the upconversion of beams carrying orbital angular momentum (OAM) [44–53], for which it was shown that the th harmonic of a driving beam with OAM will have an OAM of , with the exact OAM depending on the fundamental beam properties [51,52]. Wang et al. [54] showed that ptychography on periodic structures can be improved significantly by using an XUV beam with nonzero OAM, as its large intrinsic divergence leads to overlapping diffraction orders in the far field.
In addition to the amount of structure in individual diffraction patterns, a key aspect of ptychography is the diversity between scan positions. Similarly, multimode ptychography can be expected to benefit in situations where the diffraction resulting from different modes is clearly distinct. Notably, when such modes are well-defined physical states, such as different wavelengths, it should be possible to engineer the illumination such that the resulting diffraction data can be more accurately processed by multimode ptychography algorithms.
Sign up for Photonics Research TOC Get the latest issue of Advanced Photonics delivered right to you!Sign up now
In this paper, we investigate and experimentally demonstrate the improvement in ptychographic multiwavelength reconstructions with directly and indirectly structured HHG probe beams compared with smooth Gaussian beams. To systematically explore the suitability of a probe for a given object and experimental ptychographic setup, we introduce the concept of diversity in the diffraction patterns and use dissimilarity metrics [55] to characterize our probes. We observe a strong correlation between these diversity metrics and the achieved image reconstruction quality. Based on these observations, we conclude that analyzing and optimizing diversity between wavelengths (or other modes) is an important aspect in the design of any multimode ptychography experiment.
2. DIVERSITY CONSIDERATIONS
In multiwavelength ptychography, the reconstruction algorithms are usually based on ptychographical information multiplexing (PIM) [30], which models measured polychromatic diffraction data as an incoherent sum of individual monochromatic diffraction patterns. For objects with grating-like structures, probe modes corresponding to different wavelengths have distinct diffraction angles and illuminate different areas of the detector [20,56], which facilitates the reconstruction algorithm to identify the monochromatic components of the polychromatic diffraction pattern. However, for general imaging purposes, objects with arbitrary features do not guarantee spectral diversity. This creates a challenge for the reconstruction algorithm to reliably converge and accurately reconstruct all probe and object modes. Here, we explore the influence of illumination diversity on successful ptychographic reconstructions.
The concept of diversity enhancement is illustrated in Fig. 1. We consider a binary object illuminated by an XUV beam consisting of the 27th and the 29th harmonics of a 1030 nm wavelength drive laser, with either low or high diversity. The two harmonic beams are assumed to have equal photon flux for both diversity cases. For a low-diversity beam, we assume a flat wavefront with top-hat intensity distribution for both wavelengths [Fig. 1(a)]; for high diversity, we consider a beam with similar intensity profiles but having an OAM phase proportional to the harmonic order [Fig. 1(b)]. We model the propagation of the two-color beams from the sample plane to a camera plane that is placed in the far field. Figures 1(c) and 1(d) show images of the difference between the two wavelength components in the far-field diffraction patterns for the flat and OAM beam, respectively, in a saturated dynamic range in order to highlight the differences. It is clear that the diversity introduced by the wavelength-dependent OAM phase leads to strongly enhanced differences in diffraction between the modes.
Figure 1.Spectral diversity in diffraction. (a), (b) A binary object is illuminated by a beam containing the 27th and 29th harmonics (at 38.4 and 35.7 nm wavelength), either (a) with a flat intensity and phase or (b) with order-dependent OAM. (c), (d) Difference of the monochromatic diffraction patterns between the two wavelengths () for (c) flat and (d) OAM beam illumination. The dynamic range of the camera is set to and 15 bits for the flat beam and OAM beam, respectively, such that the number of photons in the incoherent sum of the monochromatic diffraction patterns is equal to in both cases.
The diversity in diffraction patterns can be quantified using various dissimilarity metrics [55] such as the L1 norm, the L2 norm, the cosine metric, and the Jensen-Shannon divergence (JSD). The former three metrics treat the -sized diffraction patterns as 1D vectors with size equal to and calculate the distance or angle between the vectors. JSD is a metric borrowed from information theory that compares two or more probability density functions (PDFs). To evaluate the JSD, we treat each diffraction pattern as the PDF of the diffracted beam over all detector pixels. The mathematical expressions for these metrics are where or for calculating the diversity according to L1 or L2 norm, respectively, is the number of scan positions, is the diffraction pattern for a scan position, and , with the natural logarithm of , is the spatial entropy functional as defined in Ref. [13]. The cosine metric and JSD are bounded metrics with and . The L1 norm and L2 norm are instead unbounded metrics, and the absolute values of distances give practically no information about the similarity of two diffraction patterns, as they depend on the dynamic range that is assumed for the calculation of the norms. For this reason, we use a relative L1 norm and L2 norm metrics, which are normalized with respect to the largest magnitude in the diffraction pattern series. The relative norms, albeit still unbounded, give more insight about the similarity of two diffraction patterns. Using these metrics, we can compare pairs of monochromatic diffraction patterns at a single scan position and calculate the spectral diversity. Similarly, we can compare pairs of polychromatic diffraction patterns that correspond to two adjacent scan positions in the ptychographic measurement and compute their scanning diversity.
Scanning diversity, where each additional scan position contributes new information to the ptychographic reconstruction algorithm, is beneficial. This is because the robustness of ptychography relies on the aggregated information from diffraction patterns recorded at partially overlapping areas of the object. With HHG beams, it is challenging to isolate the effects of spectral and scanning diversity in an experiment and to tune the amount of diversity in a continuous way. To gain more insight into the diversity provided by structuring the HHG beam and the subsequent improvement in the ptychographic reconstructions, we performed a series of numerical simulations. In these simulations, we consider an object that is illuminated by a monochromatic Gaussian-shaped probe beam with increasing divergence, keeping all other relevant parameters constant [beam size, 30.5 μm; scanning pattern, concentric, 200 scan positions; overlap, 87%; photon flux of probe, photons; distance between object and detector, 105 mm; probe wavelength, 38.25 nm; noise statistics, mixture of Poisson and Gaussian ]. Probe divergence can be considered as a simple, continuously tunable version of spatial beam structure, resulting in similar diversity variation as observed for the binary masks and OAM beams that we study experimentally (see below). Since the actual object is known, the quality of ptychographic reconstruction can be calculated with the Fourier ring correlation (FRC) for the object [57]. The object used in simulation is shown in Fig. 1. The examples of probes with increasing quadratic phase and the corresponding diffraction patterns are given in Figs. 2(a) and 2(b).
Figure 2.Scanning diversity and reconstruction quality of simulated data sets. (a) Logarithmic scale diffraction patterns are shown for beams with radii of curvature , , and , all illuminating the center of the object at scan position 0. (b) Three monochromatic probes at 38.25 nm wavelength with increasing quadratic phase and identical Gaussian intensity profile (30.5 μm diameter). (c) Diversity metrics , , cosine, and JSD by comparing diffraction patterns between adjacent scan positions for a scan grid with the first 20 scan points, as a function of the quadratic phase of the probe. (d) The example FRC by comparing independent reconstructions within the data sets of three monochromatic probes. The intersection of FRC curves and 1 bit threshold line determines the object resolution. (e) Reconstruction quality calculated from the FRC as a function of the quadratic phase of the probe. The colored dots are extracted from (d).
The magnitudes of the various diversity metrics as a function of beam divergence are shown in Fig. 2(c). The diversity is calculated by comparing diffraction patterns from adjacent scan positions. We observe that all diversity metrics exhibit a similar trend with slight variations: the lowest diversity occurs when the object is illuminated by a flat wavefront, while diversity stabilizes after a certain degree of curvature is added to the beam phase. For each probe curvature, we performed independent ptychography reconstructions, with the achieved object resolution determined using FRC. As examples, Fig. 2(d) shows the FRC results of three beams with increasing quadratic phase, demonstrating that the achieved resolution improves as beam divergence increases. Figure 2(e) summarizes the achieved object resolution. As beam divergence increases, diversity between the diffraction patterns also improves. Consequently, the aggregated information from diffraction patterns recorded during each scan position increased. Therefore, we attribute the better reconstructions obtained with more divergent beams to the higher scanning diversity they provide.
3. RESULTS
A. Experiment Design
To test the use of diversity metrics in ptychography, we designed a series of experiments in which we perform multiwavelength ptychography in the extreme-ultraviolet wavelength range, while introducing varying degrees of diversity between the different wavelength components and scan positions. We use HHG as the illumination source, as it naturally provides coherent XUV radiation at multiple wavelengths in parallel. The concept extends to other broadband XUV sources such as pink beam synchrotrons and free-electron lasers. To control the amount of diversity, we use two ways to structure the illumination beams, as schematically indicated in Fig. 3. The first approach is the introduction of a binary mask in the HHG beam just before the imaging target. This mask leads to a finely structured beam at the object location, thus increasing the scanning diversity. In addition, the diffraction from the mask leads to increased spectral diversity at the object location as well. The second approach is to structure the HHG radiation by shaping the fundamental laser beam. Here, we use the property of the HHG process that OAM is upconverted in an order-dependent way, which naturally leads to a large spectral diversity at the object location while maintaining efficient HHG. The rapid angular phase profiles of these OAM beams also result in a high scanning diversity. With these different HHG beams, we perform ptychography scans on a resolution test chart. The resulting data are reconstructed using our PIE-based algorithm [58] and analyzed to determine the link between diversity and image reconstruction quality.
Figure 3.Experimental setup. (a) The driving NIR laser is focused by an focal length lens into an argon gas jet. An Al filter blocks the fundamental, and the high harmonics are refocused by a pair of broadband multilayer mirrors onto the sample. A CCD camera is placed approximately 10 cm from the focal plane. (b), (c) Intensity profile of the driving laser at the gas jet plane for generating (b) Gaussian and (c) OAM XUV beams. (d)–(f) Polychromatic beam intensities for (d) Gaussian, (e) vortex, and (f) structured beam, computed upstream from the sample plane at distances 8.1, 6, and 1.625 mm (mask plane), respectively. (g) Measured and reconstructed spectrum of the XUV radiation after the mirrors, plotted along with the reflectivity curve of the XUV mirrors. (h) Scanning electron microscope image of the imaging target. (i)–(k) Polychromatic diffraction patterns from illumination of the central part of the object with (i) Gaussian, (j) vortex, and (k) structured beam.
The experimental setup is shown in Fig. 3(a). An NIR laser is focused in an argon gas jet to generate high harmonics. Detailed information about the HHG source is given in Appendix A.1. Before the focusing lens, we can insert a spiral phase plate (Vortex Photonics V-1064-20-1 [59]) in the beamline to generate a vortex fundamental beam with OAM equal to 1.
Behind the gas jet, a 200 nm aluminum membrane filters out the fundamental beam and the high harmonics are directed and focused by a pair of plane and curved () multilayer mirrors onto the sample. A secondary sample stage is placed at 1.62 mm distance in front of the sample plane, allowing us to place a binary beam-structuring mask or a circular aperture with 50 μm diameter. The binary mask consists of a set of 2 μm holes oriented in a slightly distorted periodic grid separated by an average distance of 5 μm. The 50 μm aperture acts as a spatial filter for the XUV Gaussian beam and minimizes the leakage of the fundamental beam on the camera, while no mask or aperture is used for the vortex beam measurements. As the Gaussian beam at the focus is smaller compared with the structured beams, the measurement with the Gaussian probe is performed with the sample placed 4 mm behind the focus, where the beam has expanded to a comparable size with respect to the focused vortex or mask-structured beam.
The imaging object is a home-built USAF-1951 resolution target with printed logos of VU and ARCNL at the center of the target [Fig. 3(h)]. It is oriented in a 45-deg configuration so that the diffraction of the bars is along the diagonal of the camera, where the detection NA is maximized. A typical multispectral HHG diffraction pattern corresponding to illumination of the central area of the sample is shown in Figs. 3(i)–3(k) for the three considered beams (Gaussian, vortex, structured), assuming equal photon budgets for the three beams. From the individual polychromatic diffraction patterns, it can be seen that the vortex beam leads to the highest NA data, as the diffraction pattern has spread to higher angles on the detector. However, we expect that the overall reconstruction quality is not merely a function of the effective NA, but it will be described more completely by the spectral and scanning diversity metrics.
Figure 3(g) shows the XUV spectrum, measured from the diffraction of the HHG beam through a transmission grating with 500 nm pitch (solid line) and compared with the spectral weights of the reconstructed probes shown in the next section. The Gaussian-shaped envelope of the spectrum has been formed by the efficiency of the XUV mirrors [dashed line in Fig. 3(g)] that favor the reflection of the 27th harmonic (38.3 nm), while harmonics 25 and 29 have about three times lower first-order diffraction signal on the camera, and harmonics 23 and 31 have about 10% of the signal strength of the brightest harmonic.
C. Ptychographic Imaging with Different Probes
In order to have a fair comparison for the reconstruction quality of ptychographic imaging experiments with different beam types, it is important that other experimental settings, such as sample to camera distance, illumination overlap between adjacent scan positions, and probe energy are identical. However, in practice, the distance between object and camera is slightly different for measurements with different beam types and varies between 104.8 and 108 mm to achieve a desired beam size at the sample plane. The variation in the distance leads to a 3% variation of the achievable diffraction-limited resolution. Moreover, in order to maximize the captured information from each ptychography scan, we aimed for the utilization of the full dynamic range of the camera via adjusting exposure time and preamplification gain of the camera. This strategy inevitably leads to unequal photon budget of the diffraction patterns for different beam types, as smooth beams more readily saturate the zeroth order diffraction at the center of the camera. A possible solution to this issue would be high dynamic range exposures [14,18,54,60] during the measurement of the Gaussian beam, which however significantly increases measurement time and increases long-term drift and stability requirements.
The ptychographic data sets for the Gaussian and vortex beams consist of 218 scan positions in a concentric scan grid with 6 μm step size and 104 μm field of view. For the structured beam, we used a scan grid with smaller step size (2.45 μm) and field of view (44 μm) due to an underestimation of the probe size. Due to the irregular intensity profile of the nonsmooth beam types, characterization of the overlap with a linear overlap factor [61] is not accurate. Therefore, we have defined the overlap as the 2D average cross-correlation of a binarized version of the polychromatic beam with a translated version of itself to an adjacent scan position. According to this definition, the overlap is equal to 74% with a standard deviation of 5.8% for the Gaussian beam, 68% with a standard deviation of 8.3% for the vortex beam, and 88.6% with a standard deviation of 10% for the structured beam.
Figure 4 shows the reconstruction results from the ptychography measurements with the three different beam types. These results were obtained using two incoherent probe modes for each wavelength, similar to Ref. [22], in order to account for decoherence and other sources of noise in the forward model [15,22]. To reduce the complexity of the problem, we constrained the object to look identical for all wavelengths, given that we use a binary, nondispersive object. More details on the PIE-based reconstruction algorithm and the reconstruction results are given in Appendices B and C, respectively. The reconstruction quality of the object upon vortex and structured beam illumination is clearly better than for Gaussian beam illumination, with fewer artifacts and sharper edges. Since the reconstructions give complex-valued expressions for the object and the probe, the object has been numerically propagated to remove a defocus term that is caused by calibration errors of the wavelengths or the sample to camera distance.
Figure 4.Reconstruction results from ptychographic measurements for (a) Gaussian, (b) vortex, and (c) structured probes. Right: amplitude of reconstructed objects. Top left: zoomed-in areas of the object. Bottom left: dominant modes of the reconstructed probes of the five brightest harmonics. Scale bars in all figures correspond to 20 μm.
In Fig. 4, we show reconstructions for the dominant mode of the five brightest probes that range from 33.4 to 45 nm. The vortex probes are elongated due to the presence of astigmatism, as explained in more detail in Appendix A.3. For all beam shapes, the spectral weights reconstructed from ptychography are consistent with the grating measurement [Fig. 3(g)]. Small variations are apparent, as the ptychography scan effectively measures the average diffracted radiation flux from the object across the scanned area for all wavelength components. The resulting spectrum may differ from the grating measurement that was acquired by only sampling part of the Gaussian beam. For the smooth beam, the wavefronts of the weaker harmonics have more artifacts and are less trustworthy compared with the weak harmonics of the structured and vortex beam.
The significant difference in the quality of the reconstruction for the smooth beam can be attributed either to the smaller photon budget on the camera or the lower illumination diversity, likely to the combination of the two factors.
D. Simulations with Reconstructed Probes under Comparable Experimental Conditions
To determine the cause of the difference in imaging performance and exclude the potential influence of experimental conditions, we set up simulations using the actual reconstructed probes and the SEM image of the object. In this simulation, we ensured identical parameters such as object-to-camera distance , spectral weights, overlap , and probe energy. Specifically, the probe energy was normalized such that the data set with the vortex beams would have a 15-bit dynamic range. A combination of Poisson noise and Gaussian noise with standard deviation counts was added to the ptychograms.
The object and probe reconstructions of the synthetic ptychographic data are shown in Fig. 5. The improved imaging results with nonsmooth beams [Figs. 5(e), (f), (i), and (j)] compared to results with smooth beams [Figs. 5(a) and (b)] remain clear, with similar trends to that shown in Fig. 4. Note that there are slight discrepancies between the probes in Figs. 4 and 5, because the synthetic data were generated based on single-mode reconstructed beams, since no instability or decoherence effects were considered during the simulation. The assumption of perfectly coherent harmonic probes does not influence the hypothesis that is examined during this simulation, of how diversity caused by a structured beam can enhance the reconstructed image quality.
Figure 5.Reconstruction results from synthetic ptychographic datasets with (a)–(d) Gaussian, (e)–(h) vortex, and (i)–(l) structured probes. (a), (e), (i) Amplitude of the reconstructed object. (b), (f), (j) Zoomed-in area of the object group 9/elements 5 and 6. (c), (g), (k) Probe reconstructions of the five brightest harmonics. (d), (h), (l) FRC computed by comparing object reconstructions with true object. Scale bars in all figures correspond to 20 μm.
The achieved object resolution is determined using FRC between the true simulated object and one object reconstruction per probe beam and is shown in Figs. 4(d), (h), and (l). The resolution using the 1 bit criterion is equal to 400 nm for the vortex beam, 1251 nm for the Gaussian beam, and 414 nm for the structured beam. For the current experimental parameters, the highest achievable, diffraction-limited, resolution is 129 nm, assuming the shortest contributing wavelength component is 33.4 nm.
E. Characterization of Probe Diversity
To compare the spectral and scanning diversity for the different beam types, we calculate spectrally resolved diffraction patterns at the detector plane. The probes, object, and scan grid are identical to what we used in the simulations presented in Section 3.D, so that we can correlate the diversity metrics to the reconstruction results in Fig. 5. Moreover, the diversity metrics are calculated based on stable and coherent harmonic beams, without any long-term drifts.
The diversity is calculated according to the dissimilarity metrics presented in Section 2, and the results are shown in Fig. 6. The diffraction patterns used as input to Eqs. (1) and (2) for the calculation of diversity according to L1 norm, L2 norm, and the cosine metric were normalized with respect to the maximum pixel value over the full data set []. We selected this normalization approach, as it conserves relative intensity differences among different patterns, which contains relevant information that influences ptychography algorithm performance. On the other hand, for the diversity metric according to JSD [Eq. (3)], each diffraction pattern was normalized independently such that , in accordance with the original definition of entropy from information theory. Different normalization strategies can be chosen: diversity results for different cases are given in Appendix D (Figs. 9 and 10), which show different absolute values but similar trends.
Figure 6.Diversity metrics for different probe beam structures. (a) Scanning diversity of polychromatic diffraction patterns. (b) Spectral diversity between diffraction patterns at wavelengths of 35.6 and 38.3 nm. (c) Spectral diversity between diffraction patterns at wavelengths of 38.3 and 41.4 nm. The solid lines indicate the mean values of comparing adjacent scan positions (for scanning diversity) or wavelengths (for spectral diversity) over the whole diffraction patterns series, while the shaded areas have a width of one standard deviation.
As shown by the calculated diversity metrics in Fig. 6, the vortex beam leads to the largest scanning diversity and spectral diversity, while the structured beam follows closely with high diversity values, especially according to cosine metric and JSD. These results are in close agreement with the reconstruction results of Fig. 5, where the vortex and structured beams were shown to lead to better object reconstructions than the Gaussian beam. Out of the four different metrics, the JSD and cosine metrics reflect this difference in ptychography performance, showing significantly higher diversity values for the vortex and structured beams in a way that correlates with the image reconstruction quality. In comparison, the L1- and L2-norms are less clear, showing larger variance and smaller differences between the beams.
F. Fisher Information Analysis
To further analyze the influence of structured illumination on the ptychographic reconstruction quality, we compare the Fisher information for the three previously described and experimentally reconstructed probes: Gaussian, vortex, and structured beam. The Fisher information quantifies the amount of information a measured diffraction pattern contains about an unknown parameter , thereby setting a lower bound on the achievable precision in estimating that parameter (the Cramér–Rao lower bound). Given the observed improvement in ptychographic reconstruction quality with increased illumination diversity, it is worth investigating whether this improvement is accompanied by an increase in Fisher information. Such a finding would bolster our claim that the increased diversity in diffraction patterns achieved through structured illumination leads to more informative measurements, which subsequently enable better object reconstructions.
In general, the Fisher information is defined as , where denotes the expectation operator with respect to noise fluctuations, and denotes a probability density function of a random variable representing the observed data [62,63]. The term represents the partial derivative of the natural logarithm of the probability density function with respect to the parameter . In other words, the Fisher information describes how sensitive the measurement is to changes in . This sensitivity is directly related to the concept of diversity in diffraction patterns, as a more diverse set of patterns can be expected to contain more information about the object and its parameters.
In the case of ptychography, where the measurement noise is assumed to follow a Poisson distribution, the Fisher information associated with the th diffraction pattern for a single parameter can be expressed as [64] where is the expected photon count at detector pixel , and the sum runs over all pixels of the detector.
To explore the relationship between structured illumination and Fisher information, we consider two types of object parameters: a dimensionless scaling factor , representing overall changes in the object’s size; and a phase shift in radians, representing changes in the object’s optical thickness. We study the same object shape as previously used in Section 2 but as a transparent phase object instead of a binary amplitude object. Assuming a sample-to-camera distance of and the same experimental design as used in our experimental setup featuring an HHG source, we can numerically calculate the diffraction patterns for each parameter using the multiwavelength XUV probes that were reconstructed and presented in Section 3.C. The probe intensities are all normalized to a total photon count of photons, and the object is scanned through the beams in a concentric pattern comprising 219 positions. Using a centered finite difference scheme, we can then estimate the Fisher information as where denotes a regularization parameter with the physical interpretation of the expected value of additive Poissonian noise, and the step sizes are chosen as for the scaling factor, and for the phase shift, respectively.
For both object parameters, the Fisher information is shown in Fig. 7 independently of each other. To assess the distribution of information across the whole ptychography scan, we visualize the Fisher information per diffraction pattern as violin plots and normalize them by the average information that is achieved by Gaussian beam illumination. For the object scaling parameter [Fig. 7(a)], we observe a significant increase of information using structured illumination. Specifically, the vortex probe exhibits a total Fisher information about 3.4 times higher than the Gaussian probe. This translates into a reduction in the standard deviation of the estimate for this parameter by about 1.8 times, calculated as the square root of the inverse of the Fisher information (from the definition of the Cramér–Rao lower bound).
Figure 7.Fisher information for different probe beam structures. The shape of the violin plots describes the distribution of Fisher information per diffraction pattern (for a total of 219 scanning positions). The Fisher information is normalized by the average information achievable with a Gaussian probe. (a) Fisher information associated with a parameter that determines the overall size and scale of the phase object shown as an inset plot. (b) Fisher information associated with the phase of the phase object shown as an inset plot.
This finding supports our expectation that the increased diversity in the diffraction patterns leads to a higher sensitivity to changes of a parameter that is closely linked to the faithful reconstruction of the object’s size and shape. The scaling factor directly affects the spatial frequencies in the diffraction patterns, which are crucial for achieving high-resolution reconstructions. Moreover, the scaling factor is intrinsically related to experimental parameters such as the wavelength and object-to-detector distance [65]. The enhanced sensitivity to the scaling factor through structured illumination can therefore lead to more accurate and precise object reconstructions as well as improved retrieval of the experimental geometry from the ptychographic data set.
Interestingly, we do not observe a significant difference in the Fisher information between structured and unstructured illumination for the phase parameter [Fig. 7(b)]. This suggests that phase sensitivity in ptychography may depend on additional factors beyond beam structure, such as phase-matching conditions, as discussed in Ref. [66]. It is important to note that estimating the single parameter of the object’s phase assumes prior knowledge of its shape. This assumption reduces the complexity of the ptychographic algorithm by alleviating the need to simultaneously retrieve spatial features, which may explain the relative invariance of the achievable estimation precision to the illumination diversity used.
4. DISCUSSION
The concept of diversity metrics using diffraction patterns as a means to assess the expected image quality in ptychography is found to work well. The diversity among scan positions and the spectral diversity in a multiwavelength ptychography experiment can be characterized with such metrics. From the Fisher information analysis, it follows that beams with high spatial and spectral diversity lead to increased sensitivity to spatial properties in ptychography data sets. As a result, ptychography with such high-diversity beams can be expected to lead to better image reconstructions at similar photon numbers and scan times. Although diversity metrics provide less quantitative insight than Fisher information, they consider the overall information content between measurements rather than the sensitivity to single parameters, which makes them well-suited in the assessment of imaging performance. Diversity metrics based on estimated diffraction patterns can therefore provide a way to optimize illumination beam profiles and secondary experimental parameters. From our experiments, we find that the JSD and cosine metric show clear correlation with image quality. These metrics have the additional advantage that they are bounded, meaning that the calculated diversity can be compared with the maximum possible diversity of a given data set. While this correlation is not a direct predictor of image quality, it does allow a comparison of the expected imaging performance with different beams and objects, giving valuable insight already in the design phase of a ptychography experiment.
In contrast, the L1- and L2-norms show larger variations and less sensitivity to image reconstruction quality for different beams. This is likely due to the structure of these metrics [Eq. (1)], which are more sensitive to absolute differences in intensity. Therefore, they are less reliable in quantifying structural changes between different diffraction patterns, in which the information for ptychography reconstructions is mainly contained.
Our experiments were designed in such a way that we could compare the spatial and spectral diversity when using beams without structure, with mainly spatial diversity induced by a binary mask, and with mainly spectral diversity through wavelength-dependent OAM. We find that both of these structured beam approaches provide enough spectral and scanning diversity to enable object reconstructions with improved resolution and fewer artifacts. Further, individual probe modes that correspond to different wavelengths of the polychromatic beam are reconstructed in a more robust and reproducible way.
While the structured beams indeed lead to improved image reconstructions, an interesting finding is that both types of structuring result in comparable image quality and resolution. A possible explanation is the strong wavelength dependence of far-field diffraction, as a larger separation between diffraction orders results in better wavelength-resolved patterns as well. Strikingly, the spectral diversity between some harmonic orders is actually higher for the binary mask structuring than for the OAM beams [Fig. 6(b)].
In the present work, the available HHG flux was the limiting factor in the experiments, as the measurements required exposure times that made the ptychography scans susceptible to drifts in beam pointing and ambient changes. In particular, for the vortex XUV beam, the required exposure time was almost twice as long as for the diffraction-based structured beam to reach similar flux. This additional measurement time may also have led to a reduced reconstruction quality, which could offset the present conclusions, given that the OAM beam shows the highest diversity among all the beams. Improving the HHG flux and using additional long-term stabilization systems would remove these uncertainties. Nevertheless, even though the beam structuring methods reduce the available HHG flux, the increased diversity remains a driver for reconstruction improvements. First demonstrations of nanoscale-resolution ptychography on dispersive samples used monochromatic XUV light [15,16]. The spectrally resolved probe reconstruction with multiwavelength structured illumination in this work paves the way for broadband imaging of dispersive samples, aided by diversity metrics to design the required illumination profiles. This approach can unlock the full use of the potential of broadband XUV imaging systems for semiconductor wafer metrology and biological materials.
5. CONCLUSION
In conclusion, in this study, we introduce diversity metrics in order to quantify the suitability of a coherent illumination type for ptychography experiments. We performed comparative measurements with various multiwavelength XUV beam types, namely, Gaussian, OAM, and structured by diffraction from a binary mask. Simulation and experimental results verify that increased scanning and spectral diversity of diffraction patterns leads to improved imaging results at a given photon flux and measurement time. These diversity metrics therefore provide an intuitive design guideline for ptychography experiments, enabling a comparison of expected image reconstruction quality for different beam profiles and objects.
Acknowledgment
Acknowledgment. A. P., J. S., and S. W. acknowledge funding from the Dutch Research Council (Perspectief program LINX). F. Z., M. G., and S. W. acknowledge support from the ERC (ERC-CoG project 3D-VIEW). This work was carried out at ARCNL, a public–private partnership among the University of Amsterdam (UvA), Vrije Universiteit Amsterdam (VU), Rijksuniversiteit Groningen (RUG), the Dutch Research Council (NWO), and the semiconductor equipment manufacturer ASML.
APPENDIX A: MATERIALS AND METHODS
Drive Laser for High Harmonic Generation
Our table-top HHG source is driven by an ultrafast NIR laser system. With an ytterbium-based laser system (Pharos from Light Conversion) delivering 170 fs pulses at a center wavelength of 1030 nm, 2 mJ pulses are obtained at a repetition rate of 1 kHz. For efficient high-harmonic generation, the pulses are compressed by a home-built post-compression system to a pulse duration of with an average power of 1.5 W [67]. The NIR beam is subsequently focused by an lens into an argon gas jet confined in a 1 mm diameter metal tube, at a backing pressure of 5 bar. Moreover, an iris clips the beam before the focusing lens in order to improve the phase-matching conditions for HHG.
Sample Preparation
The binary USAF 1951 resolution target as used in the experiments is fabricated on a 120 nm thick gold layer sputtering coated on a 50 nm silicon nitride freestanding membrane (Ted Pella Inc.). Patterning was performed with a 30 keV focused gallium ion beam (FEI Helios Nanolab 600) with a current of 0.28 nA and a dwell time of 1000 ms. An SEM image of the USAF is shown in Fig. 3(h). In our case, the sample thickness and the smallest structure in the sample meet the condition , which is referred to here as the projection approximation [68]. Therefore, the sample is mathematically represented by a 2D transmission function, which is obtained by a projection of the refractive index along one spatial dimension.
Extreme Ultraviolet Optics
The XUV mirrors that have been used for focusing the HHG beams to the sample are molybdenum/silicon multilayer mirrors fabricated by optiXfab GmbH [69]. We use one plane and one curved mirror to steer and refocus the beam, respectively. The bandwidth coverage of the mirrors is broadband (20–55 nm) for the plane mirror and narrowband centered at 39 nm for the curved mirror, at an average reflectivity of 20% per mirror.
The indicated angle of incidence for maximized reflectivity is 5 deg. However, oblique incidence on the curved mirror leads to astigmatism, as can be clearly observed in the ptychographic reconstruction of the vortex probes in Fig. 4(b). A small amount of astigmatism is also noticeable in the smooth beam reconstruction [Fig. 4(a)]. The induced astigmatism to the wavefront is equal to , with the incidence angle and the radius of curvature of the mirror [70]. OAM beams have larger divergence compared with Gaussian beams, so the beam has a larger size on the curved mirror, and the effect of astigmatism becomes stronger.
Data Acquisition
The XUV camera in our experimental setup (Andor Ikon-L 936SO, pixels, pixel size 13.5 μm, 15-bit dynamic range) has a constant background offset of approximately 320 counts for unbinned data when cooled to . The CCD pixels were read-out at a rate of 1 MHz with preamplifier gain for the vortex and structured beam measurements, where the signal was decreased compared with the smooth beam measurement.
To fully utilize the dynamic range of the camera, we set different exposure times for each measurement, equal to 10 s, 350 ms, and 6 s for the vortex, Gaussian, and structured beams, respectively. This difference in exposure times per diffraction pattern creates different sensitivities to possible beam drifts and spectral jitter. To monitor slow drifts, we recorded a diffraction pattern at one specific scan position several times throughout the ptychographic measurement as well as the polychromatic bare beam position before and after the measurement, and we did not observe any significant drift for either measurement. Spectral and pointing jitter can lead to blurring of the diffraction patterns, especially for long exposure times. The jitter can be modelled as a degree of incoherence in the beam, which we treated algorithmically by decomposing each high harmonic wavefront into incoherent modes. Given the limited amount of jitter, we found that using two modes is sufficient for ptychographic reconstruction.
The sample is mounted on a 3D translation stage (Smaract SLC-1730). The translation lateral to the beam is required to perform ptychography scans and the longitudinal translation allows us to select a desired beam size and divergence at the sample plane.
The ptychographic reconstructions were performed with PtyLab.py [58]. In PtyLab, the PIM [30] algorithm has been implemented to describe the measured far-field diffraction pattern at position as the incoherent sum of monochromatic diffraction patterns. For the specific experimental settings shown in Fig. 3, we have modified the general forward model expression to the following: where denotes the wavelength, refers to the incoherent mixed states of each probe that account for sources of decoherence [15], the binary object is identical for all wavelengths, and is the scaled angular spectrum propagator [58,71] that permits the propagation of an electromagnetic wave under the Fresnel approximation [72] with wavelength-independent pixel size at the object plane. The mixed states of the probes are orthogonalized during the reconstruction via singular value decomposition. In Fig. 4, we show the probes with the highest singular values that correspond more to the physical representation of the beams, while for all simulations we do not use mixed states. Further, the general forward model expression allows the object to be different for every wavelength; in our demonstration, however, we have simplified the formula by using a single object that looks identical for all wavelengths. The phase plots of the object shown in Fig. 8 verify that the object is binary, and there is no requirement for considering different object representation for each wavelength. is a constant background, iteratively updated according to Ref. [73], which is added to the forward model in order to take into account leakage of the fundamental beam to the detector. The update rules in the ()th iteration for probe and object are, then, according to [30,58] Since we use a single object representation for all wavelengths, we have modified the update rule given in Ref. [58] accordingly. Specifically, the update for the object is derived via the accumulated gradients from all spectral and decoherence modes of the probe. Moreover, for the reconstructions shown in Fig. 4, the regularization parameters and , which were first introduced in Ref. [74], were chosen equal to 0.99 in order to penalize updates of pixels with low signals, , and was adjusted manually during the reconstruction from 0 to 0.3. Since the polychromatic beam on the camera plane was recorded before each ptychographic measurement, we implemented the modulus-enforced probe technique [75] within the reconstruction process. A good initial guess based on earlier reconstructed results was used for the object in the results shown in Fig. 4; however, the algorithms also converged with slightly worse performance if prior knowledge was assumed only for the probes and not for the object.
Figure 8.Complementary object and probe reconstructions for experimental data. (a)–(c) Complex-valued representations of the reconstructed object for (a) Gaussian, (b) vortex, and (c) structured beams. (d)–(e) Amplitude of the partially coherent 27th harmonic (38.3 nm) beams at the object plane. (g)–(i) Amplitude and complex-valued plots of the incoherent modes of the 27th harmonic. In all complex-valued plots, brightness corresponds to amplitude and hue to phase. Scale bars in all figures are equal to 20 μm.
APPENDIX C: ADDITIONAL INFORMATION ABOUT THE RECONSTRUCTION OF EXPERIMENTAL DATA
Object Reconstruction
In ptychography, the imaging results are typically complex-valued expressions for the probe and object that correspond to the laser beam amplitude and phase and to the transmission (or reflection) function of the sample under examination. In this work, we demonstrated our concept in a binary sample that is either fully opaque or fully transparent to all wavelengths. However, the reconstruction algorithm was not restricted to converge to a real-valued object. The complete object reconstruction results after numerical propagation that remove any defocusing effects, are shown in Figs. 8(a)–8(c) for the three tested beam cases. We observe that indeed the algorithm has converged to a flat-phased object reconstruction for all beam cases, with only a minor residual phase variation of at the edge of the object when illuminated by a vortex beam.
Probe Reconstruction
As mentioned in Section 3.C and Appendix B, during the reconstruction, we use two incoherent probe modes, also called “mixed states,” per wavelength. Figures 8(d)–8(f) show the amplitudes of the incoherent sums of the modes of the 27th harmonic (38.3 nm) for the three different beam types, which correspond to a physical representation of the beam amplitude at this wavelength. We observe that both proposed methods to structure the HHG beam (diffraction mask-based and introducing OAM) lead to highly structured beam profiles. Finally, Figs. 8(g)–8(i) show amplitude and complex-valued plots of the incoherent probe modes of the 27th harmonic, with the percentage of the total energy that is included in each mode.
APPENDIX D: DIVERSITY METRICS USING DIFFERENT NORMALIZATION STRATEGIES
The diversity metrics that have been used throughout this work, namely, the L1-norm and L2-norm, the cosine metric, and Jensen–Shannon divergence (JSD), were proposed by Iwasaki et al. [55] as appropriate to describe similarity between diffraction patterns but originally have been defined and used in other disciplines. The L1-norm and L2-norm measure distance between vectors, making the results strongly dependent on the magnitude of the vectors, which in this application translates to the absolute intensity of the diffraction patterns. The cosine metric only measures the angle between two vectors, giving a result that is independent of any arbitrary scaling of the vector magnitudes. The JSD, on the other hand, has been defined as a similarity metric between probability density functions (PDFs), so the two diffraction patterns that are inputs in the JSD equation need to be normalized accordingly, such that the integrated intensity over the whole detector area is equal to 1. If we abide by this normalization, JSD is a bounded metric, with the supremum indicating maximum diversity.
For ptychography, the absolute value of the signal is an important parameter for successful reconstructions, as high pixel values imply better signal-to-noise ratio (SNR), although this aspect is not specifically relevant for diversity. However, it certainly affects the reconstruction quality if within the diffraction patterns series there are many low-signal diffraction patterns that mathematically give high diversity but practically do not contain any significant information due to the low SNR. Therefore, it is relevant to assess the effect of including diffraction signal strengths in the diversity metrics on the achieved reconstruction quality.
Figure 4 in the main text shows L1-norm, L2-norm, and cosine results for normalization of the diffraction patterns such that the maximum pixel value over the whole series of diffraction patterns is equal to 1. This approach ensures that relative intensity variations among the diffraction patterns are included in the diversity metrics. For JSD, a different choice was made, and each diffraction pattern has been normalized individually such that , in order to be consistent with the definition of entropy. In addition to this choice of normalization, we can consider alternative normalization methods to calculate scanning and spectral diversity and investigate their effect on the different metrics and their correlation to the ptychographic image reconstruction results. The normalization procedures that we considered can be listed as follows. System.Xml.XmlElementSystem.Xml.XmlElementSystem.Xml.XmlElementSystem.Xml.XmlElement is the number of scan positions, and is the size of the detector.
In Figs. 9 and 10, we show results of scanning and spectral diversity, respectively, for these different normalization strategies as applied to all four metrics. The results indicate that the L1-norm and L2-norm are sensitive to the normalization strategy, even to the point where their difference becomes insignificant when using local normalization [Fig. 9(b)]. We attribute this behavior to the situation as described above, where the local normalization results in an increased weight of low-intensity diffraction patterns on the diversity metrics. As such low-intensity patterns contain significant noise, overestimating their weights will lead to higher diversity estimates, as white noise in principle has high diversity between separate measurements. Therefore, when using L1- and L2-norms as a diversity metric, global normalization is required to properly account for true signal variations across the ptychography scan. In contrast, the cosine norm is independent of the normalization, which is to be expected, as the angle between vectors is not affected by relative amplitude differences. This feature makes the metric more robust against the choice of normalization and therefore more flexible. However, its independence of the magnitude may form a limitation, as it may become less clear how to identify the influence of low SNR in a ptychography data set. As long as data of sufficiently high SNR can be guaranteed, the cosine metric is a suitable way to assess diversity in a data set.
Figure 9.Scanning diversity metrics for different normalization strategies. (a) Global normalization, (b) local normalization, and (c) local normalization on total flux. The solid lines indicate the mean values of comparing adjacent scan positions for scanning diversity over the whole diffraction patterns series, while the shaded areas have a width of one standard deviation. Note the different horizontal and vertical scales.
Figure 10.Spectral diversity metrics for monochromatic diffraction patterns at 38.3 and 41.4 nm for different normalization strategies. (a) Global normalization, (b) local normalization, (c) local normalization including spectral weights, and (d) local normalization on total flux. The solid lines indicate the mean values of comparing wavelengths over the whole diffraction patterns series, while the shaded areas have a width of one standard deviation. Note the different horizontal and vertical scales.
The JSD is quite different from the other considered metrics, as it does not consider measurements as vectors but as probability distributions. Diversity is then quantified as the difference in information content instead of norms or projections of vectors. This concept seems naturally suited to assess diversity in measured diffraction “information” but does require a different treatment to allow such an interpretation. Interpreting a diffraction measurement as a probability distribution requires the total probability of all registered events to add up to 1. This corresponds to the approach of local normalization on flux, as used in Figs. 9(c) and 10(d). With this approach, the JSD is a bounded metric with a clear interpretation of diversity in terms of new information added by each next diffraction pattern, which is clearly attractive for experiment design and analysis. However, such a local normalization approach does have the risk of becoming too sensitive to noise when there are many low-SNR diffraction patterns in a data set, as was discussed above for the L1- and L2-norms. Therefore, one could argue that JSD with a global normalization approach has advantages, as it significantly reduces this noise sensitivity. Comparing the JSD results in Figs. 9 and 10, we find that the trends in JSD for our data sets are largely independent of the chosen normalization, although the variance is significantly reduced for global normalization. This does make it easier to assess trends in JSD, but global normalization removes the absolute upper bound and reduces the JSD to a relative metric.
[67] F. Zhang, A. Pelekanidis, M. Du. Nonlinear compression of mj-level pulses via double-pass loose focusing in air. European Conference on Lasers and Electro-Optics, cf_p_15(2023).
Antonios Pelekanidis, Fengling Zhang, Matthias Gouder, Jacob Seifert, Mengqi Du, Kjeld S. E. Eikema, Stefan Witte, "Illumination diversity in multiwavelength extreme ultraviolet ptychography," Photonics Res. 12, 2757 (2024)