Spatial, Spectral, and Polarization Multiplexed Ptychography

We introduce a novel coherent diffraction imaging technique based on ptychography that enables simultaneous full-field imaging of multiple, spatially separate, sample locations. This technique only requires that diffracted light from spatially separated sample sites be mutually incoherent at the detector, which can be achieved using multiple probes that are separated either by wavelength or by orthogonal polarization states. This approach enables spatially resolved polarization spectroscopy from a single ptychography scan, as well as allowing a larger field of view to be imaged without loss in spatial resolution. Further, we compare the numerical efficiency of the multi-mode ptychography algorithm with a single mode algorithm. Extending the methodology of x-ray crystallography to allow imaging of micrometre-sized non-crystalline specimens, " Nature 400, 342–344 (1999). Beyond crystallography: diffractive imaging using coherent x-ray light sources, " Science 348, 249–254 (2015). Translation position determination in ptychographic coherent diffraction imaging, " Opt. Phase retrieval with transverse translation diversity: a nonlinear optimization approach, " Opt. Effects of missing low-frequency information on ptycho-graphic and plane-wave coherent diffraction imaging, " Appl. Enhanced high-harmonic generation using 25 fs laser pulses, " Phys. Generation of coherent soft x rays at 2.7 nm using high harmonics, " Phys. Theory of high-harmonic generation by low-frequency laser fields, " Phys. Observation of a train of attosecond pulses from high harmonic generation, circularly polarized soft x-ray high harmonics for x-ray magnetic circular dichroism, " in CLEO Postdeadline Papers (2015), pp. 2–3. Generation of bright circularly-polarized extreme ultraviolet high harmonics for magnetic circular dichroism spectroscopy, " Arxiv preprint Isolated attosecond pulses with controlled polarization, " presented at Super Intense Laser-Atom Physics Conference (2015).


Introduction
Coherent diffraction imaging (CDI) [1][2][3], or lensless imaging, is a microscopy technique that essentially replaces image-forming optics with an iterative, phase-retrieval algorithm.In CDI, light diffracts through, or from the surface of a sample and is subsequently detected in intensity, usually far from the scattering specimen.A computational routine then iteratively solves for the complex-valued wave exiting the sample (the exit surface wave) by satisfying two or more constraints that include the measured, far-field scatter pattern.Of the various forms of CDI [4][5][6], Ptychography CDI [7,8] has proven to be particularly robust in most circumstances [9].Ptychography gains its advantage by making use of redundant information from multiple diffraction patterns [10].Because ptychography is essentially a blind deconvolution, it not only reconstructs the exit surface wave, but also the constituent, input illumination and object transmission functions.Surprisingly, the redundant information in ptychography allows for more than just the sample and illuminating probe to be retrieved [11].Ptychography is also capable of solving for errors in stage positions [12][13][14], missing diffraction data [15], incoherence in the diffraction [16] due, for instance, to probes of different wavelengths [17], and is especially suited for dealing with noisy data [18].Recent work [17] used ptychography to decouple spectral responses from a single beam containing multiple wavelengths.
In this work, we demonstrate a new capability to expand the scope of ptychography CDI by simultaneously imaging spatially separated locations on a sample with multiple illuminating beams.This allows us to image a larger field of view without a loss in spatial resolution.Our technique simply requires that the spatially separated beams be mutually incoherent, or noninterfering at the detector plane, which can be achieved using multiple probes that are separated either by wavelength or by orthogonal polarization states.Furthermore, since two orthogonal polarization states of the same wavelength can be isolated and imaged simultaneously, this enables simultaneous polarization specific imaging even for a sample with weak or no polarization response.These advances have the potential to increase the speed of ptychographic imaging, particularly for magnetic systems in a static or dynamic mode.Finally we note that the use of spatially separated probes improves and speeds up the convergence of ptychographic CDI.

Spectral separation
In this initial proof-of-principle demonstration, we built a transmission-mode, visible ptychography microscope operating with two spatially separated beams, as shown in Fig. 1.We combined two single-frequency, fiber-coupled laser diodes [Blue Sky Research, FTEC2 440-20, λ =450 nm & FTEC2 658-60, λ =656 nm] in a beam-splitting cube [Thorlabs, BS013] after which, both beams propagated collinearly to a spatial filter [Newport, 900: with 5 μm pinhole 900PH-5 and M-10X objective].The output of the spatial filter was collimated by an f=5cm lens and then cropped heavily by a 300μm pinhole, which was imaged through a pair of diffraction gratings [Rainbow Symphony, 01604, 500 lines/mm] to the sample with a demagnification of M = 0.1.Using the measured separation and quoted groove density of the diffraction gratings, the beam separation was calculated to be approximately 100 μm, without altering the initial propagation direction of the beams.The additional diffraction orders were blocked by a pinhole placed after the diffraction gratings.The two beams were then incident on a negative 1951 USAF test pattern [ThorLabs, R3L3S1N], where they diffracted and subsequently propagated to the far-field after passing through an f=2cm lens.Scatter patterns were collected at 100 po-  sitions, in a semi-random rectilinear ptychographic scan grid, using a Mightex SCN-B013-U CMOS detector.Diffraction patterns were measured with various exposure times ranging from 0.05ms to 750ms and then combined to artificially extend the dynamic range of the detector from 48dB to 100dB.The composite diffraction pattern was shifted to the center of the numeric grid halfway between the DC peaks of the two constituent diffraction patterns, which were separated due to chromatic aberration in the transform lens.Using the algorithm described in [17], an object (corresponding to a different region of the sample) was reconstructed for both probe wavelengths, as shown in Fig. 2.
To verify the accuracy of our reconstruction, the two illuminating beams were directly imaged by translating the detector to the sample plane.The reconstructed objects correspond to areas on the sample that are separated by 175±5 μm, which agrees with the separation of the two directly-imaged beams (179±5 μm).For further comparison, the reconstructed probes were also computationally propagated to the imaging plane of the lens in our setup, with the resulting beams shown in Fig. 3.The blue probe was propagated further in order to reach the imaging plane.The difference between the propagation distances for the two probes is 550±100 μm, which agrees with the calculated difference based on chromatic aberration from the imaging lens (420±4 μm).

Polarization separation
Using the same setup described above, a single red beam was spatially split into two orthogonal polarization states using a beta barium borate (BBO) crystal [Eksma Optics, 9830.2.7 mm thickness, θ = 23.4,φ = 90.]before illuminating a negative USAF test sample, as shown in Fig. 4. By directly imaging the two beams at the sample plane, the separation between the two beams was measured to be 216±10 μm (see Fig 5).Without this spatial separation, this sample would produce identical diffraction patterns for both polarizations.The spatial separation allows the two polarization modes to be deconvolved without ambiguity.
A 64-position ptychography scan was taken with both beams incident on the sample, using an approach identical to that described in the previous section.The data was processed in the same manner as described for the multi-wavelength data case.The reconstructed object is shown in Fig. 6.The two reconstructed areas are groups 6 and 7 on the USAF test pattern.The separation  between these two patterns is 170±20 μm, which is consistent with the separation between the two beams (216±10 μm).
In Fig. 6b, the smallest features are 2.46 μm in width.Our system has a theoretical, Abbe diffraction-limited resolution of 2.46 μm, which indicates that using multiple beams for ptychography has no negative impact on the achievable spatial resolution.

Numerical efficiency
It is also possible to image two areas by taking a separate, single-beam ptychography scan of each area.However, using this approach has the disadvantage that the amount of data collected grows linearly with the number of areas to scan.In contrast, using M (where M is an integer) beams allows for M areas to be scanned simultaneously, which reduces the amount of recorded data by a factor of M.
The duration of a single iteration of the multi-mode reconstruction algorithm scales linearly with the number of modes.Reconstructing multiple ptychography scans also scales linearly with the number of reconstructions that need to be performed.
In order to compare the convergence of multi-mode ptychography with repeated single mode ptychography, we simulated ptychography data and reconstructed objects for multiple wavelengths using both algorithms.We allowed the multi-mode algorithm run for N iterations, and the single mode algorithm run for N iterations for each wavelength.This compensates for the linear scaling as a function of the number of modes.
The reconstructed objects were then compared with the object used to generate the data.Both the original object and the reconstructed object were masked so that we only compared the areas of the objects at which five or more scan positions overlapped.To compare the two objects, we used a root mean squared error metric, where E is the RMS error, N is the number of pixels within the mask Ω, O r is the reconstructed object, and O 0 is the original object.Calculating this error for each iteration, for both the multi-mode and single mode algorithms, we see that the multi-mode algorithm requires more iterations to converge than the single mode algorithm (see Fig 7).
We then fit the error to a power law of the form E = a N b i + c, where N i is the number of iterations, and a, b and c are fit parameters.We see that for two modes, for this simulation, the multimode algorithm error scales as the number of iterations to the power of b = −0.332± 0.006 (R 2 = 0.9766), while the single mode ptychography error scales as the number of iterations to the power of b = −0.572± 0.007 (R 2 = 0.9853).
We have performed separate simulations comparing the convergence of spatially overlapping probes with the convergence of spatially separated probes, using the multi-mode algorithm for both.In our simulations, the reconstruction converged faster for the case of spatially separated beams.We believe this to be due to the fact that the different objects cause the diffraction patterns of different probes to differ more, thus allowing for them to more readily be separated.
Comparing both approaches, the total reconstruction time is longer for the multi-mode ptychography algorithm.However, the total data acquisition time required for the same imaging area is longer for the single mode ptychography algorithm.Thus, multi-mode ptychography can take less time for the combined data acquisition and reconstruction.Additionally, the multimode ptychography algorithm allows for multiple areas to be imaged simultaneously, which is simply not possible in the single mode case.This unique capability could prove useful in systems with low stability or non-repeatable dynamic measurements.

Conclusion
We have demonstrated that by spatially separating probes in a multiplexed ptychography scan, multiple areas of a sample can be imaged simultaneously.This method does not require a reduction in the numerical aperture of the system for either probe, and thus preserves the spatial resolution of all images.Further, this technique can be generalized to any non-interfering set of probe beams.We have explicitly demonstrated two cases, by using two probe beams of different wavelengths, and two orthogonally polarized beams of the same wavelength.The latter gives spatially resolved polarization spectroscopy from a single ptychography scan.
We expect this technique to find immediate application in the EUV and X-ray spectral ranges.It is especially well suited for high harmonic generation (HHG) sources [19][20][21][22][23][24][25][26], because the HHG process produces a discrete comb of harmonics that can be spatially separated with few additional optical components [27].Additionally, recent advances in tabletop circularly polarized HHG sources [28][29][30] will allow for imaging of magnetic materials.
The limit on how many mutually incoherent probes can be reconstructed simultaneously requires further investigation, but it may be possible to use this technique to perform hyperspectral ptychography with a broadband source by spatially separating different components of the broadband radiation.

Fig. 1 .
Fig.1.Experimental schematic (not to scale).Spatially filtered blue and red beams are combined using a beamsplitter and then spatially separated using a pair of diffraction gratings.Other diffraction orders from the gratings are blocked by a pinhole.The two unblocked beams are then focused onto a 1951 USAF test pattern.The diffracted light is collected using a Mightex SCN-B013-U CMOS detector after passing through a 2 cm lens.

Fig. 2 .
Fig. 2. Reconstructed objects for a) the red and b) the blue probe beams (scale bar is 20 μm wide).c) Imaged areas shown on the USAF test sample.The two areas are separated by 175 μm, which is the same as the separation between the probes.(b) uses the same scale as (a).

Fig. 3 .Fig. 4 .
Fig. 3. Amplitude of the blue beam (a) and red beam (b) extracted from ptychographic reconstruction.The sizes of the beams agree with independent measurements performed near the imaging plane.c) and d) Propagation of the blue and red beams along the propagation direction.The white dotted lines indicate the location of the reconstructed probes, while the white dashed lines indicate the imaging plane for the probes.e) and f) Probe beams at the imaging plane, propagated from the reconstructed probes.The distance between the imaging plane for the blue probe and the red probe is found to be consistent with the chromatic aberration of the imaging lens.The scale bar in (a) is 20 μm wide and is shared by (b), (e), and (f).The scale bar in (c) is 1 mm wide and is shared by (d).

Fig. 5 .
Fig. 5. a) Direct CCD image of the probe beams in the plane of the sample.The beams are separated into two orthogonal polarization states.The separation between the beams is 233 μm.(b-c) Reconstructed amplitude and phase of the parallel polarization ( ê ).(d-e) Reconstructed amplitude and phase of the perpendicular polarization ( ê⊥ ).The maximum value of the phase is 1.5 rad in (c) and is 2.5 rad in (e).The scale bar is 20 μm wide and (b-e) are on the same scale as (a).

Fig. 6 .Fig. 7 .
Fig. 6.Reconstructed object for a) the parallel ( ê ) polarization and b) the perpendicular ( ê⊥ ) polarization.These areas are group 6 and 7 on the USAF test sample.c) Location of these areas on the USAF test sample.These two areas are separated by 170 μm, which corresponds to the shift between the two probes.The scale bar is 20 μm wide.(a) and (b) share the same scale bar.