Paper The following article is Open access

Creating analogs of thermal distributions from diabatic excitations in ion-trap-based quantum simulation

, and

Published 18 April 2016 © 2016 IOP Publishing Ltd and Deutsche Physikalische Gesellschaft
, , Citation M H Lim et al 2016 New J. Phys. 18 043026 DOI 10.1088/1367-2630/18/4/043026

1367-2630/18/4/043026

Abstract

One broad goal of quantum simulation is to start a simple quantum system in its ground state and slowly evolve the Hamiltonian to a complex one, maintaining the ground state throughout the evolution (called adiabatic state preparation). This provides a natural setting to create a highly entangled and correlated quantum state if the final Hamiltonian supports such a ground state. In ion-trap-based quantum simulations, coherence times are too short to allow for such ground-state evolution for large chains, because the rapid evolution of the system creates excitations to higher energy states. Because the probability for this excitation depends exponentially on the excitation energy and because the thermal distribution also depends exponentially on the excitation energy, we investigate whether this so-called diabatic excitation can create the analog of a thermal distribution; as this could serve as an alternative for creating thermal states of complex quantum systems without requiring contact with a heat bath. In this work, we explore this relationship and determine situations, where diabatic excitation can approximately create thermal states.

Export citation and abstract BibTeX RIS

1. Introduction

Quantum simulation in ion traps has made significant progress over the past five years. Initial work examined the transverse-field Ising model with adiabatic state preparation [13] and more recently with excited state spectroscopy (since the experiments created significant diabatic excitations) [46]. The latest work has examined Lieb–Robinson bounds in Ising and XY systems [7, 8] and higher-spin variants [9]. In general, the coherence time of the experiments is too short to allow for adiabatic state preparation (which becomes worse for frustrated spin systems, and for larger numbers of spins, because the small excitation gaps imply more excitation for the same experimental run time), so the systems generically have significant diabatic excitation. In this work, we examine how close the eigenstate populations from a diabatic excitation are to those of a thermal distribution and we compare common experimental measurements (like the spin structure factor or the Binder cumulant) between these two cases to determine how close diabatic expectation values are to thermal expectation values for typical operators.

In the controlled quantum environment of an isolated quantum system, one would like to be able to examine situations that represent equilibrium physical behavior. Namely, we would like to be able to create thermal distributions of a complex quantum system from some simple initial states of well-understood quantum systems. This approach goes a step beyond conventional adiabatic state preparation because its goal is not just to create the ground state, but to create the appropriate mixture of quantum states that corresponds to a thermal distribution without having to place the system into contact with a thermal reservoir. If this goal can be achieved, then one would have a highly tunable quantum simulator that can solve many equilibrium quantum problems and can determine critical phenomena like scaling exponents and other universal properties of phase transitions in the presence of large quantum effects. At the moment, we do not have such flexibility in conventional simulations, but because the energy gap determines the excited state populations both in the diabatic case and in the thermal case, it seems possible that these two cases may have situations where the diabatic excitations do appear to be thermal. We investigate this possibility here.

Note that one might have thought that creating a thermal state is easy—just attach the system to a thermal reservoir at a given temperature and you will create a thermal state. But in the area of quantum computing this is not a simple thing to do. Most realizations of quantum simulators are isolated from external perturbations. It is this isolation that allows them to have long coherence times, which are required to simulate quantum systems, but at the same time, it prevents them from being easily attached to a thermal reservoir that can create a thermal distribution at a given temperature. This is generally viewed as one of the benefits of quantum simulators, in that one can use techniques like adiabatic state preparation to create complex quantum ground states, but the adibatic criterion—where the system is changed slow enough that it remains in the ground state—often is not feasible before the system loses its quantum coherence. Hence, we pose a related question here, if ground-state preparation is not possible, can we create analogs of thermal states when we evolve the system so rapidly that it creates excitations to higher energy states? We are careful to say this is an analog of a thermal state, because the system remains in a pure quantum state, and so, when we evaluate expectation values of different observables, the system will include contributions from off-diagonal matrix elements (in the eigenstate basis), which never enter for a thermal distribution. But if the contribution from those off-diagonal elements is small, then it is possible that the system approximates a thermal state well. This is the fundamental question we examine in this work.

This idea of creating thermal states in quantum computers has been around for some time. Initial proposals created coherent states with the correct populations (called coherent encoding of a thermal state or CETS) and then decohered the system employing ancilla q-bits to create the thermal state which has no off-diagonal elements when matrix expectation values are taken [10]. Recently, NMR-based quantum simulators implemented a simpler version of this technique with three spins [11], but it is clear that this approach is quite complex to carry out for large systems, and the alternative we propose here will be much simpler if it is successful.

The system that we work with is a transverse-field Ising model with static Ising exchange parameters and a time-dependent transverse field. The Ising exchange parameters are long-range and approximately decay with a power law in the inter-ion distance that is tunable between the uniform case ($\alpha =0$) and the dipole–dipole case ($\alpha =3$). They are calculated exactly following the results in [12], and they approximately satisfy

Equation (1)

with J0 an overall scale for the exchange interactions, Ri the position of the $i{\rm{th}}$ ion and a the average nearest-neighbor distance between ions in the chain (hence the denominator is nearly $| i-j{| }^{\alpha }$). The equilibrium positions of the ions in a harmonic trap are not uniformly spaced, but they are nearly so for the chains we consider in this work. All Jij are positive for the ferromagnetic case and negative for the antiferromagnetic case. The Hamiltonian becomes

Equation (2)

Here, ${\sigma }_{\gamma }^{(i)}$ is the Pauli spin matrix (with eigenvalues ±1 and with $\gamma =x$, y, or z denoting the spatial direction of the Pauli matrix) at lattice site i, Bx(t) is the time-dependent transverse field, and N is the number of spins in the lattice; we work in units where the spins are dimensionless and we simulate the transverse-field Ising model in a linear Paul trap. This model is generated in the ion trap by applying an optical spin-dependent force (which employs two optical beams with slightly different frequencies), and integrating out the effects of the phonons (assuming they are only virtually occupied) [12]. In general, the spin exchange parameters are time-dependent, but in experiments with the detuning of the difference of the two optical beams to the blue of the transverse center-of-mass mode—where all of the exchange coefficients have the same sign—the system is approximated well by the static spin exchange parameters [13], even if phonons are excited during the drive from the laser.

2. Methods

The explicit formula for the static spin exchange coefficients is [12]

Equation (3)

and we use the experimental parameters from [3] where ${\rm{\Omega }}=600\;{\rm{kHz}}$ is the Rabi frequency, ${\nu }_{{\rm{R}}}=h/(M{\lambda }^{2})=18.5\;{\rm{kHz}}$ is the recoil energy of a ${}^{171}{{\rm{Yb}}}^{+}$ ion (with h Planck's constant, M the mass of the ion, and $\lambda =355\;{\rm{nm}}$ the wavelength of the laser light), bim is the value of the orthonormal eigenvector at the $i{\rm{th}}$ ion site of the $m{\rm{th}}$ transverse normal mode for the N-ion chain, ${\omega }_{m}$ is the corresponding normal mode frequency, and $\mu ={\omega }_{{\rm{COM}}}+3\eta {\rm{\Omega }}=1.0233{\omega }_{{\rm{COM}}}$ is the detuning from the transverse center of mass mode (with $\eta =\sqrt{{\nu }_{{\rm{R}}}/{\omega }_{{\rm{COM}}}}=0.0621$ the Lamb–Dicke parameter). We work in conventional (cyclic) frequency units throughout. The range of the spin exchange coefficients is adjusted by changing the anisotropy between the transverse and the axial traps, keeping the transverse trap fixed. The axial center of mass mode then has a frequency running from 620 kHz to 950 kHz, corresponding to a nearest neighbor exchange interaction which is near 1 kHz (${J}_{0}\approx 1$ kHz) and the power law decay running from $0.7\lt \alpha \lt 1.2$.

The protocol we follow for the adiabatic state preparation (with diabatic excitations) is as follows: (i) initialize the system with all spins in the x-direction and with the field large and positive ${B}_{x}\gg {J}_{0}$, so it starts in the ground state and (ii) reduce the magnetic field in an exponential fashion with ${B}_{x}(t)={B}_{0}\mathrm{exp}(-t/\tau )$ for a time constant τ with ${J}_{0}\tau \approx 1/2$ and the total time interval for the evolution being on the order of $6\tau $ time units; the initial magnetic field satisfies ${B}_{0}=5{J}_{0}$. With ${J}_{0}\approx 1\;{\rm{kHz}}$, we have $\tau \approx 0.5$ ms and the total running time for the experiment being 3 ms, similar to the experimental run times of recent experiments. We also choose the trap asymmetry so that the power law for the decay of the spin exchange satisfies $0.5\leqslant \alpha \leqslant 2$. We work with chains ranging in size from N = 6 to N = 12.

The time evolution of the system is calculated by evolving the wavefunction forward in time. We employ the Crank–Nicolson [14] algorithm to do this, with a small enough step size in the time discretization that unitarity of the time evolution is preserved throughout the simulation. This is checked explicitly by reducing the step size until results do not change within the precision of the calculation.

We will be comparing the diabatic evolution of the ground-state wavefunction to the mixed state of a thermal distribution. In order to do this, we need to identify a strategy for determining the effective temperature of the thermal distribution for this comparison. Of course, if the system evolved fully into a thermal distribution, then all of the different techniques we use to identify the effective temperature would agree. But because the evolution is not exact, these different strategies can yield different results. We summarize these strategies next.

The first thing we have to realize is that one difference between the diabatic evolution and a thermal distribution is that the diabatic evolution can only populate quantum states that have the same symmetry as the initial ground state. For the transverse-field Ising model, with long-range couplings, there are two symmetries that arise. The first is a spatial reflection symmetry, which can be expressed as ${J}_{{ij}}={J}_{N-{iN}-j}$, if we number the lattice sites in the chain from left to right in increasing order. This symmetry arises from the fact that the ion positions in the trap have a reflection symmetry about the origin in the axial direction, and so do the normal modes (due to the even symmetry of the trapping potentials). The second symmetry is a spin-reflection parity, where we perform the unitary transformation ${\sigma }_{x}\to {\tilde{\sigma }}_{x}$, ${\sigma }_{y}\to -{\tilde{\sigma }}_{y}$ and ${\sigma }_{z}\to -{\tilde{\sigma }}_{z}$, which leaves the Hamiltonian and the spin–spin commutation relations invariant. Both of these reflection symmetries produce eigenvalues with respect to the parity of the reflection, which can be even or odd for the spatial and for the spin symmetry, separately. Hence, the diabatic evolution will only populate states in the initial symmetry sector of the ground state; the other symmetry sectors are unchanged due to the diabatic evolution, and remain unpopulated. Note that this last statement can be relaxed if phonons are actually created during the time evolution, rather than just being virtually created. This is because the spin-reflection parity is not a symmetry of the laser-ion interacting Hamiltonian, but only of the effective spin Hamiltonian.

We employ three different strategies to extract an effective temperature for the thermal distribution that we will use to compare to the diabatic distribution. The first one is to find the effective temperature of the thermal distribution that has the same average energy as that of the time-evolved state. If we define the (orthonormal) eigenstates of the system to satisfy ${\boldsymbol{ \mathcal H }}({t}_{{\rm{f}}})| n\rangle ={E}_{n}| n\rangle $ at the end of the experiment, then the partition function becomes ${\boldsymbol{ \mathcal Z }}={\sum }_{n}\mathrm{exp}(-\beta {E}_{n})$, with $\beta =1/T$ the inverse temperature (setting kB = 1) and tf the final time for the evolution of the system. Then, if we denote the final time-evolved wavefunction by $| \psi ({t}_{{\rm{f}}})\rangle $, the effective temperature for the average energy fit solves

Equation (4)

where the sum goes over all of the eigenstates of the final Hamiltonian (including all symmetry sectors). The only adjustable parameter is β, which is adjusted to solve the above equation. This method is called the thermal average fit for the effective temperature. The second method we use is to find the effective temperature that has the same thermal fluctuations of the energy about the mean. This relation becomes

Equation (5)

which also has just one parameter to adjust for the fit—β. This method is called the thermal fluctuation fit for the effective temperature. The third strategy is to relate the ratio of the probability to be in the first excited state over the probability to be in the ground state to the Boltzmann formula for that ratio in terms of the excitation energy. Namely we use

Equation (6)

or

Equation (7)

where ${P}_{n}=| \langle n| \psi ({t}_{{\rm{f}}})\rangle {| }^{2}$. This last result is only meaningful when the probability to be in the ground state is larger than the probability to be in the first excited state, otherwise it produces a negative effective temperature, which is not a thermally stable state. This method is called the thermal ratio fit for the effective temperature.

3. Results

We examined the diabatic evolution of the transverse-field Ising model, employing parameters similar to those used in experiment, for a range of different systems including N = 6, 8, 10, and 12, α ranging from 0.5 to 2 in steps of 0.25, and for the ferromagnetic ${J}_{0}\gt 0$ and antiferromagnetic ${J}_{0}\lt 0$ cases. In general, we found that the effective temperature was determined best by the thermal average fit. The fit from the thermal ratio often would yield negative effective temperatures for the antiferromagnetic case, while the fit for the thermal fluctuations tended to produce too high an effective temperature to properly fit the lower excited states. In cases where all three fits are close to one another (which occurred more often for the ferromagnetic case), we can often infer that the system is nearly in the analog of a thermal state.

Figure 1 shows the comparison of the probability distribution for the energy eigenstates at time tf for the diabatically evolved state to those of a thermal distribution with an effective temperature fit; we have examined dozens of cases and illustrate typical results for N = 10 and $\alpha =1$ in the figure. The ferromagnetic case matches well to the thermal distribution, while the antiferromagnetic case does not. Note that the antiferromagnetic case usually has the ground-state probability lower than the first excited-state probability except in cases of large α when the Ising coupling becomes closer to a nearest-neighbor interaction. While the ferromagnetic case is described well by a thermal distribution.

Figure 1.

Figure 1. Probability distribution of energy eigenstates in the same symmetry sector as the ground state for $N=10,\alpha =1$, when the system is diabatically evolved for $6\tau ={t}_{{\rm{f}}}=3.0$ ms. Panel (a) on the left is for the ferromagnetic case, while panel (b) on the right is for the antiferromagnetic case. The effective temperature fit using the average energy usually yields the best fit compared to the other choices. These fits are displayed by the lines which are evaluated at the corresponding eigenenergy values. The symbols plot the probabilities of those eigenenergies in the diabatic wavefunction, with the blue lines as a guide to the eye.

Standard image High-resolution image

One interesting feature arises in the antiferromagnetic case. The data for the probabilities appear to fit distributions with two distinct effective temperatures, one linked to the ground-state probability and one to the first excited-state probability. This behavior can be clearly seen in figure 1(b). While we do not have any conclusive explanation for why this behavior occurs, we did notice that the states in each of the distributions tend to be characterized by an expansion in terms of two product states in the position-space z-basis states or four product states in the position space z-basis states when one simultaneously diagonalizes the energy, spatial parity, and spin parity at ${B}_{x}\to {0}^{+}$. If we apply the parity symmetry operations (spatial and spin) to a given product state, it either maps back onto the state itself, or it maps onto another state, so if we group product basis states in the z-axis orientation according to their spatial and spin parity eigenstates, these states include two or four terms in the expansion; they also become eigenstates when Bx = 0.

One must note, however, that having the correct distributions for the populations of the eigenstates does not mean that the time-evolved diabatic state is a thermal state. It is not. But if it has the correct distributions of the populations for the low-lying eigenstates, and if it gives similar results for other expectation values of interest for the system, then it is an excellent analog of a thermal state. Hence, we must also examine some common expectation values. This allows additional perspective for comparing the expectation values of a pure time-evolved quantum state to the corresponding thermal expectation value (which is evaluated via a trace over all states weighted by the density matrix). One common operator is the Binder cumulant [15], which is defined by

Equation (8)

Here, the operator ms stands for the uniform magnetization operator for the ferromagnetic case and the staggered magnetization operator for antiferromagnetic case: ${m}_{{\rm{s}}}=\tfrac{1}{N}{\sum }_{i=1}^{N}{(\pm 1)}^{i}{\sigma }_{z}^{(i)}$. We further scale the calculated Binder cumulants to $\bar{{g}_{{\rm{s}}}}=({g}_{{\rm{s}}}^{0}-{g}_{{\rm{s}}})/({g}_{{\rm{s}}}^{0}-1)$ with ${g}_{{\rm{s}}}^{0}=3-2/N$ to remove finite-size effects and to have the Binder cumulant vary from 0 in the least-ordered state to 1 in the most-ordered state. Figure 2 shows these results.

Figure 2.

Figure 2. Binder cumulant for the diabatic evolution (solid line) compared to a thermal distribution with an effective temperature given by the thermal average fit (dashed line) for N = 10 and $\alpha =1$ as the simulation time tf is increased from 1 ms to 5 ms (in all cases $\tau ={t}_{{\rm{f}}}/6$). The diabatic results are given by solid lines and the thermal fits by dashed lines. The FM case (red) uses the scale on the left, while the AFM case (blue) uses the scale on the right.

Standard image High-resolution image

The ferromagnet rapidly orders as we decrease the ramping speed and lengthen the simulation time. The ferromagnetic thermal fit produces a similar Binder cumulant, indicating that the diabatic Binder cumulant can be described with a corresponding thermal Binder cumulant. On the other hand, the antiferromagnet does not become strongly ordered as we lengthen the simulation time. Moreover, the corresponding thermal fit for the antiferromagnet has a Binder cumulant that is less than zero for small values of $6\tau $ and is larger than the diabatic result when the simulation is done for longer times. As we increase the number of ions N and decrease the power law exponent α, the Binder cumulant becomes smaller and takes a longer time to achieve order. Nevertheless, the trends remain the same. Once again, we find that the antiferromagnetic case does not approximate a thermal distribution well.

Another operator that we calculated is the magnetic structure factor. The magnetic structure factor is the Fourier transform of the static spin–spin correlation function ${C}_{i,j}=\langle {\sigma }_{z}^{(i)}{\sigma }_{z}^{(j)}\rangle -\langle {\sigma }_{z}^{(i)}\rangle \langle {\sigma }_{z}^{(j)}\rangle $, which measures the correlation between two spins at sites i and j. The formula for the structure factor becomes

Equation (9)

Here, $C(r)=\tfrac{1}{N-r}{\sum }_{m=1}^{N-r}{C}_{m,m+r}$ is the average correlation between spins separated by r sites, and k is the wave number ($-\pi \leqslant k\leqslant \pi $). The larger the structure factor is, the more ordered the spin system is for a spin distortion that is modulated by the wavevector k. Figure 3 plots the structure factor for two different exponents $\alpha =0.76$ and 1.

Figure 3.

Figure 3. Structure factor for the diabatically evolved state (solid line) compared to the thermal distribution (dashed line) for N = 10 and $6\tau =3$ ms.

Standard image High-resolution image

For the ferromagnetic case, the structure factor shows a reasonable agreement between the diabatically evolved state and the thermal distribution. As α is increased—corresponding to shorter-ranged spin–spin couplings—the structure factor for the time-evolved system becomes closer to the thermal distribution result. Of course, these results are peaked near k = 0, since that is where the ferromagnetic order is the strongest. The opposite holds for the antiferromagnetic case. Increasing α (reducing the range of the interaction) produced worse agreement between the time-evolved and the thermal states. Overall, the ordering is reduced (and is now peaked around the $k=\pi $ value).

The examination of these two typical experimental observables indicate that the extra coherence effects in the diabatically evolved wavefunction (or equivalently, the off-diagonal matrix elements in the expectation values when expanded in an eigenbasis) do not strongly affect the expectation values of typical observables for the ferromagnetic case, but do for the antiferomagnetic case. While not an exhaustive proof, this result is critical to have if one wants to employ the diabatically evolved state as an analog of a thermal one. Note, however, that if we were interested in expectation values of operators that strongly couple states with different symmetries, or that have very different results for one symmetry sector versus another, then the diabatic expectation value would deviate significantly from the thermal expectation value; such operators do not seem to be common ones that are used to characterize the system, though.

Finally, we examine an analog of the specific heat. The traditional definition of the specific heat holds only in equilibrium, because the definition involves the temperature of the system and is given by

Equation (10)

Here, we generalize the definition for the diabatic case, by employing the effective temperature that is fit with the thermal average fit via ${C}_{v}^{{\rm{dia}}}=\langle {({\rm{\Delta }}E)}^{2}{\rangle }_{{\rm{dia}}}/{T}_{{\rm{eff}}}^{2}$. These results are plotted in figure 4 for N = 6, 8, 10, and 12.

Figure 4.

Figure 4. Generalized specific heat for the diabatic state (solid line) compared to the equilibrium specific heat fit with the thermal average fit (dotted line). The parameters are $\alpha =1$, $6\tau ={t}_{{\rm{f}}}$, N = 6, 8, 10, and 12, and tf ranging from 1 ms to 5 ms.

Standard image High-resolution image

The ferromagnetic specific heat has an interesting peak that develops near tf = 2 ms. It is unclear what the origin of this peak is. We do see, however, that the time-evolved diabatic state and the thermal distribution results agree well for long enough experimental times, but the agreement gets worse as N increases. The antiferromagnetic case has a fairly flat specific heat and worse agreement between the diabatic and the equilibrium results, but the shape is preserved well for different lengths of the chains and when we compare the diabatic result with the equilibrium result. It looks like the specific heat is not a good indicator of the differences between the thermal and the diabatic cases.

4. Discussion

The transverse-field Ising model is a workhorse model for testing many different quantum theories given its position as one of the simplest models with a nontrivial phase transition. It is employed in a wide range of analog quantum simulators, particularly those that use ultracold ions in traps. One of the proposed modes of operation for a quantum simulator is adiabatic state preparation, where the quantum system is prepared in a simple state and the Hamiltonian is modified, adiabatically, to evolve the system into the ground state of a nontrivial Hamiltonian. It turns out this goal is difficult to achieve in most quantum simulators, especially as the size of the system grows and as the gap to excited states gets smaller. So, an interesting corollary would be to create the analog of thermal states in the system without requiring the system to be attached to a bath (which could lead to decoherence and other problems). We have performed a series of calculations to test this idea, examining both ferromagnetic and antiferromagnetic cases. We find that the idea seems to work quite well for ferromagnetic cases. Three different ways that one can extract an effective temperature from the data tend to agree, and the value of the effective temperature is governed by the rate that the Hamiltonian is changed, which determines the extent of the diabatic excitations created in the system. We compared a number of different observables as well, and found that generically, they also seem to be reproduced accurately by this approach. So, while it is clear that there are some differences, the idea does seem to work well under one caveat. Namely, the diabatic excitation only excites states that are directly coupled to the original ground state. If the system has quantum numbers that are preserved during the time-dependent evolution of the system, then they will not allow for diabatic excitation into any other symmetry sector. Perhaps the equilibrium results are still accurately produced because of the eigenstate thermalization hypothesis [16], which says that the expectation value of most experimentally measured quantities have the same approximate value for all states that are within a narrow energy window—hence, if we average over only those states in a given energy window that share the same symmetry as the ground state, they will still approximate the results that average over all states, as long as all energy windows are well-represented in the subspace with the fixed quantum numbers.

The antiferromagnetic case presents a different story. Because the system has much smaller gaps, includes a large number of low-lying states, and has a good deal of frustration, this diabatic evolution does not represent an equilibrium thermal state well. We found it difficult to fit the results to a unique effective temperature and the different experimental quantities were not approximated so well by a thermal distribution. This leads us to conclude that while it is possible that one can use generalizations of adiabatic state preparation to create effective thermal distributions, it does not indicate that such an approach will always work. Instead, it shows that there are some systems, which are the more interesting systems, where this approach is likely to fail and the diabatic evolution is not going to create the analog of a thermal distribution. We think one of the reasons why the antiferromagnetic case is different from the ferromagnetic one is that the initial excitation of states is governed by an exponential dependence on the gap and how rapidly it is traversed. But as the system excites significant probability into excited states, they can have secondary excitations (or de-excitations) to other states, which are governed by different excitation gaps. The net result is that if one has too much excitation, then the system creates states that no longer look thermal, and this is more likely to occur for antiferromagnets with frustration, due to their significantly smaller energy gaps.

5. Conclusion

In this work, we examined the difference between the diabatic evolution of a quantum system and states represented by thermal distributions. The goal was to determine whether one could use a modification of the adiabatic state preparation approach to create analogs of thermal states in quantum simulators without attaching them to external baths at fixed temperatures. For concreteness, we chose the transverse-field Ising model as the quantum system, with parameters that are similar to those used in recent experiments. We find that the ferromagnetic case does appear to be able to create near thermal distributions of the Ising model (when the total excitation is small). This was verified by examining different ways to extract the effective temperature, and common observables like the Binder cumulant and the spin structure factor. We also found that this approach does not work as well for the antiferromagnetic case, most likely because of the small energy gaps and the large density of states at low energy which emerge due to frustration in the model.

If one could easily create thermal distributions in quantum emulators, it would open the door to a new class of experiments, where one could engineer the temperature by controlling the speed of the diabatic evolution, and then use these quantum emulators to directly test quantum phenomena in a controlled environment but with equilibrium thermal states. Such studies could provide interesting insight into critical phenomena, especially critical exponents, perhaps allowing them to be directly measured in systems where they are difficult to calculate. Further work could investigate whether there are alternative methods that would improve these results, such as varying the shape of the ramp function for the magnetic field, examining the effect of real phonon creation, or adding other terms like those used in shortcuts to adiabaticity [17] that might improve the similarity of the diabatic state with the thermal one.

Our results indicate that this approach might be feasible in ferromagnetic systems. While these systems might not be the most interesting because they do not have frustration, they could serve as a useful paradigm for this type of study and could allow for a number of interesting benchmarks to be measured which mix in both the quantum and the thermal aspects in a controlled environment. We hope that experimental colleagues will investigate these ideas in the near future.

Acknowledgments

ML acknowledges support from the National Science Foundation under grant number DMR-1004268. JF and BY acknowledge support from the National Science Foundation under grant number PHY-1314295. JF also acknowledges support from the McDevitt bequest at Georgetown University. BY acknowledges support from the Achievement Rewards for College Students Foundation.

Please wait… references are loading.
10.1088/1367-2630/18/4/043026