Abstract
The successful implementation of algorithms on quantum processors relies on the accurate control of quantum bits (qubits) to perform logic gate operations. In this era of noisy intermediate-scale quantum (NISQ) computing, systematic miscalibrations, drift, and crosstalk in the control of qubits can lead to a coherent form of error that has no classical analog. Coherent errors severely limit the performance of quantum algorithms in an unpredictable manner, and mitigating their impact is necessary for realizing reliable quantum computations. Moreover, the average error rates measured by randomized benchmarking and related protocols are not sensitive to the full impact of coherent errors and therefore do not reliably predict the global performance of quantum algorithms, leaving us unprepared to validate the accuracy of future large-scale quantum computations. Randomized compiling is a protocol designed to overcome these performance limitations by converting coherent errors into stochastic noise, dramatically reducing unpredictable errors in quantum algorithms and enabling accurate predictions of algorithmic performance from error rates measured via cycle benchmarking. In this work, we demonstrate significant performance gains under randomized compiling for the four-qubit quantum Fourier transform algorithm and for random circuits of variable depth on a superconducting quantum processor. Additionally, we accurately predict algorithm performance using experimentally measured error rates. Our results demonstrate that randomized compiling can be utilized to leverage and predict the capabilities of modern-day noisy quantum processors, paving the way forward for scalable quantum computing.
- Received 13 May 2021
- Accepted 3 September 2021
DOI:https://doi.org/10.1103/PhysRevX.11.041039
Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article’s title, journal citation, and DOI.
Published by the American Physical Society
Physics Subject Headings (PhySH)
Popular Summary
The successful implementation of algorithms on quantum processors relies on the accurate control of quantum bits (qubits) to perform logic gate operations. While classical bits can be 0 or 1, qubits lie in a continuous state space, making them susceptible to imperfections in their analog control signals. This leads to a coherent form of error that has no classical analog, placing limits on the scale of reliable quantum computations. Here, we show that by converting these coherent errors into stochastic noise, one can improve the performance of quantum algorithms.
The method we use is known as randomized compiling: By creating a family of quantum circuits with different gates that all perform the same operation and combining their results, coherent errors in the original circuit are averaged into stochastic noise. The difference between coherent errors and stochastic noise can be understood using a rough analogy to the signal-to-noise ratio in classical systems, such as astronomy: The signal will coherently accumulate with more measurements, whereas random and uncorrelated noise grows more slowly.
In our experiments, we show marked improvement of algorithms such as the quantum Fourier transform by reducing the error rate per gate with randomized compiling. Further, we show that randomized compiling enhances the predictability of quantum circuit performance, finding that the measured accuracy is strongly correlated with predictions based on benchmarked gate error rates. This is not the case when dominated by coherent errors, which interfere unpredictably from gate to gate.
We perform our experiments on a superconducting quantum processor, a popular platform for scalable quantum computing. However, the general principle underlying randomized compiling, and the classes of quantum algorithms amenable to this type of noise tailoring, are applicable to all qubit platforms.