Work with what you’ve got

Quantum computing combines great promise with daunting challenges — the road to devices that solve real-world problems is still long. Now, an implementation of a quantum algorithm maps the problems we want to solve to the devices we already have.

shows that these two functions are indeed sufficient to understand the form of both the short-distance and the high-momentum behaviour of the two-nucleon densities in many nuclei from deuteron up to 40 Ca and that, in most cases, the nuclear contacts C are indeed the same for coordinate and momentum space.
While there is no doubt that the nucleon-nucleon interaction is short-range, its precise form is not known from first principles. Nowadays, the phenomenological hard-core nucleon-nucleon potentials are more and more replaced by 'softer' potentials derived from chiral perturbation theory 4 . The latter is the low-energy effective theory of quantum chromodynamics with nucleons and pions as degrees of freedom. Cruz-Torres and colleagues have done their calculations for a couple of different nucleon-nucleon interactions, including phenomenological (hard) and chiral (soft) ones. Because chiral perturbation theory can only deal with low-momentum degrees of freedom, higher momenta must be cut off. The missing high-momentum physics is compensated by additional three-body interactions 5 , which then play a much more important role than in the case of hard interactions. However, three-body effects are not included in the generalized contact formalism, and therefore induce some discrepancy between the contacts in coordinate and momentum space. Thus, the study by Cruz-Torres and colleagues also shows that the separation of two-and three-body correlations is not perfect and depends on the momentum cut-off that reflects the spatial resolution of the theory. These findings will stimulate more efforts to understand experimental observations of short-range correlations (see, for instance, ref. 6

Work with what you've got
Quantum computing combines great promise with daunting challenges -the road to devices that solve real-world problems is still long. Now, an implementation of a quantum algorithm maps the problems we want to solve to the devices we already have.

Boaz Barak
T he history of computing is a story of ever larger and more general devices. Abaci were used in ancient times, but the first vision of general-purpose computers was glimpsed by Charles Babbage in the 1830s. Alan Turing gave us a mathematical model for such computers in the 1930s and they were first constructed in the 1940s. The rapid improvement in both speed and memory since then has given us modern general-purpose computers. In the quantum case, we have a vision -and mathematical models -for large-scale computers, but as yet no blueprint for their construction (Fig. 1). Now, writing in Nature Physics, Harrigan et al. report that they have performed experiments on a candidate quantum algorithm for intermediate-term devices 1 , clarifying the challenges in taking this technology from the blackboard to the field. Current quantum computers have been described 2 as noisy intermediate-scale quantum (NISQ) devices. NISQ devices have a memory on the orders of tens or hundreds of bits -eight to nine orders of magnitude smaller than standard laptops -and, more importantly, they suffer from noise. Every time we operate on these bits, there is a small but non-negligible probability that the result is corrupted. And although theoretically this noise can be mitigated by quantum error-correcting codes, we have not yet crossed the threshold of building devices sufficiently large and robust to implement these codes.
NISQ devices pose a conundrum. On the one hand, these are physical systems that are too complex to simulate using classical computing: the time taken to simulate a NISQ device grows exponentially with the number of qubits. On the other hand, because of their limitations, it is not clear how to do useful computation with them. In particular, we do not know how to apply NISQ devices to practical problems currently not efficiently solvable with classical computers. Now, Harrigan et al. 1 have begun to address this issue by implementing the quantum approximate optimization algorithm 3 (QAOA) on their 53-qubit Sycamore NISQ device. The QAOA is an approach to use NISQ devices to improve on classical algorithms for canonical combinatorial optimization problems such as maximum cut and others that arise in many engineering and scientific applications. However, realizing this approach requires tackling several theoretical and experimental difficulties. Although the results are yet to rival classical computing, the collaboration verified several theoretical predictions, which puts the study of the QAOA on a stronger footing.
The QAOA can be thought of as a quantum analogue of algorithms such as simulated annealing and gradient descent, which start with a random solution to the given optimization problem and gradually improve it. Specifically, the QAOA uses a sequence of quantum operators that could potentially achieve better-quality solutions than those achieved by classical algorithms. The near-term benefit of the QAOA is that, unlike other quantum algorithms such as Shor's factoring algorithm 4 , it can be implemented on an intermediate-scale noisy device.
However, there are several main challenges in making this reality. First, although repeating many steps of the QAOA should, in theory, improve the outcome, each iteration also increases the effect of noise and so it is unclear that more steps would lead to better results. Second, the QAOA algorithm depends on certain 'hyperparameters' , which need to be found using a classical optimizer. For the algorithm to be useful, the classical predictions for these parameters should be borne out and stable across many different problem instances. Third, executing QAOA on a NISQ device in practice requires mapping the problem we want to solve into the device's architecture. Such mappings can introduce their own overhead and inefficiencies.
Harrigan et al. have made progress on all three fronts. Their results give the first experimental demonstration that more steps of the QAOA can deliver better results. They also showed evidence that the hyperparameters search via classical optimization corresponds to the actual experimental landscape. Finally, they have provided new studies of the complexity of mapping problem instances to the hardware. They demonstrated that the effect of noise becomes more pronounced as the instance is further from the hardware architecture. This suggests that near-term applications of QAOA will be most successful on instances that correspond to the grid-like architectures of current NISQ devices.
We still have not yet achieved so-called quantum advantage for the QAOA. The instances involved are still too small, and the level of noise too large, for the QAOA to beat its classical competitors. The big question of whether QAOA or any other quantum algorithm can achieve exponential speedup on combinatorial optimization on NISQ devices still remains. But we are now in a much better position to study it. ❐