Thoughts on complex systems: an interview with Giorgio Parisi

The Nobel Laureate Giorgio Parisi is interviewed by JPhys Complexity Editor-in-Chief, Ginestra Bianconi, on themes related to the 2021 Nobel Prize in Physics awarded to him for research on complex systems.

I think that research on complex systems is quite interesting, because-first of all-it deals with systems that were not previously considered part of physics. Examples of complex systems are very different and include ecosystems, the dynamical behaviour of networks, and optimization problems. The case of optimization problems is especially interesting because they were not traditionally considered part of physics.
Another thing that is rather interesting is that a lot of the problems that are fundamental in complex systems are related to non-equilibrium physics. This aspect is quite important, because-once upon a time-most of physics was concerned with equilibrium statistical mechanics and very few things were known about nonequilibrium statistical mechanics. Now the urgency to study complex systems pushes us to also study offequilibrium statistical mechanics-very far from equilibrium, close to equilibrium and so on-and I think that this research has (led to) a much bigger and better understanding of off-equilibrium behaviour in physical systems.

You have made pivotal contributions in complex systems ranging from spin glass theory to surface growth, multifractality, stochastic resonance and bird flocking. Do you think that the study of complex systems provides a particular source of inspiration for you? Or it is a field objectively more diverse than other more traditional fields of science?
Well, the point is that in reality I had no intention at the beginning of studying complex systems. I did some work on second order phase transitions in 1972 or 1974 but after that I was mostly interested in quantum field theory. What happened was that by chance I read a paper-maybe by D K Lubensky-on polymers. Why was I interested in polymers? I was interested in polymers because I was interested in lattice gauge theories and lattice gauge theories in high dimensions have a behaviour that looks like that of polymers. So, I was trying to understand lattice gauge theory and I discovered by reading this paper that there was a paper by (David) Sherrington and (Scott) Kirkpatrick on spin glass systems that (apparently) gave results that were wrong [I had no idea whatsoever (at this point) what spin glasses were]. Sherrington and Kirkpatrick were using the replica theory and it was not clear why the replica theory in that particular case gave the wrong results. So I was really interested in trying to figure out what was wrong. My idea was that this would be good for the community; there is a problem, for which there is a theory that does not work, and you do not know why. So I think that it was a duty to try to fix the problem. I think that at the beginning I was too optimistic that it should be easy to fix. (In the end) it was not so easy, but anyhow in a few months I realized how to fix at least part of the problem and finally the whole problem, and after I had written a few papers on this I continued to work on quantum field theory.
The point is that at a certain moment I tried to understand the solution that I gave to the Sherrington-Kirkpatrick model, and I also started to work with (Marc) Mezard, (Nicolas) Sourlas, (Gérard) Toulouse and (Miguel Angel) Virasoro. What I realized at the end is that in this particular model there were a lots of equilibrium states, i.e. the number of equilibrium states was infinitely large, which is something that I did not suspect from the beginning. Moreover, these states could be organized in a taxonomic classification, which people also call 'ultrametricity'. This was something quite surprising; to have in certain equilibrium models a taxonomic classification of states with infinite branches is something really unexpected. So, from these results it become clear that the system was complex, and the complexity of the system was really discovered three to four years after I started to work on the system. So I found myself in the mainstream of complexity without asking for it, also because the replica method was powerful for understanding other complex and disordered systems. So, it is something that I started to work with. Moreover, I went on because I wanted to really understand many of the details of the solution of the Sherrington-Kirkpatrick model and the (derivations) that I did. It was something that by chance I could not have done if I was not interested in that particular model of lattice gauge theory in very large dimensions. There was a paper by (Nicolas) Sourlas and (Jean-Michel) Drouffe that addressed that problem, I could (have gone) on with quantum Monte Carlo for lattice gauge theories and things like that without becoming interested in a general problem in complex systems.

In your popular book you have several interesting passages discussing how a scientific discovery is made.
Yes, in the book [1], I discuss how you get ideas from a problem that you try to prove, and not so much how you start to work on a problem by chance.

Your contribution to physics is really exceptional; in particular your work on spin glasses and replica symmetry breaking has been pivotal for the development of neural networks, machine learning, and to solve hard combinatorial problems. Can you tell us more about replica symmetry breaking and its role in understanding complex systems 'beyond' traditional physical applications?
The relation between spin glass theory, replica symmetry breaking, with other systems is the following: The essence (if you take out the technicalities which are clearly extremely important in a problem of that type, because the technical aspects are those that transform generic speech into real, scientific, precise and mathematical speech) the idea is that in these kinds of systems you have multiple equilibria. Now, multiple equilibria have different meanings and implications depending on the context. For instance in geologic eras, you have some very long periods of a similar situations from a geological point of view and you have (other) very fast transitions from one era to another era. Coming to neural networks, neural networks are a paradigmatic example of systems with memory: you can think of the Hopfield model, or you can think also of the memory of a primate, of a human, of an ape or of a monkey. You have the situation in which you can store a lot of information in that memory and you can do retrieval. When you do retrieval the system stays in the same state for a long time, a long time compared to the typical turn on/off of the neurons. Therefore, it is an equilibrium state. So, the incredibly large number of things that we memorize, and that we recall, corresponds to the infinitely large number of equilibrium states of memory. Also, when you think of-for example-an optimization problem, there are some problems for which the optimization is quite straightforward, or not so straightforward but can be done in polynomial time. There are other systems which are much more difficult because the algorithm might be stuck in something that you can call a 'local minimum' of the system and the fact that you have a very large-maybe an exponentially large-number of local minima, corresponds to the fact that the algorithm takes an exponentially large time to find the solution. So you have the connection between the equilibrium complexity of the system that you can compute with statistical mechanics with the slowness of an algorithm that tries to find the best configuration to solve problems like the travelling salesman problem, and so on.

This Nobel Prize 2021 has been awarded to you for 'the discovery of the interplay of disorder and fluctuations in physical systems from atomic to planetary scales', for works that 'revolutionized the theory of disordered materials and random processes'. From your theoretical perspective what is the correct picture we should have for understanding in general the role of fluctuations in complex systems?
Fluctuations are very important in many, many situations. For example, by studying the fluctuations you can discover whether the system under consideration is a system with a single equilibrium state and or a complex system with multiple equilibria. The other thing that is extremely important is that fluctuations are an off-equilibrium behaviour. In fact, according to the ideas of (Leticia) Cugliandolo and (Jose) Kurchan it is possible to relate the static properties that can be measured in equilibrium for spin glasses to the offequilibrium behaviour. Therefore, from these arguments you can derive new fluctuation-dissipation relations in off-equilibrium dynamics.
Moreover, fluctuations are also quite relevant at the planetary scale-as is revealed by the stochastic resonance framework. In a formal way the stochastic resonance phenomenon can be summarized as follows: we typically have half-degree Kelvin fluctuations from one year to the other, however in the stochastic resonance approach these half-degree Kelvin fluctuations are critical in triggering glaciations that correspond to a variation of 10 degrees. So, it is clear that in this type of system sometimes you have the amplification of fluctuations. Coming back to the idea of multiple equilibria; this phenomenon is clear because an equilibrium state can stay for a long time while you can have periods of transition between one equilibrium state to another equilibrium state due to fluctuations. This phenomenon is what we call a tunnel event, activated event and follows Arrhenius' law in equilibrium statistical mechanics. All these kinds of terms indicate the effect of fluctuations: fluctuations in some cases are not very important whereas in some other cases they are extremely important in producing tunnelling from one state to another state.
These things were really understood by (Niles) Eldredge and (Stephen Jay) Gould in their theory of a punctuated equilibria. They were not using the word 'tunnel', but they were looking at the evolution of species from the viewpoint of the phenotype or the morphology or what you can see in geologic remnants. They observed that in evolution you have a long period of stasis and these long periods of stasis are punctuated by fast regions of evolutionary change. What we can say is that the fast periods of evolutionary change are periods in which you go from one minimum to another minimum and you are crossing some kind of barrier in fitness landscape. What is clear from all the analyses that they present in their book and in their theory, is that the intermediate species-i.e. the intermediate steps between one species and another species-in general are not as fit as the initial one or the final one. Maybe as fit in some small region, and so on, but generally speaking it is some kind of tunnelling event. Therefore if we stay in the situation where a tunnelling event dominates the dynamics, the role of fluctuations is extremely important.
We can really understand the role of fluctuations because many molecular motors-many proteins and so on-have dynamics which are controlled by the Arrhenius law and therefore by tunnelling events. For example, in our (human) body we have an internal temperature which goes from 100 Fahrenheit (37.8 • C) to 106 Fahrenheit (41.1 • C); this is only a 1% change in absolute temperature and yet is large enough to clearly completely upset the body. So the tiny fluctuations in energy that are connected to the temperature are an incredible leverage to control the behaviour of a complex system like a body.

Complex systems have been studied under different points of view, and now complex systems is a striving interdisciplinary field including a series of methodologies: spin glass, networks, non-linear dynamics, stochastic and random processes with applications to climate, biology, neuroscience, epidemiology, economics. From your perspective, do you see unity in these different aspects of complexity?
Well, I think that the existence of multiple equilibria is what makes things unique. Now, unity is more difficult to say because you can say that all simple systems look similar one to the other and each complex system is complex in its own way. I mean it is like the start of Anna Karenina.
Therefore, it is difficult to put unity. There are some features that are common: slow dynamics, etc but in reality, when you look in detail, complex systems are quite different the one from the other one.
There's an old idea that goes back to (Karl) Popper. Popper says that a scientific revolution, a change of paradigm is a change of the spectacles that you use to look to the world, to interpret what happens. And now complexity, and all the theory of complex system is in some sense a different way to look to the world. It contains a unity in the kind of (perspective) that you have on reality but after that, at the technical level, it is quite different from one to the other.

According to you, what are the topics in complex systems research that will have the most important developments in the next two to three decades?
Well, I think that a very important development was about real glasses because many of the ideas that have been developed for spin glasses have been (applied) to glasses. Also, I was particularly surprised by the recent development on the problem of jamming of hard spheres for which one can do lots of things. Now all the physics of glasses is moving in a very fast way, because now people are really starting to study in a systematic way the off-equilibrium behaviour like fracturing, and so on. Also, because in one way or the other, people have been able to find a good way to thermalize glasses, ultrastable glasses were produced experimentally but also now theoretically and therefore we can understand the behaviour of glasses (across) very large time (scales).
Also, all of network theory has flourished in an incredible way. We have many, many applications of network theory which are very far away from one another and in many, many different disciplines and applications you have a deep understanding of networks.

You are famous for being much loved by your students and your young collaborators. You had real acclamation at the University of Rome 'La Sapienza' when the 2021 Nobel Prize was announced. What is your advice to young researchers in complex systems?
Well, I think that the advice will be very similar to the one I would also give to those who do not work in complex systems. I think the main advice is to try to understand, first of all, what are your abilities, your better abilities, and try to take profit of your capabilities at their best. This is something that is universally valid. And also, I think it is important to be confident in yourself as sometimes you do not study systems, problems that you think are too difficult because you are not confident enough. But that is something that is true everywhere.
I think that one should try to look to some interesting problems and try to take some risks. Of course, take some risks with moderation because one should not take too big a risk, I mean trying to solve the Fermat theorem for twenty years without succeeding-although now it is been solved-but I mean, in the past, that would have been quite frustrating. So, it is good that you have to understand which things you can reach and which things you cannot reach. But it is clear that to have a positive attitude is extremely important. I remember very well that once upon time there was some discussion about a certain conjecture, and I was saying during a seminar that I was hopeful that this conjecture could be solved soon. I said this because I had a private communication that it was solved, but I did not want to tell the people that it was solved before they had published the paper. After my talk Marc Mezard asked me: 'but why do you say this: you know that we are not able to solve this conjecture' and then I said, 'no, look it was solved by Silvio Franz and Luca Peliti'. He reflected on it for 30 s, and then told me 'yes I see' and he told me the proof of the conjecture. So just knowing that the conjecture could be proven was enough for him to find the proof in 30 s. So it is clear that sometimes one must be confident in what we can do.