Large cycles in random permutations related to the Heisenberg model

We study the weighted version of the interchange process where a permutation receives weight $\theta^{\#\mathrm{cycles}}$. For $\theta=2$ this is T\'oth's representation of the quantum Heisenberg ferromagnet on the complete graph. We prove, for $\theta>1$, that large cycles appear at `low temperature'.


Introduction
The interchange process and related models of random permutations are interesting both for their beautiful mathematics, and for their relevance to quantum theoretical models for magnetization. The interchange process may be described as follows. Fix an integer n and put n labelled balls into n labelled boxes, ball i in box i. At each time t = 1, 2, . . . , select uniformly (independently) a pair of distinct boxes i, j and transpose the balls inside them. At a given time t, box i contains some ball π t (i), where π t is a permutation of 1, 2, . . . , n. Said otherwise, π t is the composition of t independent, uniformly chosen transpositions.
Being a permuation, π t can be written as a product of disjoint cycles. Schramm showed in [11], proving a conjecture of Aldous in [4], that if t is of the form ⌊cn⌋ with c > 1/2, then with probability approaching 1 as n → ∞, the largest cycle has size of order n (for c < 1/2 it is of order log n). He also described the scaling limit of the cycles in terms of the Poisson-Dirichlet distribution.
In this paper we study random permutations which are biased towards having many small cycles. A precise definition is given in the next subsection, but roughly speaking we consider the weighted version of the interchange process where each permutation π receives a weight θ ℓ(π) . Here θ ≥ 1 is fixed, and ℓ(π) is the total number of cycles in π. For θ = 1 one recovers the interchange process. Our main result (Theorem 1.1) is that large cycles appear for c > θ/2.
The model is motivated by considerations in statistical physics, where it provides a probabilistic representation of the (ferromagnetic) quantum Heisenberg model on the complete graph K n . It is a notorious open problem to prove that the quantum Heisenberg ferromagnet on the lattice Z d , d ≥ 3, can exhibit a nonzero magnetization.
Date: July 7, 2015. Research supported by the Knut and Alice Wallenberg Foundation.

1.1.
Model and main result. We will in fact work in continuous time, and with a different time-scaling than described above. Let G = K n = (V, E) be the complete graph on the vertex set V = {1, . . . , n}, with edge set E = V 2 . Let β > 0 and let P 1 (·) denote a probability measure governing a collection ω = (ω xy : xy ∈ E) of independent rate 1 Poisson processes on [0, β], indexed by the edges. If ω xy has an event at time t ∈ [0, β] we write (xy, t) ∈ ω. We think of such an event as a transposition of the vertices x, y at time t. The time-ordered product of these transpositions gives a permutation π = π(ω) of V . More precisely, if we write (x i y i , t i ), i = 1, . . . , N , for the points of ω, indexed so that 0 < t 1 < . . . < t N < β, and write τ i = (x i , y i ) for the transposition of x i and y i , then we have that π = τ N · · · τ 2 τ 1 .
Let ℓ = ℓ(ω) denote the number of cycles in a disjoint-cycle decomposition of π(ω), including singletons. Let C 1 (π), . . . , C ℓ (π) denote the cycles ordered by decreasing size (breaking ties by any rule). For θ ≥ 1 we will consider the distribution of ω and π(ω) under the probability measure P θ (·) given by Here Z = Z(θ, β) is the appropriate normalization. (The same definition makes sense on a general finite graph G.) For θ = 1 our model is the continuous-time version of the interchange process, sped up by a factor n 2 compared to the introduction, viewed at time β. We take β of the form β = λ/n where λ > 0 is constant. Theorem 1.1. Let G = K n and β = λ/n with λ > θ > 1. For each ε > 0 there is δ > 0 such that, for n large enough, Our proof of Theorem 1.1 relies on a colouring-lemma inspired by the approach of Bollobás, Grimmett and Janson [6] to the random-cluster model. Roughly speaking, if we sample ω and then colour each cycle red or white independently, with probability 1 /θ for red, then the conditional distribution of the red cycles is determined by an interchange process. See Lemma 2.2 for a precise statement. When λ > θ, we can use the results of Schramm [11] to show that there are red cycles of order n.
With high probability there are no cycles having size of order n when λ is small enough, e.g. when λ < e −1 . This follows from [7, Theorem 6.1]. One would expect that there is a critical value λ c (θ) such that there are cycles of order n for λ > λ c but not for λ < λ c . Schramm's result shows that λ c (1) = 1. It follows from the work of Tóth [12] that λ c (2) = 2. For other values of θ the existence of λ c is not known, and Theorem 1.1 is the first result on the occurrence of large cycles in this generality.
Regarding other choices for G, the interchange process (θ = 1) has been investigated on general graphs by Alon and Kozma [1], and on infinite trees by Angel [2] and by Hammond [8,9]. In ongoing work, Kotecký, Mi loś and Ueltschi are investigating cases with θ = 1 on the hypercube.

1.2.
Relation to the Heisenberg model. For θ = 2 the cycles in our model represent correlations in the (ferromagnetic, quantum) Heisenberg model, as shown by Tóth [13]. Here is a brief account, see the review [7] for more details.
The Heisenberg model on G is given by the Hamiltonian Here Pauli matrices, and σ (j) x acts on the Hilbert space H V = x∈V C 2 as σ (j) ⊗ Id V \{x} . Magnetic correlations between vertices x, y ∈ G are given by the correlation functions where tr(·) denotes the trace of a matrix. In this formulation the parameter β > 0 is usually called the inverse temperature. (It is the same β as in Section 1.1.) Tóth's representation expresses the correlations (2) probabilistically. Write {x ↔ y} for the event that x and y belong to the same cycle in the permutation π(ω). Then we have, with θ = 2: . Thus the occurrence of large cycles in π(ω) corresponds, in physical terms, to magnetic ordering.
The quantum model also possesses other probabilistic representations. In the paper [12] Tóth studied a lattice gas representation and explicitly computed the free energy. (The same result was independently obtained by Penrose [10].) By standard arguments one may deduce from these results that the quantum Heisenberg model on G = K n undergoes a phase transition at β = 2/n, as mentioned above.
Outline and notation. We describe a graphical representation, and present the key colouring-lemma, in Section 2, followed by the proof of Theorem 1.1 in Section 3.
The abbreviation i.i.d. means independent and identically distributed. The identity permutation is denoted id. Unspecified limits are for n → ∞. If a n /b n → 0 then we write a n = o(b n ), if a n /b n is bounded above then we write a n = O(b n ). The indicator of an event A is written 1I A , and takes the value 1 if A happens, otherwise 0. Expectation with respect to P θ will be written E θ .

Colouring-lemma
The following graphical representation of a sample ω will be useful. We picture ω in G × [0, β], representing a point (xy, t) ∈ ω by a 'cross' as in Recall the permutation π = π(ω) defined in Section 1.1. The points z such that γ visits (z, 0) are precisely x, π(x), π 2 (x), . . . (in the same order). Thus the loops γ are in one-to-one correspondence with the cycles C of π, and the total vertical length of a loop equals β times the size of the corresponding cycle.
Let θ > 1 and r = 1 /θ. Given ω, colour each loop γ ∈ L(ω) red or white, independently of each other, with probability r for red. Write R and W for the unions of the red and white loops, respectively; they are subsets of Figure 1. The points of ω (i.e. the crosses) now fall into three categories: red, white and mixed. Write ω r , ω w and ω m for the red, white and mixed crosses, respectively. Thus that is, the set of points in E × [0, β] 'between' points of S. The following is the key colouring-lemma.
Lemma 2.1. Given R, the distribution of ω r is P R 1 (·), and ω r is conditionally independent of ω w .
In words, conditional on the red set, the red crosses simply form a Poisson process. This means that (given R) the red cycles are obtained from a sample of the interchange process, in a way which will be made precise in Lemma 2.2. Lemma 2.1 holds for general graphs G, with the same proof.
One way to check Lemma 2.1 is to finely discretize the Poisson processes. We present instead a 'clean' proof, and the basic approach is as follows. We write a coloured loop-configuration as a pair (q, ω), where q ∈ {r, w} V is a colouring of the vertices. We interpret q x as the colour of the loop containing (x, 0), thus we require the pair (q, ω) to be consistent in that q x = q y whenever x, y belong to the same cycle. Write C(q) for the set of ω that are consistent with q. Note that, for ω ∈ C(q), the red and white sets R, W are determined by the pair (q, ω m ) where ω m are the mixed crosses as before. Indeed, deleting red or white crosses does not change R or W : compare Figure 2 with Figure 1. Thus we may write R = R(q, ω m ) and W = W (q, ω m ), and moreover there is some freedom in the choice of ω r ∪ ω w . The only restriction on ω r ∪ ω w is that it is a subset of We will proceed by conditioning on the mixed crosses ω m . When integrating over the allowed choices for ω r , a cancellation occurs in the factor θ ℓ(ω) which removes the dependencies in R.
Proof of Lemma 2.1. Let X = X(R) be a bounded R-measurable random variable, and consider events A = A(ω r ) and B = B(ω w ) depending on ω r and ω w , respectively. We will give an expression for E θ [X1I A 1I B ] which will allow us to deduce the result.
As noted above, R is the same if we remove all red and all white crosses, that is if we replace ω = ω r ∪ ω w ∪ ω m withω = ω m . We now introduce some notation, see Figure 2 again. Let be the set of red vertices at time t = 0. These are the elements of the red cycles of π(ω). For each x ∈ R 0 , the trajectory of x formed by following the vertical lines and crosses inω in the time interval [0, β] resembles a 'crooked line'. We writeh t (x) ∈ V for the location of this line at time t. We write h t (x) for the corresponding location obtained using ω. We take both these functions to be right-continuous in t.
These are permutations of R 0 , and ϕ is precisely the restriction of π(ω) to R 0 . Hence the cycles of ϕ are the red cycles of π. Clearlyφ is R-measurable. Let ξ = (ξ xy : x, y ∈ R 0 , x = y) be a collection of independent rate 1 Poisson processes on [0, β], independent of everything else. We interpret the points (xy, t) ∈ ξ as transpositions of the vertices x, y as before, and let σ t be the time-ordered product of these transpositions up to time t. Thus σ t is a sample of the interchange process in the set R 0 . We will use Lemma 2.1 to prove the following.
Lemma 2.2. Given R, the conditional distribution of (h t (·) : t ∈ [0, β]) coincides with that of (h t • σ t : t ∈ [0, β]). In particular, In words, the conditional distribution of the red cycles of π(ω), given the union R of the red loops, is given by the interchange process σ β on R 0 , composed with a 'twist'φ (which is a function of R).

Proof of Lemma 2.2.
The key observation is that, thanks to Lemma 2.1 and the symmetry of the complete graph, ω r has the same (conditional) distribution as the collection of points of the form (6) (h t (x)h t (y), t) for (xy, t) ∈ ξ.
For simpler notation we identify ω r with the collection of points in (6). With this identification, we can prove the statement of the lemma with equality (not just in distribution). Some further notation is required. Let R t = h t (R 0 ) =h t (R 0 ) be the set of red vertices at time t. Let t 1 < t 2 < · · · denote the sequence of times at which there are mixed crosses (elements of ω m =ω). Also set t 0 = 0. Then R t is constant for t k−1 ≤ t < t k , for k ≥ 1. Moreover, there is a unique a k ∈ R t k−1 and a unique b k ∈ R t k−1 such that R t k = ψ k (R t k−1 ), where Then for t k ≤ t < t k+1 we have that Now, for all k ≥ 0 let denote the times of events (transpositions) in ξ, and for 1 ≤ q ≤ m k let τ With this notation in hand, we can turn to the proof. For t = 0 we havẽ h 0 = id, σ 0 = id and h 0 = id, so clearly the claim holds then. In fact, for t < t 1 we have thath t = id and that h t = σ t , so the claim holds for such t (by Lemma 2.1). Since the functions involved only change at times t k and s (k) q , we can proceed by induction.
Assume that the claim holds for some t > 0, i.e. assume that Let t ′ be the time of the next 'event'. That is, either t ′ = t k for some k, or q for some k, q. It suffices to show that h t ′ =h t ′ • σ t ′ holds in both cases.
First case: t ′ = t k . Then h t k is obtained by applying ψ k , thus Here we used (7) and the fact that σ t k = σ t . Thus h t ′ =h t ′ • σ t ′ holds in this case. Second case: q . Now we have thath t ′ =h t , and that By the identification of ω r with (6), we obtain h t ′ by transposingh t (x otherwise. Using that h t =h t • σ t , we can rewrite this as h t ′ (z) =h t (τ (k) q (σ t (z))), as required.

Large cycles
In order to use Lemma 2.2 to analyze the cycle structure of π, we will first need results on the cycle structure ofφ • σ t , where σ t is given by the interchange process andφ is a non-random permutation.
3.1. Random and non-random transpositions. The following result will be obtained using small modifications of Lemmas 2.1-2.3 of [11]. (A similar result can be obtained by a small modification of Theorem 1 of [3].) Here σ t denotes a sample of the interchange process (θ = 1 in (1)) on 1, . . . , n, viewed at time t, andφ is a deterministic permuation of 1, . . . , n. In this subsection we write P for P 1 .
Proposition 3.1. Let λ > 1 and t = λ/n. For each ε > 0 there is δ > 0 and n 0 (λ, ε, δ) such that for n ≥ n 0 we have Proof. First note that, sinceφ • σ t and σ t •φ are conjugate, it is equivalent to consider the largest cycle in the process (σ t •φ) t≥0 which starts with the permutationφ at time t = 0. We associate withφ a graphG whose connected components coincide (as sets) with the cycles ofφ. One way to do this is to decompose each cycle ofφ as: and let the edges ofG be the pairs {x i , x i+1 } obtained in this way. For t ≥ 0 we let G t be the (multi-)graph obtained by representing each new transposition that appears in the process (σ t ) t≥0 by an edge, and we letG t be the (multi-)graph obtained by superimposing G t onG. Note that G t has the distribution of an Erdős-Rényi graph G(n, p) with p = 1 − e −t . Also note that each cycle of σ t •φ is contained in some connected component of G t .
We have that Lemmas 2.1-2.3 of [11] hold in this situation, with the graph G t replaced byG t . Indeed, one need only check the proof of Lemma 2.2, since Lemmas 2.1 and 2.3 do not refer to the graph. We provide an outline of the arguments.
As in Lemma 2.1 of [11], each time we apply a new transposition in σ t , the probability that it splits an existing cycle so that at least one of the resulting cycles has size ≤ k is at most 2k/(n − 1). Let V t G (k) ⊆ V be the union of all components ofG t of size at least k, and let V t X (k) ⊆ V be the union of all the cycles of σ t •φ of size at least k. As in Lemma 2.2 of [11] we have that (Recall that our process is sped up by a factor n 2 compared to [11].) This is because each cycle of size < k which lies in a component of size ≥ k can be associated with a transposition which split an existing cycle so that at least one resulting cycle had size ≤ k (and this can be done so that at most two cycles get mapped to the same transposition). Here we use the fact that at time t = 0 the components are equal to the cycles.