A note on Ising random currents, Ising-FK, loop-soups and the Gaussian free field

We make a few elementary observations that relate directly the items mentioned in the title. In particular, we note that when one superimposes the random current model related to the Ising model with an independent Bernoulli percolation model with well-chosen weights, one obtains exactly the FK-percolation (or random cluster model) associated with the Ising model. We also point out that this relation can be interpreted via loop-soups, combining the description of the sign of a Gaussian Free Field on a discrete graph knowing its square (and the relation of this question with the FK-Ising model) with the loop-soup interpretation of the random current model.


TITUS LUPU AND WENDELIN WERNER
Abstract. We make a few elementary observations that relate directly the items mentioned in the title. In particular, we note that when one superimposes the random current model related to the Ising model with an independent Bernoulli percolation model with well-chosen weights, one obtains exactly the FK-percolation (or random cluster model) associated with the Ising model. We also point out that this relation can be interpreted via loop-soups, combining the description of the sign of a Gaussian free field on a discrete graph knowing its square (and the relation of this question with the FK-Ising model) with the loop-soup interpretation of the random current model.

A simple direct Ising-random-current/Ising-FK coupling
Let us first briefly review the definitions of the basic models (Ising, random current and FK-Ising) that we will discuss. Troughout this section, we will consider a finite connected graph G consisting of a set of vertices X and a set of non-oriented edges E. We will also use a function β = (β e ) from the set of edges into R + .
The Ising model. The Ising model on G with edge-weights (β e ) is the probability measure on Σ := {−1, +1} X defined by P β ((σ x )) = Z −1 β e exp(β e I σ (e)), where I σ (e) denotes the product σ x σ y where x and y are the two extremities of the edge e, and Z β is the renormalization constant (sometimes called the partition function) chosen so that this is a probability measure.
The random current model. The random current model is closely related to the Ising model, and has been instrumental to prove some of its important proprties (see [1], [2] and the references therein). Here, one assigns to each edge e of the graph a random non-negative integer N e . In fact, our set N of admissible configurations imposes the additional constraint that for each site x, the sum of the N e 's for all adjacent vertices to e is even (when e is a vertex from x to x, then N e is counted twice). The random current model is the probability measure on N defined bŷ By expanding each exp(β e I e (σ)) in the definition of the Ising model into ne (σ x σ y β e ) ne /(n e )!, and resumming over all σ's (noting that all terms with odd degree in σ x do sum up to 0 by symmetry), one easily sees that the partition functionẐ β for the random current model is indeed the same as the partition function Z β of the Ising model.
The FK-Ising model. The FK-model (named after Kasteleyn and Fortuin) associated to the Ising model (it is also called the random cluster model but we stick here to FK-Ising percolation terminology in order not to confuse random currents with random clusters) is a probability measure on {0, 1} E . The edge e is said to be open for the configuration w = (w e ) if w e = 1, and when w e = 0 it is closed. To each configuration w, one look at the graph G w consisting of the sites X and the set of edges that are open for w, and denote by k(w) its number of connected components.
The FK-Ising model is defined bỹ (1 − exp(−2β e )) we (exp(−2β e )) 1−we (this is sometimes described as the FK model for q = 2 and edge-probabilities 1 − e −2βe ). The FKmodel is useful to study the Ising model as the connectivity properties of the FK model correspond to the correlation functions of the Ising model. Indeed, when one samples the FK model and then chooses in an i.i.d. way a sign for each of the clusters of G w , one obtains a function (assigning a sign to each site) that follows exactly P β (and this leads to a simple expression for the ratio between Z β andZ β ), see for instance [6] for basics about FK-percolation. One can easily check that this property ("coloring the FK-clusters at random gives the Ising model") in fact characterizes the law of the FK-clusters (ie. the information that says which sites are in the same cluster, but not necessarily the information about the state of all edges).
Bernoulli percolation. Bernoulli bond percolation with probabilities (p e ) is the product probability measure on {0, 1} E , where each edge e is open with respective probability p e . In the sequel, we will use the probabilities p e := 1 − exp(−β e ).
We are now ready to state and prove following coupling lemma: The "Current+Bernoulli=FK" coupling lemma. Let us consider a random current configuration (N e ) with parameters (β e ), and an independent Bernoulli percolation configuration (ξ e ) with probabilities (p e = 1 − exp(−β e )). We then define V e := 1 − 1 Ne=ξe=0 ∈ {0, 1} for each e, so that V e is equal to 1 if and only if at least one of N e or ξ e is non-zero. Then, the law of (V e ) is exactly the FK-Ising measure on G with weights (β e ).
Note that this provides a direct coupling between the Ising model and the random current model at the level of probability measures, that somehow enlightens the identity between the partition functions.
Proof. As one can expect for such a simple statement, the proof is fairly simple as well: Let us first define U e to be equal to 0, 1 or 2, depending on whether N e is 0, odd, or positive even. Note that this is enough information in order to construct V , and that the law of (U e ) is the probability measure on N ∩ {0, 1, 2} E defined by where f e (0) = 1, f e (1) = sinh β e and f e (2) = (cosh β e ) − 1.
Let us define V e as in the statement of the lemma. We define α e to be 1 if N e is even, and −1 if N e is odd. Then, we defineṼ e = α e V e . Note thatṼ e = −1 if and only if U e = 1 (the corresponding weight contribution in the probability is therefore sinh(β e )), and thatṼ e = 1 if either u e = 2, or if u e = 0 = 1 − ξ e (the total weight contribution in the probability is then (cosh β e ) − 1 + (1 − exp(−β e )) = sinh(β e ) as well -the fact that these two quantities are equal is the key point in the proof). It therefore follows, that the probability that (Ṽ e ) = (ṽ e ) for an admissible (ṽ e ) (meaning that each site must have an even number of incoming edges with negativeṽ e ), is equal to Hence, the probability that (V e ) = (v e ) is equal to where K v denotes the number of admissible choices forṽ that are compatible with v (meaning that |ṽ e | = v e ). In other words, K v is the number of ways to assign a sign to each open edge e for the configuration v, in such a way that each site has an even number of negative incoming signs. But this quantity is easily shown (see below) to be equal to is the number of open edges for v and |X| the number of sites in the graph, so that the law of (V e ) is which is indeed the same as the FK-Ising measure.
In order to see that K v = 2 o(v)+k(v)−|X| , one can for instance proceed by induction, adding edges one-by-one to a forest-like graph (if G v is a forest, all of its edges have to be of positive sign) and to see that for each new edge that one adds to v without joining two connected components, one gets an additional multiplicative factor 2 (this is classical; the number o(v) + k(v) − |X| is the first Betti number, also known as Kirkhoff's cyclomatic number, of the graph G v ).
It is worth noticing that this property of the random current measure trace does in fact characterize the distribution of the configuration (W e ) := ((1 Ne =0 )) that describes what edges are occupied by the random current. More precisely, the distribution of (W e ) is the only one such that if one considers an independent Bernoulli percolation (ξ e ) with parameters (p e ), and looks at the collection (max(ξ e , W e )), one obtains exactly the FK-Ising model with parameters (β e ). Indeed, one can recover by induction over n ≥ 0, the probability of all configurations with n occupied edges (for instance, the probability that all edges are unoccupied for W is the ratio between the probability that they are all closed for the FK model and the probability that they are all closed for the percolation, and then one can work out the probability of a configuration where just given edge is occupied etc.).

Relation to loop-soup clusters
Let us now explain how the previous considerations can be embedded in the setting of the coupling between loop-soup clusters and the Gaussian free field (GFF) as pointed out in [10], using the relation between random currents and loop-soups described in [16]. This will follow by combining the following observations: • Consider a discrete GFF h on the graph G, where we view the (β e )'s as conductances of an electric network. This is the Gaussian random vector (h x ) with intensity proportional to where |∇ e h| := |h(x) − h(y)| where x and y are the two extremities of the edge e. This GFF is in fact only defined "up to an additive constant" (ie. it is not well-defined) because the previous quantity is invariant when one adds the same constant to all h x 's, but we can for instance artificially (and arbitrarily for what will follow, because we will then anyway condition this GFF by the value of its square) add an edge to our connected graph, joining a site x 0 ∈ X to a boundary site o, and add the condition that h(o) = 0 which amounts to multiply the previous expression by exp(−h 2 x 0 ) and ensures that it is integrable on R X . One can then easily make sense of the GFF conditioned by the values of its square (h(x) 2 ) ie. by h(x) 2 = u(x) for a given vector (u(x)) in (0, ∞) X : The unknown random quantities are then the signs σ x of h(x), and the conditional distribution of (σ x ) is just proportional to the corresponding Gaussian densities at (σ x u(x)). One can note that for a given (u(x)), this density is proportional to the product over all edges e = (x, y) of exp(β u e σ x σ y ), where the modified weights J u e are defined by β u e := β e × u(x) × u(y).
In other words, the conditional distribution of these signs (σ x ) given (h(x) 2 ) = (u(x)) is exactly an Ising model with weight (β u e ) on the graph G. For instance, in the case where one conditions h(x) 2 to be equal to 1 at each site, one gets exactly the Ising model with edge-weight function (β e ). As explained before, one way to sample this is to choose the signs by tossing independent fair coins for each of the clusters of an FK-model with parameters (β e ).
• We now are recall the relation between the square of the GFF and loop-soups on the graph (see [9]): The squared GFF is the cumulated occupation time of a continuous-time loopsoup defined on the discrete graph G (where one adds the boundary point o where the process is killed). As noted in [10], these continuous-time loop soups can be also viewed (see [10]) as the trace on the sites of a Brownian loop-soup defined on the cable-system associated with the graph (each edge is replaced by a one-dimensional segment on which the Brownian motion can move continuously) -the time spent by the discrete loops at sites corresponds to the local time spent at this site by the corresponding Brownian loop. This provides a natural coupling between the GFF on the discrete graph, the GFF on the cable system, the continuous-time loop-soup on the discrete graph, and the continuous loop-soup on the cable system, that we will from now on always implicitely use. A key observation in [10] is that conditionally on this loop-soup (that defines the square of the GFF), the sign of the GFF is chosen to be constant and independent for each "cluster" of the cable system loop-soup. • But it is also easy to make sense of the distribution of the loop-soups (on the discrete graph and on the cable systems) conditioned by the square of the GFF on the sites. Indeed, as explained in [16], the conditional distributions of the number of jumps of the loop-soup on G along the unoriented edges of G, when conditioned by (h 2 (x)) = (u(x)) is exactly the random current model with edge-weights (β u e ). For instance, when one conditions h 2 (x) to be equal to 1 at each site, one gets exactly the random current model on G with edge-weights (β e ). In the loop-soup on the cable system, it is easy to see (this type of observations is already present in [10]) that on top of the excursions made by the loops inbetween different sites (these correspond to the discrete jumps that we just described via the random current distribution), one adds an independent contribution in each edge e (these correspond to the excursions away from the two extremities of the edge that do not cross the edge, and the loops that are totally contained in this edge). When occupation time of the loop-soup at both extremities of e is equal to one, the condtional probability that these contributions join them into the same cluster is equal p e . Hence, the conditional distribution (given that h(x) 2 = 1 at all sites) of the clusters created by the loop-soup on the cable-system is exactly given by the clusters defined by superimposing of a random current (given by (1)) and a Bernoulli percolation on the edges. Comparing this with the previous description of the conditional distribution of σ, we conclude that the FK-clusters are indeed distributed like the clusters of the superposition of the random current with the Bernoulli percolation.
This therefore provides an alternative explanation to the relation between the Ising random current and the FK-Ising + Bernoulli percolation pointed out in the previous section. In fact, it is this interpretation of the random current in terms of loop-soups conditioned by the values of the GFF at sites that did lead us to realize that the relation derived in the first section should hold (and then, once one guesses that this result holds, it is actually easy to find a direct proof).
Let us note that the notions of loop-soup clusters and their relation with GFF have a nice SLE/CLE type properties in the two-dimensional continuous space via the Brownian loop-soup introduced in [8], see [13,11,7] and the references therein.
To conclude, let us note that is quite possible that some of these random current-loop soup-FK features have been observed before (explicitely or in some slightly hidden way) -the study of the Ising model has proved to be prone to recurrent rediscoveries of such simple combinatorial identities... As in [4,16], the present considerations are reminiscent of some ideas in [3,5,14,9].
In the recent work of Sabot and Tarrès [12] on vertex reinforced jump processes and Ray-Knight theorems, one can for instance find some traces of the relation between loop-soups, fields and the Ising model. More precisely, one can interpret their "magnetized inversed VRJP" as a reconstruction of the loop-soup conditioned on its occupation field, that is to say on h 2 , that also samples a random current given the edge weights (β e ): In their setting, edge weights evolve over time, and one first discovers the loops that go through a point x 1 , then the loops that go through x 2 without visiting x 1 and so on. Loosely speaking, tracing the loops then progressively eats up the available time at each sites, and the evolving edge-weights represent this remaining available time.
More precisely, let x 1 , . . . , x k be an arbitrary enumeration of vertices of G. One defines by induction over i ≤ k the processes (β (i) e (t)) t≥0 and (X (i) (t)) t≥0 as follows. They start from β  e (t) is described by dβ (i) e (t) = −1 {e adjacent to X (i) (t)} × β (i) e (t)dt and the dynamics of the jump process X (i) is that when it is at x at time t, it jumps to a neighbour y via the edge e, with rate where σ x σ y (i) t := E (β (i) e (t)) (σ x σ y ) can be interpreted as the time-evolving two-point functions of the Ising model associated to the time-evolving weights. It turns out that almost surely, X (i) (t) = x i for all large t. Hence, for any edge e adjacent to x i , lim t→∞ β (i) e (t) = 0. In particular lim t→∞ β (k) e (t) = 0 for all e. The family (N e ) where N e denotes the total number of jumps across the edge e by the k processes X (i) is then distributed like a random current with weights (β e ). For details on this model, see [12].