Explicit formula for the density of local times of Markov Jump Processes

In this note we show a simple formula for the joint density of local times, last exit tree and cycling numbers of continuous-time Markov Chains on finite graphs, which involves the modified Bessel function of the first type.


Introduction
Let G = (V, E, ∼) be a nonoriented connected finite graph with no multiple edges, with conductances W e ∈ R + \ {0}, e ∈ E. For any i ∈ V , let W i = j∼i W ij . Consider the associated Markov Jump Process (X t ) t≥0 started at i 0 ∈ V , that is the continuous-time discrete-space random walk which jumps from a vertex i ∈ V to a neighbor j at rate W ij = W ji , i.e. with generator Lf (i) = j∈V :j∼i W ij (f (j) − f (i)), for any i ∈ V.
Let E = {(i, j) : {i, j} ∈ E} be the set of directed edges, where each undirected edge in E is replaced by two directed edges with opposite directions. For any oriented spanning tree T , we call its root the unique site from which no edge goes out. We denote by δ i (j) = 1{i = j} the Kronecker delta.
Let I be the set of currents on the graph, i.e.
For any a ∈ Z E and i ∈ V , we let a i = j∼i a ij . If a ∈ I, then a i can be interpreted as the divergence of a at site i. For any k ∈ N E , let a(k) ∈ I be defined by a(k) ij = k ij − k ji . For any a ∈ I and any oriented spanning tree T of G, letã be defined bỹ Key words and phrases. Markov Jump Process, density of local times, last exit trees, cycling numbers, modified Bessel function.
For any σ > 0 and any right-continuous path x = (x(t)) t≥0 , let us define ℓ(x, σ) ∈ R V + as the vector of local times at time σ, that is, Let us also define k(x, σ) = (k ij (x, σ)) (i,j)∈ E the vector of oriented crossings up to time σ, that is, Let T (x, σ) be the last-exit tree of the path x on the interval [0, σ], that is, the collection of directed edges taken by path x for the last departures from all vertices visited in that time interval except the endpoint x(σ). In other words, For any ν ∈ R, the modified Bessel function of the first kind is defined by Recall that I ν (z) = I −ν (z). Therefore, for any a ∈ I, any nonoriented edge e = {i, j} ∈ E and z ∈ R, we define I ae (z) = I a ij (z) = I a ji (z).
The main result of this note is the following.
T be an oriented spanning tree of the graph with root i 1 , and let a ∈ I such that

Remark 1.
It is easy to extend Theorem 1 to the case where σ is a stopping time In the case where G = T is a (finite) tree and if X 0 = i 0 , at time σ i 0 u and on the event that ℓ i (X, σ i u ) > 0, for all i ∈ V , there is only one choice of last exit tree T (X, σ), which is T itself oriented towards i 0 , and on the other hand a(k(X, σ)) = 0. Hence in that case This is consistent with the second Ray-Knight theorem that relates the local times of Brownian motion on R at time σ u 0 to a 0-dimensional squared Bessel process.
Remark 2. One could obtain a formula for the density of the local times alone by summing the formula obtained in Theorem 1 over all possible spanning trees and cycling numbers. For this purpose, one should realize that, for any fixed oriented spanning tree T , any arbitrary choice of cycling numbers on E \ T can be uniquely extended to cycling numbers on the whole graph via a linear map.
Explicit formulas for the joint density of local times of continous-time Markov Chains were already proposed, see for instance [6,1]. Merkl, Rolles and Tarrès proposed in [7] a formula for the joint density of the oriented edge crossings, local times and last-exit tree for the Vertex-Reinforced Jump Process on a general graph, whose counterpart in the context of continuous-time Markov Chains is stated in Proposition 2.1 below.
Le Jan independently obtained in Theorem 4.1 [5], in the context of loop soups L 1 with intensity 1, an expression for the joint density of the cycling numbers and local time, which also involves the first modified Bessel function. We can deduce that result from the construction of those loop soups by Wilson's algorithm, in the following manner.
Let us first quickly recall that algorithm: we order all the sites of our finite graph V = {i 0 , . . . , i |V | }, and we assume that the walk is transient with cemetery ∆. We start a loop-erased Markov chain {η 1 } starting from i 0 and ending at ∆. Then, from the next vertex in V \ {η 1 } we start a loop-erased Markov chain {η 2 } ending in {η 1 } ∪ {∆}, and so on. The union of all η i is a spanning tree, whose leaves are the starting sites of the successive loop-erased chains.
Given a fixed spanning tree T , we can easily obtain a formula similar to the one in Theorem 1 for the joint density of the succession of Markov chains starting successively at all leaves of T with respect to the order on sites given above and killed at cemetery ∆. Now the loop soup extracted from that succession of Markov chains by Wilson's algorithm has the same local time at all sites (see for instance Chapter 8 [4]), its cycling numbers are k =ã after extraction of the spanning tree;ã satisfiesã i = 0 for all i ∈ V , so that the term i∈V ℓã i /2 i in the density is 1. Summing ij∈T W ij over all spanning trees of G yields a determinant by matrix-tree theorem, which enables to deduce Theorem 4.1 [5].

Proof of Theorem 1
We first show the following Proposition 2.1. Its proof relies on an argument similar to the proof of Theorem 1.6 in [7]; the technique for determining the cardinality of the set of paths with given last exit tree are from Lemma 6 by Keane and Rolles in [3].
let T be an oriented spanning tree of the graph with root i 1 , and let k ∈ N E be such that a(k) i = δ i 0 (i) − δ i 1 (i) for all i ∈ V . Then Proof. It follows from a simple argument (similar but simpler than Lemma 1 in [3]) that, for any k ∈ N E such that a(k) = δ i 0 − δ i 1 , there exists a path from i 0 to i 1 realizing the edge crossings prescribed by k. Consider adding to the event on the l.h.s. of (2.1) the additional requirements that (X t ) 0≤t≤σ takes a given path γ = {γ 0 = i 0 , γ 1 , ..., γ n−1 , γ n = i 1 } with given jump times 0 < t 1 < t 2 < ... < t n < σ, the probability turns out to be independent of such choices, and is equal to The number of ways to distribute the jump times out of each vertex within its given local time contributes a multiplicative factor of Further, with the last exit tree T fixed, the number of relative orders of exiting each vertex follows the multinomial distribution, thus contributing another factor of Multiplying the preceding three displays proves the proposition.
Let us now prove Theorem 1. Let a ∈ I be such that a i = δ i 0 (i) − δ i 1 (i) for all i ∈ V . For each nonoriented edge e ∈ E, let us choose a unique orientation e = (i, j) with e = {i, j} such that a ij ≥ 0, and let E + = { e : e ∈ E}.
In order to compute the probability considered in the statement of the theorem, we need to sum all the contributions from Proposition 2.1 for all k ∈ N E such that a(k) = a. For each ij ∈ E + , we sum over all k ji ≥ 0 and k ij is determined by k ij = k ji + a ij ≥ k ji . Therefore, using Proposition 2.1, recalling that ℓ i > 0 for every i ∈ V and joining the contributions from ij and ji for each ij ∈ E + , we have In the second equality we use that, if ji ∈ T , then the summand is 0 if k ji = 0. Let k ′ ji = k ji − 1{ji ∈ T }. Then k ij − 1{ij ∈ T } = k ′ ji +ã ij k ij + k ji = 2k ′ ji +ã ij + 1{ij ∈ T } where T is the nonoriented tree associated to T , so that We conclude the proof by the observation that