Pruning a Lévy continuum random tree

Given a general critical or sub-critical branching mechanism, we deﬁne a pruning procedure of the associated Lévy continuum random tree. This pruning procedure is deﬁned by adding some marks on the tree, using Lévy snake techniques. We then prove that the resulting subtree after pruning is still a Lévy continuum random tree. This last result is proved using the exploration process that codes the CRT, a special Markov property and martingale problems for exploration processes. We ﬁnally give the joint law under the excursion measure of the lengths of the excursions of the initial exploration process and the pruned one.

Continuous state branching processes (CSBP) were first introduced by Jirina [25] and it is known since Lamperti [27] that these processes are the scaling limits of Galton-Watson processes. They hence model the evolution of a large population on a long time interval. The law of such a process is characterized by the so-called branching mechanism function ψ. We will be interested mainly in critical or sub-critical CSBP. In those cases, the branching mechanism ψ is given by with α ≥ 0, β ≥ 0 and the Lévy measure π is a positive σ-finite measure on (0, +∞) such that (0,+∞) (ℓ ∧ ℓ 2 )π(dℓ) < ∞. We shall say that the branching mechanism ψ has parameter (α, β, π). Let us recall that α represents a drift term, β is a diffusion coefficient and π describes the jumps of the CSBP.
As for discrete Galton-Watson processes, we can associate with a CSBP a genealogical tree, see [30] or [22]. These trees can be considered as continuum random trees (CRT) in the sense that the branching points along a branch form a dense subset. We call the genealogical tree associated with a branching mechanism ψ the ψ-Lévy CRT (the term "Lévy" will be explained later). The prototype of such a tree is the Brownian CRT introduced by Aldous [9].
In a discrete setting, it is easy to consider and study a percolation on the tree (for instance, see [11] for percolation on the branches of a Galton-Watson tree, or [6] for percolation on the nodes of a Galton-Watson tree). The goal of this paper is to introduce a general pruning procedure of a genealogical tree associated with a branching mechanism ψ of the form (1), which is the continuous analogue of the previous percolation (although no link is actually made between both). We first add some marks on the skeleton of the tree according to a Poisson measure with intensity α 1 λ where λ is the length measure on the tree (see the definition of that measure further) and α 1 is a non-negative parameter. We next add some marks on the nodes of infinite index of the tree: with such a node s is associated a "weight" say ∆ s (see later for a formal definition), each infinite node is then marked with probability p(∆ s ) where p is a non-negative measurable function satisfying the integrability condition (0,+∞) ℓ p(ℓ) π(dℓ) < +∞. ( We then prune the tree according to these marks and consider the law of the pruned subtree containing the root. The main result of the paper is the following theorem: Theorem 0.1. Let ψ be a (sub)-critical branching mechanism of the form (1). We define and set ψ 0 (λ) = α 0 λ + βλ + (0,+∞) π 0 (dℓ) e −λℓ −1 + λℓ (5) which is again a branching mechanism of a critical or subcritical CSBP.
Then, the pruned subtree is a Lévy-CRT with branching mechanism ψ 0 .
In order to make the previous statement more rigorous, we must first describe more precisely the geometric structure of a continuum random tree and define the so-called exploration process that codes the CRT in the next subsection. In a second subsection, we describe the pruning procedure and state rigorously the main results of the paper. Eventually, we give some biological motivations for studying the pruning procedure and other applications of this work.

The Lévy CRT and its coding by the exploration process
We first give the definition of a real tree, see e.g. [24] or [28].
Definition 0.2. A metric space ( , d) is a real tree if the following two properties hold for every v 1 , v 2 ∈ .
(i) There is a unique isometric map f v 1 (ii) If q is a continuous injective map from [0, 1] into such that q(0) = v 1 and q(1) = v 2 , then we have A rooted real tree is a real tree ( , d) with a distinguished vertex v called the root.
Let ( , d) be a rooted real tree. The range of the mapping f v 1 ,v 2 is denoted by [[v 1 , v 2 , ]] (this is the line between v 1 and v 2 in the tree). In particular, for every vertex v ∈ , [[v , v]] is the path going from the root to v which we call the ancestral line of vertex v. More generally, we say that a vertex v is an ancestor a]]. We call a the most recent common ancestor of v and v ′ . By definition, the degree of a vertex v ∈ is the number of connected components of \ {v}. A vertex v is called a leaf if it has degree 1. Finally, we set λ the one-dimensional Hausdorff measure on .
The coding of a compact real tree by a continuous function is now well known and is a key tool for defining random real trees. We consider a continuous function g : [0, +∞) −→ [0, +∞) with compact support and such that g(0) = 0. We also assume that g is not identically 0. For every 0 ≤ s ≤ t, we set m g (s, t) = inf u∈ [s,t] g(u), and d g (s, t) = g(s) + g(t) − 2m g (s, t).
We then introduce the equivalence relation s ∼ t if and only if d g (s, t) = 0. Let g be the quotient space [0, +∞)/ ∼. It is easy to check that d g induces a distance on g . Moreover, ( g , d g ) is a compact real tree (see [21], Theorem 2.1). The function g is the so-called height process of the tree g . This construction can be extended to more general measurable functions. In order to define a random tree, instead of taking a tree-valued random variable, it suffices to take a stochastic process for g. For instance, when g is a normalized Brownian excursion, the associated real tree is Aldous' CRT [10].
The construction of a height process that codes a tree associated with a general branching mechanism is due to Le Gall and Le Jan [30]. Let ψ be a branching mechanism given by (1) and let X be a Lévy process with Laplace exponent ψ: [e −λX t ] = e tψ(λ) for all λ ≥ 0. Following [30], we also assume that X is of infinite variation a.s. which implies that β > 0 or (0,1) ℓπ(dℓ) = ∞. Notice that these conditions are satisfied in the stable case: ψ(λ) = λ c , c ∈ (1, 2] (the quadratic case ψ(λ) = λ 2 corresponds to the Brownian case).
We then set where for 0 ≤ s ≤ t, I s t = inf s≤r≤t X r . If the additional assumption holds, then the process H admits a continuous version. In this case, we can consider the real tree associated with an excursion of the process H and we say that this real tree is the Lévy CRT associated with ψ. If we set L a t (H) the local time time of the process H at level a and time t and T x = inf{t ≥ 0, L 0 t (H) = x}, then the process (L a T x (H), a ≥ 0) is a CSBP starting from x with branching mechanism ψ and the tree with height process H can be viewed as the genealogical tree of this CSBP. Let us remark that the latter property also holds for a discontinuous H (i.e. if (7) doesn't hold) and we still say that H describes the genealogy of the CSBP associated with ψ.
In general, the process H is not a Markov process. So, we introduce the so-called exploration process ρ = (ρ t , t ≥ 0) which is a measure-valued process defined by The height process can easily be recovered from the exploration process as H t = H(ρ t ), where H(µ) denotes the supremum of the closed support of the measure µ (with the convention that H(0) = 0). If we endow the set f ( + ) of finite measures on + with the topology of weak convergence, then the exploration process ρ is a càd-làg strong Markov process in f ( + ) (see [22], Proposition 1.2.3).
To understand the meaning of the exploration process, let us use the queuing system representation of [30] when β = 0. We consider a preemptive LIFO (Last In, First Out) queue with one server. A jump of X at time s corresponds to the arrival of a new customer requiring a service equal to ∆ s := X s − X s− . The server interrupts his current job and starts immediately the service of this new customer (preemptive LIFO procedure). When this new service is finished, the server will resume the previous job. When π is infinite, all services will suffer interruptions. The customer (arrived at time) s will still be in the system at time t > s if and only if X s− < inf s≤r≤t X r and, in this case, the quantity ρ t ({H s }) represents the remaining service required by the customer s at time t. Observe that ρ t ([0, H t ]) corresponds to the load of the server at time t and is equal to X t − I t where In view of the Markov property of ρ and the Poisson representation of Lemma 1.6, we can view ρ t as a measure placed on the ancestral line of the individual labeled by t which gives the intensity of the sub-trees that are grafted "on the right" of this ancestral line. The continuous part of the measure ρ t gives binary branching points (i.e. vertex in the tree of degree 3) which are dense along that ancestral line since the excursion measure that appears in Lemma 1.6 is an infinite measure, whereas the atomic part of the measure ρ t gives nodes of infinite degree for the same reason.
Consequently, the nodes of the tree coded by H are of two types : nodes of degree 3 and nodes of infinite degree. Moreover, we see that each node of infinite degree corresponds to a jump of the Lévy process X and so we associate to such a node a "weight" given by the height of the corresponding jump of X (this will be formally stated in Section 1.4). From now-on, we will only handle the exploration process although we will often use vocabulary taken from the real tree (coded by this exploration process). In particular, the theorems will be stated in terms of the exploration process and also hold when H is not continuous.

The pruned exploration process
We now consider the Lévy CRT associated with a general critical or sub-critical branching mechanism ψ (or rather the exploration process that codes that tree) and we add marks on the tree. There will be two kinds of marks: some marks will be set only on nodes of infinite degrees whereas the others will be 'uniformly distributed' on the skeleton on the tree.

Marks on the nodes
Let p : [0, +∞) −→ [0, 1] be a measurable function satisfying condition (2). Recall that each node of infinite degree of the tree is associated with a jump ∆ s of the process X . Conditionally on X , we mark such a node with probability p(∆ s ), independently of the other nodes.

Marks on the skeleton
Let α 1 be a non-negative constant. The marks associated with these parameters will be distributed on the skeleton of the tree according to a Poisson point measure with intensity α 1 λ(d r) (recall that λ denotes the one-dimensional Hausdorff measure on the tree).

The marked exploration process
As we don't use the real trees framework but only the exploration processes that codes the Lévy CRTs, we must describe all these marks in term of exploration processes. Therefore, we define a measure-valued process := ((ρ t , m nod t , m ske t ), t ≥ 0) called the marked exploration process where the process ρ is the usual exploration process whereas the processes m nod and m ske keep track of the marks, respectively on the nodes and on the skeleton of the tree.
The measure m nod t is just the sum of the Dirac measure of the marked nodes (up to some weights for technical reasons) which are the ancestors of t.
To define the measure m ske t , we first consider a Lévy snake (ρ t , W t ) t≥0 with spatial motion W a Poisson process of parameter α 1 (see [22], Chapter 4 for the definition of a Lévy snake). We then define the measure m ske t as the derivative of the function W t . Let us remark that in [22], the height process is supposed to be continuous for the construction of Lévy snakes. We explain in the appendix how to remove this technical assumption.

Main result
We denote by A t the Lebesgue measure of the set of the individuals prior to t whose lineage does not contain any mark i.e.
We consider its right-continuous inverse C t := inf{r ≥ 0, A r > t} and we define the pruned exploration processρ by In other words, we remove from the CRT all the individuals who have a marked ancestor, and the exploration processρ codes the remaining tree.
We can now restate Theorem 0.1 rigorously in terms of exploration processes.
Theorem 0.3. The pruned exploration processρ is distributed as the exploration process associated with a Lévy process with Laplace exponent ψ 0 .
The proof relies on a martingale problem forρ and a special Markov property, Theorem 3.2. Roughly speaking, the special Markov property gives the conditional distribution of the individuals with marked ancestors with respect to the tree of individuals with no marked ancestors. This result is of independent interest. Notice the proof of this result in the general setting is surprisingly much more involved than the previous two particular cases: the quadratic case (see Proposition 6 in [7] or Proposition 7 in [16]) and the case without quadratic term (see Theorem 3.12 in [2]).
Finally, we give the joint law of the length of the excursion of the exploration process and the length of the excursion of the pruned exploration process, see Proposition 5.1.

Motivations and applications
A first approach for this construction is to consider the CSBP Y 0 associated with the pruned exploration processρ as an initial Eve-population which undergoes some neutral mutations (the marks on the genealogical tree) and the CSBP Y denotes the total population (the Eve-one and the mutants) associated with the exploration process ρ. We see that, from our construction, we have means that, when the population Y 0 jumps, so does the population Y . By these remarks, we can see that our pruning procedure is quite general. Let us however remark that the coefficient diffusion β is the same for ψ and ψ 0 which might imply that more general prunings exist (in particular, we would like to remove some of the vertices of index 3).
As we consider general critical or sub-critical branching mechanism, this work extends previous work from Abraham and Serlet [7] on Brownian CRT (π = 0) and Abraham and Delmas [2] on CRT without Brownian part (β = 0). See also Bertoin [14] for an approach using Galton-Watson trees and p = 0, or [4] for an approach using CSBP with immigration. Let us remark that this paper goes along the same general ideas as [2] (the theorems and the intermediate lemmas are the same) but the proofs of each of them are more involved and use quite different techniques based on martingale problem.
This work has also others applications. Our method separates in fact the genealogical tree associated with Y into several components. For some values of the parameters of the pruning procedure, we can construct via our pruning procedure, a fragmentation process as defined by Bertoin [13] but which is not self-similar, see for instance [7; 2; 31]. On the other hand, we can view our method as a manner to increase the size of a tree, starting from the CRT associated with ψ 0 to get the CRT associated with ψ. We can even construct a tree-valued process which makes the tree grow, starting from a trivial tree containing only the root up to infinite super-critical trees, see [5].

Organization of the paper
We first recall in the next Section the construction of the exploration process, how it codes a CRT and its main properties we shall use. We also define the marked exploration process that is used for pruning the tree. In Section 2, we define rigorously the pruned exploration processρ and restate precisely Theorem 0.3. The rest of the paper is devoted to the proof of that theorem. In Section 3, we state and prove a special Markov property of the marked exploration process, that gives the law of the exploration process "above" the marks, conditionally onρ. We use this special property in Section 4 to derive from the martingale problem satisfied by ρ, introduced in [1] when β = 0, a martingale problem forρ which allows us to obtain the law ofρ. Finally, we compute in the last section, under the excursion measure, the joint law of the lengths of the excursions of ρ andρ. The Appendix is devoted to some extension of the Lévy snake when the height process is not continuous.

The exploration process: notations and properties
We recall here the construction of the height process and the exploration process that codes a Lévy continuum random tree. These objects have been introduced in [30 ; 29] and developed later in [22]. The results of this section are mainly extracted from [22], but for Section 1.4.
We denote by + the set of non-negative real numbers. Let ( + ) (resp. f ( + )) be the set of σfinite (resp. finite) measures on + , endowed with the topology of vague (resp. weak) convergence. If E is a Polish space, let (E) (resp. + (E)) be the set of real-valued measurable (resp. and nonnegative) functions defined on E endowed with its Borel σ-field. For any measure µ ∈ ( + ) and f ∈ + ( + ), we write

The underlying Lévy process
We consider a -valued Lévy process X = (X t , t ≥ 0) starting from 0. We assume that X is the canonical process on the Skorohod space ( + , ) of càd-làg real-valued paths, endowed with the canonical filtration. The law of the process X starting from 0 will be denoted by and the corresponding expectation by . Most of the following facts on Lévy processes can be found in [12].
In this paper, we assume that X • has no negative jumps, • has first moments, • is of infinite variation, • does not drift to +∞.
The law of X is characterized by its Laplace transform: where, as X does not drift to +∞, its Laplace exponent ψ can then be written as (1), where the Lévy measure π is a Radon measure on + (positive jumps) that satisfies the integrability condition (0,+∞) (ℓ ∧ ℓ 2 )π(dℓ) < +∞ (X has first moments), the drift coefficient α is non negative (X does not drift to +∞) and β ≥ 0. As we ask for X to be of infinite variation, we must additionally suppose that β > 0 or (0,1) ℓ π(dℓ) = +∞.
As X is of infinite variation, we have, see Corollary VII.5 in [12], Let I = (I t , t ≥ 0) be the infimum process of X , I t = inf 0≤s≤t X s , and let S = (S t , t ≥ 0) be the supremum process, S t = sup 0≤s≤t X s . We will also consider for every 0 ≤ s ≤ t the infimum of X over [s, t]: We denote by the set of jumping times of X : and for t ≥ 0 we set ∆ t := X t − X t− the height of the jump of X at time t. Of course, ∆ t > 0 ⇐⇒ t ∈ .
The point 0 is regular for the Markov process X − I, and −I is the local time of X − I at 0 (see [12], chap. VII). Let be the associated excursion measure of the process X − I away from 0, and let σ = inf{t > 0; X t − I t = 0} be the length of the excursion of X − I under . We will assume that under , X 0 = I 0 = 0.
Since X is of infinite variation, 0 is also regular for the Markov process S − X . The local time L = (L t , t ≥ 0) of S − X at 0 will be normalized so that where L −1 t = inf{s ≥ 0; L s ≥ t} (see also [12] Theorem VII.4 (ii)).

The height process
We now define the height process H associated with the Lévy process X . Following [22], we give an alternative definition of H instead of those in the introduction, formula (6).
For each t ≥ 0, we consider the reversed process at time t,X (t) = (X (t) s , 0 ≤ s ≤ t) by: with the convention X 0− = X 0 . The two processes (X (t) s , 0 ≤ s ≤ t) and (X s , 0 ≤ s ≤ t) have the same law. LetŜ (t) be the supremum process ofX (t) andL (t) be the local time at 0 ofŜ (t) −X (t) with the same normalization as L. This definition gives also a modification of the process defined by (6) (see [22], Lemma 1.1.3). In general, H takes its values in [0, +∞], but we have, a.s. for every t ≥ 0, H s < ∞ for every s < t such that X s− ≤ I s t , and H t < +∞ if ∆ t > 0 (see [22], Lemma 1.2.1). The process H does not admit a continuous version (or even càd-làg) in general but it has continuous sample paths -a.s. iff (7) is satisfied, see [22], Theorem 1.4.3.
To end this section, let us remark that the height process is also well-defined under the excursion process and all the previous results remain valid under .

The exploration process
The height process is not Markov in general. But it is a very simple function of a measure-valued Markov process, the so-called exploration process.
The exploration process ρ = (ρ t , t ≥ 0) is a f ( + )-valued process defined as follows: for every In particular, the total mass of ρ t is 〈ρ t , 1〉 = X t − I t .
For µ ∈ ( + ), we set where Supp µ is the closed support of µ, with the convention H(0) = 0. We have • H(ρ t ) = H t , • ρ t = 0 if and only if H t = 0, In the definition of the exploration process, as X starts from 0, we have ρ 0 = 0 a.s. To state the Markov property of ρ, we must first define the process ρ started at any initial measure µ ∈ f ( + ).
For a ∈ [0, 〈µ, 1〉], we define the erased measure k a µ by If a > 〈µ, 1〉, we set k a µ = 0. In other words, the measure k a µ is the measure µ erased by a mass a backward from H(µ).
For ν, µ ∈ f ( + ), and µ with compact support, we define the concatenation [µ, ν] ∈ f ( + ) of the two measures by: Finally, we set for every µ ∈ f ( + ) and every t > 0, We say that (ρ µ t , t ≥ 0) is the process ρ started at ρ µ 0 = µ, and write µ for its law. Unless there is an ambiguity, we shall write ρ t for ρ µ t .

Proposition 1.3. ([22], Proposition 1.2.3) For any initial finite measure
Remark 1.4. From the construction of ρ, we get that a.s. ρ t = 0 if and only if −I t ≥ 〈ρ 0 , 1〉 and X t − I t = 0. This implies that 0 is also a regular point for ρ. Notice that is also the excursion measure of the process ρ away from 0, and that σ, the length of the excursion, is -a.e. equal to inf{t > 0; ρ t = 0}.
Exponential formula for the Poisson point process of jumps of the inverse subordinator of −I gives (see also the beginning of Section 3.2.2. [22]) that for λ > 0

The marked exploration process
As presented in the introduction, we add random marks on the Lévy CRT coded by ρ. There will be two kinds of marks: marks on the nodes of infinite degree and marks on the skeleton.

Marks on the skeleton
Let α 1 ≥ 0. We want to construct a "Lévy Poisson snake" (i.e. a Lévy snake with spatial motion a Poisson process), whose jumps give the marks on the branches of the CRT. More precisely, we set the space of killed càd-làg paths w : [0, ζ) → where ζ ∈ (0, +∞) is called the lifetime of the path w. We equip with a distance d (defined in [22] Chapter 4 and whose expression is not important for our purpose) such that ( , d) is a Polish space.
By Proposition 4.4.1 of [22] when H is continuous, or the results of the appendix in the general case, there exists a probability measure˜ onΩ = ( + , f ( + ) × ) under which the canonical process (ρ s , W s ) satisfies 1. The process ρ is the exploration process starting at 0 associated with a branching mechanism ψ, 2. For every s ≥ 0, the path W s is distributed as a Poisson process with intensity α 1 stopped at time H s := H(ρ s ), 3. The process (ρ, W ) satisfies the so-called snake property: for every s < s ′ , conditionally given ρ, the paths W s (·) and W s ′ (·) coincide up to time H s,s ′ := inf{H u , s ≤ u ≤ s ′ } and then are independent.
So, for every t ≥ 0, the path W t is a.s. càd-làg with jumps equal to one. Its derivative m ske t is an atomic measure on [0, H t ); it gives the marks (on the skeleton) on the ancestral line of the individual labeled t.
We shall denote by˜ the corresponding excursion measure out of (0, 0).

Marks on the nodes
Let p be a measurable function defined on + taking values in [0, 1] such that We define the measures π 1 and π 0 by their density: Let (Ω ′ , ′ , P ′ ) be a probability space with no atom. Recall that , defined by (10), denotes the jumping times of the Lévy process X and that ∆ s represents the height of the jump of X at time s ∈ . As is countable, we can construct on the product spaceΩ×Ω ′ (with the product probability measure˜ ⊗P ′ ) a family (U s , s ∈ ) of random variables which are, conditionally on X , independent, uniformly distributed over [0, 1] and independent of (∆ s , s ∈ ) and (W s , s ≥ 0). We set, for every s ∈ : so that, conditionally on X , the family (V s , s ∈ ) are independent Bernoulli random variables with respective parameters p(∆ s ).
We set 1 = {s ∈ , V s = 1} the set of the marked jumps and 0 = \ 1 = {s ∈ , V s = 0} the set of the non-marked jumps. For t ≥ 0, we consider the measure on + , The atoms of m nod t give the marked nodes of the exploration process at time t.
The definition of the measure-valued process m nod also holds under˜ ⊗ P ′ . For convenience, we shall write for˜ ⊗ P ′ and for˜ ⊗ P ′ .

Decomposition of X
At this stage, we can introduce a decomposition of the process X . Thanks to the integrability condition (16) on p, we can define the process X (1) by, for every t ≥ 0, The process X (1) is a subordinator with Laplace exponent φ 1 given by: . We then set X (0) = X − X (1) which is a Lévy process with Laplace exponent ψ 0 , independent of the process X (1) by standard properties of Poisson point processes.

The marked exploration process
We consider the process = ((ρ t , m nod t , m ske t ), t ≥ 0) on the product probability spaceΩ × Ω ′ under the probability and call it the marked exploration process. Let us remark that, as the process is defined under the probability , we have ρ 0 = 0, m nod 0 = 0 and m ske 0 = 0 a.s. Let us first define the state-space of the marked exploration process. We consider the set of triplet • Π 1 is a finite measure on + absolutely continuous with respect to µ, We endow with the following distance: If (µ, Π 1 , Π 2 ) ∈ , we set We then define (62) and D is a distance that defines the topology of weak convergence and such that the metric space ( f ( + ), D) is complete.
To get the Markov property of the marked exploration process, we must define the process started at any initial value of .
Notice the definition of (m ske ) (µ,Π) t is coherent with the construction of the Lévy snake, with W 0 being the cumulative function of Π ske over [0, H 0 ].
We shall write m nod for (m nod ) (µ,Π) and similarly for m ske . Finally, we write m = (m nod , m ske ). By construction and since ρ is an homogeneous Markov process, the marked exploration process = (ρ, m) is an homogeneous Markov process.
From now-on, we suppose that the marked exploration process is defined on the canonical space ( , ′ ) where ′ is the Borel σ-field associated with the metric d ′ . We denote by = (ρ, m nod , m ske ) the canonical process and we denote by µ,Π the probability measure under which the canonical process is distributed as the marked exploration process starting at time 0 from (µ, Π), and by * µ,Π the probability measure under which the canonical process is distributed as the marked exploration process killed when ρ reaches 0. For convenience we shall write µ if Π = 0 and if (µ, Π) = 0 and similarly for * . Finally, we still denote by the distribution of when ρ is distributed under the excursion measure .

Let
= ( t , t ≥ 0) be the canonical filtration. Using the strong Markov property of (X , X (1) ) and Proposition 6.2 or Theorem 4.1.2 in [22] if H is continuous, we get the following result.

Proposition 1.5. The marked exploration process is a càd-làg -valued strong Markov process.
Let us remark that the marked exploration process satisfies the following snake property:

Poisson representation
We decompose the path of under * µ,Π according to excursions of the total mass of ρ above its past minimum, see Section 4.2.3 in [22]. More precisely, let (a i , b i ), i ∈ be the excursion intervals of X − I above 0 under * µ,Π . For every i ∈ , we define h i = H a i and¯ i = (ρ i ,m i ) by the formulas: for t ≥ 0 and f ∈ + ( + ), It is easy to adapt Lemma 4.2.4. of [22] to get the following Lemma.

The dual process and representation formula
We shall need the The process η is the dual process of ρ under (see Corollary 3.1.6 in [22]).
The next Lemma on time reversibility can easily be deduced from Corollary 3.1.6 of [22] and the construction of m.
We present a Poisson representation of (ρ, η, m) under . Let 0 (d x dℓ du), 1 (d x dℓ du) and For every a > 0, let us denote by a the law of the pair (µ, ν, m nod , m ske ) of measures on + with finite mass defined by: for any f ∈ + ( + ) We finally set = +∞ 0 d a e −αa a . Using the construction of the snake, it is easy to deduce from Proposition 3.1.3 in [22], the following Poisson representation.

The pruned exploration process
We define the following continuous additive functional of the process ((ρ t , m t ), t ≥ 0): Lemma 2.1. We have the following properties.
(ii) -a.e. 0 and σ are points of increase for A. More precisely, -a.e. for all ǫ > 0, we have A ǫ > 0 Proof. We first prove (i). Let λ > 0. Before where we replaced e −λ σ t dA u in the last equality by * ρ t ,m t [e −λA σ ], its optional projection, and used that dA t -a.e. m t = 0. In order to compute this last expression, we use the decomposition of under * µ according to excursions of the total mass of ρ above its minimum, see Lemma 1.6. Using the same notations as in this lemma, notice that under * µ , we have By Lemma 1.6, we get * where we used Proposition 1.9 for the third and fourth equalities, and for the last equality that Notice that if v = 0, then (30) implies v = λ/ψ ′ 0 (0), which is absurd since ψ ′ 0 (0) = α 0 > 0 thanks to (19). Therefore we have v ∈ (0, ∞), and we can divide (31) by v to get ψ 0 (v) = λ. This proves (i). Now, we prove (ii). If we let λ goes to infinity in (i) and use that lim r→∞ ψ 0 (r) = +∞, then we get that [A σ > 0] = +∞. Notice that for (µ, Π) ∈ , we have under * µ,Π , A ∞ ≥ i∈ Ā i ∞ , with A i defined by (28). Thus Lemma 1.6 implies that if µ = 0, then * µ,Π -a.s. is infinite and A ∞ > 0. Using the Markov property at time t of the snake under , we get that for any t > 0, -a.e. on {σ > t}, we have A σ − A t > 0. This implies that σ is -a.e. a point of increase of A. By time reversibility, see Lemma 1.7, we also get that -a.e. 0 is a point of increase of A. This gives (ii).
We set C t = inf{r > 0; A r > t} the right continuous inverse of A, with the convention that inf = ∞. From excursion decomposition, see Lemma 1.6, (ii) of Lemma 2.1 implies the following Corollary. We define the pruned exploration processρ = (ρ t = ρ C t , t ≥ 0) and the pruned marked exploration process˜ = (ρ,m), wherem = (m C t , t ≥ 0) = 0. Notice C t is a -stopping time for any t ≥ 0 and is finite a.s. from Corollary 2.2. Notice the processρ, and thus the process˜ , is càd-làg. We also setH t = H C t andσ = inf{t > 0;ρ t = 0}. Let˜ = (˜ t , t ≥ 0) be the filtration generated by the pruned marked exploration process˜ completed the usual way. In particular˜ t ⊂ C t , where if τ is an -stopping time, then τ is the σ-field associated with τ.
We are now able to restate precisely Theorem 0.3. Let ρ (0) be the exploration process of a Lévy process with Laplace exponent ψ 0 . The proof of this Theorem is given at the end of Section 4.

A special Markov property
In order to define the excursions of the marked exploration process away from {s ≥ 0; m s = 0}, we define O as the interior of {s ≥ 0, m s = 0}. We shall see that the complementary of O has positive Lebesgue measure. We write O = i∈ (α i , β i ) and say that (α i , β i ) i∈ are the excursions intervals of the marked exploration process = (ρ, m) away from {s ≥ 0, m s = 0}. For every i ∈ , let us define the measure-valued process i = (ρ i , m i ). For every f ∈ + ( + ), t ≥ 0, we set and Notice that the mass located at H α i is kept, if there is any, in the definition of ρ i whereas it is removed in the definition of m i . In particular, if ∆ α i > 0, then ρ i 0 = ∆ α i δ 0 and for every t < β i − α i , the measure ρ i t charges 0. On the contrary, as m i 0 = 0, we have, for Let˜ ∞ be the σ-field generated by˜ = ((ρ C t , m C t ), t ≥ 0). Recall that * µ,Π (d ) denotes the law of the marked exploration process started at (µ, Π) ∈ and stopped when ρ reaches 0. For ℓ ∈ (0, +∞), we will write * ℓ for * ℓδ 0 ,0 . If Q is a measure on and ϕ is a non-negative measurable function defined on the measurable space In other words, the integration concerns only the third component of the function ϕ.
We can now state the Special Markov Property.

Theorem 3.2 (Special Markov property). Let ϕ be a non-negative measurable function defined on
In other words, the law under of the excursion process i∈ Informally speaking, this Theorem gives the distribution of the marked exploration process "above" the pruned CRT. The end of this section is now devoted to its proof.
Let us first remark that, if lim λ→+∞ φ 1 (λ) < +∞, we have α 1 = 0 and π 1 is a finite measure. Hence, there is no marks on the skeleton and the number of marks on the nodes is finite on every bounded interval of time. The proof of Theorem 3.2 in that case is easy and left to the reader. For the rest of this Section, we assume that lim λ→+∞ φ 1 (λ) = +∞. We consider non-negative bounded functions ϕ satisfying the assumptions of Theorem 3.2 and these four conditions:

Stopping times
Let R(d t, du) be a Poisson point measure on 2 + (defined on ( , )) independent of ∞ with intensity the Lebesgue measure. We denote by t the σ-field generated by R(· ∩ [0, t] × + ). For every ǫ > 0, the process R ǫ t := R([0, t] × [0, 1/ǫ]) is a Poisson process with intensity 1/ǫ. We denote by (e ǫ k , k ≥ 1) the time intervals between the jumps of (R ǫ t , t ≥ 0). The random variables (e ǫ k , k ≥ 1) are i.i.d. exponential random variables with mean ǫ, and are independent of ∞ . They define a mesh of + which is finer and finer as ǫ decreases to 0.
For ǫ > 0, we consider T ǫ 0 = 0, M ǫ 0 = 0 and for k ≥ 0, with the convention inf = +∞. For every t ≥ 0, we set τ ǫ Notice that T ǫ k and S ǫ k are e -stopping times. Now we introduce a notation for the process defined above the marks on the intervals S ǫ k , T ǫ k . We set, for a ≥ 0,H a the level of the first mark, ρ − a the restriction of ρ a strictly below it and ρ + a the restriction of ρ a above it: and ρ + a is defined by ρ a = [ρ − a , ρ + a ], that is for any f ∈ + ( + ), For k ≥ 1 and ǫ > 0 fixed, we define k,ǫ = ρ k,ǫ , m k,ǫ in the following way: for s > 0 and f ∈ + ( + ) For k ≥ 1, we consider the σ-field (ǫ),k generated by the family of processes

Approximation of the functional
Let be a marked exploration process and g be a function defined on . We decompose the path of ρ according to the excursions of the total mass of ρ above its minimum as in Section 1.5, with a slight difference if the initial measure µ charges {0}. More precisely, we perform the same decomposition as in Section1.5 until the height process reaches 0. If µ({0}) = 0, then H t = 0 ⇐⇒ ρ t = 0 and the decompositions are the same. If not, there remains a part of the process which is not decomposed and is gathered in a single excursion (defined as (a i 0 , b i 0 ) in Figure 1). We set Y t = 〈ρ t , 1〉 and J t = inf 0≤u≤t Y t . Recall that (Y t , t ≥ 0) is distributed under * µ as a Lévy process with Laplace exponent ψ started at 〈µ, 1〉 and killed when it reaches 0. Let (a i , b i ), i ∈ , be the intervals excursion of Y − J away from 0. For every i ∈ , we define h i = H a i = H b i . We set˜ = {i ∈ ; h i > 0} and for i ∈˜ let¯ i = (ρ i ,m i ) be defined by (22) and (23). If the initial measure µ does not charge 0, we have˜ = and we set * =˜ = . If the initial measure µ charges 0, we consider i 0 ∈˜ and set Figure 1 to get the picture of the different excursions. Figure 1: Definition of the different excursions where the sums have a finite number of non zero terms.
Proof. First equality. By assumptions (h 1 ) and (h 3 ), as [T η < +∞] < +∞, the set is finite. Therefore, for ǫ small enough, for every j ∈ ′ , the mesh defined by (36) intersects the interval (α j , β j ): in other words, there exists an integer k j such that S ǫ k j ∈ (α j , β j ) (and that integer is unique).
Moreover, for every j ∈ ′ , we can choose ǫ small enough so that S ǫ k j < T η (ρ j ), which gives that, Finally, as the mark at α j is still present at time S ǫ k j , the additive functional A is constant on that time . Therefore, we get Second equality. Let j ∈ ′ . We consider the decomposition of k j ,ǫ according to ρ k j ,ǫ above its minimum described at the beginning of this Subsection. We must consider two cases : First case : The mass at α j is on a node. Then, for ǫ > 0 small enough, we have T η > a i 0 and as all the terms in the sum that defines ϕ * are zero but for i = i 0 .
Second case : The mass at α j is on the skeleton. In that case, we again can choose ǫ small enough so that the interval [T η , L η ] is included in one excursion interval above the minimum of the exploration total mass process of S k j ,ǫ . We then conclude as in the previous case.
Proof. This proof is rather long and technical. We decompose it in several steps.
Step 1. We introduce first a special form of the random variable Z.
Let p ≥ 1. Recall that H t,t ′ denotes the minimum of H between t and t ′ and thatH t defined by (38) represents the height of the lowest mark. We set Step 2. We apply the strong Markov property to get rid of terms which involve S ǫ p and T ǫ p . We first apply the strong Markov property at time T ǫ p by conditioning with respect to e T ǫ p . We obtain Recall notation (38) and (39). Notice that ρ T ǫ p = ρ − S ǫ p , and consequently ρ T ǫ p is measurable with respect to e S ǫ p . So, when we use the strong Markov property at time S ǫ p , we get thanks to (40) Using the strong Markov property at time T ǫ p−1 , and the lack of memory for the exponential r.v., we get where τ ′ is distributed under * ν as S ǫ 1 .
Step 3. We compute the function φ given by (46). To simplify the formulas, we set (the dependence on b and µ of F is omitted) so that The proof of the following technical Lemma is postponed to the end of this Sub-section.
Using w and v defined in (44), we get We deduce that Finally, plugging this formula in (48) and using the function K ǫ introduced in (43), we have Step 4. Induction.
Plugging the expression (49) for φ in (45), and using the arguments backward from (45) we get  In particular, from monotone class Theorem, this equality holds for any non-negative Z measurable w.r.t. the σ-field¯ ǫ ∞ . So, we may iterate the previous argument and let p goes to infinity to finally get that for any non-negative random variable Z ∈¯ ∞ , we have  Intuitively,¯ ǫ ∞ is the σ-field generated by˜ ∞ and the mesh ( ], k ≥ 0). As¯ ǫ ∞ contains ∞ , the Lemma is proved.
Proof of Lemma 3.6. We consider the Poisson decomposition of under * ν given in Lemma 1.6. Notice there exists a unique excursion i 1 ∈ s.t. a i 1 < τ ′ < b i 1 .
By hypothesis on Z 1 , under * ν , we can write Z 1 = 1 (ν, i∈ ;a i <a i 1 δ h i ,¯ i ) for a measurable function 1 . We can also write Z 2 = 2 (ρ u , ξ 0 d ≤ u < ξ 1 g ) for a measurable function 2 as m u = 0 for u ∈ [ξ 0 d , ξ 1 g ). Then, using compensation formula in excursion theory, see Corollary IV.11 in [12], we get Let (R k , k ≥ 1) be the increasing sequence of the jumping times of a Poisson process of intensity 1/ǫ, independent of . Then, by time-reversal, we have where τ k = inf{t > R k ; m t = 0}. We then apply the strong Markov property at time R k and the Poisson representation of the marked exploration process to get where τ 0 = inf{t > 0; m t = 0}. Now, let us remark that, if m 0 = 0, then m s = 0 for s ∈ [0, τ 0 ] and A τ 0 = 0. Therefore, m R 1 = 0 implies R 1 > τ 0 . The strong Markov property at time τ 0 gives, with We have, using the Poisson representation of Lemma 1.6 and (15), that * ρ + R k as γ = ψ −1 (1/ǫ). We obtain As k≥1 δ R k is a Poisson point process with intensity 1/ǫ, we deduce that Using Proposition 1.9, we get For r > 0 and µ a measure on + , let us define the measures µ ≥r and µ <r by Using Palm formula, we get Using the independence of the Poisson point measures, we get We deduce that Using this and (50) with similar arguments (in reverse order), we obtain (48). i j=1 e ǫ j ≤ A ǫ t } is a Poisson process with intensity 1/ǫ and the process s → N ǫ,t , where

Computation of the limit
is a marked Poisson process with intensity (m τ = 0)/ǫ, where τ is an exponential random variable with mean ǫ independent of .
We first study the process t → N ǫ,t .

Lemma 3.7. The process t → N ǫ,t is a Poisson process with intensity
Proof. We have, by the similar computations as in the proof of Lemma 3.5, By time reversibility and using optional projection and (15), we have The proof of Lemma 2.1, see (29) and (31), gives that (m τ = 0) = 1 ǫψ 0 (γ) .
We then get the following Corollary.
Recall that (A ǫ S ǫ k , k ≥ 1) are the jumping time of the Poisson process t → N ǫ,t with parameter φ 1 (γ)/ǫψ 0 (γ). Standard results on Poisson process implies the vague convergence in distribution (see also Lemma XI.11.1 in [18] towards the Lebesgue measure on + as ǫ goes down to 0. Since the limit is deterministic, the convergence holds in probability and a.s. along a decreasing sub-sequence (ǫ j , j ∈ ). In particular, if g is a continuous function on + with compact support (hence bounded), we have that a.s.
Notice that A ǫ s ≥ A s and that a.s. A ǫ s → A s as ǫ goes down to 0. This implies that a.s. (A ǫ s , s ≥ 0) converges uniformly on compacts to (A s , s ≥ 0). Therefore, if g is continuous with compact support, we have a.s.
and this convergence also holds for a càd-làg function g with compact support as the Lebesgue measure does not charge the point of discontinuity of g.
Let h be a continuous function defined on + × f ( + ) such that h(u, µ) = 0 for u ≥ t 0 . First let us remark that ρ − and that m T ǫ k = 0. Using the strong Markov property at time T ǫ k and the second part of Corollary 2.2, we deduce that -a.s. for all k ∈ * , Therefore, as This gives and applying the convergence (51) to the càd-làg function gives the result of the lemma.
We now study K ǫ given by (43). We keep the same notation as in Lemma 3.5.
The previous results allow us to compute the following limit. We keep the same notation as in Lemma 3.5.  w(ℓ, u, µ)) .

Proof of Theorem 3.2
Now we can prove the special Markov property in the case lim γ→∞ φ 1 (γ) = +∞.
Let Z ∈˜ ∞ non-negative such that [Z] < ∞. Let ϕ satisfying hypothesis of Theorem 3.2, (h 1 )-(h 3 ). We have, using notation of the previous sections  , where we used Lemma 3.4 and dominated convergence for the first equality, Lemma 3.5 for the second equality, Lemma 3.10 and dominated convergence for the last equality. By monotone class Theorem and monotonicity, we can remove hypothesis (h 1 )-(h 3 ). To ends the proof of the first part, notice that w(ℓ, u)) is˜ ∞ -measurable and so this isa.e. equal to the conditional expectation (i.e. the left hand side term of (34)).

Law of the pruned exploration process
Let ρ (0) be the exploration process of a Lévy process with Laplace exponent ψ 0 . The aim of this section is to prove Theorem 2.3.

A martingale problem forρ
Letσ = inf{t > 0,ρ t = 0}. In this section, we shall compute the law of the total mass process (〈ρ t∧σ , 1〉, t ≥ 0) under µ = µ,0 , using martingale problem characterization. We will first show how a martingale problem for ρ can be translated into a martingale problem forρ, see also [1]. Unfortunately, we were not able to use standard techniques of random time change, as developed in Chapter 6 of [23] and used for Poisson snake in [7], − f (µ) may not have a limit as t goes down to 0, even for exponential functionals.
Let F, K ∈ ( f ( + )) be bounded. We suppose that Notice that we have and thus * µ sup t≥0 M t < ∞. Consequently, we can define for t ≥ 0, Proposition 4.1. The process N = (N t , t ≥ 0) is an˜ -martingale. And we have the representation formula for N t : withK Proof. Notice that N = (N t , t ≥ 0) is an˜ -martingale. Indeed, we have for t, s ≥ 0, where we used the optional stopping time Theorem for the last equality.
where for u ≥ 0, Recall that C 0 = 0 µ -a.s. by Corollary 2.2. In particular, we get where we used the time change u = A s for the last equality. In particular, asσ is an˜ -stopping time, we get that the process (N ′ t , t ≥ 0) is˜ -adapted.
We keep the notations of Section 3. We consider (ρ i , m i ), i ∈ I the excursions of the process (ρ, m) outside {s, m s = 0} before σ and let (α i , β i ), i ∈ I be the corresponding interval excursions. In particular we can write where σ(ρ) = inf{v > 0; ρ v = 0}. We deduce from the second part of Theorem 3.2, that µ -a.s.
Proof. Let X = (X t , t ≥ 0) be under P * x a Lévy process with Laplace transform ψ started at x > 0 and stopped when it reached 0. Under µ , the total mass process (〈ρ t∧σ , 1〉, t ≥ 0) is distributed as X under P * 〈µ,1〉 . Let c > 0. From Lévy processes theory, we know that the process e −cX t −ψ(c) This implies that, for any µ ∈ f (R + ), µ σ 0 K(ρ s ) ds is finite. Using the Poisson representation, see Proposition 1.9, it is easy to get that In particular, it is also finite.
From Proposition 4.1, we get that N = (N t , t ≥ 0) is under µ an˜ -martingale, where: for t ≥ 0, andK given by (56). We can computeK: where we used (59) and the excursion decomposition for the second equality, and ψ 0 = ψ + φ 1 for the last one.
Thus, the process (N t , t ≥ 0) with for t ≥ 0 t , t ≥ 0) be under P * x a Lévy process with Laplace transform ψ 0 started at x > 0 and stopped when it reached 0. The two non-negative càdlàg processes (〈ρ t∧σ , 1〉, t ≥ 0) and X (0) solves the martingale problem: for any c ≥ 0, the process defined for t ≥ 0 by where σ ′ = inf{s ≥ 0; Y s ≤ 0}, is a martingale. From Corollary 4.4.4 in [23], we deduce that those two processes have the same distribution. To finish the proof, notice that the total mass process of ρ (0) under * µ is distributed as X (0) under P * 〈µ,1〉 .

Identification of the law ofρ
To begin with, let us mention some useful properties of the processρ.

Lemma 4.3.
We have the following properties for the processρ.
Proof. (i) This is a direct consequence of the strong Markov property of the process (ρ, m).
Since the processesρ and ρ (0) are both Markov processes, to show that they have the same law, it is enough to show that they have the same one-dimensional marginals. We first prove that result under the excursion measure.
Proof. On one hand, we compute, using the definition of the pruned processρ,  We now make the change of variable t = A u to get  By a time reversibility argument, see Lemma 1.7, we obtain  where we applied Lemma 2.1 (i) for the last equality. Now, using Proposition 1.9, we have  Using usual properties of point Poisson measures, we have, with c = α 1 + (0,∞) ℓ π 1 (dℓ), where with the notations of Proposition 1.9, for any f ∈ + ( + ) As Proposition 3.1.3 in [22] directly implies that the left-hand side of the previous equality is equal to    On the other hand, similar computations as above yields that this quantity is equal to . This ends the proof. Now, we prove the same result under * µ,0 , that is: For every λ > 0, f ∈ + ( + ) bounded and every finite measure µ, * µ,0 Proof. From the Poisson representation, see Lemma 1.6, and using notations of this Lemma and of (28) we have * µ,0 where the function f r is defined by is the maximal element of the closed support of k r µ (see (12)). We recall that −I is the local time at 0 of the reflected process X − I, and that τ r = inf{s; −I s > r} is the right continuous inverse of −I. From excursion formula, and using the time change −I s = r (or equivalently τ r = s), we get * µ,0 where the function G(r) is given by The same kind of computation gives * µ where the function G (0) is defined by and τ (0) is the right-continuous inverse of the infimum process −I (0) of the Lévy process with Laplace exponent ψ 0 . In fact this equality holds for every r by right-continuity.
Finally as G = G (0) , we have thanks to (60) and (61), that, for every bounded non-negative measurable function f , which ends the proof. Proof. Let f ∈ + ( + ) bounded. Proposition 4.5 can be re-written as By uniqueness of the Laplace transform, we deduce that, for almost every t > 0, * In fact this equality holds for every t by right-continuity. As the Laplace functionals characterize the law of a random measure, we deduce that, for fixed t > 0, the law ofρ t under * µ,0 is the same as the law of ρ (0) t under * µ . The Markov property then gives the equality in law for the càd-làg processesρ and ρ (0) .
Proof of Theorem 2.3. 0 is recurrent for the Markov càd-làg processesρ and ρ (0) . These two processes have no sojourn time at 0, and when killed on the first hitting time of 0, they have the same law, thanks to Lemma 4.6. From Theorem 4.2 of [17], Section 5, we deduce thatρ under µ,0 is distributed as ρ (0) under µ .

Law of the excursion lengths
Recallσ = σ 0 1 {m s =0} ds denotes the length of the excursion of the pruned exploration process. We can compute the joint law of (σ, σ). This will determine uniquely the law ofσ conditionally on σ = r. .
Notice that σ under * ℓ is distributed as τ ℓ , the first time for which the infimum of X , started at 0, reaches −ℓ. Since τ ℓ is distributed as a subordinator with Laplace exponent ψ −1 at time ℓ, we have * Thanks to (15), we get Since ψ 0 is increasing and continuous, we get the result.

Appendix
We shall present in a first subsection, how one can extend the construction of the Lévy snake from [22] to a weighted Lévy snake, when the height process may not be continuous and the lifetime process is given by the total mass of the exploration process (instead of the height of the exploration process in [22]). Then, using this construction, we can define in a second subsection a general Lévy snake when the height process is not continuous.

Weighted Lévy snake
Let D be a distance on f ( + ) which defines the topology of weak convergence. Let us recall that ( f ( + ), D) is a Polish space, see [19], section 3.1.
Let E be a Polish space, whose topology is defined by a metric δ, and ∂ be a cemetery point added to E. Let x be the space of all E-valued weighted killed paths started at x ∈ E. An elementw = (µ, w) of x is a mass measure µ ∈ f ( + ) and a càd-làg mapping w : [0, 〈µ, 1〉) → E s.t. w(0) = x. By convention the point x is also considered as a weighted killed path with mass measure µ = 0. We set = x∈E x and equip with the distance d((µ, w), (µ ′ , w ′ )) = δ(w(0), w ′ (0)) + D(µ, µ ′ ) where d t is the Skorohod metric on the space ([0, t], E) and w ≤t denote the restriction of w to the interval [0, t]. Notice d is a distance on . Indeed, we have that: • d is symmetric.
Thus the space ( , d) is a Polish space.
The last property corresponds to the Markov property conditionally on the mass measure. We shall assume that the mapping (x, µ) →Π x,µ is measurable.
Since ρ and Y are µ -a.s. càd-làg, this implies that the mapping s →W s is (µ,w) -a.s. càd-làg on [0, ∞) . Hence there is a unique càd-làg extension to the positive real line, we shall still denote by (µ,w) . The process (W s , s ≥ 0) is under (µ,w) a time-homogeneous Markov process living in ( + , f ( + ) × ). We call this distribution the distribution of the weighted Lévy snake associated withΠ.
We set We denote by ( s , s ≥ 0) the canonical filtration on ( + , f ( + ) × ). One can readily adapt the proofs of Propositions 4.1.1 and 4.1.2 of [22] to get the following result. Theorem 6.1. The process (W s , s ≥ 0; (µ,w) , (µ, w) ∈ Θ x ) is a càd-làg Markov process in Θ x and is strong Markov with respect to the filtration ( s+ , s ≥ 0). Let us remark that, when the family of probability measuresΠ x,µ is just the law of a homogeneous Markov process ξ starting at x and stopped at time 〈µ, 1〉, the previous construction gives a snake with spatial motion ξ and lifetime process X − I, which is the total mass of the exploration process. Notice that in [22] the lifetime process is given by the height of the exploration process.

The general Lévy snake
However, we need some dependency between the spatial motion and the exploration process ρ in order to recover the usual Lévy snake from the weighted Lévy snake. Informally, we keep the spatial motion from moving when time t is "on a mass" of ρ s . This idea can be compared to a subordination and has already been used in the snake framework by Bertoin, Le Gall and Le Jan in [16] in order to construct a kind of Lévy snake from the usual Brownian snake.