An upper bound for the probability of visiting a distant point by critical branching random walk in $\mathbb{Z}^4$

In this paper, we study the probability of visiting a distant point $a\in \mathbb{Z}^4$ by critical branching random walk starting from the origin. We prove that this probability is bounded by $1/(|a|^2\log |a|)$ up to a constant.


Introduction
A branching random walk is a discrete-time particle system in Z d as the following. Fix a distribution µ on N, called offspring distribution, and θ on Z d , called jump distribution. At time 0, there is a single particle at the origin 0 ∈ Z d . At each time step n ∈ N, every particle, say at the site x ∈ Z d , gives birth to a random number of offspring (and dies afterwards), according µ; each of these moves independently to a site according to distribution x + θ. If the expectation of µ is one, we say that the branching random walk is critical.
The asymptotic behavior of the probability of visiting a distant point a ∈ Z d by critical branching random walk in low dimensions (d ≤ 3) was established recently by Le Gall and Lin (Theorem 7 in [4]). Their theorem implies that (under some assumption about the branching random walk) where we write f (a) g(a) (f (a) g(a) resp.) if there exists a positive constant c (only depending on d, the offspring distribution µ and the jump distribution θ of the critical branching random walk) such that f (a) ≥ cg(a) (f (a) ≤ cg(a) resp.) and f (a) ≍ g(a) if f (a) g(a) and f (a) g(a).
It is also pointed out there (Section 5.1 in [4]) that a simple calculation of the first moment and second moment gives and P (visiting a) 1/(|a| 2 log |a|) in Z 4 ; (1.1) It is guessed there: In this paper, we prove (1.2) under some assumption about θ-we almost assume nothing about µ, as long as µ is critical and nondegenerate i.e. µ(1) < 1. Let us state the main theorem. Theorem 1.1. Let µ be a critical and nondegenerate probability measure on N and θ be a probability measure on Z 4 with zero mean and finite fifth moment (i.e. Eθ = 0 and E|θ| 5 < ∞) , which is also not supported on a strict subgroup of Z 4 . If S is a critical branching random walk with offspring distribution µ and jump distribution θ, then there exists a positive constant C depending on µ and θ, such that, for any a ∈ Z 4 with |a| sufficiently large, we have: Remark 1. Note that for (1.1) we need to assume that µ has finite variance. Hence if µ has finite variance in addition to the assumptions above, then:

Proof of the main theorem
Before the formal proof, let us talk a bit about the main idea. From simple calculation one can see that the expectation of the times of visiting a is G(a) = G(0, a) (G is the Green function). Our assumptions about θ can guarantee G(z) ≍ |z| −2 (see Theorem 4.3.5 in [2]). If conditioned on visiting, the expectation of the visiting times is of order log |a|, then we have (1.3). In fact, we will prove that this is true with high probability. Let us introduce some notation. Classically, branching random walk can be regarded as a random function S : V (T ) → Z d , where T is a random plane tree, i.e. rooted ordered tree, and V (T ) is the set of all vertices of T . In our case T is a Galton-Watson tree with offspring distribution µ. First the root is mapped to the origin under S. Then, we assign to every edge e of T a random variable Y e according to θ independently and S(v), for any v ∈ V (T ) is the sum of the random variables Y e over all edges e belonging to the unique simple path from the root to u in the tree. Since we have an order ≺ in the offspring of each vertex in T , we have the classical order (according to Depth-first search) on V (T ) as follows. Let v and v ′ are different vertices, and ω = (v 0 , v 1 , . . . , v m ) and ω ′ = (v ′ 0 , v ′ 1 , . . . , v ′ n ) be the unique simple paths in the tree from the root (hence v 0 = v ′ 0 is the root) to v and v ′ respectively. We say that v is on the left of is a path in Z 4 from the origin to a. We denote this path byγ(S). Let N be the number of visiting times of a. For any γ be a path from the origin to a, define p(γ) = P (N > 0,γ(S) = γ) and e(γ) = E(N |N > 0,γ(S) = γ). Note that N > 0 iff S visits a. For any path γ = (z 0 , . . . , z n ) in Z 4 , define g(γ) = n i=0 G(z i , a) = n i=0 G(a − z i ), where G is the Green function respect to distribution θ. Let G = 1{S visiting a} · g(γ(S)). The following lemmas are the key ingredients for the main theorem.
(2.1) Lemma 2.2. There exists positive constants c 1 , c 2 , such that for all a ∈ Z 4 with |a| sufficiently large, we have We postpone the proof of these two lemmas and start the proof for Theorem 1.1. Since µ is critical, we have: By Lemma 2.1, we have: Therefore: Then we have: Proof of Lemma 2.1. Fix a γ = (z 0 , z 1 , . . . , z k ) such that p(γ) > 0. For any branching random walk sample S such thatγ(S) = γ, write a i (b i respectively) for the number of the brothers of z i on the left of z i (on the right respectively), for i = 1, . . . , k. From the tree structure, one can easily see that, for any l 1 , . . . , l k , m 1 , . . . , m k ∈ N, we have where s(γ) is the probability weight for random walk respect to θ, i.e. s(γ) = k i=1 θ(z i − z i−1 ) and q(z) is the probability that the branching random walk does not visit z conditioned on the initial particle having only one child.
Conditioned on the event on (2.3), the expectation of N is: . Thus it suffices to show: A straight computation using (2.3) gives: Proof of Lemma 2.2. Straight calculation using (2.3) gives: Hence, we have: where P RW is the probability about Random Walk with step distribution θ. Then Lemma 2.2 can be implied by: There exist c 1 , c 2 such that for a ∈ Z 4 with |a| sufficiently large, where (S i ) i∈N is Random Walk starting from 0 with distribution θ and τ a is the hitting time for a.
Note that we have changed G(·, a) to G(·). We can do this by considering the reversed random walk. This proposition is an adjusted version of Lemma 10.1.2 (a) in [2]. It is assumed there that θ has finite support which is much stronger than our case, though its conclusion is also much stronger than ours. The argument is similar to the one there with small adjustments. We give an outline here. It suffices to prove: where τ n = min{k ≥ 0 : |S k | ≥ n}. Fix β = 0.9 ∈ (4/5, 1) and let N = ⌊n β ⌋. Since θ has finite fifth moment, we have P (|θ| > m) m −5 . Let A be the event that |X i | ≤ N , for i = 1, 2, . . . , 2n 2 ∧τ n (where X i = S i − S i−1 and i ∧ j = min{i, j}). Then P (A c ) n 2 /n 5β ≤ n −2.1 . When A happens, the range of the random walk is bounded by N for the first 2n 2 steps. Since only first 2n 2 steps is bounded, we need to change the stopping times there a little. Let ξ 0 = 0, ξ i = min{k : |S k | ≥ 2 i N } ∧ (ξ i−1 + (2 i N ) 2 ), for i = 1, 2, . . . , L, where L = max{k : 2 k N ≤ n} ≍ log n. Now (2.4) can be obtained by following the argument of the proof of Lemma 10.1.2 (a) in [2] .