Chaos expansion of 2D parabolic Anderson model

We prove a chaos expansion for the 2D parabolic Anderson Model in small time, with the expansion coefficients expressed in terms of the annealed density function of the polymer in a white noise environment.

The equation (1.1) was analyzed in [7,8,9] by different approaches including the theory of regularity structures, para-controlled calculus, and the method of correctors and two-scale expansions. The main results in these references showed that a smoothed version of (1.1) converges to some limit that is independent of the mollification.
More precisely, let ϕ : R 2 → R + be a smooth and compactly supported function on R 2 satisfying ϕ(x) = ϕ(−x) and ϕ = 1. Define ϕ (·) = −2 ϕ(·/ ) anḋ  (1.5) Then u converges in some weighted Hölder space to a limit u that is defined to be the solution to (1.1), see [9,Theorem 4.1]. While the solution to (1.1) is well-defined, its statistical property remains a challenge. We refer to [1,2,5,6,14] for some relevant discussions. The goal of this note is to provide a Wiener chaos expansion of the solution u, in the short time regime. We assume u (0, x) = u 0 (x) for some bounded function u 0 . Theorem 1.1 below shows that for small t, u (t, x) → u(t, x) in L 2 (Ω) as → 0, and u(t, x) is written explicitly as a Wiener chaos expansion in terms of the probability density of a polymer in a white noise environment, see (1.17). We hope that the explicit chaos expansion will provide another way of proving the convergence to (1.1), e.g. from a discrete system using the general criteria proved in [3,14]. The tool we use is a combination of the Feynman-Kac representation and Malliavin calculus. By writing u (t, x) in terms of a chaos expansion, it suffices to pass to the limit in each chaos.

Elements of Malliavin calculus
We give a brief introduction to Malliavin calculus and refer to [15] for more details.
For any function φ ∈ L 2 (R 2 ), we define W (φ) = φ dW . Let F be a smooth and cylindrical random variable of the form p (R n ) (namely f and all its partial derivatives have polynomial growth), then the Malliavin derivative of F , denoted by DF , is the L 2 (R 2 )−valued random variable defined by For each positive integer k, D k F is defined to be the k-th iterated derivative of F , which is a random variable taking values in L 2 (R 2 ) ⊗k , the k-th tensor product of L 2 (R 2 ). The operator D k is closable from L 2 (Ω) into L 2 (Ω; L 2 (R 2 ) ⊗k ) and we define the Sobolev space D k,2 as the closure of the space of smooth and cylindrical random variables under the Define D ∞,2 = ∞ k=1 D k,2 , and L 2 (R 2 ) k as the k-th symmetric tensor product of L 2 (R 2 ). For any integer n ≥ 0, we denote by H n the n-th Wiener chaos of W . We recall that H 0 is simply R, and for n ≥ 1, H n is the closed linear subspace of L 2 (Ω) generated by the random variables where H n is the n-th order Hermite polynomials. For any n ≥ 1, the mapping can be extended to a linear isometry between L 2 (R 2 ) n and H n , with the isometric relation E[I n (h ⊗n ) 2 ] = n! h ⊗n 2 L 2 (R 2 ) ⊗n . Consider now a random variable F ∈ L 2 (Ω), it can be written as I n (f n ) , (1.9) where the series converges in L 2 (Ω), and the coefficients f n ∈ L 2 (R 2 ) n are determined by F . This identity is called the Wiener-chaos expansion of F . When the above F ∈ D ∞,2 , the n−th coefficient f n in the Wiener chaos expansion of F can be explicitly written as [16, Page 3, equation (7)] (1.10)

Brownian self-intersection local time and polymer in white noise
The self-intersection local time of the planar Brownian motion is a classical subject in probability theory [4,12,13,17,18]. In the following, we discuss its connections to the parabolic Anderson model.
Using the Feynman-Kac formula, we write the solution to (1.4) as where B is a standard Brownian motion starting from the origin which is independent fromẆ , and E B denotes the expectation with respect to B. Taking expectation with respect toẆ and using the fact that the exponent inside the expectation in (1.11) is of Gaussian distribution for each realization of the Brownian motion, we obtain where we recall that R is the covariance function ofẆ . It is well-known that almost surely, and γ(t, B) is the so-called renormalized self-intersection local time of the planar Brownian motion formally written as (1.13) In addition, there exists some critical t c > 0 such that (1.14) The renormalization constant in (1.5) matches the expectation in (1.12) up to an O(1) correction, and a calculation as in [6, Lemma 1.1] shows that there exists constants (1.16) whereÊ t,B denotes the expectation with respect to the Wiener measure tilted by the for any bounded X. By the formal expression in (1.13), we can viewÊ t,B as the expectation with respect to the annealed measure of a polymer in a white noise environment. By (1.14), it is clear that the measure is absolutely continuous with respect to the Wiener measure for small t. Applying the Radon-Nikodym theorem, for any n ∈ Z + and 0 < s 1 < . . . < s n ≤ t < t c , there exists a non-negative measurable function, denoted by In other words, F s1,...,sn is the joint spatial density function of the polymer path at s 1 < . . . < s n . We note that F actually depends on t since the tilted measure depends on t. For our purpose, we use the simplified notation since t is fixed. It is an elementary exercise to show that F s1,...,sn (x 1 , . . . , x n ) is jointly measurable in (s 1 , . . . , s n , x 1 , . . . , x n ). For the convenience of the reader, we present a proof in the appendix.
where q t (x) := (2πt) −1 e −|x| 2 /2t is the standard heat kernel. With µ 1 = µ 2 = F = 0, the expansion coefficient is given by f n (y 1 , . . . , y n ; t, x) with the convention that y 0 = x, y n+1 = z, s n+1 = 0. Thus, the resulting chaos expansion is obtained by iterating the mild formulation of (1.19). The missing exponential weight e γ(t,B) favors self-attracting of the polymer paths, which prevails in the intermittency behaviors of parabolic Anderson model. We refer to the recent monograph [11] for more details.

Proof of the main result
For fixed t > 0, x ∈ R 2 , > 0 and each realization of the Brownian motion, we write the exponent in (1.11) as Then it is easy to see that u (t, x) ∈ D ∞,2 , and x,B (·)) ⊗n . I n (f ,n (·; t, x)) , By (1.15), we define (2.4) which goes to zero as → 0, and rewrite (2.5) To prove Theorem 1.1, it suffices to show that as → 0, the proof of (2.6) reduces to the following three lemmas.
In the following, we use the notation a b when a ≤ Cb for some constant C > 0 independent of , n.
Proof of Lemma 2.1. The proof of f ,n andf ,n is the same. Take f ,n for example: where B 1 , B 2 stand for independent Brownian motions. Performing the integral in the y variable, the r.h.s. of the above display is bounded by

Now we use Cauchy-Schwarz inequality and (A.1)-(A.2) to derive
An application of Stirling's approximation yields the desired result. By Lemma 2.3, the same estimate holds for f n . The proof is complete.
Proof of Lemma 2.2. By the same discussion as in the proof of Lemma 2.1, we have f ,n (·; t, x) −f ,n (·; t, x) 2 By the fact that γ → γ a.s. as → 0 and (A.3), we know that the random variable inside the above expectation converges to zero in probability. The uniform integrability is guaranteed by (A.1) and (A.2). Thus, the r.h.s. of the above display goes to zero as → 0.

A.2 Estimates on intersection local time
We collect some standard estimates on the intersection local time of planar Brownian motion. Recall that R 1 , 2 = ϕ 1 ϕ 2 , and assume that the Brownian motion is built on the probability space (Σ, A, P B ).