Martingale approximations for random fields

In this paper we provide necessary and sufficient conditions for the mean square approximation of a random field with an ortho-martingale. The conditions are formulated in terms of projective criteria. Applications are given to linear and nonlinear random fields with independent innovations.


Introduction
A random field consists of multi-indexed random variables (X u ) u∈Z d . An important class of random fields are ortho-martingales which were introduced by Cairoli (1969) and have resurfaced in many recent works. The central limit theorem for stationary ortho-martingales was recently investigated by Volný (2015). It is remarkable that Volný (2015) imposed the ergodicity conditions to only one direction of the stationary random field. In order to exploit the richness of the martingale techniques, in this paper we obtain necessary and sufficient conditions for an ortho-martingale approximation in mean square. These approximations extend to random fields the corresponding results obtained for sequences of random variables by Dedecker and Merlevède (2002), Zhao and Woodroofe (2008) and Peligrad (2010). The tools for proving these results consist of projection decomposition. We present applications of our results to linear and nonlinear random fields.
We would like to mention several remarkable recent contributions, which provide interesting sufficient conditions for ortho-martingale approximations, by Gordin (2009) Peligrad and Zhang (2017), and Giraudo (2017). A special type of ortho-martingale approximation, so called co-boundary decomposition, was studied by El Machkouri and Giraudo (2017) and Volný (2017). Other recent results involve interesting mixingale-type conditions in Wang and Woodroofe (2013), and mixing conditions in Bradley and Tone (2017).
Our results could also be formulated in the language of dynamical systems, leading to new results in this field.

Results
For the sake of clarity, especially due to the complicated notation, we shall explain first the results for double indexed random fields and, at the end, we shall formulate the results for general random fields. No technical difficulties arise when the double indexed random field is replaced by a multiple indexed one.
We shall introduce a stationary random field adapted to a stationary filtration. In order to construct a flexible filtration it is customary to start with a stationary real valued random field (ξ n,m ) n,m∈Z defined on a probability space (Ω, K, P ) and to introduce another stationary random field (X n,m ) n,m∈Z defined by where f is a measurable function defined on R Z 2 . Note that X n,m is adapted to the filtration F n,m = σ(ξ i,j , i ≤ n, j ≤ m). Without restricting the generality we shall define (ξ u ) u∈Z 2 in a canonical way on the probability space Ω = R Z 2 , endowed with the σ−field, B, generated by cylinders.
We construct a probability measure P ′ on B such that for all B ∈ B and any m and u 1 , ..., u m we have The new sequence (ξ ′ u ) u∈Z 2 is distributed as (ξ u ) u∈Z 2 and re-denoted by (ξ u ) u∈Z 2 . We shall also re-denote P ′ as P. Now on R Z 2 we introduce the operators Two of them will play an important role in our paper, namely when u =(1, 0) and when u =(0, 1). By interpreting the indexes as notations for the lines and columns of a matrix, we shall call We assume that X 0,0 is centered and square integrable. We notice that the variables are adapted to the filtration (F n,m ) n,m∈Z . To compensate for the fact that, in the context of random fields, the future and the past do not have a unique interpretation, we shall consider commuting filtrations, i.e.
This type of filtration is induced, for instance, by an initial random field (ξ n,m ) n,m∈Z of independent random variables, or, more generally can be induced by stationary random fields (ξ n,m ) n,m∈Z where only the columns are independent, i.e.η m = (ξ n,m ) n∈Z are independent. This model often appears in statistical applications when one deals with repeated realizations of a stationary sequence.
It is interesting to point out that commuting filtrations can be described by the equivalent formulation: for a ≥ u we have where, as usual, a ∧ b stands for the minimum of a and b. This follows from this Markovian-type property (see for instance Problem 34.11 in Billingsley, 1995). Below we use the notations For an integrable random variable X, we introduce the projection operators defined by Note that, by (3), we have and by an easy computation and stationarity We shall introduce the definition of an ortho-martingale, which will be referred to as a martingale with multiple indexes or simply martingale.

Definition 1 Let d be a function and define
Assume integrability. We say that (D n,m ) n,m∈Z is a martingale differences field if E a,b (D n,m ) = 0 for either a < n or b < m.
In the sequel we shall denote by || · || the norm in L 2 . By ⇒ we denote the convergence in distribution.

Definition 2
We say that a random field (X n,m ) n,m∈Z defined by (1) admits a martingale approximation if there is a sequence of martingale differences (D n,m ) n,m∈Z defined by (5) such that Theorem 3 Assume that (3) holds. The random field (X n,m ) n,m∈Z defined by (1) admits a martingale approximation if and only if and both Remark 4 Condition (8) in Theorem 3 can be replaced by Theorem 5 Assume that (3) holds. The random field (X n,m ) n,m∈Z defined by (1) admits a martingale approximation if and only if and the condition (9) holds.

Corollary 6
Assume that the vertical shift T (or horizontal shift S) is ergodic and either the conditions of Theorem 3 or Theorem 5 hold. Then where c 2 = E(D 0,0 ) 2 .

Proofs
Proof of Theorem 3. We start from the following orthogonal representation with R n,m = E n,0 (S n,m ) + E 0,m (S n,m ) − E 0,0 (S n,m ).

Proof of Theorem 5
Let us first note that D 1,1 defined by (10)  A simple computation involving the properties of conditional expectation and the martingale property shows that By (10)  whence the martingale approximation holds by (9). Let us assume now that we have a martingale approximation. According to Theorem 3 condition (7) is satisfied. In order to show that (7) implies (10) we apply the Cauchy-Schwatz inequality twice: Also, by the triangular inequality and (9) follows.

Proof of Remark 4
If we have a martingale decomposition, then by Theorem 3 we have (7) and by Theorem 5 we have (9). Now, in the opposite direction, just note that (7) implies (10) and then apply Theorem 5.

Multidimensional index sets
The extensions to random fields indexed by Z d , for d > 2, are straightforward following the same lines of proofs as for a two-indexed random field. By u ≤ n we understand u =(u 1 , ..., u d ), n =(n 1 , ..., n d ) and 1 ≤ u 1 ≤n 1 ,..., 1 ≤ u d ≤n d . We shall start with a strictly stationary real valued random field ξ = (ξ u ) u∈Z d , defined on the canonical probability space R Z d and define the filtrations F u = σ(ξ j : j ≤ u). We shall assume that the filtration is commuting if E u E a (X) = E u∧a (X), where the minimum is taken coordinate-wise. We define We also define T i the coordinate-wise translations and then . Let d be a function and define Assume integrability. We say that (D m ) m∈Z d is a martingale differences field if E a (D m ) = 0 is at least one coordinate of a is strictly smaller than the corresponding coordinate of m. We have to introduce the d-dimensional projection operator. By using the fact that the filtration is commuting, it is convenient to define . Above, we used the notation: F We say that a random field (X n ) n∈Z d admits a martingale approximation if there is a sequence of martingale differences (D m ) m∈Z d such that where |n| =n 1 ...n k . Let us introduce the following regularity condition Theorem 7 Assume that the filtration is commuting. The following statements are equivalent: (a) The random field (X n ) n∈Z d admits a martingale approximation. (b) The random field satisfies (17) and (c) The random field satisfies (18) and for all j, 1 ≤ j ≤ d we have where and n j ∈Z d has the j − th coordinate 0 and the other coordinates equal to the coordinates of n.
(d) The random field satisfies (17) and Corollary 8 Assume that one of the shifts (T i ) 1≤i≤d is ergodic and either one of the conditions of Theorem 7 holds. Then where c 2 = ||D 0 || 2 .

Examples
Let us apply these results to linear and nonlinear random fields with independent innovations.
Example 9 (Linear field) Let (ξ n ) n∈Z d be a random field of independent, identically distributed random variables which are centered and have finite second moment. Define Assume that j≥0 a 2 j < ∞ and denote b j = j k=1 a k . Also assume that and Then the martingale approximation holds.
Proof of Example 9. The result follows by simple computations and by applying Theorem 7 (d).
Example 10 (Volterra field) Let (ξ n ) n∈Z d be a random field of independent random variables identically distributed centered and with finite second moment. Define where a u,v are real coefficients with a u,u = 0 and u,v≥0 a 2 u,v < ∞ and assume E(S 2 n ) |n| → c 2 σ 2 when min 1≤i≤d n i → ∞.
Then the martingale approximation holds.
Note that P 0 (ξ k−u ξ k−v ) = 0 if and only if k − u = 0 and k − v = w with w ≤ 0 or k − v = 0 and k − u = t ≤ 0. Therefore (a k,k−w + a k−w,k )ξ 0 ξ w .
It remains to apply Theorem 7 (d).