Weak dependence and GMM estimation of supOU and mixed moving average processes

We consider a mixed moving average (MMA) process X driven by a L\'evy basis and prove that it is weakly dependent with rates computable in terms of the moving average kernel and the characteristic quadruple of the L\'evy basis. Using this property, we show conditions ensuring that sample mean and autocovariances of X have a limiting normal distribution. We extend these results to stochastic volatility models and then investigate a Generalized Method of Moments estimator for the supOU process and the supOU stochastic volatility model after choosing a suitable distribution for the mean reversion parameter. For these estimators, we analyze the asymptotic behavior in detail.


Introduction
Lévy-driven continuous-time moving average processes, i.e. processes (X t ) t∈R of the form X t = R f (t − s)dL s with f a deterministic function and L a Lévy process, are frequently used to model time series, especially, when dealing with data observed at high frequency. Moreover, causal moving averages can be used to model the volatility process when the dynamics of a logarithmic financial asset price are modeled. Popular examples include, for instance CARMA processes [16,39], the increments of fractionally integrated Lévy processes [38] and non-Gaussian Ornstein-Uhlenbeck type processes [9] where f (s) = e as 1 [0,∞) (s) with a ∈ R − . By allowing f to depend on a random parameter A and replacing the Lévy process by a Lévy basis one arrives at so-called mixed moving averages (MMA in short) as for instance in [3,10,27].
An important example of MMA are the supOU processes studied in [3,10,25,27]. In the univariate case, assume |x|>1 log(|x|) ν(dx) < ∞ and R − − 1 A π(dA) < ∞, where ν is a Lévy measure and π is the probability distribution on R − of the random parameter A, see Definition 2.1 for details. If Λ is a Lévy basis on R with those characteristics, then the process is called a supOU process. Whereas a non-Gaussian Ornstein-Uhlenbeck process necessarily exhibits autocorrelation e ah for h ∈ N, the supOU process has a flexible dependence structure. For example, its autocorrelations can show a polynomial decay depending on the probability distribution π. Moreover, when a discrete probability distribution π for the random parameter A is considered, we obtain a popular model used, for example, in stochastic volatility models [3], in modeling fractal activity times [36,37] and in astrophysics [35].
MMA processes can also be used, under suitable conditions, as building blocks for more complex models. We study in this paper the class of MMA stochastic volatility models. An example of the class is the supOU SV model, defined in [10,11], where the log-price process (of some financial asset) is defined for t ∈ R + as J t = t 0 X s dW s , J 0 = 0, and (W s ) s∈R + is a standard Brownian motion independent of the process (X s ) s∈R + which is a non-negative supOU process. Some examples of applications of the supOU SV model can be found in [12,32,49].
The aim of this paper is twofold. First, to show that sample moments of an MMA and of the returns of an MMA SV model have a limiting normal distribution. Secondly, to develop a statistical estimation procedure for the MMA and MMA SV model in a semi-parametric framework, where the distribution of the random parameter A is specified in detail, and establishing its asymptotic properties.
For this end, it is of high importance to understand the dependence structure of the class of MMA processes. In [27], it is shown that an MMA process driven by a Lévy basis is mixing. However, in order to prove distributional limit theorems which enable valid asymptotic inference stronger notions of asymptotic independence are needed. Often one applies strong mixing properties (see [21,46]) to this end. Usually they are established by using a Markovian representation and showing geometric ergodicity of it. In turn this requires often smoothness conditions on the driving random noise and it is well-known that even autoregressive processes of order one are not strongly mixing when the distribution of the noise is not sufficiently regular (see [1]). We want to obtain results for MMA processes in general, which typically have no suitable Markovian representation, and without regularity conditions on the driving Lévy basis apart from moment conditions. As will become obvious later on, the weak dependence concepts introduced by Doukhan and Louhichi [23] and Doukhan and Dedecker [18], respectively called η-weak dependence and θ-weak dependence, are very suitable for our purposes. For an extensive introduction on the weak dependence of causal and non-causal processes we refer the reader to [19]. We then show the asymptotic normality of the sample mean and the sample autocovariance functions of an MMA process in its non-causal and causal specification. Moreover for the MMA stochastic volatility models, we show the θ-weak dependence of the return process and the distributional limit of its sample moments. In [29,30,31], the limiting behavior of integrated and partial sums of supOU processes is analyzed in relation to the growth rate of their moments, called intermittency when the grow rate is fast. This leads to some conclusions regarding their asymptotic finite dimensional distributions and to identify different limiting theorems depending on the short or long memory shown by the supOU process. In our paper, for short memory supOU processes and more general MMA and MMA SV model we can additionally give, exploiting the weak dependence properties, conditions under which functional central limit theorems hold in distribution as well as consider general moments.
Later in the paper, we discuss a Generalized Method of Moments (GMM in short) procedure to estimate the parameters of a supOU and supOU SV model. Unfortunately, the classical and efficient maximum-likelihood approach seems not applicable in this case, since the density of the supOU processes is not known in general. However, the supOU process has a known moment structure and GMM estimators can be defined as in [48]. In a semiparametric framework, we consider in detail the case in which the random parameter A is Gamma distributed and the moment functions are known in closed form. For the GMM estimators of the supOU process and the return process of a supOU SV model we show the asymptotic normality of both estimators (whose consistency has been shown in [48]). Finally, via an explicit computation of the third and fourth order cumulants of the supOU and return process, we give the explicit form of the asymptotic covariance matrices of the GMM estimators.
Interestingly, our result can also be seen as a first step in obtaining an estimation theory for the ambit processes (homogeneous and stationary) which include an additional multiplicative random input in the definition of an MMA process, see [4,5,8].
The paper is organized as follows. In Section 2, the definition of a Lévy basis and MMA process is given. In Section 3, the weak dependence properties of an MMA process are discussed. In Section 4, the asymptotic distributions of the moments of non-causal and causal MMA processes are shown. In Section 5, the definition of an MMA SV model is given and the θ-weak dependence of the return process is analyzed along with its sample moments asymptotic. In Section 6, the asymptotic normality of the GMM estimators of the supOU process and of the supOU SV model is then proven.

Lévy bases and mixed moving average processes
We start with some preliminary results leading to the definition of an MMA process. Throughout, we assume that all random variables and processes are defined on a given complete probability space (Ω, A, P) equipped with a filtration when relevant. Let S denote a non-empty topological space, B(S) the Borel σfield on S, π some probability measure on (S, B(S)) and B b (S × R) the bounded L t = Λ(S × (0, t]) and L −t = −Λ(S × (−t, 0)) for t ∈ R + .
The quadruple (γ, Σ, ν, π) determines the distribution of the Lévy basis completely and therefore it is called the generating quadruple.
In the following, norms of vectors or matrices are denoted by · . We are going to work especially with the Euclidean norm or its induced operator norm unless otherwise stated. However, due to the equivalence of all norms none of the results in the paper depends on the choice of the norm. For more information on R d -valued Lévy bases see [43] and [45].
Following [43], it can be shown that a Lévy basis has a Lévy-Itô decomposition.
Furthermore, the integral with respect to µ exists as a Lebesgue integral for all ω ∈ Ω.
Here an R d -valued Lévy basisΛ on S × R is called a modification of a Lévy basis Λ ifΛ(B) = Λ(B) a.s. for all B ∈ B b (S × R). We refer the reader to [34, Section 2.1] for further details on the integration with respect to Poisson random measures.
We also recall the following multivariate extension of [45,Theorem 2.7]. We denote by A ′ the transpose of a matrix A in what follows.
Then f is Λ-integrable as a limit in probability in the sense of Rajput and Rosiński [45], if and only if for all Borel sets B ⊆ R n \ {0}.
imsart-ejs ver. 2020/08/06 file: weak_dependence_final_CuratoStelzer.tex date:  Implicitly, we assume that Σ int or ν int are different from zero throughout the paper to rule out the deterministic case.
When the underlying Lévy process has finite variation we can do ω-wise Lebesgue integration; that is, the integral can be obtained as a Lebesgue integral for each ω ∈ Ω.
Corollary 2.1. Let Λ be an R d -valued Lévy basis with characteristic quadruple (γ, 0, ν, π) satisfying x ≤1 x ν(dx) < ∞, and define γ 0 as in (2.3), that is Then, 12) and the right hand side is a Lebesgue integral for every ω ∈ Ω. Moreover, the The above corollary follows immediately from the Lévy-Itô decomposition (2.2) and the usual integration theory with respect to a Poisson random measure. We notice that the result (2.12) is an immediate consequence of working with an underlying Lévy process of finite variation, as no compensation for the small jumps is needed if x ≤1 x ν(dx) < ∞.
We can now introduce an MMA process driven by a Lévy basis.
is well defined for each t ∈ R, infinitely divisible and strictly stationary. It is called a n-dimensional mixed moving average process and f its kernel function. We conclude the section giving sufficient conditions ensuring the finiteness of moments of an MMA process.
Proposition 2.1. Let X be an n-dimensional MMA process driven by a Lévy basis Λ satisfying the conditions of Theorem 2.2.
Proof. Following [47,Corollary 25.8], we have to show that we can conclude that (i) and (ii) follow, given that ν is a Lévy measure.
If the underlying Lévy process L is of finite variation, an analogous proof gives the following results.
Corollary 2.2. Let X be an n-dimensional MMA process driven by a Lévy basis Λ satisfying the conditions of Corollary 2.1.

Weak dependence properties of a mixed moving average process
Let (A t ) t∈R be the filtration generated by Λ defined as the σ-algebras A t generated by the set of random variables {Λ(B) : B ∈ B(S × (−∞, t])} for t ∈ R. If an MMA process is adapted to (A t ) t∈R , we call it causal. Otherwise it is referred to as being non-causal. In the following we will refer by N to the set of the non negative integers, by N * to the set of the positive integers, by R − to the set of negative real numbers and by R + to the set of the non negative real numbers.
where F u is the class of bounded functions from (R n ) u to R Lipschitz with respect to a distance δ 1 on (R n ) u defined by where x * = (x 1 , . . . , x u ) and y * = (y 1 , . . . , y u ) and x i , y i ∈ R n for all i = 1, . . . , u.
We consider R n equipped with the Euclidean norm and then δ( Definition 3.1. A process X = (X t ) t∈R with values in R n is called an η-weakly dependent process if there exists a sequence (η(r)) r∈R + converging to 0, satisfying and where c is a constant independent of r. We call (η(r)) r∈R + the sequence of the η-coefficients.
The above definition makes the asymptotic independence between past and future explicit, this means that the past is progressively forgotten. In terms of the initial process X, past and future are elementary events respectively defined through finite-dimensional marginals as A u = (X i1 , . . . , X iu ) and The weak dependence property, as stated in Definition 3.1, depends upon the class of functions H = {f ∈ F : f ∞ ≤ 1}. However, it can also be defined in F as discussed in [23]. Note, a similar definition can be given for the strong mixing property introduced by Rosenblatt [46]. Let σ(A u ) and σ(B v ) be the σ-algebras generated by the finite-dimensional marginals A u and B v and F * = u∈N F * u where F * u is the class of bounded functions from (R n ) u to R. We define where H * = {f ∈ F * : f ∞ ≤ 1}, and then α-strong mixing coefficient is We notice that definition (3.2) holds for a set of functions in H whereas (3.3) holds for functions belonging to H * . This means that if a process X is strongly mixing then it is also weakly dependent but the reverse implication does not necessarily hold. The only known case of the equivalence of the two definitions can be found in [22,Proposition 1] where it is shown that an η-weakly dependent integer valued process satisfies the strong mixing condition.
We now show that a non-causal MMA process is an η-weakly dependent process.
Proposition 3.1. Let Λ be an R d -valued Lévy basis with characteristic quadruple (γ, Σ, ν, π) such that E[L 1 ] = 0 and x >1 x 2 ν(dx) < ∞, f : S × R → M n×d (R) a B(S × R)-measurable function and f ∈ L 2 (S × R, π ⊗ λ). Then, the resulting MMA process X is an η-weakly dependent process with coefficients  Since the kernel function f is square integrable, we have that properties (2.5) and (2.6) hold and so f is Λ-integrable (Theorem 2.2) and X is well defined. Moreover, Proposition 2.1 holds, E[X 2 t ] < ∞ for all t ∈ R and we can determine an upper bound of the expectation Due to the stationarity of X the above estimation is independent of t and equal to Let F and G belong to the class of bounded functions H, (u, v) ∈ N * × N * , r ∈ R + , (i 1 , . . . , i u ) ∈ R u and (j 1 , . . . , and . ). Therefore, which converges to zero as r goes to infinity by applying the dominated convergence theorem.
The following Corollary establishes the η weak dependence of an MMA when its underlying Lévy process has finite mean possibly different from zero.
imsart-ejs ver. 2020/08/06 file: weak_dependence_final_CuratoStelzer.tex date:  Corollary 3.1. Let Λ be an R d -valued Lévy basis with characteristic quadruple (γ, Σ, ν, π) and x >1 x 2 ν(dx) < ∞, f : S × R → M n×d (R) a B(S × R)measurable function satisfying assumption (2.4) and f ∈ L 2 (S ×R, π ⊗λ). Then, the resulting MMA process X is an η-weakly dependent process with coefficients Proof. Let X   When the underlying Lévy process is of finite variation we can lighten the moment assumptions on the MMA process. The below result applies to all finite variation MMA process with finite mean.
Proof. As the kernel function f is in L 1 , the properties (2.10) and (2.11) are satisfied. Thus, f is Λ-integrable and X well defined. Moreover, because of Corollary 2.2, E[X t ] < ∞. Using the notation of Proposition 3.1, for all t ∈ R and m ≥ 0 , is the truncated sequence (3.5). Thus, for m = r 2 and F , G, X * i and Finally, we conclude by applying the dominated convergence theorem.
The η coefficients have some hereditary properties. For example, let h : R n → R be a Lipschitz function, then if the sequence (X t ) t∈R is η-weakly dependent, the same is true for the sequence (h(X t )) t∈R . The latter can be readily checked directly based on Definition 3.1. Hereditary properties for functions that are not Lipschitz on the whole space R n can be found in [2] Lemma 6, for stationary processes. Below follows a multivariate extension of this Lemma for h : R n → R m . Proposition 3.2. Let (X t ) t∈R be an R n -valued stationary process and assume there exists some constant with the constant C independent of r.

I.V. Curato and R. Stelzer/Weak dependence and GMM estimation of MMA
and We have that By assumption, for each l = 1, . . . , u

I.V. Curato and R. Stelzer/Weak dependence and GMM estimation of MMA
14 Therefore (3.12) is less than or equal to 6c u Lip(F )C p M a−p . An analogous bound holds for |Cov( The same argument holds also for the function is a process with values in A and η-weakly dependent with the same coefficients as X t , then To conclude, (3.11) is less than or equal to By choosing M = η X (r) 1 1−p and calling C = 6c(C p + 1 2 ), we obtain that For a polynomial function h(x) we have Corollary 3.3. Let (X t ) t∈R be an R n -valued stationary process and assume there exists some constant with the constant C independent of r.
Proof. The function h satisfies the assumption (3.10) for each polynomial degree a less than p. Proposition 3.2 can then be applied.

Causal case
for all and where c is a constant independent of r. We call (θ(r)) r∈R + the sequence of the θ-coefficients.
The θ-weak dependence condition is stronger than the one for η-weak dependence. Hence, moment conditions and decay demands on the rate of the θ-coefficients for central limit theorems are typically weaker than in the case of η-weak dependence, see [18]. It should be also noticed that η(r) ≤ θ(r) and that in the case of integer valued processes, [22], the θ-weak dependence implies the strong mixing condition.
A causal MMA process is defined as follows is well defined for each t ∈ R, infinitely divisible and strictly stationary. It is called a causal n-dimensional mixed moving average process and f its kernel function.
Then, the resulting causal MMA process X is a θ-weakly dependent process with coefficients imsart-ejs ver. 2020/08/06 file: weak_dependence_final_CuratoStelzer.tex date:  Proof. First, we define ∀t ∈ R and m ≥ 0 the truncated sequence (3.17) Since the kernel function f is square integrable, we have that properties (2.5) and (2.6) hold and then f is Λ-integrable (Theorem 2.2) and X is well defined. Thus, because of Proposition 2.1, E[X 2 t ] < ∞ for all t ∈ R and we can determine an upper bound of the expectation Due to the stationarity of X the above estimation is independent of t and equal to Let F and G belong respectively to the class of bounded functions H * and H, Then, if j 1 − m − i u ≥ 0, which can also be expressed as j 1 − i u ≥ m, I u = S × (−∞, i u ] and J 1 = S × [j 1 − m, j 1 ] are disjoint sets or they have intersection S ×{j 1 −m} when j 1 −m = i u . Noting that π ×λ(S ×{j 1 −m}) = 0, by the definition of a Lévy basis, the two sequences (X i ) i1,...,iu and (X imsart-ejs ver. 2020/08/06 file: weak_dependence_final_CuratoStelzer.tex date:  using the result (3.18) which converges to zero as r goes to infinity by applying the dominated convergence theorem.
Also in the case of θ-weak dependence, the θ-coefficients change when the underlying Lévy process has mean different from zero.
Then, the resulting causal MMA process X is a θ-weakly dependent process with coefficients We conclude the study of the θ-weak dependence properties of an MMA process with the computation of the θ-coefficients for an underlying Lévy process of finite variation.
Proof. The kernel function f is in L 1 then the properties (2.10) and (2.11) are satisfied and then f is Λ-integrable and X well defined and with finite mean by Corollary 2.2. Using the notation in Proposition 3.3, for all t ∈ R and m ≥ 0
Remark 3.1. The η-coefficients of a causal MMA process can be chosen to be equal to the θ-coefficients for each r ≥ 0. This can be easily seen by noticing that the truncated sequences (3.7) in Proposition 3.1 are equal to the truncated sequences (3.19) in Proposition 3.3. This leads to select the parameter m = r in both proofs. Moreover, (3.6) is equal to (3.18) and then η X (r) = θ X (r). The same observations hold when comparing the results in Corollary 3.1 or Corollary 3.2 with Corollary 3.4 or Corollary 3.5.
If E[L 1 ] = µ and |x|>1 |x| 2 ν(dx) < ∞, the supOU process is θ-weakly dependent with coefficients the underlying Lévy process is a subordinator, then (3.24) admits θ-coefficients 27) and when in addition Note that in the finite superposition case strong mixing of the supOU process has been shown in [36,37] based on Masuda's result [40]. As this crucially hinges on an embedding into a finite dimensional Markov process this does not readily extend to the general case.
Remark 3.2. The necessary and sufficient condition R − − 1 A π(dA) for the supOU process to exist is satisfied by many continuous and discrete distributions π, see [48, Section 2.4] for more details. For example, a probability measure π being absolutely continuous with density π ′ = (−x) α l(x) and regularly varying at zero from the left with α > 0 (see [13]), i.e. l is slowly varying at zero, satisfies the above condition. If moreover, l(x) is continuous in (−∞, 0) and where for α ∈ (0, 1) the supOU process exhibits long memory and for α > 1 short memory, see [28,Definition 3.1.2]. Concrete examples where the covariances are calculated explicitely, in this set-up, can be found in [7].

Remark 3.3.
A natural question is whether one can improve the weak dependence coefficients that we obtain. [24,Lemma 4.1] shows that for stationary processes with finite m-moments (m > 2 + δ, for δ > 0) being λ-weakly dependent (cf. [24, Definition 2.1]) and thus η-weakly dependent The above arguments can be easily adapted to the causal case and θ-weak dependence where we likewise get If the stationary process has finite moments of any order we thus obtain the inequalities |Cov(X 0 , X r )| ≤ 9 η(r) and |Cov(X 0 , X r )| ≤ 9 θ(r). Equation (3.28) shows that our weak dependence coefficients are sharp for a supOU process having as underlying Lévy process a subordinator with finite second moment. Note that "sharp" here means that the right and left hand side of the inequalities (3.29) only differ by a constant, as for the weak dependence coefficients one usually -like in the upcoming CLTs -only cares about their summability/integrability in r. The inequalities (3.29) compared to (3.25) and (3.26) show that we might obtain smaller weak dependence coefficients for the supOU process having an underlying Lévy process of infinite variation. In fact following Remark 3.2, if Cov(X 0 , X r ) ∼ r −α for α > 0, then the left hand side in (3.29) decays like r −α whereas the right hand side decays like r −α/2 .
Inspecting the proofs of Corollaries 3.2 and 3.5 and Propositions 3.1 and 3.3, where the η and θ-coefficients are determined, the crucial issue is that we use the equality (2.12) to compute a bound of the term E X t − X (m) t in the finite variation case, whereas we bound E X t −X (m) t by means of a second moment in the infinite variation one. We do the latter because to the best of our knowledge there are no sharper bounds known for the first absolute moment of an infinitely divisible distribution that are suitably expressible in terms of the characteristic triplet in this set-up.
To conclude, we state the hereditary property of a θ-weakly dependent process.
Proposition 3.4. Let (X t ) t∈R be an R n -valued stationary process and assume there exists some constant with the constant C independent of r.

Sample moments of an MMA process
We consider a sample of N observations of a univariate MMA process {X ∆ , . . . , X N ∆ }, where ∆ is a positive integer.
If the underlying Lévy process L has finite first moment, we defineX i∆ = imsart-ejs ver. 2020/08/06 file: weak_dependence_final_CuratoStelzer.tex date:  The sample mean of the process X is defined as and its sample autocovariance function at lag k ∈ N.
W.l.o.g, we consider below E[X 0 ] = 0 and ∆ = 1 in order to lighten the notations and, when the asymptotic properties of the sample auto-covariance functions are investigated, we focus on the features of the processes where we denote by D(k) the covariances at lag k defined, when E[X 0 ] = 0, by . We start by analyzing the asymptotic properties of the sample mean (4.2) for a non-causal and a causal MMA process.
In the case of a causal MMA process, the required decay rate of the θ coefficients is lower than in the η-weak dependence case.
Remark 4.1. In the special case of the supOU process, being representable as a finite sum of independent Ornstein-Uhlenbeck processes with gamma or inverse gaussian marginals, a comparable result can be found in [37,Theorem 2].
Remark 4.2. Theorem 4.1 and 4.2 as well as all the upcoming central limit theorems can be also formulated as functional central limit theorems, following [24] and [18] respectively. However, we state the theorems just for the sample moments we are interested in (and which we are using in Section 6) to lighten the notations. For example, denote for t ∈ [0, 1] and n ≥ 1 in the case of a non-causal MMA process, and for a causal MMA process. Then, under the assumptions of Theorem 4.1 or 4.2, n − 1 2 S n (t) converges in distribution in the Skorohod space D[0, 1] to σ η W and n − 1 2 S * n (t) converges in distribution in the space C[0, 1] to σ θ W , respectively. Here, W denotes a standard Brownian motion. Remark 4.3. In the finite variation case, (4.7) or (4.9), respectively, hold under x >1 x 2+δ ν(dx) < ∞, for some δ > 0, and f ∈ L 2+δ (S × R, π ⊗ λ) ∩ L 1 (S × R, π ⊗ λ) or f ∈ L 2+δ (S × R + , π ⊗ λ) ∩ L 1 (S × R + , π ⊗ λ), respectively. Remark 4.4. Assuming that f is not equal to zero π-almost everywhere and that f ≥ 0 or f ≤ 0, the asymptotic variance in Theorems 4.1 and 4.2 is not degenerate. This is the case for example when working with the supOU process (3.24). Moreover, it is worthy to observe that the assumptions in Theorem 4.2 clearly indicate that we obtain asymptotic normality of the sample mean for a causal MMA process just in the short memory case.
To find an asymptotic distribution for the sample autocovariance functions (4.3), we first show that (Y j,k ) j∈Z are η-weakly or θ-weakly dependent processes. In addition to the hereditary properties in Proposition 3.2 and 3.4, we need to establish when the weak dependence properties of a univariate MMA process are inherited by the process Z t = (X t , X t+1 , . . . , X t+k ) for all k ∈ N.
Proposition 4.1. Let Λ be an R d -valued Lévy basis with generating quadruple (γ, Σ, ν, π) and f : S × R → M 1×d (R) be a B(S × R)-measurable function satisfying the assumptions of Theorem 2.2. If for all t ∈ R, X is a non-causal or causal MMA as defined in (2.13) or (3.15) respectively, then in M k+1×d (R) and k ∈ N, is an MMA process. Moreover, if X satisfies the assumptions of Proposition 3.1 (Corollary 3.1) or Proposition 3.3 (Corollary 3.4) then Z is η or θ−weakly dependent respectively with coefficients η Z (r) = D η X (r − 2k) for r ≥ 2k or θ Z (r) = D θ X (r − k) for r ≥ k, (4.10) where D = (k + 1) 1 2 . In the case when the assumptions of Corollaries 3.2 or 3.5 hold, the process Z is respectively η or θ-weakly dependent with coefficients η Z (r) = D η X (r − 2k) for r ≥ 2k or θ Z (r) = D θ X (r − k) for r ≥ k, (4.11) and D = k + 1.
Proof. For k = 1, the first step of the proof consists of checking that g is a Λ-integrable function as prescribed by Theorem 2.2. W.l.o.g, we consider in our calculations the norm (x, y) = x + y for x, y ∈ M 1×d (R). We have that Noting that and it then holds that (4.12) is finite. Let us pass to the second condition Therefore, f being a kernel of an MMA process, the above integral is finite. Finally, we have Thus the kernel function g is a Λ-integrable function. By induction the statement can be shown for each k ∈ N. Because all the assumptions of Theorem 2.2 hold, we have that Z is an MMA process. Depending now on the properties of the underlying Lévy process, we can distinguish three different scenarios for the η and θ-weak dependence. When X satisfies the assumptions of Proposition 3.1, for each r > 2k. Thus, Z is a k + 1-dimensional MMA process with η coefficients η Z (r) = (k + 1) If X satisfies the assumptions of Proposition 3.3, it can be shown similarly that Z is θ-weakly dependent with coefficients θ Z (r) = (k + 1) Similar calculations follow in the finite variation case leading to the statements (4.11).
Proof. Let us consider a 2-dimensional process Z = (X j , X j+k ) j∈Z with k ∈ N.
The η or θ coefficients of the process Z are by Proposition 4.1. The 2 + δ moment, for δ > 0, of the MMA process exists because of Proposition 2.1. Let us now consider h : The function h satisfies assumption (3.10), for p = 2 + δ, c = 1 and a = 2. Then, Proposition 3.2 or 3.4 applies and h(Z) = X j X j+k , as well as Y j,k , has either coefficients The finite variation case follows easily by applying Proposition 4.1 and using the coefficients (4.11).
We can now give a distributional limit theorem for the processes Y j,k , namely determining the asymptotic distribution of 1 N N j=1 Y j,k for all k ∈ N.
is finite, non-negative and as N → ∞ Proof. Since the results of Proposition 2.1 apply, by using [24, Theorem 2.2] in the case of an η-weakly dependent process or [20, Theorem 1] when the process X is θ-weakly dependent, the distributional limit (4.13) holds.
Remark 4.5. The asymptotic variance γ 2 k can be expressed in terms of the fourth order cumulant of a zero mean MMA process and its covariances as follows. Let us consider an R 4 -valued MMA process X = (X i , X j , X k , X l ) with (i, j, k, l) ∈ R 4 and kernel function g(A, s) The Lévy basis Λ, underlying the definition of X, satisfies the assumptions of Corollary 4.1. Thus, X is infinitely divisible with characteristic triplet (γ int , Σ int , ν int ) as given in Theorem 2.2 and characteristic exponent imsart-ejs ver. 2020/08/06 file: weak_dependence_final_CuratoStelzer.tex date:  We denote by κ(i, j, k, l) the fourth order cumulant of X. By [ On the other hand, cf. [28, Definition 4.2.1], (4.14) where D(l) is defined in (4.5).
Analogously, the formula to compute the third order cumulant κ(i, j, k) can be derived. In fact, and for a zero mean MMA process holds that E[X i X j X k ] = κ(i, j, k). This computation is useful in Section 6.
imsart-ejs ver. 2020/08/06 file: weak_dependence_final_CuratoStelzer.tex date:  Proof. Let us consider the vector Z as defined in Proposition 4.1. Given the assumptions of the Corollary, Z is η-weakly dependent with coefficients η Z (r) = Dη X (r−2k) or θ-weakly dependent with coefficients θ Z (r) = Dθ X (r−k) because of Proposition 4.1 and given that the results of Proposition 2.1 apply. We apply now the function f : R k+1 → R k+1 to the vector Z such that The assumptions of Proposition 3.2 hold with p = 4 + δ, c = 1, a = 2, then f (Z t ) is η or θ-weakly dependent with coefficients C(Dη X (r − 2k)) 2+δ 3+δ or C(Dθ X (r − k)) 2+δ 3+δ . The process Z is then a process with the same weak dependence coefficients as f (Z t ). For all a ∈ R k+1 , a ′ Z is an η or a θ-weakly dependent process, because a linear function is Lipschitz, having the same coefficients as the process Z. By [24, Theorem 2.2] or [20, Theorem 1], then as N → ∞. Applying the Cramer-Wold device, the asymptotic normality of the vector Z is shown.
Remark 4.6. In this paper we consider the classical case of equidistant observations. In many applications different sampling schemes are also highly relevant and for some special cases results have been obtained. For example, [14] considers the asymptotics of the autocovariance function for Lévy-driven moving average processes sampled at an independent renewal sequence and [26] consider the asymptotics of the pathwise Fourier transform/periodogram for Lévy-driven CARMA processes sampled at deterministic irregular grids. Considering independent renewal sampled Lévy-driven MMA processes is beyond the scope of the present paper and the content of future research just starting in [15] where the preservation of strong mixing and weak dependence properties is discussed in general.

Sample moments of an MMA SV model
Let us consider a Lévy basis with characteristic quadruple (γ, σ 2 , ν, π) and values in R and the respective univariate casual MMA process X with kernel function f : (S × R + ) → R. Its dependence structure is given by imsart-ejs ver. 2020/08/06 file: weak_dependence_final_CuratoStelzer.tex date:  and controlled by the probability measure π. If we choose a causal MMA process as the model for the volatility of a logarithmic asset price, its dependence structure can be modeled in a versatile way by choosing the distribution π. Then, the typical decay of the autocovariances of the squared returns, see [17], can be more easily reproduced. Let the logarithmic asset price (J t ) t∈R + be where (W t ) t∈R + is a standard Brownian motion and (X t ) t∈R + is an adapted, stationary and square-integrable causal MMA process with values in R + being independent of W . We call (5.1) an MMA SV model. In the literature, stochastic volatility models where X is given by a sum of independent non-Gaussian OU type process are given in [6,9] which have been later extended to supOU processes in [10,11]. The latter is an example of an MMA SV model whose dependence structure is going to be analysed in this section. Other financial market models using supOU processes as building blocks and allowing for short and long range dependence can be found in [33,36,37]. We show the θ-weak dependence of the return process, over equidistant time intervals where for convenience of notation we consider t ∈ R, and the asymptotic normality of its related sample moments. To this aim, the moments of the return process by using the Itô isometry, as in [44], turn out to be determined as a function of the moments of the integrated process for t ∈ R and ∆ a positive constant. Note that, (V t ) t∈R corresponds to the integrated volatility process computed over a time interval [(t − 1)∆, t∆]. It is immediate from the definitions (5.2) and (5.3) that the strict stationarity of the process (X s ) s∈R and its square-integrability imply the stationarity and the square-integrability of the processes (Y t ) t∈R , (Y 2 t ) t∈R and (V t ) t∈R . Note that, under the square integrability assumption, the moments of the return process can be determined up to the 4th order.
In general, we consider all processes adapted with respect to the filtration (A t ) t∈R generated by the set of random variables {Λ(B) : B ∈ B(S × (−∞, t])} and the increments of the Brownian motion {(W u − W s ) s≤u≤t } for all t ∈ R.
Sufficient conditions for an MMA process to have càdlàg sample paths can be found in [42,Theorem 3.1] and the references therein.
We now show the weak dependence properties of the return process. where (θ X (r)) r∈R + are the coefficients (3.21), for all r ≥ 1.
Proof. The assumptions (H) imply that the resulting MMA process is nonnegative and the process Y t is well defined and square-integrable by Corollary is defined in (3.17). Then, The last inequality follows by (3.22). Let F and G belong respectively to the class of bounded functions H * and H, (u, v) ∈ N * × N * , r ∈ R + , (i 1 , . . . , i u ) ∈ R u and (j 1 , . . . , ). Therefore, let m = (r − 1)∆ which converges to zero as r goes to infinity by applying the dominated convergence theorem.
Let us consider a supOU process X defined as in (3.24) such that the underlying Lévy process L is a subordinator. It can be shown that the process is adapted and càdlàg under the assumptions (ii) and (iii) of [10,Theorem 3.12]. Then, Assumptions (H) are satisfied and we can define a supOU SV model and the resulting return process By applying Proposition 5.1, Y is θ-weakly dependent with coefficients where µ is the mean of the underlying Lévy process.
We consider a sample of N observations of Y and we define the following sample moments for the return process. (5.9) and the fourth order (non-centered) sample moments for k ∈ N When the asymptotic properties of the sample autocovariance functions are investigated, we focus on the processes where we denote by T (k) the covariances of order k defined by whereas in the case of the fourth order sample moments oñ where we denote by D * (k) the covariances of order k defined by where V is the integrated process defined in (5.3). Analogous to Proposition 4.1, we show that the θ-weak dependence of the return process is inherited by the process Z t = (Y t , Y t+1 , . . . , Y t+k ) for all k ∈ N.
Lemma 5.1. Let Λ be a Lévy basis and X an MMA process satisfying Assumptions (H) and W be a standard Brownian motion independent of Λ. We consider the process imsart-ejs ver. 2020/08/06 file: weak_dependence_final_CuratoStelzer.tex date:  where Y t is the return process defined in (5.2) and G s is an R k+1 × R valued for r ≥ k + 1 being D * = (k + 1) and θ X given in (3.21).
by means of the triangular inequality Proceeding as in Proposition 4.1 and in Proposition 5.1 the claim follows.
It can also be shown that the process Z t is mixing, and thus ergodic, proceeding as in the proof of [27,Theorem 4.2].
The following asymptotic result holds for (5.8).
Theorem 5.1. We assume that Assumptions (H) hold and that |x|>1 |x| 1+δ ν(dx) < ∞, for some δ > 0, and f belongs to L 1+δ (S × R + , π ⊗ λ) ∩ L 1 (S × R + , π ⊗ λ). If (Y i ) i∈R as defined in (5.2) is a θ-weakly dependent process such that the volatility process X admits coefficients θ X (r) = O(r −α ) with α > 2 1 + 1 δ , then Proof. Corollary 2.2 applies and the return process is ergodic because of its mixing properties shown in [27,Theorem 4.2]. [20,Theorem 1] can be applied, analogously as in Theorem 4.2, assuring the result (5.14) where the asymptotic variance of the sample mean is given by the absolute summable series Applying Proposition 5.1 and Proposition 3.4, the following can be easily shown.
Proof. Since Corollary 2.2 holds, Z is a θ-weakly dependent process with coefficients given in Proposition 5.
. For all a ∈ R k+1 , a ′ Z is a θ-weakly dependent process, because a linear function is Lipschitz, and ergodic having the same θ-coefficients as the process Z. Under the assumptions of the Corollary, [20, Theorem 1] is then applied and as N → ∞. Applying the Cramer-Wold device, the asymptotic normality of the vector Z is shown.
Proof. The proof follows as in Corollary 5.1, using the θ coefficients of the processZ as determined in Proposition 5.2.
Remark 5.1. In view of Section 6, let us give explicit formulas of the third and fourth order cumulant of an integrated process V and of the covariances Cov(W 0,p ,W l,q ) for p, q ∈ {0, . . . , k} and k ∈ N under the assumptions of (i, j, k, l) A(i, j, k, l) Table 1 Explicit closed formula for the summand A(i, j, k, l) for (i, j, k, l) ∈ Z 4 . Corollary 5.2. Let us consider an integrated process as defined in (5.3) with mean E[V 1 ] = C * . For all (i, j, k, l) ∈ R 4 , we call K(i, j, k) and K(i, j, k, l) the centered cumulant or order three and four which are respectively equal to with κ(s, t, u) given in (4.15), and where κ(s, t, u, z) is defined in (4.14). Moreover, where D(·) is the covariance function, as in (4.5), of the centered MMA process underlying the definition of the integrated process.Hence, by means of the Itô formula, the independence of the process (X t ) t∈R with (W t ) t∈R and using arguments similar to the formula (40) in [44] Cov(W 0,p ,W l,q ) = K(0, p, l, l + q) + C * (K(0, p, l) + K(0, p, l + q) + K(p, l, l + q) + K(0, l, l + q)) where A(i, j, k, l) is defined in Table 1 for (i, j, k, l) ∈ R 4 .

Generalized method of moments for the supOU process and supOU SV model
In this section, we apply the developed asymptotic theory to determine the asymptotic normality of GMM estimators of the supOU process and of the supOU SV model defined in [48].
Let X and Y be a supOU process and a return process as respectively defined in (3.24) and (5.6). We assume, |x|>1 x 2 ν(dx) < ∞, then, as shown in [48, Theorem 2.3 and Theorem 2.8], the supOU process X and the return process Y have known moments given by where µ = E[L 1 ] and σ 2 = V ar[L 1 ] are the mean and variance of the underlying Lévy process and s k := e A∆k .
Assumption 6.1. Let us assume that the mean reversion parameter A is Gamma distributed. That is, we assume that π is the distribution of Bξ where B ∈ R − and ξ is Γ(α π , 1) distributed with α π > 2.
We emphasize that setting the second parameter of the Gamma distribution equal to one does not restrict the model since this is equivalent to varying B. Under Assumption 6.1, we observe the decay of the autocovariances of the supOU process as given in Remark 3.2, notice that in this set-up α = α π − 1.
The moments of the supOU process X and of the return process Y , under Assumption 6.1, have been computed in [48, Section 2.2] , where f k := (1 − B∆k) 3−απ . Therefore, the moment structure, up to the 2nd order for the supOU process and up to the 4th order for the return process, depends only on the parameter vector θ := (µ, σ 2 , α π , B).

SupOU process
Suppose we observe a sample {X t : t = 1∆, . . . , N ∆} for the supOU process with ∆ a positive constant. We construct the following moment functions, as in [48], by using the auto-covariances up to a lag m ≥ 2 of X.
Remark 6.2. Under the assumptions of Corollary 6.1, a slower decay of the autocovariances of a supOU process is required to obtain asymptotic normality compared to Theorem 6.1. Moreover, if all the moments of the underlying Lévy process exist then the asymptotic result of Corollary 6.1 holds assuming that α π > 2. The latter assumption results in the slowest decay of the autocovariances of X that can be reached under short memory, see Remark 3.2 remembering that in the notations of this section α = α π − 1, whereas the asymptotic result of Theorem 6.1 holds, under these assumptions, for α π > 3.
Several assumptions have to be made to show the asymptotic normality of the GMM estimator (6.5): Assumption 6.2. The parameter space Θ is compact and large enough to include the true parameter vector θ 0 . In our set-up we always need to choose a parameter space such that µ ≥ 0, σ 2 > 0, α π > 2 and B < 0. However, Assumption 6.2 remains reasonable, due to the fact that typically an optimization procedure is used to determineθ N,m and then some parameter bounds are always imposed in practice.
Theorem 6.2. Let Λ be a real valued Lévy basis with generating quadruple (γ, Σ, ν, π) and X a supOU process satisfying assumptions (3.23) such that |x|>1 |x| 4+δ ν(dx) < ∞ for some δ > 0. Let Assumption 6.1 hold with α π − 1 > ( Moreover, if Assumption 6.2, 6.3 and 6.4 hold, then as N goes to infinity Proof. We follow the steps of the proof of [ converges uniformly in probability to zero for each θ ∈ Θ. Assumption 1.4-1.6 in [41] represent three sufficient conditions that if fulfilled imply as consequence Assumption 1.2 in [41]. We then show that all three of them hold in our set-up. Assumption 1.4 in [41] corresponds to our Assumption 6.2 and Assumption 1.5 in [41] follows from the ergodicity of the supOU process. Showing Assumption 1.6 in [41] means to prove that a stochastic Lipshitz-type assumption holds for each component of the function h(X t , θ). Let θ i = (µ i , σ 2 i , α i π , B i ) be parameter vectors belonging to Θ for i = 1, 2. Then, for example for the first component .
That means, by construction, the terms where X t appears cancel out and the stochastic Lipschitz-type condition reduces to a Lipschitz continuity condition on the non-random terms in each component of h(X t , θ ∂h(Xt,θ) ∂θ ′ , Assumption 1.8 in [41] requires that a weak law of large numbers applies to ∂h(Xt,θ) ∂θ ′ in a neighborhood of θ 0 . That is, for each sequence θ * N such that θ * N p − → θ 0 then G N,m (X, θ * N ) → G 0 . We have that the matrix ∂h(X t , θ)

SupOU SV model
We work now with a sample {Y t : t = 1, . . . , N } of the return process and define Y The moment function is given by the measurable functionh : (6.11) In this case, the sample moment function of the return process is Table 3 Explicit closed formula for the summand A(i, j, k) for (i, j, k) ∈ Z 3 . and θ 0 can be estimated by minimizing the objective function θ * N,m 0 = argmin g N,m (Y, θ) ′ A N,m g N,m (Y, θ) (6.13) where A N,m is a positive definite matrix to weight the m + 2 different moments collected in g N,m (Y, θ). The consistency of the estimator (6.13) is shown in [48,Theorem 3.2], and as before we need to show that the moment functionh(Y, θ) satisfies a central limit theorem. Theorem 6.3. Let Λ be a real valued Lévy basis with generating quadruple (γ, 0, ν, π), Assumptions (H) be satisfied such that |x|>1 |x| 4+δ ν(dx) < ∞, for some δ > 0, and let Assumption 6.1 hold with α π − 1 > (1 + 1 δ )( 6+2δ δ ). Let (Y t ) t∈R be the resulting return process of a supOU SV model, then Proof. Proceeding as in Theorem 6.1, it can be shown thath(Y t , θ 0 ) is a θ-weakly dependent process with zero mean, by using Lemma 5.1 and Proposition 3.4.
In [48,Section 4] a simulation study on the estimators (6.5) and (6.13) looks at their finite sample performances (Theorem 6.2 and 6.4 are applicable to the setup of the study). The analysis performed shows results in line with asymptotic normality derived theoretically in this paper. To obtain reliable estimation of supOU processes and supOU SV model, a substantial amount of data is needed. R m be a function such that h(0) = 0, h(x) = (h 1 (x), . . . , h m (x)) and h(x) − h(y) ≤c x − y (1 + x a−1 + y a−1 ), (3.10) for x, y ∈ R n ,c > 0 and 1 ≤ a < p. Define (Y t ) t∈R by Y t = h(X t ). If (X t ) t∈R is an η-weakly dependent process, then (Y t ) t∈R is an η-weakly dependent process such that ∀ r ≥ 0, η Y (r) = C η X (r) p−a p−1 , with the constant C independent of r.