Complete integral convergence for weighted sums of negatively dependent random variables under sub-linear expectations

: In the paper, the complete convergence and complete integral convergence for weighted sums of negatively dependent random variables under the sub-linear expectations are established. The results in the paper extend some complete moment convergence theorems from the classical probability space to the situation of sub-linear expectation space


Introduction
Probability limit theory is an important research topic in mathematical statistics that has found extensive application in the fields of mathematics, statistics, and finance. However, the limitations of classical limit theory have become increasingly apparent with the application of limit theory in finance, risk measurement and other areas. In situations where the mathematical model is characterized by uncertainty, the analysis and computation of sub-linearity becomes feasible. To address this issue, academician Peng [1][2][3] put forward the concept of sub-linear expectation space, constructed the complete theoretical system of sub-linear expectation space and effectively solved the limitation of traditional probability space theory in statistics, economics, and other fields. In recent years, an increasing number of scholars have conducted extensive research in this field, yielding numerous relevant findings. Notably, Peng [1][2][3] and Zhang [4][5][6] have derived a series of significant conclusions, including the law of large numbers of strong numbers, the exponential inequality and Rosenthal's inequality under sub-linear expectations. These findings have established a solid groundwork for investigating of the limit theory of sub-linear expectation spaces. The results obtained by Peng and Zhang have greatly contributed to the advancement of our understanding of the sub-linear expectation space theorem.
The concepts of complete convergence and complete moment convergence hold significant importance in the probability limit theory. The theory of complete convergence was initially introduced by Hsu and Robbins [7]. Chow [8] introduced the concept of complete convergence of independent random variables, which has since been expanded upon. As a result of complete convergence, complete moment convergence is more accurate, prompting a further investigation by scholars. Qiu and Chen [9] established the complete moment convergence for independent and identically distributed random variables, while Yang and Hu [10] demonstrated the complete moment convergence for pairwise NQD random variables. Song and Zhu [11] derived the complete convergence theorem for extended negatively dependent random variables. Notably, in the sub-linear expectation space, the complete moment convergence is equivalent to the complete integral convergence. In recent years, an increasing number of scholars have conducted research on the topics of complete convergence and complete integral convergence within the context of sub-linear expectations, thereby significantly augmenting the associated theoretical frameworks. For example, Li and Wu [12] conducted a study on the convergence of complete integrals for arrays of row-wise extended negatively dependent random variables. Similarly, Lu and Weng [13] examined the complete and complete integral convergence of arrays consisting of row-wise widely negative dependent random variables. Additionally, Chen and Wu [14] investigated the complete convergence and complete integral convergence of partial sums for the moving average process. It is noteworthy that complete convergence and complete integral convergence with maxima under sub-linear expectation spaces are only valid when the sequences are independent or negatively dependent. For example, Feng and Zeng [15] proved a complete convergence theorem of the maximum of partial sums under the sub-linear expectations. Xu and Kong [16,17] discussed complete convergence and complete integral convergence under negatively dependent sequences. The aforementioned findings suggest a need for further development in the field of complete integral convergence. The objective of this research is to extend the complete moment convergence characteristic, as established by Wu and Wang [18], to sub-linear space through a probabilistic approach and, subsequently, derive relevant outcomes.
The present article is structured as follows: Section 2 provides an introduction to basic notations, concepts and related properties within the context of sub-linear expectations, along with the presentation of several lemmas. Section 3 establishes complete convergence and complete integral convergence for weighted sums of negatively dependent random variables under sub-linear expectations. Finally, in Section 4, the aforementioned lemmas are utilized to demonstrate the major findings of this study. The symbol c denotes an arbitrary constant and is independent of n. The lnx is denoted as log 2 x in the paper and I (·) denotes an indicator function.

Preliminaries
We use the framework and notions of Peng [1][2][3] and Zhang [6]. Let (Ω, F ) be a given measurable space and let H be a linear space of real functions defined on (Ω, F ) such that if X 1 , X 2 , . . . , X n ∈ H, then ϕ (X 1 , . . . , X n ) ∈ H for each ϕ ∈ C l, Lip (R n ), where ϕ ∈ C l, Lip (R n ) denotes the linear space of (local Lipschitz) functions ϕ satisfying for some c > 0, m ∈ N depending on ϕ. H is considered as a space of random variable. In this case we denote X ∈ H. Form the definition, it is easily shown that for all X, Y ∈ H, ε(X) ≤Ê(X), where A c is the complement set of A. It is obvious that V is sub-additive, and This implies Markov inequality: Form I(|X| ≥ x) ≤ |X| P /x P ∈ H, by Lemma 4.1 in Zhang [5], we have Hölder inequality: ∀X, Y ∈ H, p, q > 1 satisfying p −1 + q −1 = 1, with V being replaced by V and V respectively.
Definition 2.5. (Negative dependence) In a sub-linear expectation space (Ω, H,Ê), a random vector Y = (Y 1 , · · · , Y n ) (Y i ∈ H) is said to be negatively dependent (ND) to another random vector X = (X 1 , · · · , X m ) (X i ∈ H) underÊ if for each pair of test functions ϕ 1 ∈ C l, Lip (R m ) and ϕ 2 ∈ C l, Lip (R n ), we haveÊ It is obvious that, if {X n , n ≥ 1} is a sequence of negatively dependent random variables and functions f 1 (x), f 2 (x), ... ∈ C l,Lip (R) are all non-decreasing (resp. all non-increasing), then { f n (X n ) , n ≥ 1} is also a sequence of negatively dependent random variables. Definition 2.6. A sub-linear expectationÊ : H → R is called to be countably sub-additive, if We need the following lemmas to prove the main results.
where c q is a positive constant depending only on q.
(ii) By the proof of (i), we can get (2.4), then for any β > 1 and c > 0, hence, the proof of (ii) is established.

Main results
Theorem 3.1. Assume that {X, X n , n ≥ 1} is a sequence of negatively dependent and identically distributed random variables under sub-linear expectations. Suppose that {a nk , 1 ≤ k ≤ n, n ≥ 1} is an array of positive real numbers andÊ is countably sub-additive. Set b n = n 1/α ln 1/γ n, where 0 < α ≤ 2, Assume that the conditions of Theorem 3.1 are satisfied, then for 0 < θ < 2 and ε > 0, where + is the positive part.

Proof of Theorem 3.1.
For fixed n ≥ 1 and 1 ≤ k ≤ n, denote We can easily see that for any ε > 0, In order to prove (3.4), we just need to prove First of all, we prove (4.1). We know that in the probability space: EI(|X| ≤ a) = P(|X| ≤ a) holds, nevertheless under the sub-linear expectation space, the expression I(|x| ≤ a) not necessarily continuous. As a result, EI(|X| ≤ a) does not necessarily exist. So, we need to modify the indicator function by functions in C l,Lip (R). We define the function g(x) ∈ C l,Lip (R) as follows.
Noting that According to (3.3) and a nk is non-negative, we can get E (a nk X k ) = a nkÊ X k = 0. (4.5) Combining with (4.3), we have When 0 < α < 1, according to (2.9) and (3.1), we can getÊ(|X|) ≤ C V |X| 2 ln 1−2/γ |X| < ∞. Noting |Y nk | ≤ |X k | and γ > 0, we have Thus, for ε > 0 and all n large enough, we have In order to prove (4.2), it suffices to show where c p is a positive constant depending only on p.
We know a nk is non-negative. By Definition 2.5, for fixed n ≥ 1, a nk Y nk −ÊY nk , 1 ≤ k ≤ n is still negatively dependent sequence of random variables. By (4.9), Markov inequality and (2.3) for q = 2, we can get By (4.3) and C r inequality, for any λ > 0, we can obtain Then for any r > 0, we can obtain It is noted that according to (2.5), (3.1), (4.11), (4.12), 0 < γ < 2 and g(x) is decreasing in x ≥ 0. It is easy to prove that.
Next, we estimate H 5 < ∞. Similar to the proof of I 5 , we have Hence, the proof of Theorem 3.2 is established.

Conclusions
This paper examines the concepts of complete convergence and complete integral convergence within sub-linear expectation space. The proof methodology employed differs from that utilized in probability space, as V andÊ are not countably sub-additive in sub-linear expectation space. Additionally, the definition of identical distribution in sub-linear expectation is based onÊ rather than V.
Therefore, the use of suitable auxiliary tools is crucial for conducting a thorough investigation in the sub-linear expectation space. This study primarily relies on Zhang's [5] upper expectation inequality, which serves as a useful tool in our proof. Our findings indicate that the convergence integral convergence of maxima is more comprehensive than previous research results. In upcoming research endeavors, we aim to explore more intriguing outcomes.

Use of AI tools declaration
The authors declare they have not used Artificial Intelligence (AI) tools in the creation of this article.