Abstract

The aim of this paper is to study and establish the precise asymptotics for complete integral convergence theorems under a sublinear expectation space. As applications, the precise asymptotics for order complete integral convergence theorems have been generalized to the sublinear expectation space context. We extend some precise asymptotics for complete moment convergence theorems from the traditional probability space to the sublinear expectation space. Our results generalize corresponding results obtained by Liu and Lin (2006). There is no report on the precise asymptotics under sublinear expectation, and we provide the method to study this subject.

1. Introduction

The sublinear expectation space has advantages of modelling the uncertainty of probability and distribution. Therefore, limit theorems for sublinear expectations have raised a large number of issues of interest recently. Limit theorems are important research topics in probability and statistics. They were widely used in finance and other fields. Classical limit theorems only hold in the case of model certainty. However, in practice, such model certainty assumption is not realistic in many areas of applications because the uncertainty phenomena cannot be modeled using model certainty. Motivated by modelling uncertainty in practice, Peng [1] introduced a new notion of sublinear expectation. As an alternative to the traditional probability/expectation, capacity/sublinear expectation has been studied in many fields such as statistics, finance, economics, and measures of risk (see Denis and Martini [2]; Gilboa [3]; Marinacci [4]; Peng [1, 57], etc.). The general framework of the sublinear expectation in a general function space was introduced by Peng [1, 7, 8], and sublinear expectation is a natural extension of the classical linear expectation.

Because the sublinear expectation provides a very flexible framework to model sublinear probability problems, the limit theorems of the sublinear expectation have received more and more attention and research recently. A series of useful results have been established. Peng [1, 7, 8] constructed the basic framework, basic properties, and central limit theorem under sublinear expectations, Zhang [911] established the exponential inequalities, Rosenthal’s inequalities, strong law of large numbers, and law of iterated logarithm, Hu [12], Chen [13], and Wu and Jiang [14] studied strong law of large numbers, Wu et al. [15] studied the asymptotic approximation of inverse moment, Xi et al. [16] and Lin and Feng [17] studied complete convergence, and so on. In general, extending the limit properties of conventional probability space to the cases of sublinear expectation is highly desirable and of considerably significance in the theory and application. Because sublinear expectation and capacity is not additive, many powerful tools and common methods for linear expectations and probabilities are no longer valid, so that the study of the limit theorems under sublinear expectation becomes much more complex and difficult.

Since the concept of complete convergence of a sequence of random variables was introduced by Hsu and Robbins [18], there have been extensions in several directions. One of them is to discuss the precise rate, which is more exact than complete convergence. Precise asymptotics for complete convergence and complete moment convergence is one of the most important problems in probability theory. Many related results have been obtained in the probabilistic space. Their recent results can be found in the studies of Heyde [19]; Liu and Lin [20]; Zhao [21]; Li [22]; Zhou [23]; Gut and Steinebach [24]; He and Xie [25]; Wang et al. [2629]; Spaataru [30]; and Kong and Dai [31]. However, in sublinear expectations, due to the uncertainty of expectation and capacity, the precise asymptotics is essentially different from the ordinary probability space. The study of precise asymptotics of complete convergence and complete integral convergence for sublinear expectations is much more complex and difficult. The precise asymptotics theorems under sublinear expectation have not been reported. The purpose of this paper is to establish the precise asymptotics theorems for () order complete integral convergence for independent and identically distributed random variables under sublinear expectation. As a result, the corresponding results obtained by Liu and Lin [20] have been generalized to the sublinear expectation space context.

In the next section, we summarize some basic notations and concepts and related properties under the sublinear expectations.

2. Preliminaries

We use the framework and notations of Peng [8]. Let be a given measurable space, and let be a linear space of real functions defined on such that if , then for each , where denotes the linear space of (local Lipschitz) functions satisfyingfor some , depending on . is considered as a space of “random variables.” In this case, we denote .

Definition 1. A sublinear expectation on is a function satisfying the following properties: for all ,(a)Monotonicity: If , then (b)Constant preserving: (c)Subadditivity: whenever is not of the form or (d)Positive homogeneity: The triple is called a sublinear expectation space.
Given a sublinear expectation , let us denote the conjugate expectation of byFrom the definition, it is easily shown that for all ,If , then for any .
Next, we consider the capacities corresponding to the sublinear expectations. Let . A function is called a capacity ifIt is called to be subadditive if for all with . In the sublinear space , we denote a pair of capacities bywhere is the complement set of . By definition of and , it is obvious that is subadditive, andThis implies Markov inequality: ,from . By Lemma 4.1 in Zhang [10]; we have Holder inequality: satisfying ,And particularly, we have Jensen inequality: ,Also, we define the Choquet integrals/expectations bywith being replaced by and , respectively.

Definition 2. (Peng [1] and Zhang [9]).(i)Identical distribution: let and be two -dimensional random vectors defined, respectively, in sublinear expectation spaces and . They are called identically distributed ifwhenever the subexpectations are finite. A sequence of random variables is said to be identically distributed if for each , and are identically distributed.(ii)Independence: in a sublinear expectation space , a random vector , , is said to be independent of another random vector , under if for each test function , we have , whenever for all and .(iii)Independent random variables: a sequence of random variables is said to be independent, if is independent of for each .In the following, let be a sequence of random variables in and . The symbol stands for a generic positive constant which may differ from one place to another. Let denote , denote that there exists a constant such that for sufficiently large , and denote an indicator function.
To prove our results, we need the following four lemmas.

Lemma 1 (Theorem 3.1 in Zhang [10]). Let be a sequence of independent random variables in with . Then,(i)For any ,(ii)If , thenwhere .

Lemma 2. For any , we have

Proof. We only prove (16). Let and denote the inverse function of . Then, (16) follows from the following three equations:Therefore, (16) holds.
Here, we give the notations of G-normal distribution which is introduced by Peng [7].

Definition 3. (G-normal random variable). For , a random variable in a sublinear expectation space is called a normal distributed random variable (write under ); if for any , the function () is the unique viscosity solution of the following heat equation:where .

Lemma 3 (Theorem 3.3 and Remark 3.4 in Peng [7] (CLT)). Suppose that is a sequence of independent and identically distributed random variables with . Write and . Then, for any continuous function satisfying ,where under .

In particular, if , then Lemma 3 becomes a classical central limit theorem.

Remark 1. For any , by , , and under , (19) becomes

Lemma 4 (Lemma 3 in Chen and Hu [32]). Suppose that under . Let be a probability measure and be a bounded continuous function on . If is a Brownian motion under , thenwhere

From Peng [8], if under , then for each convex function ,but if is a concave function, the above must be replaced by . If , then which is a classical normal distribution.

In particular, notice that , is a convex function; taking , in (23), we get

(24) implies that

Definition 4. A sublinear expectation is called to be continuous if it satisfiesContinuity from below: if , where Continuity from above: if , where

Lemma 5. Suppose that the conditions of Lemma 3 hold and is continuous, set , here and later, under ; then,

Remark 2. Lemma 5 is a powerful tool for studying the uniform convergence of the central limit theorem under sublinear expectations, which plays a key role in proving the theorems in this paper.

Proof of Lemma 5. If , then Lemma 3 is a classical central limit theorem. In the classical probability, (26) follows from the central limit theorem and an important fact that is a continuous function of . Therefore, we only need to prove the situation .
Obviously, ; thus, .
For , let be a Lipschitz even function and nondecreasing for such that , for all and if and if . Then,This combines equation (7), for ,where and .
On the other hand,Thus,Therefore, in order to prove that (26), it suffices to show thatWrite and .
Obviously, ; and are nonincreasing functions on . Thus, for any , the limit exists. Actually, taking and , by continuity of , we haveHence, is continuous for . As well as from .
Therefore, let be an arbitrary positive number; there exist points satisfying the conditionsFurthermore, by (19) and Remark 1, there exists a number such that for and we haveIf , then for , we getIf , then for ,If , then for ,Thus, for all and . That is, (31) holds.
Next, we prove that (32).
Because is continuous on , is uniformly continuous on . Therefore, for any , there is (can be assumed ), such that ; if , thenLet , for any ; we have and . Hence,Thereby,For , let and ; combining any , under . By Lemma 4,where is the Lipschitz constant of .
Therefore,The combination of (41) and (32) is established. This completes the proof of Lemma 5.

3. Main Results and Proofs

Our results are stated as follows.

Theorem 1. Let be a sequence of independent and identically distributed random variables in . We assume that is continuous andThen,here and later, under .

Theorem 2. Under the conditions of Theorem 1, for ,

For , we have the following theorem.

Theorem 3. Let be a sequence of independent and identically distributed random variables in . We assume that is continuous andThen,

Remark 3. Theorems 13 extend the corresponding results obtained by Liu and Lin [20] from the probability space to sublinear expectation space.

Proof of Theorem 1. Note thatHence, in order to establish (45), it suffices to prove thatObviously, (50) follows fromWithout loss of generality, here and later, we assume that . Let ,Let , from (26), as . So, by Toeplitz’s lemma, if , and , then ,Taking as the proof process of Lemma 5, by (7), (27) and identically distributed of , for any ,Hence, for , taking and in Lemma 1 (i),from .
Since also satisfies the conditions of Theorem 1, we replace the with the in the upper form:Therefore,More generally, for any , we haveThis implies from Markov’s inequality and (24) thatLet first; then, let ; we getfrom (44) and (15).
From this, combining with (53) and (54), (51) is established. This completes the proof of Theorem 1.

Proof of Theorem 2. Since when , so by Theorem 1, we only discuss the case . Note thatHence, from Theorem 1, in order to establish (46), it suffices to prove thatLet . Note thatHence, in order to establish (63), it suffices to prove thatWe first prove (65); let , thenfrom (25).
Now, we prove (66). Let . Then,whereSince implies , one can easily obtain thatBy Lemma 1 (ii), Markov’s inequality, and (24), we getFrom (69)–(72) and , using Toeplitz’s lemma, we getThat is, (66) is established.
Finally, we prove (67):By (59), Markov’s inequality, and (24),Let first, then let , we get (67) from (44) and (15). This completes the proof of Theorem 2.

Proof of Theorem 3. Note thatHence, by Theorem 1, in order to establish (48), it suffices to prove thatHowever,Hence, in order to establish (77), it suffices to prove thatWe first prove (79). Because , there is , such that for any . Therefore, combining with Markov’s inequality and (24),Thus, (79) followsNow, we prove (80). Let . Then,whereSince implies , one can easily obtain thatBy (59), Markov’s inequality, (24), and , we getFrom (84)–(87) and , using Toeplitz’s lemma, (80) followsFinally, we prove (81) as follows:By (59), Markov’s inequality, and (24),Let first and then let ; we get (81) from (47) and (16). This completes the proof of Theorem 3.

Data Availability

No data were used to support this study.

Conflicts of Interest

The authors declare no conflicts of interest.

Acknowledgments

This research was supported by the National Natural Science Foundation of China (11661029) and the Support Program of the Guangxi China Science Foundation (2018GXNSFAA281011).