Skip to main content

On the complete convergence for weighted sums of a class of random variables

Abstract

In this article, some new results as regards complete convergence for weighted sums \(\sum_{i=1}^{n}a_{ni}X_{i}\) of random variables satisfying the Rosenthal type inequality are established under some mild conditions. These results extend the corresponding theorems of Deng et al. (Filomat 28(3):509-522, 2014) and Gan and Chen (Acta Math. Sci. 28(2):269-281, 2008).

1 Introduction

In this paper we are interested in the complete convergence of a sequence of random variables which satisfies the Rosenthal type inequality. First let us recall some definitions and well-known results.

1.1 Complete convergence

The following concept of complete convergence of a sequence of random variables was introduced first by Hsu and Robbins [3], which plays an important role in limit theory of probability. A random sequence \(\{X_{n}, n \geq1\}\) is said to converge completely to the constant C (write \(X_{n}\rightarrow C\) completely) if for any \(\varepsilon> 0\),

$$\sum_{n=1}^{\infty}P\bigl(\vert X_{n}-C\vert >\varepsilon\bigr)< \infty. $$

From the Borel-Cantelli lemma, this implies that \(X_{n}\rightarrow C\) almost surely (a.s.). For the case of i.i.d. random variables, Hsu and Robbins [3] proved that the sequence of arithmetic means of the random variables converges completely to the expected value if the variance of the summands is finite. Somewhat later, Erdös [4] proved the converse. These results are summarized as follows.

Hsu-Robbins-Erdös strong law. Let \(\{X_{n}, n\ge1\}\) be a sequence of i.i.d. random variables with mean zero and set \(S_{n}=\sum_{i=1}^{n} X_{i}\), \(n\ge1\), then \(E X_{1}^{2}<\infty\) is equivalent to the condition that

$$\sum_{n=1}^{\infty}P\bigl(\vert S_{n}\vert >\varepsilon n\bigr)< \infty\quad \text{for all } \varepsilon>0. $$

The result of Hsu-Robbins-Erdös’ strong law is a fundamental theorem in probability theory and has been intensively investigated in several directions by many authors in the past decades. One of the most important results is Baum and Katz’ [5] strong law of large numbers.

Baum and Katz strong law. Let \(\alpha p\ge1\), \(p> 2\), and let \(\{ X_{n}\}\) be a sequence of i.i.d. random variables and \(E|X_{1}|^{p}<\infty\). If \(\frac{1}{2}<\alpha\le1\), assume that \(E X_{1}=0\). Then

$$\sum_{n=1}^{\infty}n^{\alpha p-2}P \Biggl( \max_{1\le i\le n}\Biggl\vert \sum_{i=1}^{j} X_{i}\Biggr\vert >\varepsilon n^{\alpha} \Biggr)< \infty \quad \text{for all } \varepsilon>0. $$

The Baum and Katz strong law bridges the integrability of summands and the rate of convergence in the Marcinkiewicz-Zygmund strong law of large numbers.

It is well known that the analysis of weighted sums plays an important role in the statistics, such as jackknife estimate, nonparametric regression function estimate and so on. Many authors considered the complete convergence of the weight sums of random variables. Thrum [6] studied the almost sure convergence of weighted sums of i.i.d. random variables; Li et al. [7] obtained complete convergence of weighted sums without the identically distributed assumption. Liang and Su [8] extended the results of Thrum [6], and Li et al. [7] showed the complete convergence of weighted sums of negatively associated sequence. The reader can refer to further literature on complete convergence of weighted sums, such as Xue et al. [9] for the NSD sequence, Gan and Chen [10] for the NOD sequence and so on.

1.2 Rosenthal type inequality

The Rosenthal type inequality is expressed as follows: let \(\{Z_{n}, n \geq1\}\) be a sequence of random variables, for any \(r \geq2\) and every \(n \geq1\), there exists a positive constant C such that

$$ E \Biggl(\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}(Z_{i}-EZ_{i})\Biggr\vert ^{r} \Biggr)\leq C \Biggl[\sum_{i=1}^{n}E|Z_{i}-EZ_{i}|^{r}+ \Biggl(\sum_{i=1}^{n}E(Z_{i}-EZ_{i})^{2} \Biggr)^{\frac{r}{2}} \Biggr]. $$
(1.1)

Many dependent sequences satisfy the Rosenthal type inequality. We refer to Shao [11] for a negatively associated sequence; Utev and Peligrad [12] for a ρ̃-mixing sequence; Shen [13] and Stout [14] for an extended negatively dependent sequence (END); Hu [15] and Wang et al. [16] for a negatively superadditive dependent sequence (NSD); Asadian et al. [17] and Wu [18] for a negatively orthant dependent sequence (NOD); Yuan and An [19] for an asymptotically almost negatively associated sequence (AANA).

The concept of stochastic domination is presented as follows.

Definition 1.1

A sequence \(\{X_{n}, n \geq1\}\) of random variables is said to be stochastically dominated by a random variable X if there exists a positive constant C such that

$$P\bigl(\vert X_{n}\vert >x\bigr)\leq C P\bigl(\vert X\vert >x \bigr) $$

for all \(x\geq0\) and \(n\geq1\).

In the present paper, we shall study the complete convergence of weighted sums of random sequence under the assumption that the random variables satisfy the Rosenthal type inequality. Our main results are stated in Section 2 and the proofs are given in Section 3. Throughout this paper, let C denote a positive constant, which may take different values whenever it appears in different expressions. \(a_{n}=O(b_{n})\) means \(|a_{n}/b_{n}|\leq C\) and \(I(\cdot)\) stands for the indicator function.

2 Main results

Theorem 2.1

Let \(\{X_{n}, n\geq1\}\) be a sequence of random variables with zero means, which is stochastically dominated by a random variable X with \(E|X|^{p}<\infty\) for some \(p\geq1\). Let \(\{a_{ni}, 1\leq i\leq n, n\geq1\}\) be an array of real numbers satisfying \(|a_{ni}|\leq C\) for \(1\leq i \leq n\) and \(n\geq1\), where C is a positive constant. Let \(\{b_{n},n \geq1\}\) and \(\{c_{n},n \geq1\}\) be two sequences of positive constants such that, for some \(r\geq\max\{2,p\}\),

$$ \begin{aligned} &\frac{n}{c_{n}^{p}}\rightarrow0 \quad \textit{and} \quad \sum_{n=1}^{k}nb_{n}=O \bigl(c_{k}^{p}\bigr), \\ &\sum_{n=1}^{\infty}\frac{n^{\frac{r}{2}}b_{n}}{c_{n}^{r}}< \infty \quad \textit{and} \quad \sum_{n=k}^{\infty} \frac {nb_{n}}{c_{n}^{r}}=O\bigl(c_{k}^{p-r}\bigr). \end{aligned} $$
(2.1)

Suppose that Rosenthal type inequality of \(Z_{ni}:=a_{ni}X_{i}I(|X_{i}|\leq c_{n})\) (\(1\leq i\leq n\)) holds for the above r. Then we have

$$ \sum_{n=1}^{\infty}b_{n}P \Biggl(\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni}X_{i}\Biggr\vert >\varepsilon c_{n} \Biggr)< \infty \quad \textit{for all } \varepsilon>0. $$
(2.2)

Remark 2.1

Under the conditions of Theorem 2.1, if we take \(b_{n}=n^{p\alpha-2}\), \(c_{n}=n^{\alpha}\) for \(1/2<\alpha\leq1\) and \(p\alpha>1\), and suppose that the Rosenthal type inequality of \(Z_{ni}:=a_{ni}X_{i}I(|X_{i}|\leq n^{\alpha})\) (\(1\leq i\leq n\)) holds for

$$\textstyle\begin{cases} r>\max \{p, \frac{p\alpha-1}{\alpha-2^{-1}} \},& \text{if } p\geq2, \\ r=2,& \text{if } 1< p< 2; \end{cases} $$

then it is easy to see that the conditions in (2.1) hold. Hence we have

$$ \sum_{n=1}^{\infty}n^{p\alpha-2}P \Biggl(\max _{1\leq j \leq n} \Biggl\vert \sum_{i=1}^{j}a_{ni}X_{i} \Biggr\vert >\varepsilon n^{\alpha} \Biggr)< \infty \quad \textit{for all } \varepsilon>0. $$
(2.3)

Remark 2.2

Under the conditions of Theorem 2.1, let \(b_{n}=n^{s-2}\), \(c_{n}=n^{s/p}\) for \(s>p\), \(p>1\), and let the Rosenthal type inequality of \(Z_{ni}:=a_{ni}X_{i}I(|X_{i}|\leq n^{s/p})\) (\(1\leq i\leq n\)) hold for

$$\textstyle\begin{cases} r>\max \{p, \frac{1-s}{\frac{1}{2}-\frac{s}{p}} \},& \text{if } p\geq2, \\ r=2,& \text{if } 1< p< 2; \end{cases} $$

then it is clear that the conditions in (2.1) hold. Hence we have

$$ \sum_{n=1}^{\infty}n^{s-2}P \Biggl(\max _{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}X_{i} \Biggr\vert >\varepsilon n^{s/p} \Biggr)< \infty\quad \mbox{for all } \varepsilon>0. $$
(2.4)

Remark 2.3

Under the conditions of Theorem 2.1, if we take \(b_{n}=\frac {\log n}{n}\), \(c_{n}=(n\log n)^{1/p}\) for some \(1\le p\le2\), and suppose that Rosenthal type inequality of \(Z_{ni}:=a_{ni}X_{i}I(|X_{i}|\leq(n\log n)^{1/p})\) (\(1\leq i\leq n\)) holds for

$$\textstyle\begin{cases} r=2,& \text{if } 1\le p < 2, \\ r>4,& \text{if } p=2. \end{cases} $$

It is easy to check that the conditions in (2.1) hold. Hence we have

$$ \sum_{n=1}^{\infty}\frac{\log n}{n}P \Biggl( \max_{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}X_{i} \Biggr\vert >\varepsilon(n\log n)^{1/p} \Biggr)< \infty\quad \mbox{for all } \varepsilon>0. $$
(2.5)

Remark 2.4

Under the conditions of Theorem 2.1, if we take \(b_{n}=\frac {1}{n\log n}\), \(c_{n}=(n\log\log n)^{1/p}\) for some \(1\le p\le2\), and suppose that Rosenthal type inequality of \(Z_{ni}:=a_{ni}X_{i}I(|X_{i}|\leq(n\log\log n)^{1/p})\) (\(1\leq i\leq n\)) holds for

$$\textstyle\begin{cases} r=2,& \text{if } 1\le p < 2, \\ r>2,& \text{if } p=2. \end{cases} $$

It is easy to check that the conditions in (2.1) hold. Hence we have

$$ \sum_{n=1}^{\infty}\frac{1}{n\log n}P \Biggl( \max_{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}X_{i} \Biggr\vert >\varepsilon(n\log\log n)^{1/p} \Biggr)< \infty\quad \mbox{for all } \varepsilon>0. $$
(2.6)

Theorem 2.2

Let \(\{X_{n}, n\geq1\}\) be a sequence of random variables with zero means and \(\{a_{ni}, 1\leq i\leq n, n\geq1\}\) be an array of real numbers satisfying \(|a_{ni}|\leq C\) for \(1\leq i \leq n\) and \(n\geq1\), where C is a positive constant. Let \(\{c_{n},n \geq1\}\) be sequences of positive constants with \(c_{n}\uparrow\infty\) and \(\{\Psi_{n}(t), n \geq1\}\) be a sequence of nonnegative and even functions such that for each \(n\geq1\), \(\Psi _{n}(t)>0\) as \(t>0\). Suppose that the Rosenthal type inequality of \(Z_{ni}:=a_{ni}X_{i}I(|X_{i}|\leq c_{n})\) (\(1\leq i\leq n\)) holds for \(r=2\). In addition, assume that

$$ \frac{\Psi_{n}(|t|)}{|t|} \uparrow ,\qquad \frac{\Psi _{n}(|t|)}{t^{2}} \downarrow \quad \textit{as } |t| \uparrow $$
(2.7)

and

$$ \sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi_{i}(X_{i})}{\Psi _{i}(c_{n})}< \infty. $$
(2.8)

Then we have

$$ \sum_{n=1}^{\infty}P \Biggl(\max _{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}X_{i} \Biggr\vert >\varepsilon c_{n} \Biggr)< \infty\quad \textit{for all } \varepsilon>0. $$
(2.9)

Theorem 2.3

Let \(\{X_{n}, n\geq1\}\) be a sequence of random variables with zero means and \(\{a_{ni}, 1\leq i\leq n, n\geq1\}\) be an array of real numbers satisfying \(|a_{ni}|\leq C\) for \(1\leq i \leq n\) and \(n\geq1\), where C is a positive constant. Let \(\{c_{n},n \geq1\}\) be sequences of positive constants with \(c_{n}\uparrow\infty\) and \(\{\Psi_{n}(t), n \geq1\}\) be a sequence of nonnegative and even functions such that for each \(n\geq1\), \(\Psi_{n}(t)>0\) as \(t>0\). Suppose that the Rosenthal type inequality of \(Z_{ni}:=a_{ni}X_{i}I(|X_{i}|\leq c_{n})\) (\(1\leq i\leq n\)) holds for \(r=2\). In addition, assume that for some \(1\leq p< q\leq2\) and each \(n\geq1\),

$$ \frac{\Psi_{n}(|t|)}{|t|^{p}} \uparrow \quad \textit{and} \quad \frac {\Psi_{n}(|t|)}{t^{q}} \downarrow \quad \textit{as } |t| \uparrow $$
(2.10)

and

$$ \sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi_{i}(X_{i})}{\Psi _{i}(c_{n})}< \infty. $$
(2.11)

Then we have

$$ \sum_{n=1}^{\infty}P \Biggl(\max _{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}X_{i} \Biggr\vert >\varepsilon c_{n} \Biggr)< \infty\quad \textit{for all } \varepsilon>0. $$
(2.12)

Corollary 2.1

Let \({\Psi_{n}(t)}\) be a positive even function satisfying (2.10) for some \(1\leq p< q\) and \(q>2\). Under the conditions in Theorem  2.3, suppose that the Rosenthal type inequality of \(Z_{ni}:=a_{ni}X_{i}I(|X_{i}|\leq c_{n})\) (\(1\leq i\leq n\)) holds for \(r=q\) and

$$ \sum_{n=1}^{\infty}c_{n}^{-r} \Biggl(\sum_{i=1}^{n}EX_{i}^{2} \Biggr)^{r/2}< \infty \quad \textit{for } r= q, $$
(2.13)

we can obtain (2.12).

3 Proofs of main results

In order to prove the main theorems, we need the following lemma which includes the basic properties for stochastic domination. One can refer to Shen [20], Wang et al. [21], Wu [22], or Shen and Wu [23] for the proof.

Lemma 3.1

Let \(\{X_{n}, n\geq1\}\) be a sequence of random variables which is stochastically dominated by a random variable X. Then for any \(\alpha >0 \) and \(b>0\),

$$ E\vert X_{n}\vert ^{\alpha}I\bigl(\vert X_{n}\vert \leq b\bigr)\leq C \bigl[E\vert X\vert ^{\alpha}I \bigl(\vert X\vert \leq b\bigr)+b^{\alpha}P\bigl(\vert X\vert >b \bigr) \bigr] $$
(3.1)

and

$$ E\vert X_{n}\vert ^{\alpha}I\bigl(\vert X_{n}\vert \geq b\bigr)\leq CE\vert X\vert ^{\alpha}I\bigl( \vert X\vert \geq b\bigr). $$
(3.2)

Consequently, \(E|X_{n}|^{\alpha}\leq C E|X|^{\alpha}\).

Proof of Theorem 2.1

For \(1\leq i\leq n\) and \(n\geq1\), denote \(X_{ni}'=X_{i}I(|X_{i}|\leq c_{n})\). Noting that \(EX_{i}=0\) and by the conditions \(nc_{n}^{-p}\to0\) and \(|a_{ni}|\leq C\), we have

$$\begin{aligned}& c_{n}^{-1}\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni}EX_{ni}' \Biggr\vert \\& \quad = c_{n}^{-1}\max_{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}EX_{i}I \bigl(\vert X_{i}\vert >c_{n}\bigr)\Biggr\vert \\& \quad \leq c_{n}^{-1}\max_{1\leq j \leq n}\sum _{i=1}^{j} \bigl\vert a_{ni}EX_{i}I \bigl(\vert X_{i}\vert >c_{n}\bigr)\bigr\vert \\& \quad \leq c_{n}^{-1}\sum_{i=1}^{n} \vert a_{ni}\vert E\vert X_{i}\vert I\bigl(\vert X_{i}\vert >c_{n}\bigr) \\& \quad \leq C c_{n}^{-1}\sum_{i=1}^{n} \vert a_{ni}\vert E\vert X\vert I\bigl(\vert X\vert >c_{n}\bigr) \\& \quad \leq C n c_{n}^{-1}E\vert X\vert I\bigl(\vert X \vert >c_{n}\bigr) \\& \quad \leq C n c_{n}^{-p}E\vert X\vert ^{p} \rightarrow0\quad \mbox{as } n\rightarrow \infty. \end{aligned}$$

Hence for any \(\varepsilon>0\), it follows that for all n large enough

$$ c_{n}^{-1}\max_{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}EX_{ni}' \Biggr\vert < \frac{\varepsilon}{2}. $$
(3.3)

From (3.3), it is easy to see that

$$\begin{aligned}& \sum_{n=1}^{\infty}b_{n}P \Biggl(\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni}X_{i}\Biggr\vert >\varepsilon c_{n} \Biggr) \\& \quad \leq C\sum_{n=1}^{\infty}b_{n} \sum_{i=1}^{n}P\bigl(\vert X_{i} \vert >c_{n}\bigr)+C\sum_{n=1}^{\infty}b_{n}P \Biggl(\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni}X_{ni}' \Biggr\vert >\varepsilon c_{n} \Biggr) \\& \quad \leq C\sum_{n=1}^{\infty}nb_{n}P \bigl(\vert X\vert >c_{n}\bigr)+C\sum_{n=1}^{\infty}b_{n}P \Biggl(\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni} \bigl(X_{ni}'-EX_{ni}' \bigr)\Biggr\vert >\frac {\varepsilon c_{n}}{2} \Biggr) \\& \quad =: CI+CJ. \end{aligned}$$
(3.4)

In order to prove (2.2), it suffices to prove that \(I<\infty\) and \(J<\infty\). First, for I, by the condition (2.1), it is easy to check that

$$\begin{aligned} I =& \sum_{n=1}^{\infty}nb_{n}P \bigl(\vert X\vert >c_{n}\bigr) \\ \leq& C\sum_{n=1}^{\infty}nb_{n}\sum _{k=n}^{\infty}P \bigl(c_{k}< \vert X \vert \leq c_{k+1} \bigr) \\ \leq& C\sum_{k=1}^{\infty}P \bigl(c_{k}< \vert X\vert \leq c_{k+1} \bigr)\sum _{n=1}^{k}nb_{n} \\ \leq& C\sum_{k=1}^{\infty}c_{k}^{p}P \bigl(c_{k}< \vert X\vert \leq c_{k+1} \bigr) \\ \leq& CE|X|^{p}< \infty. \end{aligned}$$
(3.5)

Second, we will show \(J<\infty\). It follows by the Markov inequality and the Rosenthal type inequality that, for \(r\geq2\),

$$\begin{aligned} J \leq& \biggl(\frac{2}{\varepsilon}\biggr)^{r}\sum _{n=1}^{\infty }b_{n}c_{n}^{-r}E \Biggl(\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni}\bigl(X_{ni}'-EX_{ni}' \bigr)\Biggr\vert ^{r}\Biggr) \\ \leq& C\sum_{n=1}^{\infty}b_{n}c_{n}^{-r} \Biggl\{ \Biggl(\sum_{i=1}^{n}a_{ni}^{2}E \bigl(X_{ni}'-EX_{ni}' \bigr)^{2}\Biggr)^{\frac{r}{2}} +\sum_{i=1}^{n}a_{ni}^{r}E \bigl\vert X_{ni}'-EX_{ni}'\bigr\vert ^{r}\Biggr\} \\ \leq& C\sum_{n=1}^{\infty}b_{n}c_{n}^{-r} \Biggl\{ \Biggl(\sum_{i=1}^{n}a_{ni}^{2}E \bigl(X_{ni}'\bigr)^{2}\Biggr)^{\frac{r}{2}} + \sum_{i=1}^{n}a_{ni}^{r}E \bigl\vert X_{ni}'\bigr\vert ^{r}\Biggr\} \\ \leq& C\sum_{n=1}^{\infty}b_{n}c_{n}^{-r} \Biggl(\sum_{i=1}^{n}E \vert X_{i}\vert ^{2}I\bigl(\vert X_{i}\vert \leq c_{n}\bigr)\Biggr)^{\frac{r}{2}} \\ &{}+C\sum _{n=1}^{\infty}b_{n}c_{n}^{-r} \sum_{i=1}^{n}E \vert X_{i} \vert ^{r}I\bigl(\vert X_{i}\vert \leq c_{n} \bigr) \\ =:& CJ_{1}+CJ_{2}. \end{aligned}$$
(3.6)

For the case \(p \geq2\), from Lemma 3.1, Markov’s inequality, and the condition (2.1), we have

$$\begin{aligned} J_{1} \leq &C\sum_{n=1}^{\infty}b_{n}c_{n}^{-r} \Biggl(\sum_{i=1}^{n} \bigl(E\vert X \vert ^{2}I\bigl(\vert X\vert \leq c_{n} \bigr)+c_{n}^{2}P\bigl(\vert X\vert > c_{n} \bigr) \bigr) \Biggr)^{\frac{r}{2}} \\ \leq& C\sum_{n=1}^{\infty}b_{n}c_{n}^{-r} \Biggl(\sum_{i=1}^{n} \bigl(E\vert X \vert ^{2}I\bigl(\vert X\vert \leq c_{n}\bigr)+E\vert X\vert ^{2}I\bigl(\vert X\vert > c_{n}\bigr) \bigr) \Biggr)^{\frac {r}{2}} \\ \leq& C\sum_{n=1}^{\infty}b_{n}c_{n}^{-r}n^{\frac{r}{2}}< \infty. \end{aligned}$$
(3.7)

Since \(r\ge p\), it follows by Lemma 3.1 again, (3.5), and the condition (2.1) that

$$\begin{aligned} J_{2} =& \sum_{n=1}^{\infty}b_{n}c_{n}^{-r} \sum_{i=1}^{n}E \vert X_{i} \vert ^{r}I\bigl(\vert X_{i}\vert \leq c_{n} \bigr) \\ \leq& C\sum_{n=1}^{\infty}b_{n}c_{n}^{-r} \sum_{i=1}^{n} \bigl[E\vert X\vert ^{r}I\bigl(\vert X\vert \leq c_{n}\bigr)+c_{n}^{r}P \bigl(\vert X\vert > c_{n}\bigr)\bigr] \\ =& C\sum_{n=1}^{\infty}nb_{n}c_{n}^{-r}E \vert X\vert ^{r}I\bigl(\vert X\vert \leq c_{n} \bigr)+C\sum_{n=1}^{\infty}nb_{n}P\bigl( \vert X\vert > c_{n}\bigr) \\ \leq& C\sum_{n=1}^{\infty}nb_{n}c_{n}^{-r} \sum_{k=1}^{n}E\vert X\vert ^{r}I\bigl(c_{k-1}< \vert X\vert \leq c_{k} \bigr)+C \\ =& C\sum_{k=1}^{\infty}E \vert X\vert ^{r}I\bigl(c_{k-1}< \vert X\vert \leq c_{k} \bigr)\sum_{n=k}^{\infty}nb_{n}c_{n}^{-r}+C \\ \leq& C\sum_{k=1}^{\infty}E \vert X\vert ^{p}I\bigl(c_{k-1}< \vert X\vert \leq c_{k} \bigr)+C \\ \leq& CE\vert X\vert ^{p}+C< \infty. \end{aligned}$$
(3.8)

For the case \(1\le p<2\), we have \(r\ge2\) and we can take \(r=2\) in the Rosenthal type inequality. From a proof similar to (3.8), we get

$$\begin{aligned} J \leq& C\sum_{n=1}^{\infty}b_{n}c_{n}^{-2} \sum_{i=1}^{n}E \vert X_{i} \vert ^{2}I\bigl(\vert X_{i}\vert \leq c_{n} \bigr) \\ \le&CE|X|^{p}+C< \infty. \end{aligned}$$

Hence from the above discussions the claim (2.2) holds. □

Proof of Theorem 2.2

For \(1\leq i\leq n\) and \(n\geq1\), define \(X_{ni}'=X_{i}I(|X_{i}|\leq c_{n})\). From the conditions (2.7), \(EX_{i}=0\), and \(|a_{ni}|\leq C\), we have

$$\begin{aligned}& c_{n}^{-1}\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni}EX_{ni}' \Biggr\vert \\& \quad = c_{n}^{-1}\max_{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}EX_{i}I \bigl(\vert X_{i}\vert >c_{n}\bigr)\Biggr\vert \\& \quad \leq c_{n}^{-1}\max_{1\leq j \leq n}\sum _{i=1}^{j} \bigl\vert a_{ni}EX_{i}I \bigl(\vert X_{i}\vert >c_{n}\bigr)\bigr\vert \\& \quad \leq \sum_{i=1}^{n}c_{n}^{-1} \vert a_{ni}\vert E\vert X_{i}\vert I\bigl(\vert X_{i}\vert >c_{n}\bigr) \\& \quad \leq C\sum_{i=1}^{n}E \frac{\Psi_{i}(\vert X_{i}\vert )}{\Psi _{i}(c_{n})}I\bigl(\vert X_{i}\vert >c_{n}\bigr) \\& \quad \leq C\sum_{i=1}^{n}E \frac{\Psi_{i}(X_{i})}{\Psi _{i}(c_{n})}\rightarrow0 \quad \mbox{as } n\rightarrow\infty. \end{aligned}$$

Hence, for any \(\varepsilon>0\), we have for all n large enough,

$$ c_{n}^{-1}\max_{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}EX_{ni}' \Biggr\vert < \frac{\varepsilon}{2}. $$
(3.9)

From (3.9), it follows that

$$\begin{aligned}& \sum_{n=1}^{\infty}P \Biggl(\max _{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}X_{i} \Biggr\vert >\varepsilon c_{n} \Biggr) \\& \quad \leq C\sum_{n=1}^{\infty}\sum _{i=1}^{n}P\bigl(\vert X_{i}\vert >c_{n}\bigr)+C\sum_{n=1}^{\infty}P \Biggl(\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni}X_{ni}' \Biggr\vert >\varepsilon c_{n} \Biggr) \\& \quad \leq C\sum_{n=1}^{\infty}\sum _{i=1}^{n}P\bigl(\vert X_{i}\vert >c_{n}\bigr)+C\sum_{n=1}^{\infty}P \Biggl(\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni} \bigl(X_{ni}'-EX_{ni}' \bigr)\Biggr\vert >\frac {\varepsilon c_{n}}{2} \Biggr) \\& \quad =: CI+CJ. \end{aligned}$$
(3.10)

Now it suffices to control the terms I and J. For the term I, by the condition (2.8), we can get

$$\begin{aligned}& \sum_{n=1}^{\infty}\sum _{i=1}^{n}P\bigl(\vert X_{i}\vert >c_{n}\bigr) \\& \quad \leq \sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi _{i}(|X_{i}|)}{\Psi_{i}(c_{n})} \\& \quad = \sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi_{i}(X_{i})}{\Psi _{i}(c_{n})}< \infty. \end{aligned}$$
(3.11)

For the term J, using the Markov inequality and the Rosenthal type inequality for \(r=2\) and the condition (2.7), (2.8), we have

$$\begin{aligned} J \leq& \biggl(\frac{2}{\varepsilon}\biggr)^{2}\sum _{n=1}^{\infty }c_{n}^{-2}E \Biggl(\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni}\bigl(X_{ni}'-EX_{ni}' \bigr)\Biggr\vert ^{2}\Biggr) \\ \leq& C\sum_{n=1}^{\infty}c_{n}^{-2} \sum_{i=1}^{n}a_{ni}^{2}E \bigl(X_{ni}'-EX_{ni}' \bigr)^{2} \\ \leq& C\sum_{n=1}^{\infty}c_{n}^{-2} \sum_{i=1}^{n}\vert a_{ni}\vert ^{2}E \bigl\vert X_{ni}'\bigr\vert ^{2} \\ \leq& C\sum_{n=1}^{\infty}c_{n}^{-2} \sum_{i=1}^{n}E \vert X_{i} \vert ^{2}I\bigl(\vert X_{i}\vert \leq c_{n} \bigr) \\ \leq& C\sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi _{i}(\vert X_{i}\vert )}{\Psi_{i}(c_{n})}I\bigl(\vert X_{i}\vert \leq c_{n}\bigr) \\ \leq& C\sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi _{i}(X_{i})}{\Psi_{i}(c_{n})}< \infty. \end{aligned}$$
(3.12)

Hence the proof of Theorem 2.2 is completed. □

Proof of Theorem 2.3

For \(1\leq i\leq n\) and \(n\geq1\), define \(X_{ni}'=X_{i}I(|X_{i}|\leq c_{n})\). Similar to the proof of Theorem 2.2, it suffices to show that, for any \(\varepsilon>0\),

$$\begin{aligned}& c_{n}^{-1}\max_{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}EX_{ni}' \Biggr\vert \rightarrow0 \quad \mbox{as } n\rightarrow \infty, \end{aligned}$$
(3.13)
$$\begin{aligned}& \sum_{n=1}^{\infty}\sum _{i=1}^{n}P\bigl(\vert X_{i}\vert >c_{n}\bigr)< \infty, \end{aligned}$$
(3.14)

and

$$ \sum_{n=1}^{\infty}P \Biggl(\max _{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni} \bigl(X_{ni}'-EX_{ni}' \bigr)\Biggr\vert >\frac {\varepsilon c_{n}}{2} \Biggr)< \infty. $$
(3.15)

First, it follows from the conditions (2.10), (2.11), \(EX_{i}=0\), and \(|a_{ni}|\leq C\) that

$$\begin{aligned}& c_{n}^{-1}\max_{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}EX_{ni}' \Biggr\vert \\& \quad = c_{n}^{-1}\max_{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}EX_{i}I \bigl(\vert X_{i}\vert >c_{n}\bigr)\Biggr\vert \\& \quad \leq c_{n}^{-1}\max_{1\leq j \leq n}\sum _{i=1}^{j} \bigl\vert a_{ni}EX_{i}I \bigl(\vert X_{i}\vert >c_{n}\bigr)\bigr\vert \\& \quad \leq \sum_{i=1}^{n}c_{n}^{-1}|a_{ni}|E \vert X_{i}\vert I\bigl(\vert X_{i}\vert >c_{n}\bigr) \\& \quad \leq C\sum_{i=1}^{n} \frac{E\vert X_{i}\vert ^{p}I(\vert X_{i}\vert >c_{n})}{c_{n}^{p}} \\& \quad \leq C\sum_{i=1}^{n}E \frac{\Psi_{i}(\vert X_{i}\vert )}{\Psi _{i}(c_{n})}I\bigl(\vert X_{i}\vert >c_{n}\bigr) \\& \quad \leq C\sum_{i=1}^{n}E \frac{\Psi_{i}(X_{i})}{\Psi _{i}(c_{n})}\rightarrow0 \quad \mbox{as } n\rightarrow\infty. \end{aligned}$$
(3.16)

Second, by the condition (2.10), we know \(\Psi_{n}(t)\) is an increasing function as \(t>0\). Therefore, by the condition (2.11)

$$\begin{aligned}& \sum_{n=1}^{\infty}\sum _{i=1}^{n}P\bigl(\vert X_{i}\vert >c_{n}\bigr) \\& \quad \leq \sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi _{i}(|X_{i}|)}{\Psi_{i}(c_{n})} \\& \quad = \sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi_{i}(X_{i})}{\Psi _{i}(c_{n})}< \infty. \end{aligned}$$
(3.17)

Finally, for \(1\leq p < q\leq2\), by the Markov inequality and the Rosenthal type inequality for \(r= 2\) and the conditions (2.10), (2.11), we have

$$\begin{aligned}& \sum_{n=1}^{\infty}P \Biggl(\max _{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni} \bigl(X_{ni}'-EX_{ni}' \bigr)\Biggr\vert >\frac {\varepsilon c_{n}}{2} \Biggr) \\& \quad \leq C\sum_{n=1}^{\infty}c_{n}^{-2} \sum_{i=1}^{n}E \vert X_{i} \vert ^{2}I\bigl(\vert X_{i}\vert \leq c_{n} \bigr) \\& \quad \leq \sum_{n=1}^{\infty}\sum _{i=1}^{n}\frac{E\vert X_{i} \vert ^{q}I(\vert X_{i}\vert \leq c_{n})}{c_{n}^{q}} \\& \quad \leq \sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi _{i}(\vert X_{i}\vert )}{\Psi_{i}(c_{n})}I\bigl(\vert X_{i}\vert \leq c_{n}\bigr) \\& \quad \leq \sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi _{i}(X_{i})}{\Psi_{i}(c_{n})}< \infty. \end{aligned}$$
(3.18)

The proof of this theorem is completed. □

Proof of Corollary 2.1

From (3.16), (3.17) in the proof of Theorem 2.3, and the condition (2.10) holding for some \(1\leq p< q\) and \(q>2\), we only need to show (3.15) holds. By Markov’s inequality, the Rosenthal type inequality and the conditions (2.10), (2.11), (2.13), we have, for \(r=q>2\),

$$\begin{aligned}& \sum_{n=1}^{\infty}P \Biggl(\max _{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni} \bigl(X_{ni}'-EX_{ni}' \bigr)\Biggr\vert >\frac {\varepsilon c_{n}}{2} \Biggr) \\& \quad \leq C\sum_{n=1}^{\infty}c_{n}^{-r} \Biggl(\sum_{i=1}^{n}E \vert X_{i}\vert ^{2}I\bigl(\vert X_{i}\vert \leq c_{n}\bigr) \Biggr)^{\frac{r}{2}} +C\sum _{n=1}^{\infty}c_{n}^{-r}\sum _{i=1}^{n}E\vert X_{i} \vert ^{r}I\bigl(\vert X_{i}\vert \leq c_{n}\bigr) \\& \quad \leq C+C\sum_{n=1}^{\infty}c_{n}^{-q} \sum_{i=1}^{n}E \vert X_{i} \vert ^{q}I\bigl(\vert X_{i}\vert \leq c_{n} \bigr) \\& \quad \leq C+C\sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi _{i}(\vert X_{i}\vert )}{\Psi_{i}(c_{n})}I\bigl(\vert X_{i}\vert \leq c_{n}\bigr) \\& \quad \leq C+C\sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi _{i}(X_{i})}{\Psi_{i}(c_{n})}< \infty, \end{aligned}$$
(3.19)

which completes the proof. □

4 Conclusions

The present work is meant to establish some new results as regards complete convergence for weighted sums of random variables which satisfy the Rosenthal type inequality. These results extend some known results.

References

  1. Deng, X, Ge, MM, Wang, XJ, Liu, YF, Zhou, Y: Complete convergence for weighted sums of a class of random variables. Filomat 28(3), 509-522 (2014)

    Article  MathSciNet  Google Scholar 

  2. Gan, SX, Chen, PY: Some limit theorems for sequences of pairwise NQD random variables. Acta Math. Sci. 28(2), 269-281 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  3. Hsu, PL, Robbins, H: Complete convergence and the law of large numbers. Proc. Natl. Acad. Sci. USA 33(2), 25-31 (1947)

    Article  MathSciNet  MATH  Google Scholar 

  4. Erdös, P: On a theorem of Hsu and Robbins. Ann. Math. Stat. 20, 286-291 (1949)

    Article  MathSciNet  MATH  Google Scholar 

  5. Baum, LE, Katz, M: Convergence rates in the law of large numbers. Trans. Am. Math. Soc. 120, 108-123 (1965)

    Article  MathSciNet  MATH  Google Scholar 

  6. Thrum, R: A remark on almost sure convergence of weighted sums. Probab. Theory Relat. Fields 75(3), 425-430 (1987)

    Article  MathSciNet  MATH  Google Scholar 

  7. Li, DL, Rao, MB, Jiang, TF, Wang, XC: Complete convergence and almost sure convergence of weighted sums of random variables. J. Theor. Probab. 8(1), 49-76 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  8. Liang, HY, Su, C: Complete convergence for weighted sums of NA sequences. Stat. Probab. Lett. 45(1), 85-95 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  9. Xue, Z, Zhang, LL, Lei, YJ, Chen, ZG: Complete moment convergence for weighted sums of negatively superadditive dependent random variables. J. Inequal. Appl. 2015, 117 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  10. Gan, SX, Chen, PY: Some limit theorems for weighted sums of arrays of NOD random variables. Acta Math. Sci. Ser. B Engl. Ed. 32(6), 2388-2400 (2012)

    MathSciNet  MATH  Google Scholar 

  11. Shao, QM: A comparison theorem on inequalities between negatively associated and independent random variables. J. Theor. Probab. 13(2), 343-356 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  12. Utev, S, Peligrad, M: Maximal inequalities and an invariance principle for a class of weakly dependent random variables. J. Theor. Probab. 16(1), 101-115 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  13. Shen, AT: Probability inequalities for END sequence and their applications. J. Inequal. Appl. 2011, 98 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  14. Stout, WF: Almost Sure Convergence. Academic Press, New York (1974)

    MATH  Google Scholar 

  15. Hu, TZ: Negatively superadditive dependence of random variables with applications. Chinese J. Appl. Probab. Statist. 16(2), 133-144 (2000)

    MathSciNet  MATH  Google Scholar 

  16. Wang, XJ, Deng, X, Zheng, LL, Hu, SH: Complete convergence for arrays of rowwise negatively superadditive-dependent random variables and its applications. Statistics 48(4), 834-850 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  17. Asadian, N, Fakoor, V, Bozorgnia, A: Rosenthal’s type inequalities for negatively orthant dependent random variables. JIRSS 5(1-2), 69-75 (2006)

    Google Scholar 

  18. Wu, QY: Complete convergence for weighted sums of sequences of negatively dependent random variables. J. Probab. Stat. 2011, Article ID 202015 (2011)

    MathSciNet  MATH  Google Scholar 

  19. Yuan, D, An, J: Rosenthal type inequalities for asymptotically almost negatively associated random variables and applications. Sci. China Ser. A, Math. 52(9), 1887-1904 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  20. Shen, AT: On the strong convergence rate for weighted sums of arrays of rowwise negatively orthant dependent random variables. Rev. R. Acad. Cienc. Exactas Fís. Nat., Ser. A Mat. 107(2), 257-271 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  21. Wang, XJ, Xu, C, Hu, TC, Volodin, A, Hu, SH: On complete convergence for widely orthant-dependent random variables and its applications in nonparametric regression models. Test 23(3), 607-629 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  22. Wu, QY: Probability Limit Theory for Mixing Sequences. Science Press of China, Beijing (2006)

    Google Scholar 

  23. Shen, AT, Wu, RC: Strong and weak convergence for asymptotically almost negatively associated random variables. Discrete Dyn. Nat. Soc. 2013, Article ID 235012 (2013)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The work is supported by Science and Technology Project of Beijing Municipal Education Commission (KM201611232019), NSFC (71501016) and IRTSTHN (14IRTSTHN023), NSFC (11471104).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jinghuan Zhao.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

All authors contributed equally to the manuscript, and they read and approved the final manuscript.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhao, J., Sun, Y. & Miao, Y. On the complete convergence for weighted sums of a class of random variables. J Inequal Appl 2016, 218 (2016). https://doi.org/10.1186/s13660-016-1165-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13660-016-1165-2

MSC

Keywords