2.5. Convergence of random series

$\newcommand{\argmin}{\mathop{\mathrm{argmin}}\limits}$ $\newcommand{\argmax}{\mathop{\mathrm{argmax}}\limits}$

As the last section in chapter 2, we cover convergence of random series. Especially, since I already explained what tail $\sigma$-fields and tail events are, our focus will be on Kolmogorov’s maximal inequality and the three series theorem.


Kolmogorov’s maximal inequality

If $X_i$'s are independent, $EX_i=0$, $\mathrm{Var}(X_i) < \infty$, $S_n := \sum\limits_{k=1}^n X_k$, then the following inequality holds. $$P(\max_{1 \leq k \leq n} |S_k| > x) \leq \frac{1}{x^2} \mathrm{Var}(S_n)$$
expand proof

Let $A_k = \{ S_k > x,\: S_j \leq x,\: 1\leq j \leq n \}$. $A_k$'s are disjoint and $\bigcup\limits_{k=1}^n A_k = \{ \max_{1 \leq k \leq n} |S_k| > x \}$. $$\begin{align} \mathrm{Var}(S_n) &= \mathrm{E}S_n^2 \geq \mathrm{E}(S_n^2; \cup_{k=1}^n A_k)\\ &= \sum\limits_{k=1}^n \int_{A_k} S_n^2 \: dP \\ &= \sum\limits_{k=1}^n \int_{A_k} (S_n - S_k + S_k)^2 \: dP \\ &= \sum\limits_{k=1}^n \int_{A_k} (S_n - S_k)^2 + S_k^2 + 2S_k (S_n - S_k) \: dP \\ &\geq \sum\limits_{k=1}^n \big\{ \int_{A_k} S_k^2 + 2\int_{\Omega}S_k\boldsymbol{1}_{A_k}\: dP \int_{\Omega}(S_n - S_k) \: dP \big\}\\ &= \cancel{\sum\limits_{k=1}^n 2 \int \mathbf{1}_{A_k} S_k dP \cdot \int S_n - S_k dP} + \sum\limits_{k=1}^n \int_{A_k} S_k^2 \: dP \\ &\geq \sum\limits_{k=1}^n \int_{A_k} x^2 \: dP \\ &= x^2P(\cup_{k=1}^n A_k) \\ &= x^2 P(\max_{1 \leq k \leq n} |S_k| > x) \end{align}$$


As a result we get the result similar to that of Chebyshev’s inequality for maximum of a random series. In the previous post, I also proved that the same results holds with series of shifted starting points.

Kolmogorov’s three series theorem

We need two more theorems to prove convergence of random series with certain conditions.

If $(X_n)$ decreases almost surely, then $X_n \overset{P}{\to} 0$ if and only if $X_n \to 0 \text{ a.s.}$
$X_1,X_2,\cdots$ are independent with $EX_i = 0$, $\sum_{n=1}^\infty \text{Var}(X_n) < \infty$.
$\implies S_n$ converges a.s.

Let $W_m = \sup_{n,k \ge m} |S_n - S_k|$ be a sequence monotonically decreasing to 0. By the maximal inequality, $$ P(\max_{m\le k\le N} |S_k - S_m| > \epsilon) \le \frac{1}{\epsilon^2}\sum\limits_{k=m+1}^N \text{Var}(X_k). $$ Take $N \to \infty$ to both sides then as $m\to\infty$, $$ P(\sup_{k \ge m} |S_k - S_m| > \epsilon) \le \frac{1}{\epsilon^2}\sum\limits_{k=m+1}^\infty \text{Var}(X_k) \to 0. $$ This implies $W_m \overset{P}{\to} 0.$ By the lemma $W_m \to 0$ a.s. This means for any $\omega \in \Omega_0 = \{W_m \text{ converges}\}$, $P(\Omega_0) = 1$ and $(S_n(\omega))$ is a Cauchy sequence on $\mathbb{R}.$ Hence $S_n$ converges a.s.

Now we state the main theorem.

$X_1,X_2,\cdots$ are independent. Let $Y_i = X_i \mathbf{1}_{|X_i|\le A}$ for $A > 0.$ Then, $S_n=X_1+\cdots+X_n$ converges a.s. if and only if the followings hold.
(i) $\sum_{n=1}^\infty P(|X_n|>A) < \infty.$
(ii) $\sum_{n=1}^\infty EY_n$ converges.
(iii) $\sum_{n=1}^\infty \text{Var}(Y_n) < \infty.$

Intuitively, (i) assures that $X_n = Y_n$ eventually. (ii) and (iii) makes the series $S_n$ to be convergent with the help of the theorem 2.5.6. We will only prove the $\Leftarrow$ direction and leave the other direction for the next chapter.


($\Leftarrow$) Let $Z_n = Y_n - EY_n.$ Then $Z_n$'s are independent with mean 0 and $\sum_{n=1}^\infty \text{Var}(Z_n) <\infty.$ By 2.5.6, $\sum_{n=1}^\infty Z_n$ converges a.s. and thus by (ii), $\sum_{n=1}^\infty Y_n$ also converges a.s. Since the first B-C lemma and (i) implies that $X_n = Y_n$ for all but finite $n$'s, it yields $\sum_{n=1}^\infty X_n$ to be almost surely convergent.


Application

The three series theorem can be used to determine convergence of series. It is especially useful to determine aymptotic rate of series, as in conjunction with the Kronecker’s lemma, it can be applied to the form $\frac{S_n}{f(n)}$ (just like LIL!). The next theorem is one example.

$a_n \uparrow \infty$ and $\sum\limits_{n=1}^\infty \frac{b_n}{a_n}$ converges. Then $\frac{1}{a_n} \sum\limits_{k=1}^n b_k \to 0$ as $n \to \infty.$
$X_1,X_2,\cdots$ are i.i.d. with $EX_1 = 0$, $E|X_1|^p < \infty$ for $1<p<2$.
$\implies \frac{S_n}{n^{1/p}} \to 0 \text{ a.s.}$

Let $Y_k = X_k \mathbf{1}_{|X_k|\le k^{1/p}}$ then we can check it satisfies the conditions of the three series theorem.



Acknowledgement

This post series is based on the textbook Probability: Theory and Examples, 5th edition (Durrett, 2019) and the lecture at Seoul National University, Republic of Korea (instructor: Prof. Johan Lim).