4.5. Square integrable martingales
In this section, we look into martingales with special property - square integrability. Square integrability gives martingale an upper bound for maximal expectation so that it can further be used to determine the convergence of the sequence.
Square integrable martingales
In the following discussion, we assume $X_0 = 0.$ Notice that $X_n^2$ is a submartingale and if we let $A_n = A_{n-1} + E(X_n^2 | \mathcal{F}_{n-1}) - X_{n-1}^2,$ $A_0=0,$ which is from Doob’s decomposition, then $EX_n^2 = EA_n$ and
\[\begin{aligned} A_n &= \sum\limits_{m=1}^n \left( E(X_m^2|\mathcal{F}_{m-1}) - X_{m-1}^2 \right) \\ &= \sum\limits_{m-1}^n E\left( (X_m - X_{m-1})^2 | \mathcal{F}_{m-1} \right). \end{aligned}\](i) $E \sup_n X_n^2 \le 4 EA_\infty.$
(ii) $E \sup_n |X_n| \le 3 EA_\infty^\frac{1}{2}$
(iii) $\lim_n X_n$ exists and is almost surely finite on $\{A_\infty < \infty\}.$
(iv) If $f:\mathbb{R} \to \mathbb{R}$ is increasing and $\int_0^\infty f^{-2}(t)dt < \infty,$ $f(t) \ge 1,\forall t,$ then $\frac{X_n}{f(A_n)} \to 0$ a.s. on $\{A_\infty = \infty\}.$
expand proof
(i) is direct from $L^p$ maximal inequality.
(ii) Let $N_a = \inf\{n: A_{n+1} > a^2\},$ then it is a stopping time. $$\begin{aligned} P(\sup_n|X_n| > a) &= P(\sup_n |X_n| > a, N < \infty) + P(\sup_n |X_n| > a, N = \infty) \\ &\le P(N<\infty) + P(\sup_n |X_{n \wedge N} > a) \\ &= P(N<\infty) + \lim_n P(\sup_{m\le n} |X_{m \wedge N}| > a) \\ &\le P(N<\infty) + \frac{1}{a^2} \lim_n E|X_{n \wedge N}|^2 \\ &= P(N<\infty) + \frac{1}{a^2} \lim_n E A_{n\wedge N} \\ &\le P(N<\infty) + \frac{1}{a^2} E(A_\infty \wedge a^2) \\ &= P(A_\infty > a^2) + \frac{1}{a^2} E(A_\infty \wedge a^2). \end{aligned}$$ The last inequality is from the fact that $$EA_{n\wedge N} \le EA_N \le a^2 \text{ on } \{N<\infty\}, \\ EA_n \le EA_\infty \le a^2 \text{ on } \{N=\infty\}.$$ Using this, Fubini's theorem and integration by substitution, we get $$\begin{aligned} E \sup_n |X_n| &= \int P(\sup_n |X_n| \ge a) da \\ &\le \int_0^\infty P(A_\infty^{1/2} > a) da + \int_0^\infty \frac{1}{a^2} E(A_\infty \wedge a^2) da \\ &= EA_\infty^{1/2} + \int_0^\infty \frac{1}{a^2} \int_0^{a^2} P(A_\infty > b) db da \\ &= 3EA_\infty^{1/2}. \end{aligned}$$ (iii) Given $a>0,$ by (i), $E \sup_n X_{n \wedge N_a} \le 4a^2 < \infty.$ By submartingale convergence, $X_{n\wedge N_a}$ converges a.s. and in $L^2.$ Now let $C_k = \{X_{n\wedge N_k} \text{ converges}\},$ then $P(C_k) = 1$ and $P(\cap_k C_k) = 1$ as well. For an arbitrary $\omega \in (\cap_k C_k) \cap (A_\infty < \infty),$ $N_k(\omega) = \inf\{n: A_n(\omega)\ge k\} = \infty$ for large enough $k$ since $A_\infty(\omega) < \infty.$ Hence $X_{n\wedge N_k}(\omega) = X_n(\omega)$ converges.
(iv) Let $H_m = \frac{1}{f(A_m)}$ be a bounded predictable sequence. Then $Y_n := (H \cdot X)_n = \sum\limits_{m=1}^n \frac{X_m - X_{m-1}}{f(A_m)}$ is a square integrable martingale. Let $B_n = \sum\limits_{m=1}^n E\left( (Y_m - Y_{m-1})^2 | \mathcal{F}_{m-1} \right),$ then $$\begin{aligned} B_\infty &= \sum\limits_{m=0}^\infty \frac{A_{m+1} - A_m}{f(A_{m+1})^2} \\ &\le \sum\limits_{m=0}^\infty \int_{A_m}^{A_{m+1}} f^{-2}(t) dt \\ &\le \int_0^\infty f^{-2}(t) dt < \infty \text{ a.s.} \end{aligned}$$ By (iii), $\lim_n Y_n$ exists and is finite almost surely. By Kronecker's lemma, it suffices to show that $f(A_n) \uparrow \infty.$ Since $\int_0^\infty f^{-2}(t) dt < \infty,$ $\lim_t f(t)$ should be $\infty$ otherwise it gives contradiction. Since $A_n, f$ is increasing and $f(A_\infty) = \infty$ on $(A_\infty = \infty),$ this is true.
From the facts, we get another form of conditional Borel-Cantelli lemma.
Let $X_n = X_{n-1} + \mathbb{1}_{B_n} - P(B_n | \mathcal{F}_{n-1}),$ $X_0=0$ be a square integrable martingale. Then $A_n$ from Doob's decomposition yields $A_m - A_{m-1} = p_m - p_m^2$ and $A_n = \sum\limits_{m=1}^n p_m - p_m^2 \le \sum\limits_{m=1}^n p_n.$
On $(A_\infty < \infty),$ $X_n$ converges a.s. $$\frac{X_n}{\sum_{m=1}^n p_m} = \frac{\sum_{m=1}^n \mathbf{1}_{B_m}}{\sum_{m=1}^n p_m} - 1 \to 0 \text{ a.s. on } (\sum_{n=1}^\infty p_n = \infty).$$ On $(A_\infty = \infty),$ let $f(t) = 1 \vee t$ so that such $f$ satisfies conditions in (iv) of the previous theorem. Then $\frac{X_n}{f(A_n)} = \frac{X_n}{A_n \vee 1} \to 0$ a.s. on $(A_\infty = \infty).$ Since $A_n \le \sum_{m=1}^n p_m,$ we get $\frac{X_n}{\sum_{m=1}^n p_m} \to 0$ a.s. on $(A_\infty = \infty).$
Acknowledgement
This post series is based on the textbook Probability: Theory and Examples, 5th edition (Durrett, 2019) and the lecture at Seoul National University, Republic of Korea (instructor: Prof. Sangyeol Lee).