4.6.1. Uniform integrability and convergence in $L^1$

$\newcommand{\argmin}{\mathop{\mathrm{argmin}}\limits}$ $\newcommand{\argmax}{\mathop{\mathrm{argmax}}\limits}$

In section 4.4, we covered the condition where martingales converges in $L^p.$ We only covered the case where $p>1.$ In this section, the notions of uniform integrability is introduced to compensate convergence in $p=1$ case.


Uniform integrability

If a random variable $X$ is integrable, $\int_{|X|\ge a} |X| dP < \epsilon$ for all $\epsilon > 0$ for large $a$ and vice versa. Intuitively, in order for a random variable to be integrable, integration of its tail part should be bounded for any small $\epsilon.$ Uniform integrability is defined accordingly.

$(X_t)_{t\in T}$ is uniformly integrable if $\lim_a \sup_{t\in T} \int_{|X_t|\ge a} |X_t| dP = 0.$

If $X_t \le X$ for all $t\in T$ where $X$ is integrable, $(X_t)$ is uniformly integrable. If $(X_t), (Y_t)$ are uniformly integrable, then $(X_t + Y_t)$ is uniformly integrable since for given $a>0$

\[\begin{aligned} &\int_{|X_t+Y_t|\ge a} |X_t+Y_t| dP \\ &\le \int_{|X_t|+|Y_t|\ge a, |X_t|\ge |Y_t|} |X_t|+|Y_t| dP \\ &\;\;\;\;+ \int_{|X_t|+|Y_t|\ge a, |X_t| < |Y_t|} |X_t|+|Y_t| dP \\ &\le \int_{2|X_t|\ge a} 2|X_t| dP + \int_{2|Y_t|\ge a} 2|Y_t| dP. \end{aligned}\]

The next theorem which sometimes is referred to as Vitali’s lemma is about necessary and sufficient condition for uniform integrability.

$(X_t)_{t\in T}$ is uniformly integrable if and only if the followings hold.
(i) $\sup_{t\in T} E|X_t| < \infty.$
(ii) $\forall \epsilon>0,\exists\delta>0$ such that $\sup_{t\in T} \int_A |X_t|dP \le \epsilon$ for all $A\in\mathcal{F}$ where $P(A)\le\delta.$

($\Rightarrow$) (i) is clear. Given $A \in \mathcal{F},$ $a>0,$ $$\begin{aligned} & \int_A |X_t| dP \\ &= \int_{A\cap\{|X_t|\ge a\}} |X_t| dP + \int_{A\cap\{|X_t|< a\}} |X_t| dP \\ &\le \int_{|X_t|\ge a} |X_t| dP + aP(A) \end{aligned}$$ Thus $\sup_t \int_A |X_t| dP \le \epsilon/2 + a\delta.$
($\Leftarrow$) Let $M = \sup_t E|X_t| < \infty,$ $a_0 = M/\delta.$ Since $P(|X_t|\ge a_0) \le E|X_t|/a_0 \le M/a_0 = \delta,$ $\sup_t \int_{|X_t|\ge a_0} |X_t| dP \le \epsilon.$

We state our main theorem of this subsection.

Suppose $X_n \to X,$ $X_n \in L^p,$ $p\ge 1.$ The followings are equivalent.
(i) $(|X_n|^p)$ is uniformly integrable.
(ii) $X_n \to X$ in $L^p.$
(iii) $E|X_n|^p \to E|X|^p < \infty.$
expand proof

((i)$\Rightarrow$(ii)) By Fatou's lemma, $E|X|^p \le \infty.$ $|X_n - X|^p \le 2^p(|X_n|^p+|X|^p)$ makes $|X_n-X|^p$ uniformly integrable. By the theorem, given $\epsilon>0$ there exists $\delta>0$ such that $\sup_{t\in T} \int_A |X_t|dP \le \epsilon$ for all $A\in\mathcal{F}$ where $P(A)\le\delta.$ There exists $N$ such that for all $n\ge N,$ $P(|X_n-X|^p \ge \epsilon) \le \delta.$ Thus $$E|X_n-X|^p = E|X_n-X|^p\mathbf{1}_{|X_n-X|^p\ge\epsilon} + E|X_n-X|^p\mathbf{1}_{|X_n-X|^p<\epsilon} \le 2\epsilon.$$ ((ii)$\Rightarrow$(iii)) Trivial by $|\|X_n\|_p - \|X\|_p| \le \|X_n - X\|_p.$
((iii)$\Rightarrow$(i)) Given $a\in\mathbb{R}$ such that $P(|X|^p=a)=0.$
claim: $|X_n|^p\mathbf{1}_{|X_n|^p \le a} \overset{P}{\to} |X|^p\mathbf{1}_{|X|^p \le a}.$
For all $\delta>0,$ $$\begin{aligned} &P(|\mathbf{1}_{|X_n|^p \le a} - \mathbf{1}_{|X|^p \le a}|>\epsilon)\\ &\le P(|X_n|^p\le a, |X|^p>a) + P(|X_n|^p>a, |X|^p \le a) \\ &\le P(|X_n|^p\le a, |X|^p>a+\delta) + P(|X_n|^p>a, a-\delta < |X|^p \le a) \\ &\;\;\;\;+ P(|X_n|^p > a, |X|^p \le a-\delta) + P(|X_n|^p>a, a-\delta < |X|^p \le a) \\ &\le 2P(||X_n|^p-|X|^p|>\delta) + P(a<|X|^p\le a+\delta) + P(a-\delta<|X|^p\le a) \end{aligned}$$ Thus as $\delta\to0,$ $$\limsup_n P(|\mathbf{1}_{|X_n|^p \le a} - \mathbf{1}_{|X|^p \le a}|>\epsilon) \le 0 + P(|X|^p = a) = 0.$$ By the claim and since $|X_n|^p\mathbf{1}_{|X_n|^p\le a}$ is bounded by $a,$ $(|X_n|^p\mathbf{1}_{|X_n|^p\le a})$ is uniformly integrable. By (i)$\Rightarrow$(ii)$\Rightarrow$(iii), $E|X_n|^p\mathbf{1}_{|X_n|^p \le a} \to E|X|^p\mathbf{1}_{|X|^p \le a}.$ In addition, by the assumption $E|X_n|^p\mathbf{1}_{|X_n|^p > a} \to E|X|^p\mathbf{1}_{|X|^p > a}.$ For a given $\epsilon>0,$ there exists $a_0>0$ such that $E|X|^p\mathbf{1}_{|X|^p > a_0}<\epsilon/2$ and $P(|X|^p=a_0)=0.$ Pick $N$ such that $\left|E|X_n|^p\mathbf{1}_{|X_n|^p > a_0} - E|X|^p\mathbf{1}_{|X|^p > a_0}\right| < \epsilon/2$ for all $n\le N.$ Then for $n\ge N,$ $E|X_n|^p\mathbf{1}_{|X_n|^p > a_0} < \epsilon.$ For $n < N,$ there exists $a_1$ such that $\max_{n<N} E|X_n|^p\mathbf{1}_{|X_n|^p > a_0} < \epsilon.$


$L^1$ convergence of martingales

With uniform integrability we get $L^1$-convergence of martingales. First we define regular and closable martingale for simplicity of the statement.


A martingale $(X_n)$ is regular if there exists a random variable $X \in L^1$ such that $X_n = E(X|\mathcal{F}_n)$ a.s. $(X_n)$ is closable if there exists a random variable $X_\infty \in L^1$ such that $X_n \to X_\infty$ a.s. and $E(X_\infty | \mathcal{F}_n) = X_n$ a.s. for all $n.$

If $X_n$ is closable, then it is clearly a regular martingale.

Let $X_n$ be a martingale. The followings are equivalent.
(i) $X_n$ is regular.
(ii) $X_n$ is uniformly integrable.
(iii) $X_n$ converges a.s. and in $L^1$
(iv) $X_n$ is closable.
expand proof

((i)$\Rightarrow$(ii)) There exists $X \in L^1$ such that $X_n = E(X|\mathcal{F}_n)$ a.s. $$\int_{|X_n|\ge a}|X_n|dP \le \int_{|X_n|\ge a} E(|X||\mathcal{F}_n)dP \le \int_{E(|X||\mathcal{F}_n)\ge a}|X|dP.$$ Since $X$ is integrable, for a given $\epsilon>0$ there exists $\delta>0$ such that $\int_A |X|dP < \epsilon$ for all $A$ such that $P(A)\le\delta$ and there exists $a>0$ such that $P(E(|X||\mathcal{F}_n)\ge a)\le \frac{1}{a} E|X| < \delta.$
((ii)$\Rightarrow$(iii)) Uniform integrability implies $\sup_n |X_n| < \infty$ so by submartingale convergence we get convergence in probability. By Vitali's lemma, we get the result.
((iii)$\Rightarrow$(iv)) There exists $X \in L^1$ such that $E|X_n-X|\to0$ as $n\to\infty.$ Then $E|X_n| \to E|X|$ and $\sup_n E|X_n| <\infty.$ By submartingale inequality, there exists $X_\infty \in L^1$ such that $X_n\to X_\infty$ a.s. Notice that $X=X_\infty$ a.s. Let $m \ge n$ then $$\begin{aligned} &E\left| E(X_\infty|\mathcal{F}_n) - X_n \right| \\ &= E\left| E(X_\infty|\mathcal{F}_n) - E(X_m|\mathcal{F}_n) \right| \\ &\le E\left| E(|X_\infty - X_m| | \mathcal{F}_n) \right| \\ &= E|X_\infty - X_m| \to 0 \end{aligned}$$ as $m\to\infty.$ Hence $E(X_\infty|\mathcal{F}_n) = X_n$ a.s.
((iv)$\Rightarrow$(i)) Trivial.


Acknowledgement

This post series is based on the textbook Probability: Theory and Examples, 5th edition (Durrett, 2019) and the lecture at Seoul National University, Republic of Korea (instructor: Prof. Sangyeol Lee).