3.10. Limit theorems in ℝᵈ

$\newcommand{\argmin}{\mathop{\mathrm{argmin}}\limits}$ $\newcommand{\argmax}{\mathop{\mathrm{argmax}}\limits}$

This part covers limit theorems regarding random vectors $\mathbf{X} = (X_1,\cdots,X_d)’\in\mathbb{R}^d.$


Definitions

$F:\mathbb{R}^d\to\mathbb{R}$ is a distribution function if it is
(i) non-decreasing.
(ii) right-continuous.
(iii) $\lim_{x\to-\infty}F(x)=0$ and $\lim_{x\to\infty}F(x)=1.$ Where $x\to\infty$ means $x_i\to\infty$ for all $i=1,\cdots,d$ and $x\downarrow x$ means $y_i\downarrow x_i$ for all $i.$
(iv) $\Delta_A F := P(X\in A) \ge 0$ for all rectangle $A,$ where $P(X\in A) = \sum_{v\in V}\text{sgn}(v)F(v),$ $A=(a_1,b_1]\times\cdots\times(a_d,b_d],$ $V=\{a_1,b_1\}\times\cdots\times\{a_d,b_d\},$ $\text{sgn}(v) = (-1)^{(\text{\# of } a_i \text{'s in } v)}.$


Limit theorems

$f$ is Lipschitz continuous on a metric space $(S,\rho)$ if there exists $c\in\mathbb{R}$ such that $|f(x)-f(y)|\le c\cdot\rho(x,y)$ for all $x,y\in S.$

We arrive at another portmanteau theorem.

The followings are equivalent.
(i) $Eg(X_n) \to Eg(X_\infty),~ \forall g:$ bounded, continuous.
(ii) $Eg(X_n) \to Eg(X_\infty),~ \forall g:$ bounded, Lipschitz continuous.
(iii) $\limsup_n P(X_n \in F) \le P(X_\infty \in F),~ \forall F:$ closed.
(iv) $\liminf_n P(X_n \in G) \ge P(X_\infty \in G),~ \forall G:$ open.
(v) $\lim_n P(X_n\in A) = P(X_\infty \in A),~ \forall A:~ P(X_\infty \in \partial A)=0.$
(vi) $Ef(X_n) \to Ef(X_\infty),~ \forall f:$ bounded and $P(X_\infty \in D_f) = 0.$

In addition, we get multi-dimensional version of tightness theorem.

$(\mu_n)$ is uniformly tight if for all $\epsilon > 0,$ there exists $M$ such that $\liminf_n \mu_n([-M,M]^d) \ge 1-\epsilon.$
If $(\mu_n)$ is tight, there exists a weakly convergent subsequence.

The proof uses both Helly’s selection theorem and uniform tightness.

Characteristic functions and inversion

$$\varphi(\mathbf{t}) := Ee^{i\mathbf{t}\cdot\mathbf{X}} = Ee^{i(t_1X_2+\cdots+t_dX_d)}$$ where $\mathbf{t}=(t_1,\cdots,t_d)$ and $\mathbf{X}=(X_1,\cdots,X_d).$
Let $A=[a_1,b_1]\times\cdots\times[a_d,b_d]$ such that $\mu(\partial A)=0.$ Let $\varphi$ be a ch.f. of $\mu.$ Define $$\psi_j(t_j) = \frac{1}{it_j}\left( e^{-it_ja_j} - e^{-it_jb_j} \right)$$ then $$\mu(A) = \lim_T \frac{1}{(2\pi)^d} \int_{[-T,T]^d} \prod_{j=1}^d \psi_j(t_j) \varphi(\mathbf{t})d\mathbf{t}.$$

We achieve the result by Fubini’s theorem and the inversion formula for random variables.

Central limit theorem in $\mathbb{R}^d$

Let $\mathbf{X}_n$, $1\le n\le\infty$ be random vectors with ch.f. $\varphi_n.$ then $\mathbf{X}_n \overset{w}{\to} \mathbf{X}_\infty$ if and only if $\varphi_n(\mathbf{t}) \to \varphi_\infty(\mathbf{t}).$
$$\mathbf{\theta}\cdot\mathbf{X}_n \overset{w}{\to} \mathbf{\theta}\cdot\mathbf{X}_\infty,~ \forall \mathbf{\theta} \in \mathbb{R}^d \implies \mathbf{X}_n \overset{w}{\to} \mathbf{X}_\infty.$$

Cramer-Wold device implies if every linear combination converges weakly to the same random variable, then the random vector itself weakly converges. In fact for normal random vector we can prove that $\mathbf{X} \sim \mathcal{N}_d(0,\mathbf{\Gamma})$ if and only if $\mathbf{t}\cdot\mathbf{X} \sim \mathcal{N}(0,\mathbf{t}’\mathbf{\Gamma}\mathbf{t})$ for all $\mathbf{t} \in \mathbb{R}^d.$

$\mathbf{X}_i$'s are i.i.d. random vectors with $E\mathbf{X}_i=\mathbf{\mu},$ $\Gamma_{ij} = E(X_{ni}-\mu_i)(X_{nj}-\mu_j).$
Let $\mathbf{S}_n = \mathbf{X}_1+\cdots+\mathbf{X}_n$, then $\frac{\mathbf{S}_n - n\mathbf{\mu}}{\sqrt n} \overset{w}{\to} \mathcal{N}_d(\mathbf{0}, \mathbf{\Gamma}).$

Without loss of generality, let $\mathbf{\mu}=0.$ Given $\mathbf{t} \in \mathbb{R}^d,$ $$\frac{1}{\sqrt n} \mathbf{t}\cdot\mathbf{S}_n = \frac{1}{\sqrt n} \sum\limits_{i=1}^n \mathbf{t}\cdot\mathbf{X}_i \overset{w}{\to} \mathcal{N}(0,\sigma_t^2)$$ where $\sigma_t^2 = \text{Var}(\mathbf{t}\cdot\mathbf{X}_i) = \mathbf{t}'\mathbf{\Gamma}\mathbf{t}.$ By Cramer-Wold device and the fact, the desired result follows.


Acknowledgement

This post series is based on the textbook Probability: Theory and Examples, 5th edition (Durrett, 2019) and the lecture at Seoul National University, Republic of Korea (instructor: Prof. Johan Lim).