3.4. Central limit theorem
Now that we have all the right tools, we state and prove the central limit theorem (CLT for short), starting from the simplest form for i.i.d. cases and to Lindeberg and Lyapounov conditions.
» continue reading
3.3. Characteristic functions
In undergraduate statistics, we learned moment generating functions and that it uniquely determines the distribution if exists. However moment generating function does not always exsits for all distributions. Characteristic functions always exists for all real-valued random variables and it provides alternative approach to working with distributions.
» continue reading
3.2.2. Vague convergence and uniform tightness
Our next interest is in whether a sequence of distribution functions converges weakly. To be more specific, subsequential convergence of distribution functions are is the topic of this subsection. Helly’s selection theorem shows there always exists a vaguely convergent subsequence. Uniform tightness of a sequence strengthen this result to be weakly convergent.
» continue reading
3.2.1. Weak convergence
Now that we covered convergence of point estimates (specifically, the sample mean) our next interest is in weaker concept of convergence where convergence in probability is not guaranteed. In undergraduate statistics, we call it convergence in distribution. Here we prefer borrowing terminology from measure theory and call it weak convergence and write $X_n \overset{w}{\to} X$.
» continue reading
2.S1. Convergence concepts
This is a quick review of convergence concepts and their relations.
» continue reading