The Levy’s continuity theorem is a very important tool in the statistical machinery. For example, it will give us two simple proofs to two classical statistical problems: The Law of Large Numbers and the Central Limit Theorem.

Let us first start with a Lemma without proof.

**Lemma** Random vectors $latex {X}&fg=000000$ and $latex {Y}&fg=000000$ in $latex {{\mathbb R}^{k}}&fg=000000$ are equal in distribution if and only if $latex {\mathbb E e^{it^{\top}X}=\mathbb E e^{it^{\top}Y}}&fg=000000$ for every $latex {t\in{\mathbb R}^{k}}&fg=000000$.

Remember also that the characteristic function of the multivariate $latex {N_{k}\left(\mu,\Sigma\right)}&fg=000000$ distribution is the function

$latex \displaystyle t\mapsto \exp\left( it^{\top}\mu-\frac{1}{2}t^{\top}\Sigma t \right). &fg=000000$

It is also important recall that the sum of independent random variables equals the product of the characteristic function of the individual variables. With those results and the Levy’s theorem we are ready to show the law of large number theorem.

**Proposition (Weak law of large numbers)** *Let $latex {Y_{1}\ldots,Y_{n}}&fg=000000$ be i.i.d. random variables with characteristic function $latex {\phi}&fg=000000$. Then $latex {\overline{Y}_{n}\stackrel{P}{\rightarrow}\mu}&fg=000000$ for a real number $latex {\mu}&fg=000000$ if $latex {\phi}&fg=000000$ is differentiate at zero with $latex {i\mu=\phi^{\prime}(0)}&fg=000000$.*

*Proof:* Because $latex {\phi(0)=1}&fg=000000$, by differentiability of $latex {\phi}&fg=000000$ we have $latex {\phi(t)=1+t\phi^{\prime}(t)+o(t)}&fg=000000$ as $latex {t\rightarrow0}&fg=000000$. Thus by the Fubini’s theorem:

$latex \displaystyle \mathbb E\exp\left({it\overline{Y}_{n}}\right) =\phi^{n}\left(\frac{t}{n}\right)=\left(1+\frac{t}{n}i\mu+o\left(\frac{1}{n}\right)\right)^{n}\rightarrow\exp\left({it\mu}\right). &fg=000000$

The right side is the characteristic function of the constant $latex {\mu}&fg=000000$. By the Levy’s theorem $latex {\overline{Y}_{n}}&fg=000000$ convergence in distribution to $latex {\mu}&fg=000000$. Convergence in distribution to a constant is equivalent to convergence in probability.$latex \Box&fg=000000$

** Remark:** If $latex {\mathbb E|Y|<\infty}&fg=000000$ then we can show that $latex {\phi^{\prime}(t)}&fg=000000$ exists for any $latex {t}&fg=000000$ and $latex {\phi^{\prime}(0)=i\mathbb E(Y)}&fg=000000$. By the dominated convergence theorem, it is allowed interchange the expectation and differentiation signs in

$latex \displaystyle \phi^{\prime}(t)=\frac{d}{dt}\mathbb E \exp\left({itY}\right)=\mathbb E iY\exp\left({itY}\right). &fg=000000$

In particular $latex {\phi^{\prime}(0)=i\mathbb E Y}&fg=000000$ concluding that $latex {\overline{Y}_{n}\stackrel{P}{\rightarrow}\mathbb E Y}&fg=000000$.

* Remark 2:* The converse of this Proposition is also true. See Révész P. (1968).

*The laws of large numbers*, Acad. Press, New York. MR0245079, Zbl 0203.50403.

Assuming $latex {\mathbb E Y^{2}<\infty}&fg=000000$ and adding another term to the Taylor expansion we get the Central Limit Theorem.

**Proposition (Central Limit Theorem)***Let $latex {Y_{1}\dots,Y_{n}}&fg=000000$ be i.i.d. random variables with $latex {\mathbb E Y_{i}=0}&fg=000000$ and $latex {\mathbb E Y_{i}^{2}=1}&fg=000000$. Then the sequence $latex {\sqrt{n}\overline{Y}_{n}}&fg=000000$ converges in distribution to the standard normal distribution.*

*Proof:* A second time differentiation under the expectation sign shows that $latex {\phi^{\prime\prime}(0)=i^{2}\mathbb E Y^{2}}&fg=000000$. Because $latex {\phi^{\prime}(0)=i\mathbb E Y=0}&fg=000000$, we get

$latex \displaystyle \mathbb E \exp\left({it\sqrt{n}\overline{Y}_{n}}\right)=\phi^{n}\left(\frac{t}{\sqrt{n}}\right)=\left(1-\frac{1}{2}\frac{t^{2}}{n}\mathbb E Y^{2}+o\left(\frac{1}{n}\right)\right)^{n}\rightarrow \exp\left({-\frac{1}{2}t^{2}\mathbb E Y^{2}}\right). &fg=000000$

The right side is the characteristic function of the normal distribution with mean zero and variance $latex {\mathbb E Y^{2}}&fg=000000$. The proposition follows from Levy’s continuity theorem. $latex \Box&fg=000000$

## Comments

Pingback: Characteristic functions and the Lévy’s continuity theorem | Blog about Statistics

Pingback: The Slutsky’s lemma as an application of the continuous mapping theorem and uniform weak convergence | Maximum Entropy