5
$\begingroup$

Suppose we are given random variables $X_1, X_2, \dots$, which are i.i.d., in $L^2$ and such that $\mathrm{E}[X_i]=0$, $\mathrm{Var}[X_i] = 1$, say. Let $S_n = \sum_{i=1}^n X_i$. It is well-known that in this case, we have the following convergence theorems:

Law of large numbers: $\frac{S_n}n \to 0$ almost surely.

Central limit theorem: $\frac{S_n}{\sqrt{n}}$ converges to a $\sim \mathcal N(0,1)$ random variable in distribution.

Does anything interesting happen, if we normalize $S_n$ differently? For instance, is it possible to normalize in a way that we obtain convergence of $\frac{S_n}{c_n}$ to a non-constant random variable (with $\sqrt{n} \ll c_n\ll n$, I'd imagine) or something like that?

This really is just a random thought... If there is a fundamental reason for why we really only care about the above two normalizations, then I'd be happy to know what these reasons are (apart from the fact that these normalizations are particularly natural).

2 Answers 2

6

There is also the law of iterated logarithm, by which $ \limsup_{n \rightarrow \infty} \frac{S_n}{\sqrt{2n \log \log n}} = 1 $

almost surely. Also, almost surely,

$ \liminf_{n \rightarrow \infty} \frac{S_n}{\sqrt{2n \log \log n}} = -1 $

  • 0
    Thanks. That's the sort of thing I've been looking for.2012-12-31
2

The weakest mode of convergence is convergence in law. So we want a sequence $c_n$ such that $S_n/c_n$ converges in law to a non-constant random variable. Using characteristic functions, we can see that it implies that the sequence $n\cdot\log\left(1-\frac 1{c_n^2}\right)$ converges, and since $c_n\geqslant \sqrt n$, we should have that $\frac n{c_n^2}$ converges, so $\{c_n\}$ behaves as $\sqrt n$.