2
$\begingroup$

Assume that $X_1,X_2,\ldots$ are independent random variables (not necessarily of the same distribution). Assume that that $Var[X_n]>0$ for all $n$. Assume also that

$\sum_{n=0}^\infty \frac{Var[X_n]}{n^2}<\infty,$

that

$\frac{1}{n}\sum_{i=1}^n(X_i-E[X_i])\to 0 \textrm{ almost surely as $n\to\infty$},$

that

$E[X_n]>0 \textrm{ for all $n$},$

and that

$\liminf_{n\to\infty} E[X_n] > 0.$

How can we prove that $\sum_{i=1}^n X_i \to \infty\text{ almost surely as $n\to\infty$?}$ This seems to intuitively make sense, but a formal proof escapes me. Also, what can we say if $E[X_n] = 0$ for all $n$ and $\lim_{n\to\infty} E[X_n] = 0$?

  • 0
    Not quite the same as http://math.stackexchange.com/questions/152041/almost-sure-convergence-of-random-variables/152049#1520492012-06-01

2 Answers 2

3

By the strong law of large numbers, $\frac1n\sum\limits_{k=1}^n\left(X_k-E(X_k)\right)$ converges almost surely. The hypothesis on the expectations implies that $\liminf\limits_{n\to\infty}\frac1n\sum\limits_{k=1}^nE(X_k)\geqslant c$ with $c\gt0$. Summing these two yields $\liminf\limits_{n\to\infty}\frac1n\sum\limits_{k=1}^nX_k\geqslant c$ hence $\sum\limits_{k=1}^nX_k$ diverges to $+\infty$ almost surely (and it does so, at least linearly).

  • 0
    The difference I see is that one proof is by contradiction and the other is not. Apart from that...2012-06-03
1

If $\liminf E[X_n] >0$ then there exist a constant $a>0$ such that $\sum_{i=1}^{n}E[X_n]>an $ for sufficiently large $n$.

Now, reasoning with fixed $\omega$, if $\sum X_i < \infty$, then for $n$ large enough $ \frac{1}{n}\sum_{i=1}^n X_i -E[X_i] < \frac{\sum_{i=1}^{\infty}X_i}{n}-a $ where the right hand side is stricly negative for large $n$. This implies that $\frac{1}{n}\sum_{i=1}^n X_i -E[X_i]$ doesn't tend to zero.

We conclude that $\mathbb{P}(\frac{1}{n}\sum_{i=1}^n X_i -E[X_i]\rightarrow 0)=1$ implies $\mathbb{P}(\sum X_i<\infty)=0$