Let $X_1,X_2,\dotsc$ be defined jointly. I'm not entirely sure what this means, but I think it means they're all defined on the same space. Let $E[X_i]=0, E[X_i^2]=1 \;\forall\; i$. Show $P(X_n\geq n \text{ infinitely often}=0$. The condition that an event occurs infinitely often is equivalent to saying the $\lim\sup (X_n\geq n)$ (at least I'm pretty sure that's the correct way to write it). The $\lim\sup$ is defined as $$ \bigcap_{n=1}^\infty \bigcup_{k=n}^\infty (X_k \geq n) $$ Now since the mean for each $X_i$ is $0$ intuitively it makes sense that the probability would go to $0$. My concern though is that I'm not using the other hypothesis and I can't really see how it fits in. Thanks!
Showing the probability of an event occuring infinitely often is $0$
1 Answers
Hint: According to the first Borel-Cantelli lemma, the limsup of the events has probability zero as soon as the series $(*)$ $\sum\limits_n\mathrm P(X_n\geqslant n)$ converges. Hence if one shows $(*)$ converges, the proof is over.
How to show that $(*)$ converges? Luckily, one is given only one hypothesis on $X_n$, hence one knows that one must use it somehow. Since the hypothesis is that $\mathrm E(X_n)=0$ and $\mathrm E(X_n^2)=1$ for every $n$, the problem is to bound $\mathrm P(X\geqslant n)$ for any random variable $X$ such that $\mathrm E(X)=0$ and $\mathrm E(X^2)=1$. Any idea?
One might begin with the obvious inclusion $[X\geqslant n]\subseteq[|X-\mathrm E(X)|\geqslant n]$ and try to use one of the not-so-many inequalities one knows which allow to bound $\mathrm P(|X-\mathrm E(X)|\geqslant n)$...
-
0This gives $Var[X]=1$ but I'm not quite sure how to apply that. It's incredibly late here and I know there's a mistake in my logic, but this is what I'm thinking: $E[X]=P(X_n=1)=\lim_{n\to\infty} \frac{1}{n}=0$ $\Rightarrow P(X_n=k)=0 \;\forall\; k$. Then $P(X\geq n)=P(X_n=n)+P(X_n=n+1)+\cdots = 0$ so the sum is $<\infty$. – 2011-10-12
-
0@bret, right, let us continue this once you will have had some sleep. – 2011-10-12
-
0@Bret: Consider how $\int_{|x|>\alpha}\alpha^2\;\mathrm{d}\mu(x)$ compares to $\int_{|x|>\alpha}x^2\;\mathrm{d}\mu(x)$. – 2011-10-12
-
0@Didier: So does my idea for showing the sum is $<\infty$ works? And I'm not entirely sure how to apply your suggestion rob; we haven't used any integrals yet so I'm not sure it's the right approach. – 2011-10-12
-
0@bret, sorry but I see no upper bound of $\mathrm P(X\geqslant n)$ in what you wrote... Question: what probabilistic inequalities do you know? – 2011-10-12
-
0@bret, Edited version. Tell me if this helps. – 2011-10-14