Let $X_1,X_2,\dotsc$ be defined jointly. I'm not entirely sure what this means, but I think it means they're all defined on the same space. Let $E[X_i]=0, E[X_i^2]=1 \;\forall\; i$. Show $P(X_n\geq n \text{ infinitely often}=0$. The condition that an event occurs infinitely often is equivalent to saying the $\lim\sup (X_n\geq n)$ (at least I'm pretty sure that's the correct way to write it). The $\lim\sup$ is defined as $$ \bigcap_{n=1}^\infty \bigcup_{k=n}^\infty (X_k \geq n) $$ Now since the mean for each $X_i$ is $0$ intuitively it makes sense that the probability would go to $0$. My concern though is that I'm not using the other hypothesis and I can't really see how it fits in. Thanks!
Showing the probability of an event occuring infinitely often is $0$
3
$\begingroup$
probability