1
$\begingroup$

Let $(\Omega, \mathbb{A},P)$ be a probability triple and $X_n$ be a sequence of random variables. $X_n$ converges almost surely if and only if $Prob(w \in \Omega:X_n(w) \to X(w) \ as \ n \ \to \ \infty )=1$.

How do we know that this set is measurable?

Thanks.

Christian

  • 0
    You're not confused (Although I sense a hint of sarcasm).2012-11-02

1 Answers 1

7

If every $X_n$ is measurable and the event $[X_n\to X]$ is almost sure, then $X$ is measurable${}^*$ since $ [X_n\to X]\cap[X\leqslant x]=[X_n\to X]\cap\bigcap_{n\geqslant1}\bigcap_{k\geqslant1}\bigcup_{i\geqslant k}[X_i\leqslant x+\tfrac1n]. $ If $X$ and every $X_n$ are measurable, the set $[X_n\to X]$ is measurable since $ [X_n\to X]=\bigcap_{n\geqslant1}\bigcup_{k\geqslant1}\bigcap_{i\geqslant k}\,[|X_i-X|\leqslant\tfrac1n]. $ Edit: ${}^*$ This assumes that the sigma-algebra $\mathcal A$ is complete, that is, that $\mathcal A$ contains every subset of every $A$ in $\mathcal A$ such that $\mathbb P(A)=0$. Otherwise, assume that $[X_n\to X]=\Omega$.

  • 0
    Continuous functions are measurable...2012-11-01