I'm looking for a proof of the following statement: Given a sequence of independent random variables $X_n$ satisfying $ \lim_{n\to \infty} E[X_n] = T, $ where T is a constant, then
$ \lim_{n\to \infty} V[X_n] = 0 $ implies convergence of $X_n$ to $T$ in the mean-square. This statement is supplied without proof or reference in Shreve's Stochastic Calculus book.
Convergence in mean square from expected value/variance
5
$\begingroup$
probability-theory
stochastic-processes
-
1@t-laarhoven: "Mean-square" convergence means $L^2$ convergence, i.e. we want to show $E[(X_n - T)^2] \to 0$. – 2011-08-24
1 Answers
7
I assume $V[X_n]$ is the variance.
Let $\mu_n = E[X_n]$ for convenience, and write $\begin{align*} E[(X_n - T)^2] &= E[(X_n - \mu_n + \mu_n - T)^2] \\ &= E[(X_n - \mu_n)^2] + (\mu_n - T)^2.\end{align*}$
(The cross term vanished since $E[X_n - \mu_n]=0$.) Now both terms go to 0 by assumption.
-
0By the way, this is why $E(X)$ is the value of $x$ which minimizes $E((X-x)^2)$ (and why, as a consequence, the minimal value is the variance of $X$). – 2011-08-28