10
$\begingroup$

For $X_i$ independent, is $\operatorname{Var}\left(\sum \limits_{i = 0}^\infty X_i \right) = \sum\limits_{i=0}^\infty \operatorname{Var}(X_i)$?

Thanks!

  • 0
    I may lose simple ways, but it seems that you have to use e.g. [Dominated Convergence Theorem](http://en.wikipedia.org/wiki/Dominated_convergence_theorem) so there may be a counterexample2011-11-15
  • 0
    In the finite case one can weaken the hypothesis of independence to that of uncorrelatedness: that all covariances are $0$. I'll be surprised if that is not also true of the infinite case.2011-11-15
  • 7
    Maybe this is more pathological than what you are looking for and not sure if it should be a comment or answer. Let $X_n = (-1)^n (2n-1)$ with probability one. Then $\{X_n\}$ are independent and $\sum_{i=1}^\infty \sigma_i^2 = 0$, but $\mathrm{Var}(\sum_i X_i)$ does not exist since $S_n = \sum_{i=1}^n X_i = (-1)^n n$ and so $S_n$ does not have a limit, and cannot have a mean, so cannot have a variance.2011-11-15
  • 0
    What if the sequence of variances of $\small X_i $ is $\small 1,4,9,16,25,\ldots,k^2,\ldots $ and thus their sum equals (formally?) $\small \zeta(-2) $ My point is, independently of the choice of $\small \zeta() $ as an example here, that if infinitely many terms are involved then we might have unexpected zeros which are counterintuitive and not a consequence of approximation.2011-11-16

1 Answers 1

9

Yes, as soon as the RHS is finite and the series $\sum\limits_{n}\mathrm E(X_n)$ converges.

To see this, assume without loss of generality that the random variables $X_n$ are centered with variances $\sigma_n^2$ and that the series $\sum\limits_n\sigma_n^2$ converges, with $\sigma^2$ as its sum. Let $S_n=\sum\limits_{k\leqslant n}X_k$ and note that, for every $n\leqslant m$, $$ \mathrm E((S_m-S_n)^2)=\sum\limits_{k=n+1}^m\sigma_k^2, $$ which converges to zero when $n\to\infty$, hence $(S_n)_n$ is a Cauchy sequence in $L^2$. Let $S$ denote its limit in $L^2$. Then $\mathrm E(S_n^2)\to\mathrm E(S^2)$ when $n\to\infty$ and, for every $n$, $$ \mathrm E(S_n^2)=\sum\limits_{k\leqslant n}\sigma_k^2, $$ hence $\mathrm E(S^2)=\sigma^2$. Since $S_n\to S$ in $L^2$, a subsequence converges almost surely to $S$. Kolmogorov's inequality proves that the whole sequence converges almost surely to $S$, hence $S_n\to S$ in the almost sure sense as well and the proof is complete.

If the sum of the series $\sum\limits_n\sigma_n^2$ is infinite, $(S_n)_n$ diverges almost surely.