1
$\begingroup$

Exercise 4.5.9 of Jeff Rosenthal's "A first look at rigorous probability" asks whether or not we can prove the linearity of variance for independent random variables by induction; in the text it is proved by noting that in general

\begin{equation} Var(\sum_{i=1}^n X_i) = \sum_{i=1}^n Var(X_i) + 2 \sum_{i < j} Cov(X_i, X_j). \end{equation}

With independent random variables the covariance terms vanish and we get the desired formula.

In particular he asks the following. If we know that $Var(X + Y) = Var(X) + Var(Y)$ when $X$ and $Y$ are independent, then can we conclude that $Var(X + Y + Z) = Var(X) + Var(Y) + Var(Z)$ when $X, Y$, and $Z$ are independent.

I think the answer is yes. We proved earlier in the text that under these assumptions $X + Y$ and $Z$ are independent. Thus

\begin{equation} Var(X + Y + Z) = Var((X + Y) + Z) = Var(X + Y) + Var(Z) = Var(X) + Var(Y) + Var(Z). \end{equation}

Is this correct?

1 Answers 1

4

This is correct, provided you have a convincing proof that if $(X_1,X_2,\ldots,X_{n})$ is independent, for any $n\geqslant3$,, then $(X_1+X_2+\cdots+X_{n-1},X_{n})$ is independent. The fact that if $(X_1,X_2,X_3)$ is independent, then $(X_1+X_2,X_3)$ is independent, is not sufficient to perform the induction.