Exercise 4.5.9 of Jeff Rosenthal's "A first look at rigorous probability" asks whether or not we can prove the linearity of variance for independent random variables by induction; in the text it is proved by noting that in general
\begin{equation} Var(\sum_{i=1}^n X_i) = \sum_{i=1}^n Var(X_i) + 2 \sum_{i < j} Cov(X_i, X_j). \end{equation}
With independent random variables the covariance terms vanish and we get the desired formula.
In particular he asks the following. If we know that $Var(X + Y) = Var(X) + Var(Y)$ when $X$ and $Y$ are independent, then can we conclude that $Var(X + Y + Z) = Var(X) + Var(Y) + Var(Z)$ when $X, Y$, and $Z$ are independent.
I think the answer is yes. We proved earlier in the text that under these assumptions $X + Y$ and $Z$ are independent. Thus
\begin{equation} Var(X + Y + Z) = Var((X + Y) + Z) = Var(X + Y) + Var(Z) = Var(X) + Var(Y) + Var(Z). \end{equation}
Is this correct?