5
$\begingroup$

I have a question about part of a proof of a Lemma in a book (Casella's Statistical Inference) I'm reading. This it how it goes.

Let $X_1, \cdots ,X_n$ are a random sample from a population and let $g(x)$ be a function such that $\mathbb{E}g(X_1)$ and $\text{Var}\,g(X_1)$ exist. Then $ \text{Var}\,\left(\sum_{i=1}^{n}g(X_i)\right)=n\left(\text{Var}\,g(X_1)\right).$


So this is how I proceeded to to prove it.
Since the $X_i's$ are independent, we have that
$ \begin {align*} \text{Var}\,\left(\sum_{i=1}^{n}g(X_i)\right)&= \text{Var}\,g(X_1)+\cdots +\text{Var}\,g(X_n)\\ &= n\text{Var}\, g(X_1). \end {align*}$ where the last equality holds because the $X_i's$ are identically distributed. Can I do this? I'm asking this because the proof in the book started by using the definition of the variance and somewhere along the lines involved the covariance matrix.
Thanks.

  • 0
    Thanks. That's refreshing to know...)2019-04-24

1 Answers 1

4

Shai has answered Nana's question, but in the interest of this question being "officially" answered let's prove the elementary result Shai cites; namely, that if $Y_1, \ldots, Y_n$ are independent random variables with finite mean and variance then $\newcommand{\Var}{\mathrm{Var}}\Var(Y_1 + \cdots + Y_n) = \Var(Y_1) + \cdots + \Var(Y_n)$.

First, let's prove it in the $n=2$ case. If $Y_1$ and $Y_2$ are independent then we know that $E[Y_1 Y_2] = E[Y_1] E[Y_2]$. By a basic property of the variance,

$\Var(Y_1 + Y_2) = E[(Y_1 + Y_2)^2] - (E[Y_1 + Y_2])^2 = E[Y_1 + 2Y_1Y_2 + Y_2^2] - (E[Y_1] + E[Y_2])^2$ $= E[Y_1^2] + 2E[Y_1Y_2] + E[Y_2^2] - E[Y_1]^2 -2E[Y_1]E[Y_2] - E[Y_2]^2$ $= E[Y_1^2] - E[Y_1]^2 + E[Y_2^2] - E[Y_2]^2 = \Var(Y_1) + \Var(Y_2).$

Then, applying the result for the $n=2$ case successively to the general case we have $\Var(Y_1 + \cdots + Y_n) = \Var(Y_1) + \Var(Y_2 + \cdots + Y_n) = \Var(Y_1) + \Var(Y_2) + \Var(Y_3 + \cdots + Y_n)$ $= \cdots = \Var(Y_1) + \cdots + \Var(Y_n).$