0
$\begingroup$

$\newcommand{\Var}{\operatorname{Var}}$

The formula for the variance of the sum of two independent random variables is given $$ \Var (X +X) = \Var(2X) = 2^2\Var(X)$$

How then, does this happen:

Rolling one dice, results in a variance of $\frac{35}{12}$. Rolling two dice, should give a variance of $2^2\Var(\text{one die}) = 4 \times \frac{35}{12} \approx 11.67$. Instead, my Excel spreadsheet sample (and other sources) are giving me 5.83, which can be seen is equal to only $2 \times \Var(X)$.

What am I doing wrong?

  • 0
    If only you could enclose the excel sheet. How are you computing the variance of both dice?2012-11-14
  • 0
    Michael Hardy's answer, though downvoted, is correct.2012-11-14
  • 1
    Symbols should stand for the same thing wherever they appear in a formula. While it is perfectly acceptable to use $X$ to denote the result of first roll of the die, it is _not_ appropriate to use $X$ to _also_ denote the result of the second roll of the die, unless you are considering a weird die that _always_ shows the _same_ number on two successive rolls. That is, for an ordinary die, $X+X$ is **not** the sum of the results of the two successive rolls, and the variance of the sum is _not_ $4$var$(X)$. Instead, the variance is var$(X) + $var$(Y) = 2$var$(X)$ as Michael Hardy points out.2012-11-14

1 Answers 1

3

$\newcommand{\Var}{\operatorname{Var}}$

The formula you give is not for two independent random variables. It's for random variables that are as far from independent as you can get. If $X,Y$ are independent, then you have $\Var(X+Y)=\Var(X)+\Var(Y)$. If, in addition, $X$ and $Y$ both have the same distribution, then this is equal to $2\Var(X)$. It is also the case that, as you say, $\Var(X+X)=4\Var(X)$. But that involves random variables that are nowhere near independent.

  • 0
    Thanks for clearing that up for me! When would you use $Var(X+X)$? i.e. what do you mean by "nowhere near independent"?2012-11-14
  • 0
    I would write $\operatorname{Var}(X+X)$ only when that is what I meant. Suppose $X=\left.\begin{cases} 0 & \text{with probability }1/3, \\ 1 & \text{with probability }1/2, \\ 2 & \text{with probability }1/6. \end{cases}\right\}$ Then $X+X=\left.\begin{cases} 0 & \text{with probability }1/3, \\ 2 & \text{with probability }1/2, \\ 4 & \text{with probability }1/6. \end{cases}\right\}$ On the other hand, suppose $Y$ is _independent of $X$ and has that same distribution. Then $X+Y$ could be $0$, $1$, $2$, $3$, or $4$, each with some probability that follows from the above.2012-11-15
  • 0
    Specifically, $X+Y=\left.\begin{cases} 0 & \text{with probability }1/9, \\ 1 & \text{with probability }1/3, \\ 2 & \text{with probability }13/36 ,\\ 3 & \text{with probability }1/6, \\ 4 & \text{with probability }1/36. \end{cases}\right\}$ So the distribution of $X+X$ is quite different from the distribution of $X+Y$.2012-11-15