34
$\begingroup$

I understand that the variance of the sum of two independent normally distributed random variables is the sum of the variances, but how does this change when the two random variables are correlated?

  • 2
    You have to add twice the covariance.2012-03-02
  • 0
    There is also a good (and simple) explanation on [Insight Things](http://insight-things.com/why-you-can-add-variances).2016-03-07

3 Answers 3

38

For any two random variables: $$\text{Var}(X+Y) =\text{Var}(X)+\text{Var}(Y)+2\text{Cov}(X,Y).$$ If the variables are uncorrelated (that is, $\text{Cov}(X,Y)=0$), then

$$\tag{1}\text{Var}(X+Y) =\text{Var}(X)+\text{Var}(Y).$$ In particular, if $X$ and $Y$ are independent, then equation $(1)$ holds.

In general $$ \text{Var}\Bigl(\,\sum_{i=1}^n X_i\,\Bigr)= \sum_{i=1}^n\text{Var}( X_i)+ 2\sum_{i< j} \text{Cov}(X_i,X_j). $$ If for each $i\ne j$, $X_i$ and $X_j$ are uncorrelated, in particular if the $X_i$ are pairwise independent (that is, $X_i$ and $X_j$ are independent whenever $i\ne j$), then $$ \text{Var}\Bigl(\,\sum_{i=1}^n X_i\,\Bigr)= \sum_{i=1}^n\text{Var}( X_i) . $$

  • 0
    I am unfamiliar with the summation(i2012-03-02
  • 3
    @soo You calculate all covariances $\text{Cov}(X_i,X_j)$ with $i and sum them up. Another way to write $2\sum_{i in this case is to write $\sum_{i\ne j}$. (The 2 is there in the first sum because in the second sum you calculate, e.g., $\text{Cov}(X_1,X_2)$ and $\text{Cov}(X_2,X_1)$, but these are equal.2012-03-02
  • 0
    David, excellent explanation, the 2 in the 2*cov(...) makes more sense now. Also, can you explain why you wouldn't define an upper limit "n" in the summation(i2012-03-02
  • 0
    @soo For your first comment, that's correct. I'll just let your comment be the addendum, if that's ok.2012-03-02
  • 0
    @soo To be rigorous, I should have written $\sum\limits_{i or something like that. No upper limit though. The lower limit perfectly describes what the index set is.2012-03-02
  • 0
    Are the statements "The covariance is zero" and "The events are independent" equivalent?2015-11-14
  • 0
    @IanHaggerty No. See [this](http://math.stackexchange.com/questions/249422/why-can-2-uncorrelated-random-variables-be-dependent) for example.2015-11-14
7

You can also think in vector form:

$$\text{Var}(a^T X) = a^T \text{Var}(X) a$$

where $a$ could be a vector or a matrix, $X = (X_1, X_2, \dots, X_n)^T$ is a vector of random variables. $\text{Var}(X)$ is the covariance matrix.

If $a = (1, 1, \dots, 1)^T$, then $a^T X$ is the sum of all the $x_i's$.

0

Let's work this out from the definitions. Let's say we have 2 random variables $x$ and $y$ with means $\mu_x$ and $\mu_y$. Then variances of $x$ and $y$ would be:

$${\sigma_x}^2 = \frac{\sum_i(\mu_x-x_i)(\mu_x-x_i)}{N}$$ $${\sigma_y}^2 = \frac{\sum_i(\mu_y-y_i)(\mu_y-y_i)}{N}$$

Covariance of $x$ and $y$ is:

$${\sigma_{xy}} = \frac{\sum_i(\mu_x-x_i)(\mu_y-y_i)}{N}$$

Now, let us consider the weighted sum $p$ of $x$ and $y$:

$$\mu_p = w_x\mu_x + w_y\mu_y$$

$${\sigma_p}^2 = \frac{\sum_i(\mu_p-p_i)(\mu_p-p_i)}{N} = \frac{\sum_i(w_x\mu_x + w_y\mu_y - w_xx_i - w_yy_i)^2}{N} = \frac{\sum_i(w_x(\mu_x - x_i) + w_y(\mu_y - w_yy_i))^2}{N} = \frac{\sum_i(w^2_x(\mu_x - x_i)^2 + w^2_y(\mu_y - y_i)^2 + 2w_xw_y(\mu_x - x_i)(\mu_y - y_i))}{N} = w^2_x\frac{\sum_i(\mu_x-x_i)^2}{N} + w^2_y\frac{\sum_i(\mu_y-y_i)^2}{N} + 2w_xw_y\frac{\sum_i(\mu_x-x_i)(\mu_y-y_i)}{N} = w^2_x\sigma^2_x + w^2_y\sigma^2_y + 2w_xw_y\sigma_{xy}$$