3
$\begingroup$

Wiki states that standard deviation of $X-Y$ is:

$\sigma_{x-y} = \sqrt { \sigma_x^2 + \sigma_y^2 - 2\rho\sigma_x\sigma_y }$

I have a number (say 3) correlated random variables to be subtracted from another correlated random variable.

All random variables have identical correlation $\rho$.

Can I subtract each one in turn like this:

$\sigma_{x-1} = \sqrt { \sigma_x^2 + \sigma_1^2 - 2\rho\sigma_x\sigma_1 }$

$\sigma_{x-2} = \sqrt { \sigma_{x-1}^2 + \sigma_2^2 - 2\rho\sigma_{x-1}\sigma_2 }$

$\sigma_{x-3} = \sqrt { \sigma_{x-2}^2 + \sigma_3^2 - 2\rho\sigma_{x-2}\sigma_3 }$

The application is determining carrier-to-interference ratio of multiple co-channel interferers in an environment where it can be assumed that each interferer has identical correlated variation.

1 Answers 1

2

$\newcommand{\Var}{\operatorname{Var}} \newcommand{\Cov}{\operatorname{Cov}}$ I think it is easier to work with variance and covariance (which are easily computed from correlation and standard deviation, and vice versa). The key identity is $\Var(X \pm Y) = \Var(X) + \Var(Y) \pm 2 \Cov(X,Y).$ (This is very simple to prove; it is nothing but linearity of expectation and some algebra.) Similarly, you can show $\Cov(X \pm Y, Z) = \Cov(X,Z) \pm \Cov(Y,Z).$ Now by using these identities repeatedly, we could show $\begin{align*} \Var(X_1 - X_2 - X_3) &= \Var(X_1 - (X_2 + X_3)) \\ &= \Var(X_1) + \Var(X_2 + X_3) - 2 \Cov(X_1, X_2 + X_3) \\ &= \Var(X_1) + \Var(X_2) + \Var(X_3) + 2 \Cov(X_2, X_3) \\ &\quad\quad\quad\quad- 2 \Cov(X_1, X_2) - 2 \Cov(X_1, X_3). \end{align*}$ I'll leave you to perform the corresponding computation for 4 random variables.

  • 0
    Than$k$s. I get Var(X1)+Var(X2)+Var(X3)+Var(X4) - interestingly, co-variance terms cancel.2011-08-02