I want to add two variance. Example: $\sigma_1^2=10$ and $\sigma_2^2=30.$ Will $\sigma_{total}^2 = \sigma_1^2+\sigma_2^2 =40?$
Can I add the two variance directly or is there some law to add two variances.
Please help
I want to add two variance. Example: $\sigma_1^2=10$ and $\sigma_2^2=30.$ Will $\sigma_{total}^2 = \sigma_1^2+\sigma_2^2 =40?$
Can I add the two variance directly or is there some law to add two variances.
Please help
Here is a quick demo that independence is crucial. Let $X$ be a random variable and let $Y$ be exactly the same random variable: $X \equiv Y.$ (Not just a random variable with the same distribution.) Certainly, then $X$ and $Y$ are associated, not independent. Let $V(X) = V(Y) = \sigma^2.$
If we could ignore independence, then we would have equality throughout in the following relationship, and thus be able to prove that $2 = 4.$
$$ 2\sigma^2 = V(X) + V(Y) \ne V(X + Y) = V(2X) = 4V(X) = 4\sigma^2.$$
Note: Here are some important formulas about expectations and variances of random variables. You should find these (or similar) formulas in your book and look at any examples given there.
$E(a + bX + cY) = a + bE(X) + cE(Y),$ regardless of independence. In particular, setting $c=0,$ we have $E(a + bX) = a + bE(X),$
$V(a + bX + cY) = b^2V(X) + c^2V(Y),$ only if $X$ and $Y$ are uncorrelated. In particular, $V(a + bX) = b^2V(X).$ [Above I used this with $a = 0,\, b=2.$]
Also, setting $a=0,\, b=1,\, c = -1,$ we have $V(X - Y) = V(X) + V(Y),$ only if $X$ and $Y$ are uncorrelated.