2
$\begingroup$

I am relatively very new to probability distribution and after understanding the concept of Expected value of a discrete random variable,I am trying to understand the variance of the same here is an extract from my book:

$Var(X) = E[(X - E(X))^2] = \sum_{\text{all }X} (X - E(X))^2 P_x(X) = E(X^2) - (E(X))^2$

What my question is that I don't understand this simplification.Any pointers in this regard will be highly appreciated.

  • 0
    Exactly I gave me some woes in understanding but Mat's answers clears down everything in a very lucid away :)2011-01-27

3 Answers 3

6

Forget the middle bit. What you need to know is that $E(X)$ is a linear function, this means that $E(X + Y) = E(X) + E(Y)$ and $E(aX) = aE(X)$ where $a$ is a constant.

Per definition, $Var(X) = E((X - E(X)^2))$.

Multiplying out the argument of $E$ gives you

$\begin{align} E((X - E(X)^2)) ~=~& E(X^2 - 2XE(X) +E^2(X)) \\ =~& E(X^2) - E(2XE(X)) + E(E^2(X))\\ =~& E(X^2) - 2 E(X)E(X) +E^2(X)\end{align}$

where the last equality holds because $E(X)$ is a constant, so $E(E(X)) = E(X)$.

0

First equality is a definition of variance. Second is by the definition of expected value. To see the third, expand the square, multiply out and simplify.

0

if you know the E(X) then you can figure out how far your random variables lie from the mean(E(X)) by computing the Var(X). So that way you can figure out where all your x's are bundle up at. And to figure that out just use the def. Var(X) or you can find you E(x^2) and take the difference one by one from you r.v. X