5
$\begingroup$

I've had a look for intuitive explanations of the variance of an RV (e.g. Intuitive explanation of variance and moment in Probability) but unfortunately for me, I still don't feel comfortable with the concept. Why would you opt to use variance over the standard deviation (which usefully is in the same units as the expectation)?

Also, if the expectation,

$E(X) = \Sigma_{i=0}^n i P(X = i)$,

what is $E(X^2)$? Is it simply

$E(X^2) = \Sigma_{i=0}^n i P(X^2 = i)$?

  • 0
    For inde$p$endent random variables, "the variance of the sum is the sum of the variances" while "the standard deviation of the sum is the square-root of the sum of the squares of the standard deviations"; the former is briefer, though they say the same thing.2011-04-06

3 Answers 3

6

The variance is easier to deal with in intermediate computations, because it doesn't have a square root. For example, if $X$ and $Y$ are independent, then $Var(X+Y) = Var(X) + Var(Y)$, which is a simpler formula than $SD(X+Y) = \sqrt{SD(X)^2 + SD(Y)^2}$. Basically, if you want to work in terms of standard deviation all the time then you end up doing a lot of squaring and square-rooting.

Your claimed formula for $E(X^2)$ is almost true -- you're thinking of $X^2$ as a new random variable unrelated to $X$. To be strictly correct you'd need $E(X^2) = \sum_{i=0}^{n^2} i P(X^2 = i)$, since if $X$ can take values from $0$ to $n$ then $X^2$ can take values as large as $n^2$. But then you're faced with the problem of getting $P(X^2 = i)$. In practice one uses picakhu's formula $E(X^2) = \sum_{i=0}^n i^2 P(X=i)$.

  • 0
    @atem, mind the "flaw of averages": while mean and variance completely characterize Gaussian RVs, these statistics are not sufficient for (1) (even 1-dimensional) multimodal distributions, or (2) power-law RVs as opposed to exponential. In the extreme case Cauchy distribution don't even possess a converging mean.2012-08-30
6

Variance and standard deviation are the same, in the sense that if you know one you know the other. The importance of variance is that if $X,Y$ are independent then $V[X+Y] = V[X]+V[Y],$ i.e. variance is additive. If you substitute the standard deviation you'd get a more complicated formula.

You might ask why do people not use the median instead of the expectation, or one of $E[|X-E[X]|], E[|X-M[X]|]$ instead of the standard deviation. The reason is that the expectation and variance enjoy many nice properties like additivity, and are much easier to work with analytically.

Your formula for $E[X^2]$ is valid (provided the range of values encompasses all possible values of $X^2$, which is $0$ to $n^2$ in your case), though a more useful one is $E[X^2] = \sum_i \Pr[X=i] i^2.$

  • 2
    I think it's interesting that we gave essentially the same answer.2011-04-05
0

Variance is a natural parameter for the normal distribution.