1
$\begingroup$

The Chebyshev's inequality is $P(|X-E(X)|>\varepsilon)\leq \frac{\operatorname{Var(X)}}{\varepsilon^2}.$

I saw a proof which goes like this:

$ \begin{align} \operatorname{Var(X)}(X) &= E((X-E(X))^2) \\ &= \sum_{x\in S}(x-E(X))^2\cdot P(X=x) \\ &\geq \sum_{|x-E(X)|>\varepsilon}(x-E(X))^2\cdot P(X=x) \\ &> \sum_{|x-E(X)|>\varepsilon}\varepsilon^2\cdot P(X=x) \\ &= \varepsilon^2 P(|X-E(X)|>\varepsilon) \\ \end{align} $

from which the equation should follow by dividing by $\varepsilon^2$.

What I don't understand here is the 4th step:

$\sum_{|x-E(X)|>\varepsilon}(x-E(X))^2\cdot P(X=x) > \sum_{|x-E(X)|>\varepsilon}\varepsilon^2\cdot P(X=x) $

Doen't this imply $P(|X-E(X)|>\varepsilon)< \frac{\operatorname{Var(X)}}{\varepsilon^2}$ rather then $P(|X-E(X)|>\varepsilon)\leq \frac{\operatorname{Var(X)}}{\varepsilon^2}.$

Why is this correct?

  • 0
    @AndréNicolas: I understand, thank you!2012-01-14

1 Answers 1

2

Note that $P(|X-E(X)|>\varepsilon)< \frac{Var(X)}{\varepsilon^2}$ is (sometimes) false. For let $X=a$ with probability $1$. The variance of $X$ is $0$, but no probability can be $<0$. But if we assume non-zero variance, your reasoning is correct.

The usual version of the Chebyshev Inequality is
$P(|X-E(X)|\ge \varepsilon) \le \frac{Var(X)}{\varepsilon^2}.$

  • 0
    Fine - I was really responding to "So this Wikibooks entry is in general wrong, then?", to which the answer is no.2012-01-14