1
$\begingroup$

I am curious how to get $P(a if i know the expected value $E(X)$ and the variance $D^{2}(X)$?

Off the top of my head i tought it's $(b-a)/(D^{2}(X))^{1/2}$, but it can easily have bigger value than 1.

So what could it be? Is it even possible to get it from them, if not is there any constraints we can assume?

With regards.

2 Answers 2

3

You cannot, without knowing more about the distribution of $X$.

Suppose for example that $X_1$ has a standard normal distribution, and $X_2$ has a Bernoulli distribution (so that $P(X_2 = 1) = P(X_2 = -1) = 1/2$). Then $X_1, X_2$ both have mean zero and variance 1, but $P(a < X_1 < b)$ is completely different from $P(a < X_2 < b)$. (For instance, if $a=-1$, $b=1$, then $P(a < X_1 < b) \approx 0.683$ but $P(a < X_2 < b) = 0$.)

Intuitively, the mean and variance contain too little information to determine a distribution.

3

From the knowledge of the mean and variance of $X$, one cannot in general compute $P(a. One can obtain certain inequalities for $P(a, by using for example the Chebyshev Inequality. In most practical situations, however, the Chebyshev Inequality is too weak to be useful.

Moderately often in practice, much more is known about a random variable that just its mean and variance. We may, for example, know that the distribution of $X$ is reasonably well-approximated by the normal distribution. Then from the mean and variance we can obtain good approximations to $P(a.