0
$\begingroup$

From a book, I found these 2 questions which I have not understand.

(1) Suppose X is a discrete random variable with probability function

      x       0       1       2       3       f(x)    27/64   27/64   9/64    1/64 

Find the probabilities $P[\mu - 2\sigma < X < \mu + 2\sigma]$. Compare these probabilities with the probabilities that you will get from Chebyshev's Inequality.

When they solved the question, they used $\leq$ sign, that is $P[\mu - 2\sigma \leq X \leq \mu + 2\sigma]$. But the question is $P[\mu - 2\sigma < X < \mu + 2\sigma]$. Why they use the equal sign? Isn't in discrete case the equal sign's impact is tremendous?

Again when using Chebyshev's Inequality, they used $\leq$ sign, but in formula I know that it is $P[\mu - K\sigma < X < \mu + K\sigma]$.

(2) It is found from the Markov's inequality that for any non-negative random variable $X$ whose mean is $1$, the maximum possible value of $P[X \leq 100]$ is $0.01$.

But the formula of Markov's inequality is $$P[X \geq k] \leq E[X]/k.$$ Then how $P[X \leq 100]$ is $0.01$? In my consideration by using formula,

$$P[X \geq 100] \leq 1/100 \ \ \Rightarrow \ \ 1 - P[X < 100] \leq 1/100 \ \ \Rightarrow \ \ P[X < 100] \geq 1 - (1/100) = 0.99.$$

If this process is right how it conclude that $P[X≤100]$ is $0.01$?

2 Answers 2