1
$\begingroup$

For upper bound of probability $P\{X \ge t\}$, we have Markov's inequality, Chernoff's bound, and moment bound.

But how can we deal with lower bound? Is there any similar inequalities for lower bound analysis?

  • 0
    [Chebyshev’s inequality](http://en.wikipedia.org/wiki/Chebyshev%27s_inequality)?2012-04-01
  • 2
    A discrete random variable with probability masses $\frac{1}{2}$ at $\mu \pm \sigma$ has mean $\mu$ and variance $\sigma^2$, and so $P\{X > t\} = 0$ for all $t > \mu + \sigma$ for this variable. Thus it is going to be difficult to get a lower bound better than $0$ that applies to random variables _in general_, though with additional conditions one might get a lower bound.2012-04-02
  • 0
    @J.D. Chebyshev's inequality is still an upper bound on probability (for a non-negative variable). It is where Markov's inequality is derived from, so it's likely that the poster already knew of this bound regardless. I suppose one can expand the absolute value usually used in it into two events, then use complements and multiply by -1 to get a statement of lower bounded probability, but it won't be in terms of a useful bound as it will always involve the other probability of the other "half-event" that you split out of the absolute value.2012-04-02

1 Answers 1

4

Since the indicator function $I_{x \ge t} \ge 1 - (x-a)^2/(t-a)^2$ for $a > t$, $$P(X \ge t) \ge 1 - \frac{\sigma^2+(a-\mu)^2}{(t-a)^2}$$ If $\mu > t$, the optimal $a = \mu + \sigma^2/(\mu-t)$, obtaining $$P(X \ge t) \ge \frac{(\mu-t)^2}{(\mu-t)^2+\sigma^2} \ \text{for}\ t < \mu$$