0
$\begingroup$

Say I have a random variable $X$ and I am interested in finding a good bound on $\mathbb{P}( X \geq a)$ for $a \in \mathbb{R}$. I could do this through the the one sided Chebyshev inequality or I could rely on the moment generating function and use the Chernoff bounds. Assuming you have the $s$ that minimizes the MGF, my question is, which would you expect to be tighter to the actual probability $\mathbb{P}( X \geq a)$ for a given $a$? It seems like it would be Chernoff since the MGF contains information about all the moments while the Chebyshev is just using the first two.

  • 0
    Chebyshev's inequality is more general; consequently, weaker. Chebyshev's inequality for the normal distribution is very poor, for instance. So definitely Chernoff bounds if you need to be accurate.2017-02-02
  • 0
    ... or an explicit computation if that is viable, of course.2017-02-02
  • 0
    Cheers. Quick question, is there a way to recover the CDF (and hence the probability) directly from the MGF?2017-02-02
  • 0
    That is an instance of the moment problem (have a look at Wikipedia) that can be solved through the (inverse) Laplace and Fourier transforms.2017-02-02

0 Answers 0