2
$\begingroup$

Working on an exercise from Shorack's Probability for Statisticians, Ex 4.6 (Wellner):

Suppose $T \simeq$ Binomial$(n,p)$. Then use the inequality $$\mu(|X| \ge \lambda) \le \frac{\textrm{E}g(X)}{g(\lambda)} \quad \textrm{for all } \lambda > 0$$ with $g(x) = e^{rx}$ and $r >0$, to show that $$ P(\frac{T}{n} \ge p\epsilon) \le e^{-np\cdot h(\epsilon)}$$ where $h(\epsilon) \equiv \epsilon(\log(\epsilon)-1)+1)$.

Using binomial theorem, I can show that $$\textrm{E}g(T) = \sum_{t=0}^n e^{rt} {{n}\choose{t}} p^t (1-p)^{n-t} = (e^rp + 1-p)^n. $$ Using $\lambda = np\epsilon$, I therefore have: $$P(T\ge np\epsilon) \le \exp(n\log(e^rp+1-p) - np\epsilon)$$ and only need to prove that $$n\log(e^rp+1-p) - np\epsilon \le -np (\epsilon(\log(\epsilon) -1) +1)$$

However, I've written code in R to verify this inequality with actual values for $r$, $p$, and $\epsilon$ but it's not always true.

Can anybody give me tips or perhaps an alternative way to do this?

  • 1
    This kind of result is commonly needed in information theory and communication theory. I do not know the Wellner Inequality, but a common way of approaching questions like this is to use the [Chernoff bound](http://en.wikipedia.org/wiki/Chernoff_bound), a simple version of which says that for all $\lambda$, $$P\{X \geq a\} \leq E[\exp(\lambda(X-a))] = \exp(-\lambda a)M(\lambda)$$ where $M(\lambda)$ is the moment-generating function of $X$. Look at Chapter 2 of _Principles of Communication Engineering_ by Wozencraft and Jacobs (Wiley 1964) to see how to get to bounds of the form you seek.2012-10-01
  • 0
    @DilipSarwate Thank you very much! Will look into that. :)2012-10-01

1 Answers 1