5
$\begingroup$

Let $x_1,..., x_n$ be independent identically distributed variables with means $M$ and variances $V$. Let $\bar{x}=\frac{1}{n}\sum_{i=1}^{n}x_i.$

Then we can say that:

$\mathbb{P}(|\bar{x}-M|>\varepsilon)\leq\frac{V}{n\varepsilon^2}. $

Is this bound sharp for these conditions? If it is true can you hint me of an example that shows that it can't be tightened?

Thanks!

  • 0
    One way to prove the inequality is to say \Pr(|\bar X - M|>\varepsilon) =\Pr((\bar X - M)^2>\varepsilon^2) and apply Markov's inequality, which says that if $\Pr(Y\ge 0)=1$ then \Pr(Y>y)\le \mathbb{E}(Y)/y. Markov's inequality itself is sharp in that there is a distribution for which equality holds, namely any distribution supported on a set of the form $\{0,y\}$ where y>0. However, I think S.B. may be right when he says the central limit theorem may show the inequality is not sharp.2012-10-01

1 Answers 1

0

Define $V$ as in the question, and assume $V<\infty$. Then as $n\to\infty$, $ \Pr\left(a < \frac{\sqrt{n}(\bar x-M)}{\sqrt{V}} < b\right) \to \frac{1}{\sqrt{2\pi}}\int_a^b e^{-x^2/2} \, dx = \int_a^b \varphi(x)\,dx. $ So $ \Pr\left(|\bar x-M|>\varepsilon \right) \cong 2\int_{\varepsilon\sqrt{n/V\,{}}}^\infty \varphi(x)\,dx. $

So now I'd see if I can prove that that last integral is less than $\dfrac{V}{2n\varepsilon^2}$.

Later note: Apparently its somewhat more delicate than what the last two sentences assume. Maybe I'll be back later . . . . . .

  • 0
    I think I was hoping "$\cong$" might get me past that, but I see your point. We need to get subtler than this.2012-10-03