5
$\begingroup$

Let $x_1,..., x_n$ be independent identically distributed variables with means $M$ and variances $V$. Let $$\bar{x}=\frac{1}{n}\sum_{i=1}^{n}x_i.$$

Then we can say that:

$$\mathbb{P}(|\bar{x}-M|>\varepsilon)\leq\frac{V}{n\varepsilon^2}. $$

Is this bound sharp for these conditions? If it is true can you hint me of an example that shows that it can't be tightened?

Thanks!

  • 1
    It really depends on the distribution of $x_i$'s. For example, if they are sub-Gaussian random variables you can get bounds that decay exponentially in $n$ using Chernoff's inequality. The bound you have here is a direct application of Markov's inequality.2012-10-01
  • 0
    But if nothing is known about the distribution of $x_i$? I think that this bound cannot be sharpened but I can't come up with an example.2012-10-01
  • 0
    The problem is that defining the "sharpest bound" is a little vague. Assuming that by sharpest you mean fastest decay rate in terms of $n$, you want a bound that holds uniformly for all distributions (given a fixed mean and variance). I don't know whether or not your claim holds in that sense, but I think Central Limit Theorem may contradict the claim.2012-10-01
  • 0
    @S.B. : "Sharp" in this context would mean the _slowest_ possible decay rate of the variances, and you need just _one_ distribution in which the slowest rate is achieved. In other words, you would want the variance to be exactly equal to $V/(n\varepsilon^2)$. That would prove that there is no tighter bound. There's nothing vague about that.2012-10-01
  • 0
    One way to prove the inequality is to say $\Pr(|\bar X - M|>\varepsilon)$ $=\Pr((\bar X - M)^2>\varepsilon^2)$ and apply Markov's inequality, which says that if $\Pr(Y\ge 0)=1$ then $\Pr(Y>y)\le \mathbb{E}(Y)/y$. Markov's inequality itself is sharp in that there is a distribution for which equality holds, namely any distribution supported on a set of the form $\{0,y\}$ where $y>0$. However, I think S.B. may be right when he says the central limit theorem may show the inequality is not sharp.2012-10-01

1 Answers 1

0

Define $V$ as in the question, and assume $V<\infty$. Then as $n\to\infty$, $$ \Pr\left(a < \frac{\sqrt{n}(\bar x-M)}{\sqrt{V}} < b\right) \to \frac{1}{\sqrt{2\pi}}\int_a^b e^{-x^2/2} \, dx = \int_a^b \varphi(x)\,dx. $$ So $$ \Pr\left(|\bar x-M|>\varepsilon \right) \cong 2\int_{\varepsilon\sqrt{n/V\,{}}}^\infty \varphi(x)\,dx. $$

So now I'd see if I can prove that that last integral is less than $\dfrac{V}{2n\varepsilon^2}$.

Later note: Apparently its somewhat more delicate than what the last two sentences assume. Maybe I'll be back later . . . . . .

  • 0
    The deduction after **So** is definitely not kosher (and has many counterexamples) since it assumes that $a$ and $b$ are $\pm\varepsilon\sqrt{n}$ while the CLT is valid when $a$ and $b$ do not depend on $n$.2012-10-03
  • 0
    I think I was hoping "$\cong$" might get me past that, but I see your point. We need to get subtler than this.2012-10-03