1
$\begingroup$

Let $x$ be random variable, such that $E(x)=0,E(x^2)=1$ and $P(x^2\geq s^2)\geq\displaystyle\frac{C}{s^t}$, where $C>0,s\geq 1 , t>0$. Let $m and $m,n$ are natural numbers very big. Let also $L\geq 1$ . Consider $1-(1-\frac{n}{2}P(x^2\geq Ln))^m$.

Assume (*) $\frac{n}{2}P(x^2\geq Ln)\leq \frac{2c_0}{m}$, where $0 ;

using inequality $(1-y)^m\leq 1-\displaystyle\frac{my}{2}$, valid for $y\in [0, c_0]$ , with natural $n$, we get

$1-(1-\frac{n}{2}P(x^2\geq Ln))^m\geq\displaystyle\frac{nm}{4}P(x^2\geq Ln),$

using assumption $P(x^2\geq s^2)\geq\displaystyle\frac{C}{s^t}$ with $s^2=Ln$ , we get 1-(1-\frac{n}{2}P(x^2\geq Ln))^m\geq \displaystyle\frac{nm}{4}P(x^2\geq Ln)\geq \frac{mnC'}{(Ln)^{\frac{t}{2}}}.

How to show, that if $t\geq 4$, then (*) holds?

  • 1
    @David: You need to enclose your math within $$. At the moment it is unreadable.2011-12-01

2 Answers 2

0

Let us first recall Markov inequality: let $Z$ denote a nonnegative integrable random variable. Then, for every positive $z$, $ \mathrm P(Z\geqslant z)\leqslant z^{-1}\mathrm E(Z). $ Application: consider $Z=X^2$ for a square integrable random variable $X$ such that $\mathrm E(X^2)=1$, and $z=n$ for any positive integer $n$. Then $\mathrm P(X^2\geqslant n)\leqslant n^{-1}$.

Note that this proof uses neither the hypothesis that $\mathrm E(X)=0$ nor any lower bound on the tail of the distribution of $X^2$.


Now, what you seem to be asking for is a proof that $\mathrm P(X^2\geqslant n)\leqslant\Theta(n^{-2})$ under the additional hypotheses that $\mathrm E(X)=0$ and that $\mathrm P(X^2\geqslant s)\geqslant Cs^{-t/2}$ for every $s\geqslant1$, for some $t\geqslant4$.

But there is no hope such a result could hold, is there? Consider a symmetric random variable such that $\mathrm P(X^2\geqslant s)=\Theta(s^{-u})$ when $s\to\infty$, for some positive $u$. Then your hypothesis may hold as soon as $u\leqslant t/2$ (for the tail estimate) and $u\gt1$ (for the square integrability) and your conclusion asks that $u\geqslant2$. These simply do not fit.

  • 0
    I have no idea what you are talking about. Is the part of my post which reads *But there is no hope such a result could hold* not clear enough?2011-12-09
0

Probably it is better reformulate the question I've posted. Please tell me if my solution is correct and where I have mistakes. Thank you for advice.

Let x be random variable with mean zero and variance 1. And such that for $C>0$, $t>0$ and natural n, $P(x^2\ge n)\geq \frac{C}{n^t}$. We want to show that if $t\ge 2$, then $P(x^2\ge n)\geq \frac{2}{n^2}$.

First, we show that for any $n\geq \widetilde{n}$, $ \begin{align} P(x^2\geq n)\leq \frac{2}{n^2}. \end{align} $ Suppose to the contrary that \begin{align} P(x^2 \geq n_i) > \frac{2}{n_i^2} \end{align} for some infinite sequence $n_1$, $n_2$, $\ldots$, $n_i$, $\ldots$, such that \begin{align} P(x^2 > n_i^2) > 2P(x^2 > n_{i+1}^2). \end{align} By assumption $ 1=E x^2 = \int_{0}^\infty x^2 d \mu. $ Here $\mu$ is a probability measure. We can break up this integral into the sum: $ E x^2 = \sum_{i=0}^\infty \int_{n_i}^{n_{i+1}}x^2 d \mu. $ Consider now $ \begin{align} \int_{n_i}^{n_{i+1}} x^2 d\mu \geq n_i^2 \int_{n_i}^{n_{i+1}} d\mu &= n_i^2(P(x^2\geq n_i^2)-P(x^2 \geq n_{i+1}^2)) &\gt n_i^2\frac 12 P(x^2\geq n_i^2) &\gt 1. \end{align} $ This is a contradiction to $E x^2=1$.

Thus, with $\widetilde{n}\leq n$ \begin{align} P(w^2\geq Kn)\leq \frac {2}{n^2} . \end{align} But, within condition \begin{align} \frac{C}{n^t}\leq P(w^2\geq Kn)\leq \frac{2}{n^2}, \end{align} and $\displaystyle{n\geq \left(\frac{C}{2}\right)^{\frac{1}{t-2}}=\widetilde{\widetilde{n}}}$. Thus, for any $n\geq \max\{\widetilde{n},\widetilde{\widetilde{n}}\}$, $\frac{C}{n^t}\leq P(w^2\geq Kn)\leq \frac{2}{n^2}$.

If $t\geq 2$, then for any $n \in N$ and ,$C=2$ $P(w^2\geq Kn)=\frac{2}{n^2}$. Thus, $\frac{C}{n^t}\leq P(w^2\geq Kn)\leq \frac{2}{n^2}$ for any natural n.