6
$\begingroup$

I need to lower bound the tail probability of a non-negative random variable. I have a lower bound on its expected value. I am aware of a reverse markov's inequality that does the job when the random variable is bounded above. Unfortunately that is not my case.

Is there any other inequality that may be useful to me in this regard?

thanks

NR

  • 0
    Maybe you can give the actual distribution you're interested in.2011-11-19
  • 0
    For any given finite expectation, no matter how large it is, it is possible to construct a bounded positive r.v. with such expectation, i.e. there are no general lower bounds on tail probability. The fact that it your r.v. is unbounded we can easily take into account by constructing bounded r.v. and adding to it positive unbounded r.v. with tail probability as small as we want for the counterexample2011-11-19
  • 0
    @cardinal : I am trying to show that a set of k noncentral chi squared random variables are bigger than a threshold (tau) with high probability. I have managed to lower bound the expected value of the minimum of the variables. So, Something like the reverse markov inequality would come in handy, so long as the variables are bounded.2011-11-20
  • 0
    So, you have $X_1, \ldots, X_k$ each distributed as noncentral chi-squared random variable (independent? identically distributed?) and you want $\mathbb P(\cap_{i=1}^k \{X_i > \tau\}) > 1 - \delta$ for small $\delta > 0$?2011-11-20
  • 0
    @gortaur : please refer my comment above to cardinal. As far as bounding it and coming up with a counterexample goes, i'm not sure I understand. (I am not a pure mathematician). IF i'm understanding correctly, I just want to clarify that i do not want a counterexample. I want to have an inequality that says the tail probability of a non negative random variable (min. over noncentral chi squares in my case) is lower bounded. I could directly use the CDF, but those are in terms of Marcum Q functions, and deriving simple bounds for them seemed beyond my capacity.2011-11-20
  • 0
    @cardinal: they are not iid. But the rest of your statement holds. I tried the union bound, over the complement of this event, but it became very messy thanks to (perhaps my inability to handle) marcum Q functions2011-11-20
  • 0
    yes thanks ! appreciate it2015-08-15

1 Answers 1

7

You want a lower bound of the probability of $[X\gt x]$ hence an upper bound of the probability of the event $A=[X\leqslant x]$. As explained by others there is little hope to achieve such a bound depending on $\mathrm E(X)$ only, which would be valid for every nonnegative random variable $X$.

However, for every decreasing bounded function $u$, $A=[u(X)\geqslant u(x)]$ hence Markov's inequality yields $$ \mathrm P(A)\leqslant u(x)^{-1}\mathrm E(u(X)). $$ Two frequently used cases are $$u(x)=\mathrm e^{-tx}$$ and $$u(x)=\frac1{1+tx}$$ for some positive $t$, related to Laplace and Stieltjes transforms, respectively. In both cases, one can choose the value of the parameter $t$ which yields an optimal, or nearly optimal, upper bound.

This yields $$ \mathrm P(X\gt x)\geqslant 1-u(x)^{-1}\mathrm E(u(X)). $$ A simple consequence is the fact that, for every positive $s$ (and for $s=0$ as well, provided $1/X$ is integrable), $$ \mathrm P(X\gt x)\geqslant \mathrm E\left(\frac{X-x}{s+X}\right). $$