Let $X_{1},X_{2},\ldots$ be i.i.d. random variables and let $S_{n}=X_{1}+ \cdots +X_{n}$. Given that $1 and $0<\sigma<\lambda$, how do I show that if $\sup_{1\leq b \leq a'-a}P(|S_{b}|\geq \sigma)\leq \frac{1}{2}$, then $P( \sup_{a\leq b\leq a'} S_{b}\geq \lambda)\leq 2P(S_{a'}\geq \lambda - \sigma)$?
Probability inequalities
2
$\begingroup$
probability-theory
-
4Please, do not make it look as if you are giving us homework. Show us what you already did and where you got stuck. Also, people will stop answering your questions if you do not accept a single one of them. – 2011-11-20
-
1Just what Dimitri said. Strongly suggested reading: [How-to-ask-a-homework-question](http://meta.math.stackexchange.com/questions/1803/how-to-ask-a-homework-question). – 2011-11-20