3
$\begingroup$

I am trying to show that whenever we have a sequence of random variables that are symmetric distribued about the origin and have finite fourth moment, then Kolmogorov's inequality will be bounded above by the fourth moment/ t^4.

Originally, we have that the bound is second moment (variance)/ t^2.

My approach has been to use binomial expansion and set bounds, but I am unable to do so. Please give me some advise.

Ok- I think I have made progress on this now and I have a solution. Please give me your comments in the "answers", since a lot of space in the comments has been wasted on unnecessary censorship ram-i-fications. Here it goes:

First of all, we will do a binomial expansion of $(x_1 + x_2 + x_3+...+x_n)^4$ into $(x_1 + x_2 +...x_k) + (x_(k+1)+...+x_n)$. So I evaluate $E(x_1 + x_2 +...+x_n)^4$ in this way. The general idea is then to see the odd moments cancel out, and I have to show that the cross terms are either zero or just positive. Then, I can have that $E(x_1 + x_2 +...+x_n)^4 \geq E(x_1 +...+x_k)^4$. Then, we use Jensen's with our convex function x^2 on positive reals to finish job since we already know Kolmogorov's Inequality is true. Hence, we have refined it.

  • 1
    @erik, **Please** edit your question to remove the unnecessary verbiage: it adds *nothing* to the question, and much less to its title.2012-05-11

1 Answers 1

3

The idea of the classical proof goes through. For every $n\geqslant1$, let $S_n=X_1+\cdots+X_n$. Fix $t\gt0$ and let $T=\inf\{n\geqslant1\mid |S_n|\geqslant t\}$. Then $\big[\max\limits_{1\leqslant k\leqslant n}|S_k|\geqslant t\big]=[T\leqslant n]. $ On each event $[T=k]$, $S_k^4\geqslant t^4$, hence $ t^4\mathrm P(T\leqslant n)=\sum_{k=1}^nt^4\mathrm P(T=k)\leqslant\sum_{k=1}^n\mathrm E(S_k^4;T=k). $ The task now is to compare $\mathrm E(S_k^4;T=k)$ and $\mathrm E(S_n^4;T=k)$ for every $1\leqslant k\leqslant n$.

Note that $S_n=S_k+R_k$, where $R_k=S_n-S_k$ is independent on $\mathcal F_k=\sigma(X_i,1\leqslant i\leqslant k)$ while $S_k$ is $\mathcal F_k$ measurable and $[T=k]$ is in $\mathcal F_k$.

Expanding $S_n^4=(S_k+R_k)^4$ and using the independence of $R_k$ from $S_k\cdot[T=k]$, one sees that $\mathrm E(S_n^4;T=k)$ is a linear combination of the products $\mathrm E(R_k^i)\mathrm E(S_k^{4-i};T=k)$ for $0\leqslant i\leqslant 4$. Furthermore, the distribution of $R_k$ is symmetric hence $\mathrm E(R_k)=\mathrm E(R_k^3)=0$ and one is left with $ \mathrm E(S_n^4;T=k)=\mathrm E(S_k^4;T=k)+6\mathrm E(R_k^2)\mathrm E(S_k^2;T=k)+\mathrm E(R_k^4)\mathrm P(T=k)\geqslant\mathrm E(S_k^4;T=k). $ Summing these yields $ t^4\mathrm P(T\leqslant n)\leqslant\sum_{k=1}^n\mathrm E(S_n^4;T=k)=\mathrm E(S_n^4;T\leqslant n)\leqslant\mathrm E(S_n^4), $ hence the result holds, that is, $ \color{red}{\mathrm P(T\leqslant n)\leqslant\mathrm E(S_n^4)/t^4}. $

One should probably add that the algebraic manipulations above, while sufficient to yield the result, are not necessary. Let us now explain why.

Replace $x\mapsto x^4$ by any nonnegative nondecreasing convex function $u$, the centered and symmetric random walk $(S_n)_n$ by any martingale, and go directly to the step where one wants to compare $\mathrm E(u(S_k);T=k)$ and $\mathrm E(u(S_n);T=k)$. Jensen conditional inequality yields $ \mathrm E(u(S_n)\mid\mathcal F_k)\geqslant u(\mathrm E(S_n\mid\mathcal F_k))=u(S_k), $ and $[T=k]$ is in $\mathcal F_k$, hence $ \mathrm E(u(S_n);T=k)\geqslant\mathrm E(u(S_k);T=k), $ and summing these yields $ \color{red}{u(t)\mathrm P(T\leqslant n)\leqslant\mathrm E(u(S_n);T\leqslant n)\leqslant\mathrm E(u(S_n))}. $