6
$\begingroup$

The Kolmogorov's maximal inequality states that when $X_1,\dots,X_n$ are mutually independent random variables, each with finite variance. Set $S_j=X_1+\cdots+X_j, 1 \le j\le n.$ Then, for each $\epsilon>0$, $\Pr(\max_{1 \le j \le n}|S_j-\mathbb{E}(S_j)| \ge \epsilon) \le \frac{Var(S_n)}{\epsilon^2}$

I consider the following question, given a number $n$, a random composition (strong) of this number into $k$ positive parts. So we can get $k$ random variable $Y_1, Y_2,\dots, Y_k$ with $Y_1+Y_2+\cdots+Y_k=n$

Apparently, in my case, the random variables are dependent. So how to apply Kolmogorov's Inequality or some other related inequality for these random variables?

That is let S_j'=Y_1+\cdots+Y_j, give a bound of \Pr(\max_{1 \le j \le k} |S_j'-\mathbb{E}(S_j')| \ge \epsilon) for some $\epsilon \ge 0$.

2 Answers 2

9

From this MSE answer you know that $(Y_j-\mathbb{E}(Y_j))_{j=1}^k$ forms an exchangeable sequence of random variables. The desired inequality follows, for instance, from Theorem 1 of the reference below.

Reference: "A Maximal Inequality for Partial Sums of Finite Exchangeable Sequences of Random Variables" by Alexander R. Pruss. Proceedings of the American Mathematical Society Volume 126, Number 6, June 1998, Pages 1811-1819.

  • 0
    Also, there may be a better solution in your particular case. I didn't mean for my answer to close off discussion of the problem.2011-12-30
5

You can use Pruss's paper as follows. First you need to prove an inequality like this. Let $S_k = Y_1+\cdots+Y_k$ and $T_k = Y_{n/2}+\cdots+Y_{n/2+k}$. (If $n$ is odd you can round $n/2$ in either direction.) Then prove an inequality like

$\text{Pr}\left(\max_{1\le k \le n} |S_k| > t\right) \leq \text{Pr}\left(\max_{1\le k \le n/2} |S_k| > t/2\right) + \text{Pr}\left(\max_{1\le k \le n/2} |T_k| > t/2\right).$

Then apply Pruss's result to each half. Then use Kolmogorov's argument (and wlog replace $Y$ by $Y-\text{E}(Y)$) on each half to get an upper bound of some universal constant times $\text{Var}(S_{n/2})/\epsilon^2$.