0
$\begingroup$

The followings are from Kai Lai Chung's A Course in Probability Theory, on page 207

$X_{n,j}, j=1,\dots, k_n, n=1,\dots,$ are random variables, where $k_n \to \infty$ as $n\to \infty$.

For every $\epsilon > 0$:

(a) for each $j$, $\lim_n P(|X_{nj}|>\epsilon) = 0$;

(b) $\lim_n \max_j P(|X_{nj}| > \epsilon) = 0$; (holospoudic)

(c) $\lim_n P(\max_j |X_{nj}| > \epsilon) = 0$;

(d) $\lim_n \sum_j P(|X_{nj}| > \epsilon) = 0$;

It is clear that (d) => (c) => (b) => (a).

I can understand (d) => (c), because $\sum_j P(|X_{nj}| > \epsilon) \geq P(\max_j |X_{nj}| > \epsilon).$

I can understand (b) => (a), because $\max_j P(|X_{nj}| > \epsilon) \geq P(|X_{nj}|>\epsilon).$

Questions:

  1. I wonder why (c) => (b)? Neither do I understand why $P(\max_j |X_{nj}| > \epsilon) \geq \max_j P(|X_{nj}| > \epsilon),$ if it is true.
  2. Also why when $X_{nj}, j=1,\dots,k_n$ are independent for each $n$, (d) $\equiv$ (c)? This is in the exercise 1 on page 214.

Thanks and regards!

1 Answers 1

2

Fix $\varepsilon$, and consider some random variables $(Y_k)_k$ and $Y=\sup\limits_kY_k$. For every $i$, $Y\geqslant Y_i$ hence $[Y_i\gt\varepsilon]\subseteq[Y\gt\varepsilon]$. This implies that $\mathrm P(Y_i\gt\varepsilon)\leqslant\mathrm P(Y\gt\varepsilon)$. This inequality holds for every $i$ and the RHS does not depend on $i$ hence $\sup\limits_i\mathrm P(Y_i\gt\varepsilon)\leqslant\mathrm P(Y\gt\varepsilon)=\mathrm P(\sup\limits_iY_i\gt\varepsilon)$.

Edit: When, furthermore, the random variables $(Y_k)_k$ are assumed independent, the estimation of the distribution of $Y$ proceeds as usual. First, for every $\varepsilon$, $[Y\leqslant\varepsilon]=\bigcap\limits_k[Y_k\leqslant\varepsilon]$ hence $ \mathrm P(Y\gt\varepsilon)=1-\prod\limits_k(1-\mathrm P(Y_k\gt\varepsilon)). $ Now, $1-x\leqslant\mathrm e^{-x}$ for every $x$ hence $ \mathrm P(Y\gt\varepsilon)\geqslant1-\exp\left(-\sum\limits_k\mathrm P(Y_k\gt\varepsilon)\right), $ which shows that (c) implies (d). On the other hand, (d) implies (c) because $[Y\gt\varepsilon]=\bigcup\limits_k[Y_k\gt\varepsilon]$ and the probability of the union is at most the sum of the probabilities.

  • 0
    No, you are not.2012-07-02