The followings are from Kai Lai Chung's A Course in Probability Theory, on page 207
$X_{n,j}, j=1,\dots, k_n, n=1,\dots,$ are random variables, where $k_n \to \infty$ as $n\to \infty$.
For every $\epsilon > 0$:
(a) for each $j$, $\lim_n P(|X_{nj}|>\epsilon) = 0$;
(b) $\lim_n \max_j P(|X_{nj}| > \epsilon) = 0$; (holospoudic)
(c) $\lim_n P(\max_j |X_{nj}| > \epsilon) = 0$;
(d) $\lim_n \sum_j P(|X_{nj}| > \epsilon) = 0$;
It is clear that (d) => (c) => (b) => (a).
I can understand (d) => (c), because $\sum_j P(|X_{nj}| > \epsilon) \geq P(\max_j |X_{nj}| > \epsilon).$
I can understand (b) => (a), because $\max_j P(|X_{nj}| > \epsilon) \geq P(|X_{nj}|>\epsilon).$
Questions:
- I wonder why (c) => (b)? Neither do I understand why $P(\max_j |X_{nj}| > \epsilon) \geq \max_j P(|X_{nj}| > \epsilon),$ if it is true.
- Also why when $X_{nj}, j=1,\dots,k_n$ are independent for each $n$, (d) $\equiv$ (c)? This is in the exercise 1 on page 214.
Thanks and regards!