How to find necessary and sufficient condition when $X_i \text{ ~ } \text{Exp} (\lambda_i)$ sequence of independent random variables converge to zero in distribution and almost surely.
Thank you very much for your help in advance!
How to find necessary and sufficient condition when $X_i \text{ ~ } \text{Exp} (\lambda_i)$ sequence of independent random variables converge to zero in distribution and almost surely.
Thank you very much for your help in advance!
Let's see the first part, convergence in distribution means that $F_{X_n}(x) \to_{n \to \infty} F_X(x) \;\;\; \forall x$.
Thus: $ F_{X_i}(x) \to F_0(x) \iff 1-e^{-\lambda_ix} \to 1\text{ if } x> 0 \iff \lambda_i \to \infty$.
So we found the necessary and sufficient condition.
Let's see the other part: almost surely convergence means that $ \mathbb{P}(\lim_{n \to \infty} X_n=X)=1 $.
Consider the following: $\mathbb{P}(\neg(X_i\to 0))=\mathbb{P}(\bigcup_{m=1}^{\infty}\{\text{for infinitely many } i \;\;|X_i|>\frac1m\}) \leq \sum_{m=1}^{\infty}\mathbb{P}(\text{for infinitely many } i \;\;|X_i|>\frac1m)$
But: $\mathbb{P}(X_i > \frac1m) \leq \frac{\mathbb{E}(X_i)}{\frac1m}=\frac{m}{\lambda_i}$ (We know it from Markov inequality.)
Thus: \sum_{m=1}^{\infty}\mathbb{P}(X_i > \frac1m)\leq \sum_{m=1}^{\infty}\frac{m}{\lambda_i}<\infty \iff \sum_{m=1}^{\infty}\mathbb{P}(\text{for infinitely many } i \;\;|X_i|>\frac1m)=0 (It follows from Borel-Cantelli lemma.)
Thus: \mathbb{P}(\neg(X_i\to 0))=0 \Rightarrow \mathbb{P}(\text{for infinitely many } i \;\;|X_i|>\frac1m))=0 \iff \sum_{i=1}^{\infty}\frac{1}{\lambda_i}<\infty (It follows from Borel-Cantelli lemma.)
So we found the necessary and sufficient condition.
Part 1 is answered by Dawson. For part 2, we need to use both Borel Cantelli lemmas:
This means \sum e^{-\lambda_i\epsilon}< \infty is both necessary and sufficient.
PS: As noted in a comment, it is better not to use Markov inequality in such proofs as it is a weak inequality.