1
$\begingroup$

How to find necessary and sufficient condition when $X_i \text{ ~ } \text{Exp} (\lambda_i)$ sequence of independent random variables converge to zero in distribution and almost surely.

Thank you very much for your help in advance!

  • 0
    benny: Surely you tried something, you scribbled some estimates on a piece of paper. Please show these.2012-04-14
  • 0
    I am afraid I have no valuable results. I would be grateful if you could help me.2012-04-14
  • 2
    @benny Are these random variables independent? If so, this is crucial information that you left out of the question.2012-04-15
  • 1
    benny: Do not be shy! Surely this question does not come out from nowhere... Surely you solved similar, simpler ones and/or were given some tools, or related results, whatever. As it is, you make it look like you are only interested in a full written solution, and not at all in **thinking** about this problem. Come on, just prove me wrong on this!2012-04-15

2 Answers 2

1

Let's see the first part, convergence in distribution means that $F_{X_n}(x) \to_{n \to \infty} F_X(x) \;\;\; \forall x$.

Thus: $ F_{X_i}(x) \to F_0(x) \iff 1-e^{-\lambda_ix} \to 1\text{ if } x> 0 \iff \lambda_i \to \infty$.

So we found the necessary and sufficient condition.

Let's see the other part: almost surely convergence means that $ \mathbb{P}(\lim_{n \to \infty} X_n=X)=1 $.

Consider the following: $\mathbb{P}(\neg(X_i\to 0))=\mathbb{P}(\bigcup_{m=1}^{\infty}\{\text{for infinitely many } i \;\;|X_i|>\frac1m\}) \leq \sum_{m=1}^{\infty}\mathbb{P}(\text{for infinitely many } i \;\;|X_i|>\frac1m)$

But: $\mathbb{P}(X_i > \frac1m) \leq \frac{\mathbb{E}(X_i)}{\frac1m}=\frac{m}{\lambda_i}$ (We know it from Markov inequality.)

Thus: $\sum_{m=1}^{\infty}\mathbb{P}(X_i > \frac1m)\leq \sum_{m=1}^{\infty}\frac{m}{\lambda_i}<\infty \iff \sum_{m=1}^{\infty}\mathbb{P}(\text{for infinitely many } i \;\;|X_i|>\frac1m)=0$ (It follows from Borel-Cantelli lemma.)

Thus: $\mathbb{P}(\neg(X_i\to 0))=0 \Rightarrow \mathbb{P}(\text{for infinitely many } i \;\;|X_i|>\frac1m))=0 \iff \sum_{i=1}^{\infty}\frac{1}{\lambda_i}<\infty$ (It follows from Borel-Cantelli lemma.)

So we found the necessary and sufficient condition.

  • 0
    This only shows that, if $\sum\limits_i1/\lambda_i$ converges, then $X_i\to0$ almost surely, not the other implication.2012-04-16
  • 1
    @Dawson there something wrong here. Firstly in the second line you have used the tail cdf in stead of the proper cdf. It is clear that $\lambda_i$ should goes to infinity. In the second part I think the condition should follow from $\sum_n e^{-\lambda_n \varepsilon}<+ \infty$. For example $\lambda_n =n$ satisfies the convergence.2012-04-16
  • 0
    @Kolmo Whoops! I made those changes in error. I will change them back.2012-04-16
1

Part 1 is answered by Dawson. For part 2, we need to use both Borel Cantelli lemmas:

  • Use the first BC lemma, for any $\epsilon>0$ we have $\sum P(X_i>\epsilon) < \infty\implies\sum e^{-\lambda_i\epsilon} < \infty$ is the sufficient condition as $P(\lim_{n\to \infty} X_n>\epsilon)=0$.
  • Use the second BC lemma, as the sequence is independent. $\sum P(X_i>\epsilon) = \infty\implies \sum e^{-\lambda_i\epsilon} = \infty$ is the condition for $P(\lim_{n\to \infty} X_n>\epsilon)=1$.

This means $\sum e^{-\lambda_i\epsilon}< \infty$ is both necessary and sufficient.

PS: As noted in a comment, it is better not to use Markov inequality in such proofs as it is a weak inequality.

  • 0
    Hi could you elaborate you proof a little, im struggling to follow it at the moment2018-04-18