2
$\begingroup$

Suppose $X_1, X_2, ...$ are independent random variables with $P(X_n=\sqrt n)=1/\sqrt n $ and $P(X_n=0)=1-1/\sqrt n$. Let $S_n=X_1+X_2+\cdots+X_n$ for all $n$. Show that $S_n/n \rightarrow 1 \space a.s.$
I try to apply the Borel-Cantelli lemma, but its not working. I also try to truncate the random variable by define a new random variable $Y_n=X_n\Large 1\normalsize\{ X_n<1\}$. But its still not working.
Can anybody help?

  • 0
    Can you elaborate on why Borel Cantelli failed? What happens if you try Chebyshev's Inequality? Just to be clear, I'm suggesting you try the form of Borel Cantelli which says that if $\sum_i P(A_i>e_i)<\infty$ for some $e_i\rightarrow 0$, then the probability of $A_i>0$ infinitely often is zero.2012-12-08
  • 0
    I try to show $\sum_i P(|S_n-1|>e)<\infty$, but I got this series is diverges.2012-12-08
  • 0
    Aren't you showing $S_n/n$ converges to 1? So shouldn't that be $S_n/n$ in your sum?2012-12-08
  • 0
    I'm sorry it is $S_n/n$ in the sum, it was a typo.2012-12-08
  • 0
    Can you show your work for divergence? Did you try Chebyshev's Inequality?2012-12-08
  • 0
    That' what I have, $\sum_i P(|S_n/n-1|>e)<\sum_i\frac{Var(S_n/n)}{e^2}=\frac{\sum_i\sqrt n}{n^2e^2}=\infty$2012-12-08
  • 0
    I made a mistake when calculate the variance, but the variance is very complicated since each $X_k$ is depend on $k$,so $\sqrt k-1$is just the variance for each $X_k$.2012-12-08
  • 0
    @BigMike Yes you are right. Now I think what remains to be done if you are interested only in convergence in probability is to show $\frac{1}{n^2} \sum_{k = 1}^n \left( \sqrt{k} - 1 \right) \xrightarrow{n \rightarrow \infty} 0$2012-12-08
  • 0
    @Learner Its easy to show this is converges in probability, but I don't think that will help us proving almost sure convergence.2012-12-08
  • 0
    Where did you get this question? Was it in some book for example?2012-12-08

1 Answers 1

2

I think it's easier to use the Strong Law of Large Numbers. The following theorem is useful:

Theorem (Kolmogorov's strong law) Let $X_j \in L^2$ independent random variables such that $\sum_{j \geq 1} \frac{\text{var}(X_j)}{j^2}<\infty$. Then $(X_j)$ fulfills the Strong Law of Large Numbers, i.e.

$$\frac{1}{n} \sum_{j=1}^n (X_j-\mathbb{E}X_j)=0 \quad \text{a.s.}$$

(See Sen & Singer (1993, Theorem 2.3.10)).

In this case we have $\text{var}(X_j) = \sqrt{j}-1 \leq \sqrt{j}$, hence $\sum_{j \geq 1} \frac{\text{var} X_j}{j^2}<\infty$. Thus (by Kolmogorov's strong law)

$$ \frac{S_n}{n}- \underbrace{\frac{\mathbb{E}S_n}{n}}_{1} \to 0 \quad \text{a.s.}$$

  • 0
    Thanks for the advice, but I think I figured out how to prove it. The trick is to use the Kronecker's Lemma.2012-12-09
  • 0
    To which sequence do you apply Kronecker's Lemma?2012-12-09
  • 0
    Define a new random variable $Y_n=X_n-1$ Then this new RV has mean zero, and $\sum_{k=1}^\infty Y_n/n$ converges a.s. to 0. And then, apply the Kroneckers Lemma.2012-12-09
  • 0
    You could post it as an answer (to your own question) ... it would be interesting to see your way of solving it.2012-12-09