0
$\begingroup$

Could anyone help to solve this problem in probability? Thank you very much!

Let $\{X_{k}\}_{k\in N}$ be iid $0$-$1$ random variables. Decide the asymptotic behavior of the random walk $S_{n}:=\sum\limits_{k=1}^{n}\frac{1}{k}X_{k}$.

You may begin with showing that the tail of the random walk decays to zero. Namely $\forall \epsilon>0$, $$ \Pr [ |T_{k}|>\epsilon \text{ infinite often}]=0 ,$$ where $T_{k}:=\sum\limits_{i=k}^{\infty}\frac{1}{i}X_{i}$.

  • 2
    Is this homework? What have you tried?2011-09-09
  • 3
    Are you sure the values are supposed to be $0$ and $1$, not $-1$ and $1$? As it is, $T_k = \infty$ almost surely (except in the trivial case where all $X_k = 0$ almost surely).2011-09-09
  • 1
    @Robert I don't think $\pm 1$ alone is enough; shouldn't they be zero mean random variables as well?2011-09-09
  • 0
    Perhaps a bit nitpicky, but I do not understand why $T_k$ is indexed with $k$, when the $k$ is already used for the $X_k$. Writing $T_n = \sum_{k=n}^{\infty} \frac{1}{k} X_k$ would fit better with the notation $S_n$ and $X_k$.2011-09-09
  • 0
    @Srivatsan: yes, of course, but the point is that with $0$ and $1$ there is no nontrivial case that works. Actually, I suspect that what was meant was $N(0,1)$, i.e. normal with mean $0$ and variance $1$.2011-09-10
  • 0
    @Robert No, the x-s are assumed to be iid 0,1 variables and we do not know the distribution. Also, T is indexed by k because there are some kind of symmetry here.2011-09-13

1 Answers 1

3

Assume that $\mathrm P(X_n=1)=x$ and $\mathrm P(X_n=0)=1-x$, hence $\mathrm E(X_n)=x$ for every $n$, and let $H_n=\sum\limits_{k=1}^n\frac1k$ denote the $n$th harmonic number, hence $H_n=\log(n)+\gamma+o(1)$.

Then $S_n-xH_n$ converges almost surely and in $L^2$ to an almost surely finite centered random variable $Y$ with variance $\mathrm E(Y^2)=x(1-x)\frac{\pi^2}6$.

In particular, $\frac1{\log n}S_n\to x$.

To see this, consider $Y_n=\sum\limits_{k=1}^n\frac1k(X_k-x)$. The random variables $X_k-x$ are centered, square integrable with variance $x(1-x)$, and independent. Hence, for every $n$, $Y_n$ is centered with variance $\sum\limits_{k=1}^n\frac1{k^2}\mathrm E((X_k-x)^2)=\sum\limits_{k=1}^n\frac1{k^2}x(1-x)$. The series $\sum\limits_k\frac1{k^2}x(1-x)$ converges hence $(Y_n)$ converges in $L^2$. Since $(Y_n)$ is the sequence of the partial sums of some independent random variables, a result due to Paul Lévy ensures that $(Y_n)$ converges almost surely as well.