1
$\begingroup$

I'm curious about what I believe to be a necessary condition for a sequence of partial sums $S_n$ of i.i.d. random variables $X_i$ to converge almost surely to $+\infty$. Intuitively it seems obvious that $EX_i \neq 0$ is necessary, but I cannot prove it.

Some things I'm sure won't work: SLLN (can't be used because it could be that $E|X_i| = +\infty$), and I don't think Kolmogorov's three-series theorem is at all useful here. Neither are the Monotone or Dominated Convergence theorems, clearly. Other ideas? Or is there a counterexample to the statement?

  • 1
    Look at the section of Durrett's book dealing with random walks. In particular, for a mean zero random walk you are either have $S_n=0$ for all $n$, or $-\infty=\liminf S_n<\limsup S_n=\infty$ with probability one. You can find a link to his book online at my answer here: http://math.stackexchange.com/questions/2114109/random-walk-with-maximum-and-minimum/2116747#21167472017-02-02
  • 0
    It was an exercise in Durrett that prompted the question. :) Exercise 4.1.11 (i), specifically. I see how $P(\overline{\beta} = \infty) = 0$ follows if $P(\alpha < \infty) < 1$. If $P(\alpha < \infty) = 1$, then $P(\sup{S_n} = \infty) = 1$ from an earlier exercise. Then Hewitt-Savage implies either $S_n \to \infty$ a.s. or $P(\limsup{S_n} = \infty) = P(\liminf{S_n} = -\infty) = 1$. I'm not sure what to do in the $S_n \to \infty$ case.2017-02-03

1 Answers 1

1

You can prove this exercise using Wald's equation, proven just below in Durrett's book.

Suppose that $E X_i=0.$ If $E\alpha$ were finite, then Wald's equation would give $0