2
$\begingroup$

Let $\{X_n\}_{n \geq 1}$ be a sequence of random variables with $\mathbb{E}[X_n] = u$. Suppose $\lim_{n \to \infty}\mathrm{Var}[X_n] = 0$. Do we have that $X_n$ converges to constant $u$ almost surely?


What I ask actually comes from proving the quadratic variation of Brownian motion $B(t)$ is $t$. I was wondering how above argument for $X_n = \sum_i[B(t_i^n)-B(t_{i-1}^n)]^2$ implies that quadratic variation of Brownian motion $B(t)$ is $t$?

  • 0
    As far as I know, the approximate quadratic variation need not converge a.s. without extra conditions. Refer to [this](http://math.stackexchange.com/questions/1853219/quadratic-variation-of-brownian-motion-doesnt-converge-almost-surely), for instance. I am not sure how you would relate those two problems.2017-02-23
  • 0
    It's a theorem in my textbook. Refer to this https://books.google.com/books?id=JYzW0uqQxB0C&lpg=PA63&vq=quadratic%20variance&pg=PA63#v=snippet&q=quadratic%20variance&f=false Page 63.2017-02-23

1 Answers 1

2

Yes, and the easiest way to prove this is Chebyshev's Inequality.

Edit: Good point, but you can use Borel-Cantelli if the variances are summable. I think with your example we already know that there is an a.s. limit for the quadratic variation, so we're just making claims about what it is, I'm blanking on it now but that's likely the argument.

  • 2
    I guess Chebyshev's Inequality only implies convergence in probability?2017-02-23
  • 0
    yes it only implies convergence in probability2018-04-23