10
$\begingroup$

In our probability theory class, we are supposed to solve the following problem:

Let $X_n$, $n \geq 1 $ be a sequence of independent random variables such that $ \mathbb{E}[X_n] = 0, \mathbb{Var}(X_n) = \sigma_n^2 < + \infty $ and $ | X_n | \leq K, $ for some constant $ 0 \leq K < + \infty, \ \forall n \geq 1$.

Use martingale methods to show that $$ \sum \limits_{n = 1}^{\infty} \ X_n \ \mbox{ converges } \mathbb P-\mbox{a.s.} \ \Longrightarrow \ \sum\limits_{n = 1}^{\infty} \ \sigma_n^2 < + \infty .$$

Could anybody give me a hint? Thanks a lot for your help!

Regards, Si

  • 0
    What do you know about martingales?2011-12-09
  • 0
    I know the definition: Let $ (\Omega, \mathcal{F}, P) $ be a probability space and $ (\mathcal{F}_n) $ a filtration. An adapted sequence of integrable random variables $ X_n $ is called a martingale, if $ E[X_{n+1} | \mathcal{F}_n] = X_n, \ \forall n \geq 1. $2011-12-09
  • 0
    Furthermore, I know the following proposition: Let $ X_n, n \geq 1, $ and $ X $ be random variables such that $ X_n \rightarrow X, $ P-a.s. Then, the following are equivalent: (1) $ \{ X_n, n \geq 1 \} $ is uniformly integrable (2) $ X_n \rightarrow X $ w.r.t. $ L^1 $.2011-12-09
  • 0
    Also, I know the convergence theorem: Let $ X_n, n \geq 1 $, be a submartingale such that $ \sup_{n \geq 1} E[X_n^+] < + \infty. $ Then, the sequence $ X_n $ converges $ P $-a.s. to some r.v. $ X $ satisfying $ E[| X |] < + \infty. $2011-12-09
  • 0
    Then, I know the convergence theorem for $ L^p: $ Let $ X_n, n \geq 1, $ be a martingale such that $ \sup_{n \geq 1} E[|X_n|^p] < + \infty. $ Then, $ X_n \rightarrow X, $ P-a.s. and w.r.t. $ L^p, $ for some r.v. $ X. $2011-12-09
  • 0
    @DavideGiraudo: Does our exercise have to do with any of the above? Unfortunately, I don't see how any of the above could be applied? Thanks again for your help!2011-12-09
  • 0
    The first idea I would have (it doesn't mean all at it will work!) is the following: put $S_n:=\sum_{k=1}^nX_k$ and $\mathcal F_n:=\sigma(X_k,1\leq k\leq n)$. Then $\{X_n\}$ is a $(\mathcal F_n)$ martingale. There are also some particular results for $L^2$ martingale. Do you know them.2011-12-09
  • 0
    @DavideGiraudo: Unfortunately, I only know the $ L^p $ results which I have stated above. Is there something special about the case $ p = 2 $ ? Thanks a lot for your help, Davide!2011-12-09
  • 0
    Maybe it would be better to use $Y_n:=(\sum_{j=1}^nX_j)^2-\sum_{j=1}^n\sigma_i^2$.2011-12-09
  • 0
    Hmmm, I'm not sure it would be better to use $ Y_n. $ Simply because: how can we make use of the almost sure convergence of $ \sum_n X_n $ in this setting?2011-12-10
  • 0
    let us [continue this discussion in chat](http://chat.stackexchange.com/rooms/1960/discussion-between-davide-giraudo-and-mad-si)2011-12-10

1 Answers 1

9

Set $S_n:=\sum_{i=1}^n X_i$ and $S_0=0$. $S$ is a martingale (wrt the natural filtration), so $S^2$ is a sub-martingale. Using the Doob's decomposition we can write $S^2=M+A$ where M is a martingale and A is predictable (increasing) process, both null at 0. It turns out that $A_n=\sum_{i=1}^n \sigma_i^2$.

Define the stopping time $T_\alpha$ as $T_\alpha=\inf\{n \geq0 \ : \ |S_n|>\alpha\}$. By OST we have $$E[S_{T_\alpha \wedge n}^2]-E[A_{T_\alpha \wedge n}]=0;\qquad \forall n.$$

Also $|S_{T_\alpha \wedge n}| \leq \alpha+K$, since $|X_i| \leq K $; then $$E[A_{T_\alpha \wedge n}] \leq (\alpha+K)^2;\qquad \forall n.$$

Since $\sum X_i$ converges a.s., the partial sum $S_n$ are a.s. bounded, then for some $\alpha$ we have that $(T_{\alpha}=\infty)$ has positive probability. So applying this fact to the last inequality, we get $A_{\infty}=\sum\sigma_n^2 < \infty$.

P.S.: I hope it is clear; I am a new user and I want to say hi to everyone. Also I noticed that I can not comment in some post; is there any restriction? Thanks.