3
$\begingroup$

I'm reading "Stochastic Processes" by Doob, and I have a question in the following corollary:

enter image description here

The proof is here:

enter image description here

where Theorem 4.1 is:

enter image description here

My question is why $\lim x_n(\omega)$ exists a.e.?

Is it possible that $P(\lim \sup x_n(\omega) = \infty \cap \lim \inf x_n(\omega) = -\infty)>0$?

1 Answers 1

0

The series $\sum_{j=1}^{\infty} y_j(\omega)$ converges for almost all $\omega$ for which $\sum_{j=1}^{\infty} p_j(\omega)$ converges, and conversely.

Let $\omega$ such that $\sum_{j=1}^{\infty} y_j(\omega)$ converges, then by the non-negativity

$$x_n(\omega) \leq \sum_{j=1}^n y_j(\omega) \leq \sum_{j=1}^{\infty} y_j(\omega)<\infty,$$

hence $\limsup_{n \to \infty} x_n(\omega)<\infty$. Now it follows from Theorem 4.1 that $\lim_{n \to \infty} x_n(\omega)$ exists and is finite (up to a null set), and this, in turn, implies that $\sum_{j=1}^{\infty} p_j(\omega)<\infty$.

Conversely, if $\sum_{j=1}^{\infty} p_j(\omega)<\infty$ for some $\omega$, then

$$\liminf_{n \to \infty} x_n(\omega) \geq - \sum_{j=1}^{\infty} p_j(\omega)>-\infty$$

and again we find $\lim_{n \to \infty} x_n(\omega)<\infty$ (up to a null set) implying $\sum_{j=1}^{\infty} p_j(\omega)$.

This proves the assertion.

  • 0
    It is another way to only prove the original assertion. But my question is in Doob's proof. He used the conclusion that $P(\lim x_n(\omega) exists)=1$, then $\sum y_n$ and $\sum E(y_n|\mathscr{F}_{n-1})$ must converge and diverge together. So it implied when the two series are both divergent, $\lim x_n(\omega)$ still exists. I wonder how to obtain this assertion or how to prove that $\lim x_n(\omega)$ exists almost surely. Thanks in advance!2017-02-03
  • 0
    @JyChen It's not "another way" to prove the original assertion; I use essentially the same reasoning as in Doob's proof. If $\sum_j y_j$ converges almost everywhere (or $\sum_j p_j$ converges almost everywhere), then my proof shows $$\mathbb{P}(\liminf_n x_n = -\infty, \limsup_n x_n=\infty)=0. \tag{1}$$ Although I don't have a concrete counterexample, I don't think that $(1)$ holds true if $\sum_j y_j(\omega)$ converges only for some $\omega$.2017-02-03
  • 0
    Yes, if $\sum_j y_j$ converges almost everywhere, (1) holds true. However, if $P(\sum_j y_j = \infty) > 0$, we can still prove the assertion that the set $\{\omega: \sum_j y_j(\omega) converges\}$ is equivalent to $\{\omega: \sum_j p_j(\omega) converges\}$ by your proof without using $(1)$. While in such situation, Doob's proof still needs $(1)$ to be true, so that $\lim x_n$ exists and then two series are converge and diverge together. Or do I misunderstand Doob's proof?2017-02-03
  • 0
    @JyChen No, I agree with you... I think that "my proof" is actually exactly what Doob had in mind when he wrote his proof. (Because the idea of my proof is exactly the same as in Doobs proof.)2017-02-04
  • 0
    Yes, so I can consider that Doob just reduce the whole probability space to $\{\sum_j y_j \text{ converges}\} \cup \{\sum_j p_j \text{ converges}\}$ and then prove the assertion...2017-02-04