1
$\begingroup$

I have the following from a book:

Assume that $$ P_x(\tau_C \circ \theta_{(k-1)N} > N|F_{(k-1)N}) = P_{X_{(k-1)N}}(\tau_C > N). $$ Integrating over $\{ \tau_C > (k-1)N\}$ using the definition of conditional probability we have $$ P_x(\tau_C > kN) = E_x\left(\mathbf{1}\{\tau_C \circ \theta_{(k-1)N} > N\} \cdot\mathbf{1}\{\tau_C > (k-1)N\}\right) $$

I'm a bit unsure how from that equality he gets the second equality we see. I can see that the LHS of the first equality multiplied by $\mathbf{1}\{\tau_C > (k-1)N\}$ and then taking the expectation wrt. x yields the RHS of the second equality, but how does the RHS of the first equality being "integrated over" as claimed produce the LHS of the second equality?

Here, $\tau_C$ is the first hitting time of some set $C$.

Thanks.

  • 0
    The second equality holds for every random process, be it Markov or not, since the events $\{ \tau_C > kN\}$ and $\{\tau_C \circ \theta_{(k-1)N} > N\}\cap\{\tau_C > (k-1)N\}$ coincide.2012-02-28
  • 0
    I can see that these events are equal when I write it out properly. But the author of the book seems to suggest a different method for calculating this..2012-02-28
  • 0
    Hence: never take at face value what is written in a book...2012-02-28
  • 0
    Thanks for the help. Is there an easy way to see that this? I find having to write the first set in $\{\tau_C \circ \theta_{(k-1)N} > N\}\cap\{\tau_C > (k-1)N\}$ in terms of the state space and using the shift of the sequence not very intuitive.2012-02-28
  • 0
    @Lax: to ask anyone on this website use "@" interface like I did in the current message. Otherwise the person won't receive your comment in his/her inbox. Answering your last comment: $$ \{\omega:\tau_C(\omega)>n\} = \bigcap\limits_{i=0}^{n}\{\omega:X_i(\omega)\in C^c\} $$ and $$ \{\omega:(\tau_C\circ\theta_{j})(\omega)>n\} = \bigcap\limits_{i=j}^{n+j}\{\omega:X_i(\omega)\in C^c\} $$2012-02-29
  • 0
    Now, if you apply it for $j = (k-1)N$, $n = kN$ in the first formula and $n = N$ in the second you will get what you want. With regards to "integrate $f(\omega)$ over the set $A$" it means $$ \int\limits_{A}f(\omega)\mathsf P(d\omega) = \mathsf E[f\cdot 1_A] $$ which makes a hint for what is written in the book. I didn't check though if the second formula in the book so easily follows from the first+integrating, since it is not the easiest way to get the latter formula.2012-02-29
  • 0
    @Ilya Thanks very much, I'm grateful.2012-03-03

1 Answers 1

1

The second equality holds for every random process, be it Markov or not, since, for every $k\geqslant1$, the events $\{ \tau_C \gt kN\}$ and $\{\tau_C \circ \theta_{(k-1)N} \gt N\}\cap\{\tau_C \gt (k-1)N\}$ coincide.

Proof:

Introducing the canonical process $(X_i)_{i\geqslant0}$, $\{ \tau_C \gt n\}=\{X_i\notin C\ \text{for every}\ 1\leqslant i\leqslant n\}$ for every nonnegative $n$ and $\{\tau_C \circ \theta_m \gt n\}=\{X_i\notin C\ \text{for every}\ m+1\leqslant i\leqslant m+n\}$ for every nonnegative $n$ and $m$. Use the first identity for $n=kN$ and $n=(k-1)N$ and the second identity for $n=N$ and $m=(k-1)N$.