I recently had great success with my first question here so I will boldly go on to a second. Here goes:
I'm studying Markov Chains in Rick Durrett - Probability: Theory and example and I'm stuck with the definition of the strong markov property - I know more or less what it should be, but do not understand his way of saying it. I'm gonna give you a lot of information, hopefully enough but please ask for more if you need it.
Some definitions: We have some nice (1-1 map between S and R) measurable space $(S,\mathcal{S)}$ and we then define $\Omega=\{{(\omega_{1},\omega_{2},...):\omega_{i}\in\text{}S}\}$ $\mathcal{F}=\mathcal{S}\times\mathcal{S}\times...$ $P=\mu\times\mu\times...\text{where }\mu\text{ is the distribution of }X_{i}$ $X_{n}(\omega)=\omega_{n}$
We have $P(X_{j}\in B_{J},0\leq j \leq n)=\int_{B_{0}}\mu(dx_{0})\int_{B_{1}}p(x_{0},dx_{1})...\int_{B_{n}}p(x_{n-1},dx_{n})$ Where the p's are transition probabilities (for fixed x (first variable) it's a probability measure and fixed set (second variable) a measurable function). The probability measure is consistent so Kolmogorov's extension theorem gives us the infinite one (as I understand).
His definition is then as follows:
Suppose that for each n, $Y_{n}:\Omega\rightarrow\mathbb{R}$ is measurable and $|Y_{n}|\leq M\; \forall n$ Then $E_{\mu}(Y_{N}\circ\theta_{N}|\mathcal{F}_{N})=E_{X_{N}}Y_{N}\quad on\:\{N<\infty\}$ N is a stoptime and theta a shift operator ("drops the first N elements of the omega-sequence")
So i know i am being a bit imprecise here - I reckon I know what all the elements of the theorem are, but have trouble adding it all up. I hope someone bothers to help.
Thanks in advance, Henrik
Update v3.b:
I almost figured it out, I will update with my findings shortly (hope someone cares).
So I still have some problems; can someone help me with these notions: $P_{x}=P_{\delta_{x}}$, why $P_{\mu}(A)=\int P_{x}(A)\, \mu(dx)$ and what $E_{X_{n}}$ looks like explicitly.