1
$\begingroup$

I am reading Norris's "Markov Chains" and would appreciate an explanation of the following bit.

After stating the Markov property, it is said that (on page 4)

In general, any event A determined by $X_0, ...,X_m$ may be written as a countable disjoint union of elementary events $A=\bigcup_{k=1}^{\infty} A_k$

What are these $A_k$'s and why are there infinitely many of them? How does this follow from the Markov property?

Thanks.

  • 0
    According to the answers below, you need to look up the definition of "elementary event". One hopes it occurs in Norris's "Markov Chains" before this point!2011-10-09

2 Answers 2

2

Norris considers denumerable Markov chains hence $Y=(X_0,\ldots,X_m)$ has values in a countable set $\mathcal Y_m$. Thus any event in the sigma-algebra generated by $Y$ has an expression as $A=\bigcup_{y\in \mathcal Y(A)}[Y=y]$, for a given $\mathcal Y(A)\subseteq\mathcal Y_m$. Each $[Y=y]$ is probably what Norris calls an elementary event.

Thus, this has nothing to do with the Markov property but is a consequence of the fact that the random variables considered can take at most countably many values.

  • 0
    Because every event is not elementary. Consider $m=2$ and $A=[X_0=i_0,X_2=i_2]$ for example. Then $A$ is not $[Y=y]$ for any single $y$ but $A$ is the union of the elementary events $[Y=y]$ over every $y$ in the set $\{i_0\}\times I\times\{i_2\}$.2011-10-09
2

My guess is that Norris is assuming a countable state space, and each elementary event is either empty or specifies the outcome for each of $X_0, \ldots, X_m$. There do not have to be infinitely many of them. And this has nothing at all to do with the Markov property.