1
$\begingroup$

I am reading Markov chain and Mixing time and I do not understand the following two things:

A random mapping representation of a transition matrix P on a state $\Omega$ is a function $f:\Omega\times\Lambda \rightarrow \Omega$ along with a $\Lambda$-valued random variable $Z$, satisfying $$\textbf{P}\{f(x,Z)=y \}=P(x,y)$$ The reader should chech that if $Z_1,Z_2,\dots$ is a sequence of independent random variables, each having the same distribution as $Z$, and $X_0$ has distribution $\mu$, then the sequence $(X_0,X_1,\dots)$ defined by $$X_n=f(X_{n-1}, Z_n)$$ is a Markov chain with transition matrix $P$ and initial distribution $\mu$.

Here I missing something because I could not prove what the author asks. I ask for a calculation and to know where should the independence of the random variables $Z_1,Z_2,\cdots$ is used.

Thanks a lot!

EDIT: If $n=1$

$$\textbf{P}(X_1=y|X_0=x)=\textbf{P}(f(X_0,Z_1)=y|X_0=x)=\textbf{P}(f(x_0,Z)=y)=P(x,y)$$

and that

$$\textbf{P}(X_2=y|X_1=x_1,X_0=x)=\textbf{P}(f(X_1,Z_2)=y|X_1=x,X_0=x_0)\overset{?}{=}\textbf{P}(f(x,Z_2)=y|X_0=x_0)\overset{?}{=}P(x,y)$$

It the next I do the same: $$\textbf{P}(X_{t+1}=y|X_{t}=x,X_{t-1}=x_{t-1}\dots X_0=x_0)=\textbf{P}(f(X_t,Z_t)=y|X_{t}=x,X_{t-1}=x_{t-1}\dots X_0=x_0)=\textbf{P}(f(x,Z_t)=y|X_{t-1}=x_{t-1}\dots X_0=x_0)\overset{?}{=}\textbf{P}(X_{t+1}=y|X_{t}=x)\overset{?}{=}P(x,y)$$

But I don't know if this is right and if so why and where the independency is used

1 Answers 1

1

You need independence because $X_n$ depends on $Z_n$, and unless $Z_n$ is independent of, say, $Z_{n-2}$ then there's no guarantee that the distribution of $X_n$ conditional on $X_{n-1}$ is independent of $X_{n-2}$.

As for the proof it should literally just consist of unpacking the definition. Do you know the precise definition of a Markov chain?

UPDATE

\begin{align*} \mathbb{P}&(X_n = c_n \; | \; X_{n-1} = c_{n-1}, \ldots, X_0 = c_0) \\[3mm] &= \mathbb{P}(f(x_{n-1}, Z_n) = c_n \; | \; f(x_{n-2}, Z_{n-1}) = c_{n-1}, \ldots, f(x_0, Z_{1}) = c_1, X_0 = c_0) \\[3mm] &= \mathbb{P}(f(x_{n-1}, Z_n) = c_n) = P(x_{n-1},c_{n-1}) \end{align*}

where the next-to-last step comes from the independence of $Z_n$ from the other $Z_i$. (In fact we also need to require that each $Z_i$ is independent of $X_0$, which the question didn't explicitly mention.)

  • 0
    I do. I must prove that $\textbf{P}(X_{t+1}=y|X_{t}=x,X_{t-1}=x_{t-1}\dots X_0=x_0)=\textbf{P}(X_{t+1}=y|X_{t}=x)=P(x,y)$.2017-01-03
  • 0
    But I am not sure what kind of calculations I can do. Can I post my try and you can tell what is wrong?2017-01-03
  • 0
    @YTS: Yes. Edit it into your original question.2017-01-03
  • 0
    Done with the edit.2017-01-03
  • 0
    @YTS: See update.2017-01-03