4
$\begingroup$

This is a part of an exercise in Durrett's probability book.

Consider the Markov chain on $\{1,2,\cdots,N\}$ with $p_{ij}=1/(i-1)$ when $j and $p_{ij}=0$ otherwise. Suppose that we start at point $k$. We let $I_j=1$ if $X_n$ visits $j$. Then $I_1,I_2,\cdots,I_{k-1}$ are independent.

I don't find it obvious that $I_1,\cdots,I_{k-1}$ are independent. It is possible to prove the independence if we calculate all $P(\cap_{j\in J\subset\{1,\cdots,k-1\}}I_j)$, but this work is long and tedious. Since the independence was written as an obvious thing in this exercise, I assume that there is an easier way.

2 Answers 2

3

Let $A_k$ denote the set of $\mathfrak a=(a_i)_{1\leqslant i\leqslant k}$ such that $a_1=a_k=1$ and $a_i$ is in $\{0,1\}$ for every $2\leqslant i\leqslant k-1$. For every $\mathfrak a$ in $A_k$, let $U(\mathfrak a)=\{2\leqslant i\leqslant k\mid a_i=1\}$. Then $ \mathrm P((I_i)_{1\leqslant i\leqslant k}=\mathfrak a)=\prod_{u\in U(\mathfrak a)}\frac1{u-1}=\prod_{i=2}^k\frac1{(i-1)^{a_i}}. $ The product form of the RHS ensures that $(I_i)_{1\leqslant i\leqslant k}$ is independent.

Furthermore, for every $1\leqslant i\leqslant k-1$, summing the RHS over every $\mathfrak a=(a_i)_{1\leqslant i\leqslant k}$ in $A_k$ such that $a_i=\alpha$ with $\alpha$ in $\{0,1\}$ shows that $ \mathrm P(I_i=\alpha)=\frac1{k-1}\frac1{(i-1)^{\alpha}}\prod_{2\leqslant j\leqslant k-1}^{j\ne i}\left(1+\frac1{j-1}\right)=\frac1{(i-1)^{\alpha}}\frac{i-1}i, $ hence $\mathrm P(I_i=1)=\dfrac1i$ and $\mathrm P(I_i=0)=\dfrac{i-1}i$.

1

For any $j$, observe that $X_{3}|X_{2}=j-1,X_{1}=j$ has the same distribution as $X_{2}|X_{2} \neq j-1, X_{1}=j$. Since $X_{2}=j-1$ iif $I_{j-1}=1$, by Markovianity conclude that $I_{j-1}$ is independent of $(I_{j-2},\ldots,I_{1})$ given that $X_{1}=j$.

Let's prove by induction that $I_{j-1}$ independent of $(I_{j-2},\ldots,I_{1})$ given that $X_{1}=k$.

I) $j=k$ follows straight from the first paragraph.

II) Now assume $I_{a-1}$ independent of $(I_{a-2},\ldots,I_{1})$ for all $a \geq j+1$. Thus, $(I_{k-1},\ldots,I_{j})$ is independent of $(I_{j-1},\ldots,I_{1})$. Hence, in order to prove that $I_{j-1}$ is independent of $(I_{j-2},\ldots,I_{1})$ we can condition on $(I_{k-1}=1,\ldots,I_{j}=1)$. This is the same as conditioning on $(X_{2}=k-1,\ldots,X_{k-j+1}=j)$. By markovianity and temporal homogeneity, $(X_{k-j+2}^{\infty}|X_{k-j+1}=j,\ldots,X_{1}=k)$ is identically distributed to $(X_{2}^{\infty}|X_{1}=j)$. Using the first paragraph, we know that $I_{j-1}$ is independent of $(I_{j-1},\ldots,I_{1})$ given that $X_{1}=j$. Hence, by the equality of distributions, $I_{j-1}$ is independent of $(I_{j-2},\ldots,I_{1})$ given that $X_{1}=k$.

  • 0
    Yes, this version of the induction is better than your previous one.2012-06-30