In a Markov chain, we have four states: $0, 1, 2, 3$. The transition matrix is $$ \begin{bmatrix} 1/3 & 2/3 & 0 & 0 \\ 1/3 & 0 & 2/3 & 0 \\ 0 & 1/3 & 0 & 2/3\\ 0& 0 & 0 & 1 \end{bmatrix}$$ The initial state is $1$, and we need to find the probability that $0$ is visited at least twice before reaching $3$ (which is "game over": an absorbing state).
My solution if the following: Define $P_{i v}$ as the probability to visit $0$ at least $v$ times before reaching $3$, given that the current state is $i$ (not including $i$ if it happends to be $0$).
Then we can use the law of total probability to obtain: $$ {\left\{\begin{matrix} P_{12}=\frac{2}{3}P_{22}+\frac{1}{3}P_{01}\\ P_{22}=\frac{2}{3}\cdot 0+\frac{1}{3}P_{12}\\ P_{01}=\frac{2}{3}P_{11}+\frac{1}{3}\cdot 1\\ P_{11}=\frac{2}{3}P_{21}+\frac{1}{3}\cdot 1\\ P_{21}=\frac{2}{3}\cdot 0+\frac{1}{3}P_{11}\\ \end{matrix}\right.} $$ Which has the solution: $$ {\left\{\begin{matrix} P_{12}=13/49\\ P_{22}=13/147\\ P_{01}=13/21\\ P_{11}=3/7\\ P_{21}=1/7\\ \end{matrix}\right.} $$ so the answer seems to be $13/49$. However, the solution attached (which seems to never be wrong) says that the answer is $95/243$. So I have two questions:
- Could someone please verify my solution?
- Does someone know a shorter solution?
Thanks!