I'm taking a course about Markov chain, and here's a snippet from the lecture notes:
Let $(X_i, i \ge 0)$ be a time homogeneous Markov chain, let $V$ be the state space, let $\lambda$ be the initial distribution, i.e. $P\{X_0 = v\} = \lambda(v)$. Let $T$ be a stopping time. Let $p_{w,v}$ be the transition probability from state $w$ to $v$.
Then for each $v \in V$ and $i \ge 1$, we have
\begin{align} & P_\lambda(X_i = v, T \ge i) \\ = & \sum_{w \in V} P_\lambda(X_{i-1}=w, T \ge i-1)\times P_\lambda(T>i-1|X_{i-1} = w, T \ge i-1)\times p_{w,v} \end{align}
From the context, I can see there's no typo in the equations, but when I try to derive it my self, I can only get
\begin{align} & \sum_{w \in V} P_\lambda(X_{i-1}=w, T \ge i-1)\times P_\lambda(T>i-1|X_{i-1} = w, T \ge i-1)\times p_{w,v} \\ = & \sum_{w \in V} P_\lambda(T>i-1, X_{i-1} =w, T \ge i-1) \times p_{w,v} \\ = & \sum_{w \in V} P_\lambda(T \ge i, X_{i-1} =w) \times p_{w,v} \\ = & \sum_{w \in V} P_\lambda(T \ge i, X_{i-1} =w) \times P(X_1=v|X_0=w) \end{align}
But I can't go any further. My guess is that, since $T$ is a stopping time, then we can use strong Markov property to get
$ P_\lambda(X_i =v | T \ge i, X_{i-1}=w) = P(X_1=v|X_0=w). $
Is this correct?
BTW, the original equation comes from a stopping rule called the "filling rule".