Let $(X_n, n \ge 0)$ be a Markov chain. Let $V$ be the state space. Let $\lambda$ and $\tau$ be two probability distribution. Can we say that for any $\lambda$ and $\tau$, there is always a stopping rule $T$, such that for all $v \in V$,
$ P_\lambda(X_T = v) = \tau(v). $
I guess we can say that, if we pick $z$ from $V$ according to $\tau$, then the hitting time $T$ of $z$ is a stopping rule. Since this stopping rule works with all distributions, we can say there is always a stopping rule.
Is this the correct way to prove existence of stopping rule from one distribution to another?