2
$\begingroup$

Can you help me please with proof of this question:

Prove, that in finite-state Markov chain state $i$ is transient if and only if is exist state $k$ such that $i\rightarrow k$ but k $\nrightarrow i$.

Give counterexamples at case of infinite chain.

Thanks!

  • 1
    Try showing that in a communicating class, all states are the same. That is, either they are all transient or recurrent. Thus, if there is a transient state $i$ which communicates with every single state, that means all the other states are transitive. But then there must be a "sink" somewhere, because not every single state can be transitive in a Markov chain! The other way of saying this is that at least one state must be recurrent in a finite Markov chain. The last sentence becomes false in the case of infinite chains (why!?)2012-12-22
  • 0
    Try showing what you did, the current version of the question has 0% of personal input.2012-12-28

0 Answers 0