2
$\begingroup$

Possible Duplicate:
What is the difference between all types of Markov Chains?

I've read a lot about Markov processes and chains but so far I don't understand what the difference is between the two. Is there any difference?

  • 0
    I know some people take them to be the same thing, however, according to Wikipedia a Markov chain is traditionally taken to mean a discrete Markov process.2012-11-08
  • 1
    I agree. I've only encountered the term Markov chain when both the time-parameter space and state space is countable.2012-11-08
  • 0
    This was asked recently on the site.2012-11-08
  • 0
    I've a similar point on [meta](http://meta.math.stackexchange.com/questions/3902/tags-markov-process-and-markov-chains) some time ago2012-12-04
  • 2
    duplicate: http://math.stackexchange.com/q/229822012-12-05

0 Answers 0