6
$\begingroup$

I understand that a Markov chain involves a system which can be in one of a finite number of discrete states, with a probability of going from each state to another, and for emitting a signal.

Thus,an $N \times N$ transition matrix and an $N \times N$ emission matrix of real numbers adequately describe a Markov chain with $N$ states and $M$ emissions.

Is it possible to have a Markov chain with an infinite number of states? For example, if $N=2$ is a LED that can glow blue or red, $N=\infty$ would be a LED which can glow a color that is any mixture of blue or red.

Can't an infinitely-large matrix be represented by a function of two variables (the two indices)?

  • 0
    Is the set of states countably infinite, or uncountable?2012-09-12
  • 0
    Got something from an answer below?2018-06-28

2 Answers 2