5
$\begingroup$

When will a probabilistic process obtained by an "abstraction" from a deterministic discrete process satisfy the Markov property?

Example #1) Suppose we have some recurrence, e.g., $a_t=a^2_{t-1}$, $t>0$. It's a deterministic process. However, if we make an "abstraction" by just considering the one particular digit of each $a_t$, we have a probabilistic process. We wonder whether it satisfy a Markov property or not?

Example #2) Suppose we have a finite state automaton. Now we make an "abstraction" by grouping the states into sets of states and obtaining a probabilistic finite state automaton. We consider this automaton in time and we wonder whether it satisfies the Markov property or not.

The particular examples are not important, of interest are some general conditions when a deterministic process becomes a Markov process after an "abstraction" of the kind above (in any context). I'm looking for any references on this matter.


Edit: as pointed out in the comments below, the examples #1 and #2 were not well specified. Now there's a distribution on the starting state $a_0$ and $a_t=f(a_{t-1})$ is a deterministic function. Then, $a_t$, $t\geq 0$ is a degenerate Markov chain. Now the question is whether grouping some of the states of such a chain can yield a Markov chain (i.e., pointers to literature where the conditions would be discussed is required).

A more general problem seems to be "Given any Markov chain (i.e., not a degenerate one), group the states into sets: what are the conditions under the resulting process satisfies the Markov assumption?"

  • 0
    It depends on the way of abstraction, of course ) sometimes it is focused to provide exactly Markov process. Unfortunately I cannot give an extended answer now - I try to do it tomorrow2011-10-25

3 Answers 3