2
$\begingroup$

this relates to an unanswered question I posted a few days ago:

Let $\{ X_t : t = 1, 2, 3 \dots \}$ follow a 2-state Markov chain with transition matrix P. Does the Markov property mean I can break following expectations up as follows:

$E_t \Big [ X_{t+1}\ X_{t+2}\Big ] = E_t [ X_{t+1}]\ E_t[ X_{t+2}]$

$E_t \Big [ X_{t+1}\ X_{t+2}\ X_{t+3} \Big ] = E_t [ X_{t+1}]\ E_t[X_{t+2}]\ E_t[X_{t+3} ]$

Intuitively the answer is 'no' since I now have path dependent expectations?

Thanks, Paul

  • 0
    $P[XY]=P[X]P[Y]$ implies random variables are independent. Independent RVs is a very special case of an MC and it certainly does not follow directly from the weak Markov property2012-10-13

1 Answers 1

2

No. For instance, the next value could be $+1$ or $-1$ with equal probabilities, and after that a transition might be certain to occur to the other of those two values; then the individual expectation values would be $0$, but the expectation value of the product would be $-1$, since the next two values are strongly anti-correlated.

  • 0
    Right, so even if one starts with Markovian dynamics for the state, if you want to compute expectations like: $$ E_t \Big [ X_{t+1}\ X_{t+2}\Big ] $$ one would need to include the covariance term: $$ = E_t [X_{t+1}] E_t[X_{t+2}] + Cov_t [X_{t+1} , X_{t+2}] $$ Since the problem is now path dependent. So this covariance is easy but with larger products, eg: $E_t[X_{t+1} X_{t+2} X_{t+3}]$, it becomes quite messy. Is there an elegant solution?2012-10-14
  • 0
    Thanks for your comment anyway.2012-10-14
  • 0
    @Luap: I don't know. And you're welcome.2012-10-14