I'm taking this example from a paper I'm reading. I'm having trouble understanding the logic, and I'm hoping someone can help me.
Let $ v_{t}=a \ x_{t}+u_{t} $
where $a$ is some constant and $u$ is a mean zero random variable such that $E[u_{t}x_{t}]=0$ for all $t$.
If $E[x_{t-1} \, x_{t}]=0$ and $E[u_{t} \, x_{t-1}]=0$ then $E[x_{t-1} \, v_{t}] = 0$ This part I can clearly see.
Now Suppose $x_t= p\, x_{t-1} +e_t$ where $e$ is a mean zero white noise term and $|p|<1$.
Now we have $E[x_{t-1} \, v_{t}] = a \, p\, {\rm Var}(x_t) \neq 0$ even if $E[u_{t}\, x_{t-1}]=0$ (Still good)
But, then it goes on to say that if: $ E[x_{t-1}\, u_{t} | x_t]=0$, then we get that $E[x_{t-1}\, v_{t} | x_t] = 0$ and I do not see where that comes from, we still have the same problem as before.
$E[x_{t-1}\, v_{t} | x_t] = E[x_{t-1} (a \, x_{t} + u_{t})|x_t] = a \, x_t x_{t-1} \neq 0$
Am I missing something?