0
$\begingroup$

Consider the time series defined by $$Y_t = \phi Y_{t-1}+ \epsilon_t + \theta \epsilon_{t-1}$$

Why is $E(\epsilon_{t} Y_{t}) = \sigma_{\epsilon}^{2}$?

  • 1
    You might get better results at stats.stackexchange.com. I'm not sure what the variables here are and therefore can't try to help.2012-07-03
  • 0
    This is indeed more appropiate for stats.stackexchange.com or dsp.stackexchange.com Actually, this is a ARMA(1,1) process2012-07-03

2 Answers 2

2

Simply work it out. By assumption, the white noise term $\varepsilon_t$ satisfies the following:

  1. $E[\varepsilon_t]=0$

  2. $E[\varepsilon^2_t]=\sigma_\varepsilon^2$

  3. $E[\varepsilon_t\varepsilon_s]=0$, for $t\neq s$

Now multiply $Y_t$ by $\varepsilon_t$ and use linearity of expectation. The white noise terms are all uncorrelated for different times, so their expectations vanish. Explictely, write out $Y_t$ as a geometric series by recursively using the equation for $Y_t$. Otherwise apply induction. The point is that $\varepsilon_t$ and $Y_{s}$ for $s

0

Use what sam mentioned but it is not necessary to write Yt out as a geometric series. Substitute ϕY$_t$$_−$$_1$+ϵ$_t$+θϵ$_t$$_−$$_1$ for Y$_t$ multiply by e$_t$ to get ϕϵ$_t$Y$_t$$_−$$_1$+ϵ$^2$$_t$+θ(ϵ$_t$$_−$$_1$ ϵ$_t$) take expectations.

You get ϕ E(ϵ$_t$Y$_t$$_−$$_1$) + E(ϵ$^2$$_t$) + θ E(ϵ$_t$$_−$$_1$ ϵ$_t$).

Now from condition 3 given by Sam E(ϵ$_t$$_−$$_1$ ϵ$_t$)=0 and since ϵ$_t$ is independent of Y$_t$$_-$$_1$,

E(ϵ$_t$Y$_t$$_−$$_1$)=0. So you are only left with E(ϵ$^2$$_t$) which you know is σ$^2$$_ϵ$.