0
$\begingroup$

I have $X_i=\beta X_{i-1}+\varepsilon_i$ for $i=1,2,...$, where $\varepsilon_i$ are iid with mean $\mu$ and variance $\sigma^2$ , $0<\beta<1$ , and $X_0=0$ . I've calculated $X_i=\sum_{j=1}^i \varepsilon_j \beta ^{i-j}$ and $E(X_i)=\mu((\beta^i-1)/(\beta-1))$, but I have no idea how to calculate the variance of $X_i$. Specifically, I don't know how to calculate the $E(X_i^2)$ part of $Var(X_i)=E(X_i^2)-[E(X_i)]^2$. This seems not too difficult, but I'm having a hard time because of the summation of correlated variables.

  • 0
    There should not be a problem, since $E(X_i^2)=\beta^2E(X_{i-1}^2)+2\beta E(X_{i-1}\varepsilon_i)+E(\varepsilon_i^2)$. The middle term is the product of the expectations, since $X_{i-1}$ and $\varepsilon_i$ are independent.2012-12-13

1 Answers 1

1

This follows from your formula for $X_i$ as a linear combination of independent random variables: $ X_i=\sum_{j=1}^i\beta^{i-j}\varepsilon_j\implies\mathrm{var}(X_i)=\sum_{j=1}^i\beta^{2(i-j)}\mathrm{var}(\varepsilon_j)=\sigma^2\sum_{j=1}^i\beta^{2(i-j)}=\sigma^2\frac{1-\beta^{2i}}{1-\beta^2}.$

  • 0
    Do you think this is a paradox? What is to explain? The distribution of $X_i$ converges and $\sqrt{i}\to\infty$ hence $X_i/\sqrt{i}\to0$ in distribution. And convergence in distribution to a given constant $c$ is equivalent to convergence in probability to $c$.2012-12-15