I have $X_i=\beta X_{i-1}+\varepsilon_i$ for $i=1,2,...$, where $\varepsilon_i$ are iid with mean $\mu$ and variance $\sigma^2$ , $0<\beta<1$ , and $X_0=0$ . I've calculated $X_i=\sum_{j=1}^i \varepsilon_j \beta ^{i-j}$ and $E(X_i)=\mu((\beta^i-1)/(\beta-1))$, but I have no idea how to calculate the variance of $X_i$. Specifically, I don't know how to calculate the $E(X_i^2)$ part of $Var(X_i)=E(X_i^2)-[E(X_i)]^2$. This seems not too difficult, but I'm having a hard time because of the summation of correlated variables.
variance of a sum of correlated variables
0
$\begingroup$
probability-theory
-
0There should not be a problem, since $E(X_i^2)=\beta^2E(X_{i-1}^2)+2\beta E(X_{i-1}\varepsilon_i)+E(\varepsilon_i^2)$. The middle term is the product of the expectations, since $X_{i-1}$ and $\varepsilon_i$ are independent. – 2012-12-13