2
$\begingroup$

I have the following sample variance estimator :

$ s^2 =\frac1 {2n} \sum^n_2 (e_i - e_{i-1})^2$ where $e_i$ are iid with mean 0 and an homoskedastic variance. I want to show that this estimator converges in probability to the variance of $e_i$. I know that I can reduce this to approx $$ \frac 1 n \sum^n_1 e_i^2 - \frac 1 n \sum^n_2 e_i e_{i-1} $$

but from now on, I don't understand why this trivially converges to the variance of $e_i$.

Second question, assuming it does converges to the variance of $e_i$, what would be the variance of $(s^2 - var(e_i) )$ ?

thank you

  • 1
    Found the answer to the first part, the term we substract is E(ei*ei-1) = 0 because they are iid. Still interested in the 2nd part.2017-01-21

1 Answers 1

0

For the first part: Note that by WLLN the first term converges to $Ee_i^2=var(e_i)=\sigma^2$ and the second to $Ee_iEe_{i-1}=Ee_iEe_{i-1}=0$, thus $s^2\xrightarrow{p}\sigma^2$ as $n\to \infty$.

For the second part note that $var(s^2-var(e_i))=var(s^2)=Es^4-E^2s^2=Es^4-\sigma^4$. Assuming that $Ee_i^4 < \infty$ you can get an expression for this using simple algebra, but for more concrete answer I think you will have to assume some structure on $e_i$.

  • 0
    Thank you, but how do you find an expression for Es^4 - sigma^4 , taking E(ei^4) as a variable.2017-01-21