0
$\begingroup$

The below homework question comes from Larsen and Marx, 4th edition.

Is the maximum likelihood estimator for $\sigma^{2}$ in a normal pdf, where both $\mu$ and >$\sigma^{2}$ are unknown, asymptotically unbiased?

I think I understand the notion that an estimator $\hat{\theta_{n}}$ is unbiased if the limit of its expected value as n goes to infinity is $\theta$, but I'm really not sure where to go with trying to answer the above question.

Any idea where to start?

  • 0
    Please make your question self-contained.2012-07-23

1 Answers 1

1

If $X_1,\ldots,X_n$ are iid $\mathcal{N}(\mu,\sigma^2)$ variables, then the maximum likelihood estimator of $\sigma^2$ is given by $ \hat{\sigma}_n^2=\frac{1}{n}\sum_{i=1}^n(X_i-\bar{X})^2, $ where $\bar{X}=\frac{1}{n}\sum_{i=1}^n X_i$ is the average. Recall that we usually do not use this estimator because it is biased. Instead we will often use $ s^2_n=\frac{1}{n-1}\sum_{i=1}^n (X_i-\bar{X})^2, $ because $E[s^2_n]=\sigma^2$. Now because $\hat{\sigma}^2_n = \frac{n-1}{n}s^2_n$ it follows that $ E[\hat{\sigma}_n^2]=\frac{n-1}{n}\sigma^2\to \sigma^2 $ as $n\to\infty$ and hence the maximum likelihood estimator is asymptotically unbiased.