0
$\begingroup$

Suppose $X_1,\ldots,X_n$ are iid normal random variables, $X_i \sim \mathcal{N}(\theta, \sigma^2)$. The method of moments gives estimations for $\theta$ and $\sigma^2$ by solving for them in $\bar{X}=\theta, \qquad \frac{1}{n}\sum X_i^2 = \theta^2+\sigma^2$ Giving $\theta = \bar{X}, \qquad \sigma^2=\frac{1}{n}\sum(X_i-\bar{X})^2$

Why does this give a biased estimation of $\sigma^2$? Since it seems to have $1/n$ instead of $1/(n-1)$ in front?

2 Answers 2

1

First of all you are missing some expectations throughout the post, i.e. $E[\bar{X}]=\theta$ etc. And yes, by replacing $\frac{1}{n}$ by $\frac{1}{n-1}$ you have that $ E\left[\frac{1}{n-1}\sum (X_i - \bar{X})^2\right] = \sigma^2 $

1

Denote $\hat\theta(X_1, \ldots, X_n)$ and $\hat{\sigma}^2(X_1, \ldots, X_n)$ the parameter estimators: $ \hat\theta = \frac{1}{n} \sum_{k=1}^n X_k \qquad \hat{\sigma}^2 = \frac{1}{n} \sum_{k=1}^n X_k^2 - \left( \frac{1}{n} \sum_{k=1}^n X_k \right)^2 = \frac{1}{n}\left(1-\frac{1}{n} \right) \sum_{k=1}^n X_k^2 - \frac{2}{n^2} \sum_{k=2}^n \sum_{m=1}^{k-1} X_k X_m $ To establish the bias, we need to compute the expectation of each estimator: $ \mathbb{E}(\hat{\theta}) = \frac{1}{n} \sum_{k=1}^n \mathbb{E}(X_k) = \frac{1}{n} \sum_{k=1}^n \mu = \mu $ $ \mathbb{E}(\hat{\sigma}^2) = \frac{1}{n} \left(1-\frac{1}{n}\right) \sum_{k=1}^n \mathbb{E}(X_k^2) - \frac{2}{n^2} \sum_{k=2}^n \sum_{m=1}^{k-1} \mathbb{E}\left(X_k X_m\right) \\= \frac{1}{n} \left(1-\frac{1}{n}\right) \left( \sum_{k=1}^n (\mu^2 + \sigma^2) \right) - \frac{2}{n^2} \sum_{k=2}^n \sum_{m=1}^{k-1} \mu^2 \\= \left(1-\frac{1}{n}\right) (\mu^2+\sigma^2) - \frac{2}{n^2} \frac{n(n-1)}{2} \mu^2 \\ = \left(1-\frac{1}{n}\right) \sigma^2 = \frac{n-1}{n} \sigma^2 $ Since $\mathbb{E}(\hat{\sigma}^2) \not= \sigma^2$, the estimator is biased.