This is a solution to 1 of my tutorial questions, anybody got any clue as to the step:
$\frac{1}{nI(\sigma^2)} = \frac{2\sigma^2}{n}$ ?
Thanks.
This is a solution to 1 of my tutorial questions, anybody got any clue as to the step:
$\frac{1}{nI(\sigma^2)} = \frac{2\sigma^2}{n}$ ?
Thanks.
If you compute the second derivative of the log likelihood function or equivalently the first derivative of the score vector (since the score vector by definition is the first derivative of the log likelihood function) you get $\frac{ds}{d\sigma^2}=\frac{n}{2\sigma^4}-\frac{1}{\sigma^6}\sum_{i=1}^n{(x_i-u_i)^2}$ Now taking expectations yields:
$E(\frac{ds}{d\sigma^2})=\frac{n}{2\sigma^4}-\frac{n}{\sigma^6}E\frac{\sum_{i=1}^n{(x_i-u_i)^2}}{n}=\frac{n}{2\sigma^4}-\frac{n}{\sigma^4}=-\frac{n}{2\sigma^4}$
Lastly by the fisher information matrix equality we have
$I(\sigma^2)=-E(\frac{ds}{d\sigma^2})=\frac{n}{2\sigma^4}$
Remark1: As you may be aware of, we are assuming that $\mu$ is known above. As is standard I also denote the score vector by $s()$, i.e. $s(\sigma^2)=\frac{dl}{d\sigma^2}$