4
$\begingroup$

Suppose we observe $Y_i\sim \mathcal{N}(\theta_0 + \theta_1 x_i, \sigma_i^2)$, with $x_i$ and $\sigma_i^2$ known for all $i = 1,\ldots,n$ and $Y_1,\ldots,Y_n$ independent. Assume $\theta_0$ is unknown and $\overline{x}=0.$

What is the MLE of $\theta_1$? The fact that the variances are different is throwing me off. I end up getting that I should maximize $\exp{\frac{\sum_i 2y_i(\theta_0 + \theta_1 x_i) - (\theta_0+\theta_1 x_i)^2}{2\sigma_i^2}}.$ From there I'm stuck because taking partial derivatives doesn't give me anything.

Thanks!

1 Answers 1

1

The likelihood function is

$L(\theta_0, \theta_1) = \exp\left( -\sum_i \frac{(y_i - \theta_0 - \theta_1 x_i)^2}{2\sigma_i^2}\right)$

and taking partial derivatives gives

$\frac{\partial L}{\partial\theta_0} = \left(\sum_i \frac{y_i - \theta_0 - \theta_1 x_i}{\sigma_i^2} \right) L(\theta_0,\theta_1)$

$\frac{\partial L}{\partial\theta_1} = \left(\sum_i \frac{(y_i - \theta_0 - \theta_1 x_i)x_i}{\sigma_i^2} \right) L(\theta_0,\theta_1)$

Setting both of these to zero, we find that we must solve

$\theta_0 \sum_i \frac{1}{\sigma_i^2} + \theta_1 \sum_i \frac{x_i}{\sigma_i^2} = \sum_i \frac{y_i}{\sigma_i^2}$

$\theta_0 \sum_i \frac{x_i}{\sigma_i^2} + \theta_1 \sum_i \frac{x_i^2}{\sigma_i^2} = \sum_i \frac{x_iy_i}{\sigma_i^2}$

which is easily done using linear algebra.

  • 0
    Thanks! This helps immensely. I hadn't taken partials with respect to $\theta_0$ along with $\theta_1$ and that was messing me up.2011-05-10