2
$\begingroup$

Suppose that $X_1,\ldots,X_n$ are normal with mean $\mu_1$; $Y_1,\ldots,Y_n$ are normal with mean $\mu_2$; and $W_1,\ldots,W_n$ are normal with mean $\mu_1+\mu_2$. Assuming that all $3n$ random variables are independent, with a common variance, find the maximum likelihood estimators of $\mu_1$ and $\mu_2$.

Solving for $\mu_1$ using $X_1,\ldots,X_n$ I got $\mu_1$ = the sample mean of $X$. Similarly, solving for $\mu_2$ using $Y_1,\ldots,Y_2$ I got $\mu_2$ = the sample mean of $Y$. However, I'm not sure if that's what this question meant, especially since I don't understand what the purpose of giving the mean of $W_1,\ldots,W_n$ is.

  • 0
    Maybe your title should say "Maximum likelihood estimators of the means of three independent normal random variables with a common variance".2012-11-19

2 Answers 2

3

The likelihood function based on $X_1,\ldots,X_n,Y_1,\ldots,Y_n,W_1,\ldots,W_n$ is given by $ L(\mu_1,\mu_2,\sigma^2)=\left[\prod_{i=1}^n \frac{1}{\sqrt{2\pi\sigma^2}}\exp\left(-\frac{1}{2\sigma^2}(x_i-\mu_1)^2\right)\right]\times\left[\prod_{i=1}^n \frac{1}{\sqrt{2\pi\sigma^2}}\exp\left(-\frac{1}{2\sigma^2}(y_i-\mu_2)^2\right)\right]\times \left[\prod_{i=1}^n \frac{1}{\sqrt{2\pi\sigma^2}}\exp\left(-\frac{1}{2\sigma^2}(w_i-(\mu_1+\mu_2))^2\right)\right] $

Then the log-likelihood function is $ l(\mu_1,\mu_2,\sigma^2)=-\frac{3n}{2}\log(2\pi\sigma^2)-\frac{1}{2\sigma^2}\left(\sum_{i=1}^n(x_i-\mu_1)^2+\sum_{i=1}^n(y_i-\mu_2)^2+\sum_{i=1}^n(w_i-\mu_1-\mu_2)^2\right) $ and so $ \frac{\partial l}{\partial \sigma^2}(\mu_1,\mu_2,\sigma^2)=-\frac{3n}{2\sigma^2}+\frac{1}{2(\sigma^2)^2}\left(\sum_{i=1}^n(x_i-\mu_1)^2+\sum_{i=1}^n(y_i-\mu_2)^2+\sum_{i=1}^n(w_i-\mu_1-\mu_2)^2\right). $ This is equal to $0$ if and only if $ \sigma^2=\hat{\sigma}^2=\frac{1}{3n}\left(\sum_{i=1}^n(x_i-\mu_1)^2+\sum_{i=1}^n(y_i-\mu_2)^2+\sum_{i=1}^n(w_i-\mu_1-\mu_2)^2\right). $ Thus $ l(\mu_1,\mu_2,\sigma^2)\leq l(\mu_1,\mu_2,\hat{\sigma}^2)=-\frac{3n}{2}\log(2\pi\hat{\sigma}^2)-\frac{3n}{2}. $ Now maximize this expression with respect to $\mu_1$ and $\mu_2$.

  • 0
    Aha, I see your point :)2012-11-19
0

It is useful to remember that $ \sum_{i=1}^n (x_i-\mu)^2 = n(\overline{x}-\mu)^2 + \sum_{i=1}^n (x_i-\overline{x})^2,\text{ where }\overline{x} = \frac{x_1+\cdots+x_n}{n}. $ Then you can rely on the fact that $\mu$ appears only in the first term on the right. The value of $\mu$ that minimizes this is therefore the value that minimizes the first term. That is $\hat\mu=\overline{x}$.

In the present problem we need to minimize $ \sum_{i=1}^n (x_i-\mu_1)^2 + \sum_{i=1}^n (y_i-\mu_2)^2 + \sum_{i=1}^n (w_i-(\mu_1+\mu_2))^2 $ $ = n(\overline{x}-\mu_1)^2 + n(\overline{y}-\mu_2)^2 + n(\overline{w}-(\mu_1+\mu_2))^2+\text{terms not depending on $\mu_1$ or $\mu_2$}. $ $ = n\left(2\mu_1^2+2\mu_2^2 + 2\mu_1\mu_2-2\mu_1(\overline{w}+\overline{x})-2\mu_2(\overline{w}+\overline{y}) + \text{terms not depending on $\mu_1$ or $\mu_2$}\right). $ Next you could talk about a rotation in the $(\mu_1,\mu_2)$ plane, but maybe it's more efficient at this point just to find partial derivatives with respect to $\mu_1$ and $\mu_2$ and set them to $0$ and solve for those two variables.

I get $ \mu_2=\overline{w}+\text{a certain weighted average of $\overline{x}$ and $\overline{y}$}. $ A similar thing will apply to $\mu_1$.

  • 0
    Now I'm a bit suspicious because I'm wo$n$dering if it shouldn't have been a weighted average of all three.....2012-11-19