2
$\begingroup$

$x_1$, $x_2$, and $x_3$ are i.i.d. normal random variables with distribution $N(0, \sigma_x^{2})$

$\epsilon_1$, $\epsilon_2$, and $\epsilon_3$ are i.i.d. normal random variables with distribution $N(0, \sigma_n^{2})$

lets define :

$y_1 = x_1 + \epsilon_1$

$y_2 = x_2 + \epsilon_2$

$y_3 = x_3 + \epsilon_3$

given the values for :

$s_1 = y_1 + y_2$

$s_2 = y_1 + y_3$

what is the posterior distribution $f(x_1,x_2,x_3|s_1,s_2)$ ?

what is the ML estimate for $x_1$, $x_2$, and $x_3$ ?

as an example, assume $\sigma_x=2, \sigma_n=1, s_1=2, s_2=1$ , we are interested in an estimate for $x_1$, $x_2$, and $x_3$.

Thanks, MG

1 Answers 1

1

(Edited: I seemed to have forgotten an $\epsilon$ in my earlier answer and would now say)

$\hat{x}_1 = \frac{ 4 }{5}$, $\hat{x}_2 = \frac{ 4 }{5}$, and $\hat{x}_3 = 0 $, with $\hat{\epsilon}_1 = \frac{ 1 }{5}$, $\hat{\epsilon}_2 = \frac{ 1 }{5}$, and $\hat{\epsilon}_3 = 0 $.

Generalising further with some justification added as a response to a comment:

This stat.SE question and answer gives an explanation of how to derive this. We will take all means to be zero.

The slightly harder part is estimating $Y_1$,$Y_2$ and $Y_3$. They each have variance $\sigma_y^2=\sigma_x^2+\sigma_{\epsilon}^2 $. So $S_1$ and $S_2$ each have variance $2\sigma_y^2$ and have covariance $\sigma_y^2$. Let's define

$T_1 = 2Y_1$ $T_2=Y_2+Y_3$ $T_3 = S_1+S_2 = 2Y_1 +Y_2+Y_3=T_1+T_2 $ $T_4=S_1-S_2=Y_2-Y_3$

so $T_1$ has variance $4\sigma_y^2$ and $T_2$ has variance $2\sigma_y^2$ and they are uncorrelated. This means $T_3$ has variance $6\sigma_y^2$ and $T_4$ has variance $4\sigma_y^2$ and they too are uncorrelated.

The observation of $s_1$ and $s_2$ tells us now that $t_3 = s_1 + s_2$. From this we can get maximum likelihood estimates of $t_1$ and $t_2$ in proportion to their variances so

$\hat{t}_1 = \frac{2(s_1 + s_2)}{3}$ $\hat{t}_2 = \frac{s_1 + s_2}{3}$

but since $2Y_1 = T_1$ this gives

$\hat{y}_1 = \frac{s_1 + s_2}{3}$

and we also have from the observation $t_4 = s_1 - s_2$, and from the definitions we have $2Y_2=T_2+T_4$ and $2Y_3=T_2-T_4$, giving us

$\hat{y}_2 = \frac{2s_1-s_2}{3} $ $\hat{y}_3 = \frac{2s_2-s_1}{3} .$

Getting from $\hat{y}_i$ to $\hat{x}_i$ is easier, since from the stat.SE question, we simply do this in proportion to variance, so $\hat{x}_i = \hat{y}_i \frac{\sigma_x^2}{\sigma_x^2+\sigma_{\epsilon}^2}$ giving

$\hat{x}_1 = \frac{(s_1 + s_2)\sigma_x^2}{3(\sigma_x^2+\sigma_{\epsilon}^2)}$ $\hat{x}_2 = \frac{(2s_1-s_2)\sigma_x^2}{3(\sigma_x^2+\sigma_{\epsilon}^2)} $ $\hat{x}_3 = \frac{(2s_2-s_1)\sigma_x^2}{3(\sigma_x^2+\sigma_{\epsilon}^2)} $

and

$\hat{\epsilon}_i = \hat{x}_i \frac{\sigma_{\epsilon}^2}{\sigma_x^2}.$

  • 0
    @Mahmoud Ghandi: I expect you are correct. My answer came from having started down a road, found an error, expanded the discussion, and so I continued down the same road.2011-04-06