2
$\begingroup$

Let's say I have 3 Random Variables $X_1, X_2, X_b$ where "$b$" stands for "background". Each one of them is Gaussian with $N(\mu_i, \sigma^2_i)$ for $ i\in\{1,2,b\}$. I will assume $\mu_b=0$. Now I make $N$ experiments which measure the variables $X_1+X_b, X_2+X_b$ (where $X_b$ is measured at the same time for both of them) and I want to estimate all the $\mu_i$'s and $\sigma_i$'s.

I know how the estimate the means by taking the average of the results (because $\mu_b = 0$). Also I can easily estimate $\sigma_j^2 + \sigma_b^2$ because if the fact that $X_j+X_b = N(\mu_j,\sigma_j^2 + \sigma_b^2)$... but I need another estimator so I can get all the values for $\sigma$'s!

I thought about using the off-diagonal elements of the co-variance matrix (which are supposed to be $\sigma_b^2$) but I get huge problem when they are negative. Can someone help me find the missing estimator?

  • 0
    But I know that $V(Y_1) - V(Y_2) = \sigma_1^2 - \sigma_2^2 $ and then using your equation I find that $\sigma_2^2 = \frac{V(Y_1−Y_2) - V(Y_1)+V(Y_2)}{2} $ and I think the right side can be negative for some values of $Y_1,Y_2$ :(2017-02-20
  • 0
    I know they are not equal. But using the fact that $V(Y_j) = \sigma_j^2 + \sigma_b^2$ I can get after some algebra what I wrote above... and it's not always negative (I think)2017-02-20
  • 0
    If my Comment is not helpful, please disregard it and try something else.2017-02-20

0 Answers 0