0
$\begingroup$

$\hat{\theta}_{1}$ and $\hat{\theta}_{2}$ are unbiased, dependent estimators of $\theta$ with some $\rho$. I found that when $$\lambda = \frac{{\sigma_2}^{2} - \rho {\sigma_1} {\sigma_2}}{{\sigma_1}^{2} + {\sigma_2}^{2} - 2 \rho {\sigma_1} {\sigma_2}}$$, then the unbiased estimator $\lambda \hat{\theta}_{1} + (1 - \lambda) \hat{\theta}_{2}$ minimized mean square errors.

How do I find out for what value of $\rho$ will the linear combination of unbiased estimators not reduce var or not reduce MSE?

Thanks.

  • 0
    What is $\rho$, and what do you mean by the linear combination not reducing variance or MSE?2017-02-22
  • 0
    ρ is the correlation coefficient. How do I find some ρ such that this will not reduce MSE for the unbiased estimator λθ^1+(1−λ)θ^2 compared to MSE of θ^1 and θ^22017-02-22
  • 0
    What you're asking then is when is the desired linear combination that with $\lambda=0,1$. This occurs when $\rho=\frac{\sigma_2}{\sigma_1}$ or $\frac{\sigma_1}{\sigma_2}$.2017-02-22

0 Answers 0