$\hat{\theta}_{1}$ and $\hat{\theta}_{2}$ are unbiased, dependent estimators of $\theta$ with some $\rho$. I found that when $$\lambda = \frac{{\sigma_2}^{2} - \rho {\sigma_1} {\sigma_2}}{{\sigma_1}^{2} + {\sigma_2}^{2} - 2 \rho {\sigma_1} {\sigma_2}}$$, then the unbiased estimator $\lambda \hat{\theta}_{1} + (1 - \lambda) \hat{\theta}_{2}$ minimized mean square errors.
How do I find out for what value of $\rho$ will the linear combination of unbiased estimators not reduce var or not reduce MSE?
Thanks.