Consider two processes, both have mean $\mu$. Meanwhile the variance of the first process is $\sigma^2$ (with sample size $n$), and the variance of the second process is $4\sigma ^2$ (with sample size m). First I proved that $$\hat{\mu}=a\bar{X}+(1-a)\bar{Y}$$ is an unbiased estimator for $\mu$. Now I want want to find the value of $a$ for which the value of the variance of $\hat{\mu}$ minimum.
My attempt: I found that the variance of $\hat{\mu}$ is given by $$a^2\frac{\sigma^2}{n}+(1-a)^2\frac{4\sigma^2}{m}$$ Next I want to minimize this by setting the derivative with respect to $a$ to $0$?