Suppose that X and Y are independent Poisson distributed values with means 3θ and θ, respectively. Consider the combined estimator of θ ˜θ = k1X + k2Y where k1 and k2 are arbitrary constants. (a) Find the condition on k1 and k2 such that ˜θ is an unbiased estimator of θ. (b) For what values of k1 and k2 will the combined estimator ˜θ = k1X + k2Y be an unbiased estimator with smallest variance amongst all such linear combinations? (c) Given observations x and y find the maximum likelihood estimate of θ. I've got part (a) which is k1+k2=1 and (b) which is k1=3/4 and k2=1/4. I just cant get part c..
Finding the maximum likelihood estimate of θ.
1 Answers
The idea is to write out a likelihood function and find the $\theta$ in the parameter space that maximizes this likelihood given the observations. The likelihood is proportional to the joint density: $$\mathcal L(\theta \mid x,y) \propto f_{X,Y}(x,y \mid \theta) \overset{\text{ind}}{=} e^{-3\theta} \frac{(3\theta)^x}{x!} e^{-\theta} \frac{\theta^y}{y!}, \quad \theta > 0, \quad x, y \in \{0, 1, 2, \ldots \}.$$ Thus, as a function of $\theta$, the likelihood is proportional to $$\mathcal L(\theta \mid x,y) \propto e^{-4\theta} \theta^{x+y}.$$ Note we have discarded any factors that are not functions of $\theta$. The log-likelihood is then $$\ell(\theta \mid x,y) = -4\theta + (x+y) \log \theta,$$ and its derivative is $$\frac{\partial \ell}{\partial \theta} = -4 + \frac{x+y}{\theta}.$$ Thus $\ell$ is maximized at a critical point $\hat\theta$ satisfying $\partial \ell/\partial \theta = 0$, or $$\hat \theta = \frac{x+y}{4},$$ and we can check that this is indeed a maximum.
This result should call into question your calculation for part (a): note that the expectation of $X$ is $\operatorname{E}[X] = 3\theta$, and the expectation of $Y$ is $\operatorname{E}[Y] = \theta$, thus the expectation of a linear combination of the two is $$\operatorname{E}[k_1 X + k_2 Y] = k_1 \operatorname{E}[X] + k_2 \operatorname{E}[Y] = k_1 (3\theta) + k_2 \theta = (3k_1 + k_2)\theta,$$ and in order for this to equal $\theta$, you must have $3k_1 + k_2 = 1$, not $k_1 + k_2 = 1$. This would also suggest your calculation of the minimum variance unbiased estimator among this family of estimators is not correct.
-
0@Pkr96 No, that is not correct, since $k_1 = 1/4$ and $k_2 = 3/4$ does not satisfy the constraint $3k_1 + k_2 = 1$. – 2017-02-22