0
$\begingroup$

A single observation is made from a poisson distribution with unknown mean $\lambda \geq 0$ However any value greater than 2 has been rounded down to 2. This we have the observed value of a single random variable X having distribution depending on $\lambda $ given by; $\ P(X=0) = e^{-\lambda}. P(X=1) = \lambda e^{-\lambda} P(X=2) = 1 - (1+\lambda)e^{-\lambda}$
Parameterise the distribution by $\ \theta = e^{-\lambda} \in (0,1] $ Show that there is a unique unbiased estimator of $\theta$.

So I parameterise it; $\ P(X=0) = \theta$ $\ P(X=1) = -\theta log\theta$ $\ P(X=2) = 1-(1-log\theta)\theta$

But I have no idea how to show there is a unique unbiased estimator. Also this is not a homework question, it is a practice paper question.

  • 0
    Oops, what I meant, of course, was that any /estimator/ of $\theta$ is a function of $X$.2012-05-27

1 Answers 1

2

Go back to the definitions: an estimator is a function of the observation, hence let us call $u_0$ the estimate if the observation is $0$, $u_1$ if the observation is $1$ and $u_2$ if the observation is $2$. There is no bias if $ u_0\mathrm e^{-\lambda}+u_1\lambda\mathrm e^{-\lambda}+u_2(1-(1+\lambda)\mathrm e^{-\lambda})=\mathrm e^{-\lambda}. $ This identity should hold for every $\lambda$ and the estimates $u_0$, $u_1$ and $u_2$ should be independent on $\lambda$, hence one asks that $ u_0+u_1\lambda+u_2(\mathrm e^{\lambda}-1-\lambda)=1, $ uniformly over $\lambda\gt0$. Thus, $\underline{\qquad\qquad\qquad}$.

  • 0
    @did who is your another fun apart from me, upvoting ur comments?)2013-01-09