0
$\begingroup$

A single observation is made from a poisson distribution with unknown mean $\lambda \geq 0$ However any value greater than 2 has been rounded down to 2. This we have the observed value of a single random variable X having distribution depending on $\lambda $ given by; $\ P(X=0) = e^{-\lambda}. P(X=1) = \lambda e^{-\lambda} P(X=2) = 1 - (1+\lambda)e^{-\lambda}$
Parameterise the distribution by $\ \theta = e^{-\lambda} \in (0,1] $ Show that there is a unique unbiased estimator of $\theta$.

So I parameterise it; $\ P(X=0) = \theta$ $\ P(X=1) = -\theta log\theta$ $\ P(X=2) = 1-(1-log\theta)\theta$

But I have no idea how to show there is a unique unbiased estimator. Also this is not a homework question, it is a practice paper question.

  • 1
    Just a suggestion, I haven't checked if it works: $\theta$ is a function of $X$. Let $\hat \theta$ and $\tilde \theta$ be two estimators of $\theta$, and enumerate $X$ ($\hat \theta$ and $\tilde \theta$ have unique values for all 3 possible X).2012-05-27
  • 0
    Oops, what I meant, of course, was that any /estimator/ of $\theta$ is a function of $X$.2012-05-27

1 Answers 1

2

Go back to the definitions: an estimator is a function of the observation, hence let us call $u_0$ the estimate if the observation is $0$, $u_1$ if the observation is $1$ and $u_2$ if the observation is $2$. There is no bias if $$ u_0\mathrm e^{-\lambda}+u_1\lambda\mathrm e^{-\lambda}+u_2(1-(1+\lambda)\mathrm e^{-\lambda})=\mathrm e^{-\lambda}. $$ This identity should hold for every $\lambda$ and the estimates $u_0$, $u_1$ and $u_2$ should be independent on $\lambda$, hence one asks that $$ u_0+u_1\lambda+u_2(\mathrm e^{\lambda}-1-\lambda)=1, $$ uniformly over $\lambda\gt0$. Thus, $\underline{\qquad\qquad\qquad}$.

  • 0
    I don't understand, $\ u2(1−(1+\lambda)e^{-\lambda}$ = $\ u2(1-e^{-\lambda}-\lambda e^{-\lambda}$ ? So how have you divided by $\ e^{-\lambda}$ ?2012-05-27
  • 0
    Yes. Or, equivalently, multiplied everything by $e^\lambda$.2012-05-27
  • 0
    I still don't understand how you have divided by e^-lambda? you would end up with (u2/ $\ e^{-\lambda})-1-\lambda $2012-06-02
  • 0
    To divide by $e^{-\lambda}$ is to multiply by $e^\lambda$. And $e^\lambda(1-(1+\lambda)e^{-\lambda})=e^\lambda-(1+\lambda)=e^\lambda-1-\lambda$ since $e^\lambda e^{-\lambda}=1$.2012-06-02
  • 0
    Of course! Sorry for missing that.. I don't know how I did. But Im sorry to say I dont really understand what's going on here I can see why you have said & done what you have but where does this lead us?2012-06-03
  • 0
    Do you know some $u_0$, $u_1$ and $u_2$ such that $u_0+u_1\lambda+u_2(\mathrm e^\lambda-1-\lambda)=1$ for every $\lambda\gt0$?2012-06-03
  • 0
    No? Sorry to be useless but I just really dont understand this question, are u1 etc meant to be numbers or??2012-06-04
  • 0
    Yes: do you know some numbers $u_0$, $u_1$ and $u_2$ such that for every positive $\lambda$...2012-06-04
  • 0
    estimator is not a function of observations.2012-12-07
  • 1
    @SeyhmusGüngören Oh yeah? Please share your thoughts.2013-01-09
  • 0
    @did Yes. Estimate is a function of observations, not estimator. How can you calculate the variance of a function of some data samples? There is a single variance for each estimator, indep. of what the observations are..2013-01-09
  • 1
    @SeyhmusGüngören An estimate is the image of the observations by some function. The function itself is an estimator. Hence the phrase you object to is correct while the subsequent uses of *estimator*, which you did not object to, can be replaced, if one is nitpicky, by *estimate*.2013-01-09
  • 0
    @did who is your another fun apart from me, upvoting ur comments?)2013-01-09