5
$\begingroup$

Let $X_1, ..., X_n$ be random variables with pdf $\frac 1 \lambda e^{-x / \lambda} I(x > 0).$

The goal is to find the best unbiased estimator of $h(\lambda) = e^{-\lambda}$ (incidentally, this is equal to P(X_1 > \lambda^2)). I'm currently tutoring some graduate students for a first year qualifying examination and in an embarrassing turn of events I've been unable to figure out the solution to this problem on a past exam. I'm sure I'm missing something obvious.

Some thoughts: the MLE of $h(\lambda)$ is $e^{-\bar X}$. The expectation can be calculated easily, but it is hopelessly biased:

$E[e^{-\bar X}] = M_{\bar X} (-1) = M_{X_1} \left(-\frac 1 n\right)^n = \left(1 + \frac \lambda n\right)^{-n}$

where $M_Y$ is the moment generating function of $Y$. There are only two ways of going about this that I can think of that are available to the students. One is to "guess" a $g(\bar X)$ such that $E[g(\bar X)] = e^{-\lambda}$. The other is to find an unbiased estimate of $e^{-\lambda}$ and Rao-Blackwell to finish it off (a prior part to the same question essentially sets up the ability to carry out such a calculation, so it is somewhat hinted at that this might work).

  • 1
    @Didier Piau: $\bar X$ is complete sufficient, so the best unbiased estimator must be a function of $\bar X$.2011-07-24

2 Answers 2

4

The distribution of $S=n\bar X_n$ has density $ P_S(\mathrm{d}s)=\frac{s^{n-1}}{(n-1)!}\mathrm{e}^{-s/\lambda}\frac{\mathrm{d}s}{\lambda^n}. $ Hence, for every $k\ge0$, $ E(S^k)=\frac{(n+k-1)!}{(n-1)!}\lambda^k. $ This yields $ \mathrm{e}^{-\lambda}=\sum_{k\ge0}(-1)^k\frac{\lambda^k}{k!}=\sum_{k\ge0}(-1)^k\frac{(n-1)!}{k!(n+k-1)!}E(S^k), $ which proves that an unbiased estimator of $h(\lambda)=\mathrm{e}^{-\lambda}$ is $H_n(\bar X_n)$ with $ H_n(x)=(n-1)!\sum_{k\ge0}(-1)^k\frac{n^k}{k!(n+k-1)!}x^k={}_0F_1(-nx;n). $

  • 0
    Thanks! It seems obvious now. One based on just $S$ is fine as well. So $\sum \frac{(-1)^k \Gamma(n) S^k}{k! \Gamma(n + k)}$ works fine as the final answer.2011-07-24
1

$e^{-\lambda} = \Pr(X_1 > 1)$. So let $ Y = \begin{cases}1 & \text{if }X_1> 1 \\ 0 & \text{otherwise}\end{cases} $ Then $Y$ is an unbiased estimator of $e^{-\lambda}$. By Rao-Blackwell, you just need $E(Y\mid X_1+\cdots+X_n) = \Pr(X_1>1 \mid X_1+\cdots+X_n)$.

I think that's not hard to find.

And since you have completeness, you're done!

  • 0
    This would be correct, but unfortunately you are using the rate parametrization whereas we are given the scale parametrization. e^{-\lambda} = P(X_1 > \lambda^2) \ne P(X_1 > 1).2011-07-25