Let $X_1, ..., X_n$ be random variables with pdf $\frac 1 \lambda e^{-x / \lambda} I(x > 0).$
The goal is to find the best unbiased estimator of $h(\lambda) = e^{-\lambda}$ (incidentally, this is equal to P(X_1 > \lambda^2)). I'm currently tutoring some graduate students for a first year qualifying examination and in an embarrassing turn of events I've been unable to figure out the solution to this problem on a past exam. I'm sure I'm missing something obvious.
Some thoughts: the MLE of $h(\lambda)$ is $e^{-\bar X}$. The expectation can be calculated easily, but it is hopelessly biased:
$E[e^{-\bar X}] = M_{\bar X} (-1) = M_{X_1} \left(-\frac 1 n\right)^n = \left(1 + \frac \lambda n\right)^{-n}$
where $M_Y$ is the moment generating function of $Y$. There are only two ways of going about this that I can think of that are available to the students. One is to "guess" a $g(\bar X)$ such that $E[g(\bar X)] = e^{-\lambda}$. The other is to find an unbiased estimate of $e^{-\lambda}$ and Rao-Blackwell to finish it off (a prior part to the same question essentially sets up the ability to carry out such a calculation, so it is somewhat hinted at that this might work).