Let $X_1, \ldots, X_n \sim X$ be iid random variables. The goal is to find an estimator of $\theta = \mathbb{P}(X > t)$ for a given $t > 0$.
1) Show that $\hat{\theta}_1 = 1 - F_n(t)$ is an unbiased estimator of $\theta$ and find its MSE ($F_n$ is the empirical distribution function).
2) Let now $X_1, \ldots, X_n$ be exponential($\lambda$). Find the UMVUE of $\theta$. (Hint: $X_1$ and $X_1/\sum_{i=1}^n X_i$ are independent and the pdf of $X_1/\sum_{i=1}^n X_i$ is : $(n-1)(1-x)^{n-2} \mathbb{I}_{ x \in (0,1) }$)
Some thoughts
1) If I'm not wrong:
$ \hat{\theta}_1 = 1 - F_n(t) = \frac{1}{n} \sum_{i=1}^{n} \mathbb{I}_{\{X_i > t \}} $
Therefore $n \hat{\theta}_1 \sim \mathrm{Binomial}(n, 1 - F_X(x))$ and then:
$ \mathbb{E}(\hat{\theta}_1) = \frac{1}{n}\mathbb{E}(n\hat{\theta}_1) = 1 - F_X(x) = \mathbb{P}(X > x) = \theta \qquad \mathrm{(unbiased)} $ $ \mathrm{MSE}(\hat{\theta}_1) = \mathbb{V}(\hat{\theta}_1) = \frac{1}{n^2}\mathbb{V}(n\hat{\theta}_1) = \frac{\theta(1-\theta)}{n} $
2) We know that $\hat{\theta}_1$ is an unbiased estimator of $\theta = e^{-t/ \lambda}$ and $S = \sum_{i=1}^n X_i$ is a complete sufficient statistic of $\lambda$. Therefore $U = \mathbb{E}(\hat{\theta}_1 \, | \, S)$ is the UMVUE of $\theta$. The problem is that I'm unable to "compute" $U$ or find an alternative solution.
Edit: Alternative solution
I think I've found an alternative (and rather tedious) solution which doesn't make use of the hint. I was trying to write it as a comment, but it was too long. Mr. Hardy's answer is clearer and better in any sense, so I apologize in advance.
From Michael Hardy's answer:
Notice that $W=\mathbb{I}_{\{X_1>t\}}$ is an unbiased estimator of $\theta$, so the Rao-Blackwell estimator is $\mathbb{E}(W\mid S)$. Because of completeness, [...] all unbiased estimators based on the sufficient statistic $S$ will be the same. So we seek $\mathbb{E}(W\mid S) = > \Pr(X_1>t\mid S)$, and this must be equal to $\mathbb{E}(\hat{\theta}_1 \mid S)$.
The conditional pdf of $X_1$ given $S = s$ is:
$f_{X_1 | S = s}(x) = \frac{f_{X_1,S}(x,s)}{f_S(s)} $
We know that $S$ is a Gamma r.v. because it is the sum of $n$ Exponential r.v. : $f_S(s) = \frac{\lambda^{-n} s^{n-1} e^{-s/\lambda}}{(n-1)!} $
The joint distribution can be rewritten as $f_{X_1,S}(x,s) = f_{X_1}(x)f_{S | X_1 = x}(s)$.
It should also be noted that: $ \begin{align} \mathbb{P}(S \leq s | X_1 = x) &= \mathbb{P} \left(\left. x + \sum_{i=2}^n X_i \leq s \; \right|\; X_1 = x \right) \\ &= \mathbb{P} \left(\left. \sum_{i=2}^n X_i \leq s -x \; \right|\; X_1 = x\right) \\ &= \mathbb{P} \left(\sum_{i=2}^n X_i \leq s -x\right) \end{align} $ Where $\sum_{i=2}^n X_i$ is a Gamma r.v..
Thus:
$ f_{S | X_1 = x}(s) = f_{\sum_{i=2}^n X_i}(s-x) = \frac{\lambda^{-(n-1)} (s-x)^{n-2}e^{-(s-x)/\lambda}}{(n-2)!} $
If $S = s$, then $0 \leq X_1 \leq s$. If we simplify it turns out that:
$ f_{X_1 | S = s}(x) = (n-1) \frac{(s-x)^{n-2}}{s^{n-1}} \, \, \mathbb{I}_{\{0 \leq x \leq s\}} $
Finally:
$ \mathbb{P}(X_1 > t | S = s) = \int_{t=1}^S (n-1)\frac{(s-x)^{n-2}}{s^{n-1}} \, \mathrm{d}x = \frac{n-1}{s^{n-1}}\left[- \frac{(s-x)^{n-1}}{n-1} \right]^{s}_{t} = \left(1 - \frac{t}{s}\right)^{n-1}$
Which is exactly the same thing.