3
$\begingroup$
  1. If $X_1,X_2,\ldots,X_n$ are i.i.d. $\mathrm{B}(1,p)$, find the best unbiased estimator of $p^n$.

Attempt: Use indicator functions to show every observation has mean equal to 1 so this is the same as the summation of the xi's = n. So this is a sufficient statistic and best unbiased estimator.

  1. Let $X_1,X_2,\ldots,X_n$ be i.i.d. $\mathcal{N}(\mu,1)$. If $\bar{x}$ attains the lower bound as an unbiased estimator of $\mu$, find the Fisher Information in $(X_1,X_2,\ldots,X_n)$.

Fisher information = 1/CR-lower bound.

  1. If $X_1,X_2,\ldots,X_n$ are i.i.d. Uniform on $(\theta-\frac{1}{4},\theta + \frac{1}{4})$, find the sufficient statistic for theta and find an unbiased estimator of $\theta$ based on $\bar{x}$ and determine if you can improve it.

So x_bar is a sufficent statistic and E[Unbiased Estimator|Sufficent Statistic] is the best unbiased estimator.

  • 0
    B(1,p) indicates a Bernoulli variable. I meant$p^n$in the first question.2011-10-01

2 Answers 2

2

For the first question, the best unbiased estimator is $\chi\left(\sum_i x_i = n\right)$ as you wrote, because the going probability function for the $n$ observations:

$ \mathbb{P}\left( X_1=x_1, \ldots, X_n=x_n \right)=p^{x_1}(1-p)^{1-x_1} \cdots p^{x_n} (1-p)^{x_n} = p^{\sum_i x_i} (1-p)^{n - \sum_i x_i} $ Thus it factors into $(p^n)^{\chi\left(\sum_i x_i = n\right)} \cdot \left( p^{\sum_i x_i} (1-p)^{n - \sum_i x_i} \right)^{1-\chi\left(\sum_i x_i = n\right)}$.

For the second question $\bar{x}=\frac{1}{n} \sum_{i=1}^n x_i$ is the BUE for $\mu$. The factor of the likelihood that depends on this statistics is $\exp(-\frac{n}{2} \left( \mu - \bar{x} \right)^2 )$.

The variance of $\bar{x}$ is $\mathrm{Var}(\bar{x}) = \frac{1}{n^2} \sum_i \mathrm{Var}(x_i) = \frac{1}{n^2} \cdot n = \frac{1}{n}$, hence the Fisher information is $\mathcal{I}(\mu) = \frac{1}{\mathrm{Var}(\bar{x})} = n$.

For the third question, the joint density for the sample: $ f = 2^n \chi_{\theta-\frac{1}{4} \le \min(x_1,\ldots, x_n)} \chi_{\theta+\frac{1}{4} \ge \max(x_1,\ldots,x_n)} = 2^n \chi_{ \max(x_1,\ldots,x_n) -\frac{1}{4} \le \theta \le \min(x_1, \ldots,x_n) + \frac{1}{4} } $ Thus $\theta$ is determined by two-component vector statistics consisting of the minimal and maximal element of the sample suitably shifted, and $\theta$ can be anywhere in between. The mean of these two values could be a possible choice for the estimator.

  • 0
    Yes, because $X_1 \times X_2 \times \cdots \times X_n = 1$ implies $X_1 + \ldots + X_n = n$.2011-10-02
0

In your first problem, the mean of each observation is $p$, not $1$. The sum is indeed a sufficient statistic, as can be shown by applying the definition directly and seeing the $p$ cancel out, or by Fisher's factorization criterion. Notice that the expected value of the product is $p^n$. The apply the Rao-Blackwell theorem. (I'm not sure what you get.)

In your last problem, it took me a number of seconds to figure out that by $\bar{x}$ you did not mean the sample average. Here's a hint: I think you'll get a sufficient statistic consisting of a pair of scalars. Two scalar-valued random variables. Later note: I see that another answer has gone beyond my hint: It is indeed the sample minimum and the sample maximum.