1
$\begingroup$

Let $X_1,X_2,X_3,\ldots,X_n$ be a random sample from a $\mathrm{Bernoulli}(\theta)$ distribution with probabilty function

$P(X=x) = (\theta^x)(1 - \theta)^{(1 - x)}$, $x=0,1$; $0<\theta<1$.

Is $\hat\theta(1 - \hat\theta)$ an unbiased estimator of $\theta(1 - \theta)$? Prove or disprove.

I tried $x=\theta(1-\theta)$, $\bar x=\hat\theta(1-\hat\theta)$,

$E[\bar x)=x$

$E[\bar x(1-\bar x)]=E[\bar x]-E[\bar x-1)$

but I'm not sure what to do now or how to prove it. I have an exam tomorrow so any help is really appreciated! Hopefully this is the last stats question I'll have to ask!

  • 0
    ehehehe funny.. happy to hear.. It is okay if they edit as long as you use tex)2012-08-14

1 Answers 1

3

$\newcommand{\var}{\operatorname{var}}$ $\newcommand{\E}{\mathbb{E}}$

Your notation is confusing: you use $x$ to refer to two different things, and you seem to use the lower-case $\bar x$ to refer to the sample mean after using capital letters to refer to random variables initially.

Remember that the variance of a random variable is equal to the expected value of its square minus the square of its expected value. That enables us to find the expected value of its square if we know it variance and its expected value.

I surmise that by $\hat\theta$ you mean $(X_1+\cdots+X_n)/n$. That makes $\hat\theta$ an unbiased estimator of $\theta$.

So $\E(\hat\theta) = \theta$ and $ \var(\hat\theta) = \var\left( \frac{X_1+\cdots+X_n}{n} \right) = \frac{1}{n^2}\var(X_1+\cdots+X_n) = \frac{1}{n^2}(\var(X_1)+\cdots+\var(X_n)) $ $ =\frac{1}{n^2}\cdot n\var(X_1) = \frac 1 n \var(X_1) = \frac 1 n \theta(1-\theta). $

Now we want $\mathbb{E}(\hat\theta(1-\hat\theta))$: $ \mathbb{E}(\hat\theta(1-\hat\theta)) = \mathbb{E}(\hat\theta) - \mathbb{E}(\hat\theta^2) = \theta - \Big( \var(\hat\theta) + \left(\E(\hat\theta)\right)^2 \Big) = \theta - \left( \frac{\theta(1-\theta)}{n} + \theta^2 \right) $ $ = \frac{n\theta - \theta(1-\theta) - n\theta^2}{n} = \frac{n-1}{n}\theta(1-\theta). $

From this you can draw a conclusion about whether $\hat\theta(1-\hat\theta)$ is an unbiased estimator of $\theta(1-\theta)$.

(By the way, $\hat\theta(1-\hat\theta)$ is the maximum-likelihood estimator of $\theta(1-\theta)$.)

  • 0
    asymptotically unbiased estimator.2012-08-08