1
$\begingroup$

To show whether an MLE I just found is biased/unbiased, would I need to find the expectation of the answer? Plus would I do this by integrating $\text{MLE} \cdot \text{pdf}$.

My MLE is $ \frac{1}{\bar x} $ I've heard the expectation of this is the same as of the expectation of $ \frac{1}{x} $

http://www2.imperial.ac.uk/~ayoung/m2s1/Exercises8.PDF question 6 part 2, I differentiated th log likelihood and set to zero to get $ \hat\theta = \frac{n}{sum...} = \frac{1}{\bar x} $

  • 0
    Do you want to state what `pdf` is in your case and, maybe, show how you derived the MLE?2012-11-11
  • 0
    see edit, but looks like you beat me to it.2012-11-11
  • 0
    Could you nevertheless make your post self-contained? One SHOULD NOT have to refer to the content of the link to understand what is going on.2012-11-11

1 Answers 1

0

I take it that you are dealing with the exponential distribution with $$ f_X(x) = \lambda \mathrm{e}^{-\lambda x} [x > 0] $$ Assuming all elements of the sample $\{x_1,x_2,\ldots,x_n\}$ are positive, the log-likelihood reads: $$ n \log \lambda - \lambda \sum_{k=1}^n x_k $$ which attains its maximum exactly at $\lambda = \frac{n}{\sum_{k=1}^n x_k}$.

Now to computation of the expectation of the MLE: $$\begin{eqnarray} \mathbb{E}\left(\frac{n}{X_1+X_2+\cdots+X_n}\right) &=& n \mathbb{E}\left(\frac{1}{X_1+X_2+\cdots+X_n}\right) \\ &=& n \mathbb{E}\left( \int_0^\infty \exp\left(-t(X_1+\cdots+X_n)\right) \mathrm{d}t \right) \\ &=& n \int_0^\infty \mathbb{E}\left( \exp\left(-t(X_1+\cdots+X_n)\right) \right) \mathrm{d}t \\ &\stackrel{\text{indep.}}{=}& n \int_0^\infty \left(\mathbb{E}\left( \exp\left(-t X_1\right) \right)\right)^n \mathrm{d}t \\ &=& n \int_0^\infty \left(\frac{\lambda}{t+\lambda}\right)^n \mathrm{d}t = \left. -\frac{n}{n-1} \frac{\lambda^n}{(t+\lambda)^{n-1}} \right|_{0}^\infty \\ &=& \frac{n}{n-1} \lambda \end{eqnarray} $$ Thus the MLE is biased.

  • 0
    Do you mind showing me how you got $$\begin{eqnarray} \\ &\stackrel{\text{indep.}}{=}& n \int_0^\infty \left(\mathbb{E}\left( \exp\left(-t X_1\right) \right)\right)^n \mathrm{d}t \\ \end{eqnarray} $$2012-11-11
  • 0
    plus from there to $$\begin{eqnarray} &=& n \int_0^\infty \left(\frac{\lambda}{t+\lambda}\right)^n \mathrm{d}t = \left. -\frac{n}{n-1} \frac{\lambda^n}{(t+\lambda)^{n-1}} \right|_{0}^\infty \\ &=& \frac{n}{n-1} \lambda \end{eqnarray} $$2012-11-11
  • 1
    The biasedness alone stems from the convexity inequality saying that the mean of the inverse is (strictly) greater than the inverse of the mean. Of course, in the present case, the exact computation is a bonus.2012-11-11
  • 0
    @cheeseman123 Independence of $X_k$ was used here:$$ \mathbb{E}\left(\exp(-t(X_1+\cdots+X_n))\right) = \mathbb{E}\left( \mathrm{e}^{-t X_1} \cdots \mathrm{E}^{-t X_n} \right) \stackrel{\text{indep.}}{=} \mathbb{E}\left( \mathrm{e}^{-t X_1} \right) \cdots \mathbb{E}\left( \mathrm{e}^{-t X_n} \right) \stackrel{\text{i.d.}}{=} \left(\mathbb{E}\left( \mathrm{e}^{-t X_1} \right)\right)^n$$ where the last equality follows because $X_k$ are identically distributed, thus means are equal.2012-11-11
  • 0
    The expectation $\mathbb{E}\left(\exp(-t X)\right)$ is easily integrated to be $\frac{\lambda}{t+\lambda}$. Now $$ n \int_0^\infty \left(\frac{\lambda}{t+\lambda}\right)^n \mathrm{d}t = n \lambda^n \int_0^\infty \frac{\mathrm{d}t}{(t+\lambda)^n} \stackrel{u=t+\lambda}{=} n \lambda^n \int_{\lambda}^\infty \frac{\mathrm{d}u}{u^n} $$ The latter is a table integral.2012-11-11
  • 0
    @did This is a Jensen's inequality you are talking about, right? Doesn't it establish a weak inequality only, i.e. $$\mathbb{E}\left(\bar{X}^{-1} \right) \geqslant \left(\mathbb{E}\left(\bar{X}\right) \right)^{-1}$$2012-11-11
  • 1
    My previous comment was referring (perhaps too cryptically, sorry) to the fact that the inequality you mention is strict unless the random variable is almost surely constant.2012-11-11
  • 0
    @did Thanks for the clarification.2012-11-11
  • 0
    One final point which I am confused about which at first I thought that I had understood is $$\begin{eqnarray} n \mathbb{E}\left(\frac{1}{X_1+X_2+\cdots+X_n}\right) \\ &=& n \mathbb{E}\left( \int_0^\infty \exp\left(-t(X_1+\cdots+X_n)\right) \mathrm{d}t \right) \\ \end{eqnarray} $$ and what exactly is the 't'2012-11-11
  • 0
    This is a simple result, that for $a>0$ $$ \int_0^\infty \exp(- a t) \mathrm{d} t = \frac{1}{a} $$ applied to $a = X_1 + \cdots + X_n$.2012-11-11
  • 0
    Oh right, so you just used the known result. I should have worked that one out backwards. Thank you greatly for the help.2012-11-12