5
$\begingroup$

this is my first post so I apologize if the formatting is a little rocky.

I'm currently going through "Probability and Statistics" 4th ed by DeGroot/Schervish, and I was wondering if somebody could help me out on two related problems (7.5.10, 7.6.1).

The first question is as follows: Suppose that $ X_1, \dots, X_n $ form a random sample from a distribution for which the p.d.f. $ f(x|\theta) $ is as follows: $$ f(x|\theta) = \frac{1}{2} e^{-|x-\theta|} \text{ for } -\infty < x < \infty $$ Also, suppose that the value of $ \theta $ is unknown, for $ -\infty < \theta < \infty $. We will find the M.L.E. of $ \theta $.

The likelihood function is given by $$ f_n(\mathbf{x}|\theta) = \frac{1}{2^n} e^{-\sum_{i=1}^n |x_i - \theta|}$$ and will be maximized when $ \sum_{i=1}^n |x_i - \theta| $ is minimized. By choosing $ \theta $ to be a median value of $ x_1, \dots, x_n $, we accomplish the minimization task.

More specifically, note that the likelihood function has log \begin{align*} \log f(\mathbf{x}|\theta) &= -n \log 2 - \sum_{i=1}^n |x_i - \theta| \\ &= n \left ( - \log 2 - \frac{1}{n} \sum_{i=1}^n |x_i - \theta| \right ) \end{align*} Now, we see that the M.L.E. will minimize the sum in the log likelihood shown above. We can also see that $ \frac{1}{n} \sum_{i=1}^n |x_i - \theta| = E(|X-\theta|)$ for $ X $ having a discrete distribution assigning probability $ \frac{1}{n} $ to each of $ x_1, \dots, x_n $, and 0 everywhere else. So now we can see that, by choosing $ \hat{\theta} $ to be a median of $ x_1, \dots, x_n $, then the log likelihood will be minimized. This follows from the fact that the median of a distribution minimizes the mean absolute error.

The second question is to find the MLE of $e^{-\frac{1}{\theta}}$, which by the invariance property of MLEs, should just be $e^{-\frac{1}{\hat{\theta}}}$.

My problem is that the answer in the back of the book (for the second question) is given as $\left ( \prod_{i=1}^n x_i \right)^{\frac{1}{n}}$, and I'm having trouble reconciling that with my answer. You'd think it'd be pretty straightforward, but...

Any help would be appreciated!

  • 0
    For the second problem what is the mle for theta? It looks like it is not the same theta in the previous problem. Working backwards you would get this answer if the mle for theta is -n/∑lnx$_i$.2012-09-08
  • 0
    What exactly is your first question? For the second question, I would set $\alpha = e^{1/\theta}$, solve for $\theta$, plug that in $f(x|\theta)$ and find the MLE.2012-09-08
  • 0
    @Michael I also noticed that -- does that mean I calculated the first MLE wrong?2012-09-08
  • 0
    @echoone Err, I didn't really have a question about the first part, I was just showing my work to see if maybe I was messing up somewhere there. I'll try your method and see if that works.2012-09-08
  • 0
    Here is a hint: In the first problem $X_i \in \mathbb R$. Now take $n = 2$ and suppose $X_1 < 0$ and $X_2 > 0$, each of which can happen with positive probability. Now consider the "estimator" for the second question. Conclusion? :-)2012-09-08
  • 0
    The first distribution the mle is the median. For the second you didn't give us the likelihood for the second. I got it by taking the asnwer given to e$^-$$^1/θ and inverting to get the mle for θ.2012-09-08
  • 0
    @cardinal: ... you get a complex number?? (that is if you use $\sqrt{X_1\cdot X_2}$)2012-09-08
  • 0
    @ChaseUyeda: Yes. In other words, in the book there's a typo (or some other logical disconnect) afoot. :-)2012-09-08
  • 0
    @Michael In the second part, you just use the same likelihood, but instead now writing $\theta$ as a function of $\psi$, where $\psi = e^{-1/\theta}$, right? And then maximizing that likelihood with respect to $\psi$ ?2012-09-08
  • 0
    @cardinal Hmm, okay. Thanks for pointing that out! Since the given answer, as Michael said, would tell us that $\hat{\theta} = -\frac{n}{\sum_{i=1}^n \ln x_i}$, which is the MLE for a beta distribution with unknown parameter $\theta$ and 1, I wonder if that's where this is coming from?? Not sure.2012-09-08
  • 0
    @ChaseUyeda: Here is the connection: If $X$ is exponential with rate $\theta$, then $Y=\exp(X)$ is *Pareto* with density $\theta y^{-(\theta + 1)}$ on $y \geq 1$ and $Z = 1/Y = \exp(-X)$ is $\mathrm{Beta}(\theta+1,1)$. :-)2012-09-08
  • 0
    @cardinal Interesting! Thanks again for all your help :)2012-09-09

1 Answers 1