3
$\begingroup$

Let $X_1, X_2,\ldots, X_n$ be i.i.d. uniform on $[0, \theta ]$.

a. Find the method of moments estimate of $\theta$ and its mean and variance

b. Find the MLE of $\theta$ and its mean and variance.

Thank you for answering, I really appreciate it.

My answers were:

a. $\hat{\theta} = 2 \bar{X}$

b. $\hat{\theta} = X_n$

I'm not just sure about my solution, I don't also know how to start solving for the mean and variance considering the MLE and MME.

  • 0
    no one has answered yet... :'(2012-04-06
  • 1
    Welcome to math.SE. No worry, surely someone will help you, but before you have to say what you have tried.2012-04-06
  • 0
    Hi Kaji. If you are looking for help, this is a great site. If you are looking for other people to make your homework, it's not.2012-04-06
  • 0
    hi leonbloy, im not asking someone to do y homework. if i do, i would have posted all the questions. furthermore, i have answers to some sub questions above, im not just too sure.2012-04-06
  • 0
    hi david, thanks for the tip.2012-04-06
  • 1
    Your MOM estimate is correct. Your MLE may be, although it's hard to tell with your notation. As for finding the mean and variance: what is the distribution of $\bar{x}$?2012-04-06
  • 1
    Your MME of $\theta$ is OK. But the MLE is not $X_n$. It should be $\max \{X_1,\cdots,X_n\}$ (maybe you meant this by $X_n$ but recall that when you sort a random sample it's no longer a random sample :)2012-04-06

2 Answers 2

6

OK, so I'll drop a few hints.

First of all: Check that your MLE estimator of $\theta$ is indeed the maximum of the likelihood function. Note that this maximum is not detected by the derivative!

Now, mean and variance of $\hat \theta=2\overline{X}$ can be deduced from those of $\overline{X}.$ The distribution of $\overline{X}$ is difficult to write down, but you don't need the whole pdf, you only need $E(\overline{X})$ and Var $(\overline{X}).$ If you have been given this problem, you probably already know what the mean and the variance of the sample mean is, but just in case, here you are: $E(\overline{X})=E(X),$ Var$(\overline{X})=$Var$(X)/n.$

Concerning the MLE, you probably will have to work out first the pdf of $\hat \theta=\max\{X_1,\cdots,X_n\}$. Fix $x\in [0,\theta].$ From the definition of a maximum, $P[\hat \theta \le x]=P[X_1\le x, \;X_2\le x,\;\cdots, X_n\le x];$ now we use that the $X_i$ are independent copies of the uniform distribution in $[0,\theta]$,hence $P[\hat \theta \le x]=P[X\le x]^n$ where $X$ is a uniform distribution in $[0,\theta].$ Since $P[X\le x]=x/\theta$ (cdf of a uniform variable in $[0,\theta]$), we deduce that the cdf of $\hat\theta$ is $x^n/\theta^n$ in $[0,\theta]$ and thus the pdf is $nx^{n-1}/\theta^n$, also in $[0,\theta]$. Now use this pdf to compute $E(\hat \theta)$ and Var$\hat \theta$ in the usual way. That is, $E(\hat \theta)=\int_0^{\theta}x (nx^{n-1}/\theta^n)\,dx$ and Var$\hat \theta=\int_0^{\theta}x^2 (nx^{n-1}/\theta^n)\,dx - E(\hat\theta)^2$

  • 0
    thanks Xabier! You guys are helpful... I don't know why you guys keep on participating but thank you. I still am unable to catch up, prolly because I lack idea on this subject. (I am always absent due to work, If I won't work, I won't be able to pay my tuition fee). I'll try my best to get the answers. Thanks so much. :)2012-04-07
  • 0
    xabier, am I correct that the Mean and Variance of the latter would be: mean = theta / 2 and then the variance would be variance = theta ^ 2 / 12?? is this correct? ive used the pdf of uniform distribution...2012-04-07
  • 0
    @Kaji: You have computed mean and variance of $X$ (the starting uniform variable). You need to compute mean and variance of the new variable $\hat \theta=\max\{X_1,\cdots, X_n\}$. Think a little: If you want $\hat \theta$ to be an estimator of $\theta,$ is it reasonable that its expected value be $\theta/2?$2012-04-07
  • 0
    yeah. im confused. sorry, i feel im dumb. i was not able to attend my classes due to work, sorry if i am unable to catch up easily. :( but thank you for keeping up on the lessons, i'll study it further. :)2012-04-07
  • 1
    Don't say you're dumb, we all have gone through this. I've updated my answer. Let me know if you can see the end now :)2012-04-07
  • 0
    yes.. i can see it now. hold on, i'll answer it. xabier and michael, i owe you big time. really.2012-04-07
  • 1
    yey! i think i am right. we had the same formula as above, anyway, the mean would be n\theta / n+1 is this correct?2012-04-07
  • 1
    Yes. And this is way more natural because as $n\to \infty$ it converges to $\theta,$ which is the parameter you want to estimate.2012-04-07
  • 0
    wow. thanks. im almost done with the variance.2012-04-07
  • 1
    and the variance is = (n theta^2) / (n+1)^2 (n+2) is this correct? just tell me if it's right or wrong. thanks so much xabier.2012-04-07
  • 1
    I think it is correct. Congratulations!2012-04-07
  • 0
    wow!!! Thank you very much xabier. and i think that ends this question. LOL. THANK YOU MAN! I owe you bigtime. :D2012-04-07
  • 1
    You're welcome. You don't owe me anything. Just keep participating in the site (asking and answering and doing a bit of housekeeping every once in a while...)2012-04-07
  • 0
    yeah i will. my forte is mathematics but not statistics. LOL. yes I owe you big time. I dont know how to thank you aside from saying it. Godbless. :)2012-04-07
4

Your MLE is wrong. You said $X_1,\ldots,X_n$ are i.i.d. That implies $X_1$ or $X_2$, etc., is just as likely to be the maximum observed value as is $X_n$ or any other. The MLE is actually $\max\{X_1,\ldots,X_n\}$.

If you use the conventional notation for the order statistics, with parentheses enclosing the subscripts, so that $X_{(1)}\le X_{(2)} \le \cdots\le X_{(n)}$, then the MLE is $X_{(n)}$.

The density of the uniform distribution on $[0,\theta]$ is $\dfrac 1 \theta$ for $0, so the joint density is $\dfrac{1}{\theta^n}$ for $0< x_1,\ldots,x_n<\theta$. Look at this as a function of $\theta$: it's $\dfrac{1}{\theta^n}$ for $\theta>\text{all }x\text{s}$. Thus the likelihood function is $$ L(\theta) = \frac{1}{\theta^n}\text{ for }\theta \ge \max\{x_1,\ldots,x_n\}$. $$ This is a decreasing function on the whole interval $[\max\{x_1,\ldots,x_n\},\infty)$. Thus it attains its maximum value at the left endpoint of the interval, which is $\max\{x_1,\ldots,x_n\}$.

  • 0
    ooops! just as I thought. Im having a hard time with this subject since my schedule for work coincides with my schedule for studies. anyway, Thank you very much Michael for the tip, now im confused how to arrive to that answer...2012-04-07
  • 1
    @Kaji : I've expanded the answer to explain how the MLE is found.2012-04-07
  • 0
    wow. thanks a lot michael, i was able to prove it though. we have the same answer now... thanks... seems like i couldnt accept both your answers...2012-04-07
  • 1
    @Kaji : You can still up-vote both answers.2012-04-07