6
$\begingroup$

A random variable is uniformly distributed over $(0,\theta)$. The maximum of a random sample of $n$, call $y_n$ is sufficient for $\theta$ and it is also the maximum likelihood estimator. Show also that a $100\gamma\%$ confidence interval for $\theta$ is $(y_n, y_n /(1 − \gamma )^{1/n})$.

Could anyone tell me how to deal with this problem? Do I have to use the central limit theorem?

  • 0
    In general asymptotic theory doesn't help here because the question requires an exact result. But there is asymptotic theory for the maximum. It is called Gnedenko's theorem and can be applied to the uniform distribution.2012-09-03

1 Answers 1

7

You need to show that $ \Pr\left(y_n<\theta<\frac{y_n}{(1-\gamma)^{1/n}}\right) = \gamma. $ Since $y_n$ is necessarily always less than $\theta$, this probability is the same as $ \Pr\left(\theta<\frac{y_n}{(1-\gamma)^{1/n}}\right)=\Pr\left( \theta(1-\gamma)^{1/n} < y_n\right) = 1-\Pr\left( y_n < \theta(1-\gamma)^{1/n} \right). $

Notice that $ \Pr(y_n < c) = \Pr(\text{All $n$ observations} Apply this last sequence of equalities with $\theta(1-\gamma)^{1/n}$ in place of $c$.

  • 2
    This confidence interval has $100\gamma%$ within the interval, and $100(1-\gamma)%$ to the right of the interval. One could also construct a confidence interval with $(100(1-\gamma)/2)%$ in each tail. Or even $100(1-\gamma)% to the left of the interval, with the right endpoint equal to $\infty$. The problem so far doesn't completely specify these things. But one could speak of a unique "confidence distribution", and find that. The second line of displayed $\TeX$ in my answer above is the first step in doing that.2012-09-03