0
$\begingroup$

im working on the following problem:

$X_1$,....,$X_n$ are random variables, iid and uniformly distributed on the Interval $(0,\theta)$. Show that for all $\lambda_1$,$\lambda_2$ with $0<\lambda_1<\lambda_2<1$ and $\lambda_2^n - \lambda_1^n = 1-\alpha$ the interval $[\frac{M}{\lambda_2},\frac{M}{\lambda_1}]$ is a $1-\alpha$ confidence interval for $\theta$ ( with $M=max(X_1,...,X_n)$).

Ive only done this kind of problem for explicit values, applying the definition is a little bit too much for me at the moment. What ive done so far is to use the CLT to calculate the actual interval based on mean and variance. Since a uniform distribution is given here, that should not be too much of a problem. My main problem is to show that (given the interval) $P(I) \geq 1- \alpha $.

Thanks in advance

  • 1
    I suspect the Central Limit Theorem may not be the best approach here. Instead, consider the probability that $M≤λθ$ and then the probability $λ_2θ ≤ M ≤ λ_1θ$, treating $M$ as a random variable on $(0, \theta)$2017-01-23
  • 0
    (a) Find the CDF of $M.$ (b) Note that if $P(\lambda_1\theta \le M \le \lambda_2\theta)=.95,$ then $P(M/\lambda_2 \le \theta \le M/\lambda_1)=.95.$ (c) Connect @Henry's dots.2017-01-24
  • 0
    thanks guys, that really helped2017-01-24

0 Answers 0