7
$\begingroup$

Although the setting of this question is statistics, the question actually asks for a real analysis fact (monotone functions).

Karlin-Rubin's theorem states conditions under which we can find a uniformly most powerful test (UMPT) for a statistical hypothesis:

Suppose a family of density or mass functions $\{f(\vec{x}|\theta):\,\theta\in\Theta\}$ and we want to test $$\begin{cases} H_0:\,\theta\leq\theta_0 \\ H_A:\,\theta>\theta_0.\end{cases}$$If the likelihood ratio is monotone on a statistic $T(\vec{x})$ (that is, for every fixed $\theta_1<\theta_2$ in $\Theta$, the ratio $\frac{f(\vec{x}|\theta_2)}{f(\vec{x}|\theta_1)}$ is nondecreasing on $\{\vec{x}:\,f(\vec{x}|\theta_2)>0\text{ or }f(\vec{x}|\theta_1)>0\}$ as a function of $T(\vec{x})$), then the test of critical region $\text{CR}=\{\vec{x}:\,T(\vec{x})\geq k\}$, where $k$ is chosen so that $\alpha=P(\text{CR}|\theta=\theta_0)$, is the UMPT of size $\alpha$.

In all the proofs I have read (for instance, in page 22 here or in "Statistical inference" by Casella-Berger, 2n edition, page 391), it is (more or less) said: "we can find $k_1$ such that, if $T(\vec{x})\geq k$, then $\frac{f(\vec{x}|\theta_2)}{f(\vec{x}|\theta_1)}\geq k_1$, and if $T(\vec{x})

EDIT: My questions are:

  1. Is the assertion between quotation marks true for every density or mass function with (not strictly) monotone likelihood ratio on $T$?

  2. And what about in the case of the uniform distribution?

The second question has an answer below. I would like an answer for the first question, with claims based on real-analysis.

  • 0
    Now asked also on MO: [Proof of Karlin-Rubin's theorem](https://mathoverflow.net/q/259598).2017-07-19
  • 0
    [This answer](https://math.meta.stackexchange.com/questions/5085/moderator-supported-official-guidelines-for-legitimate-crossposting/5088#5088) gives - in my opinion - a very reasonable advice about cross-posting. Among other thing, Willie Wong recommends there to include the links to other copies in the question.2017-07-19

1 Answers 1

2

Karlin- Rubin assumes that the maximum likelihood ratio exists and one of the "regular" conditions for it is that the support is independent of the parametric space. Let's examine the Uniform case with $X\sim U[0,\theta]$. If $H_0: \theta \le \theta_0$ and $H_A: \theta > \theta_0$, then $$ \frac{f_1 }{ f_0} = \frac{1/\theta_1^n I\{0\le X_{(n}) \le \theta_1\}}{1/\theta_0^n I\{0\le X_{(n}) \le \theta_0\}}, $$ So, for $X_{(n)} \le \theta_0 <\theta_1,\,\, \forall \theta_1 \in \Theta_A$ , you have that $$ \frac{f_1}{f_0} = (\theta_0/\theta_1)^{n}. $$ But if $\theta_0 < X_{(n)} $, then the ration $f_1/ f_0$ is undefined as $f_0 = 0$ and $f_1 = 1/\theta_1^n$.

As such, for a good test, it is useful to based it on the sufficient statistic $X_{(n)}$ and build it more intuitively, i.e., $$ E_{H_0} I\{c \le X_{(n)} \le \theta_0\} = \alpha, $$ thus $$ E_{H_0} I\{ (c/\theta_0)^n \le (X_{(n)}/\theta_0)^{n} \le 1\} = 1 - (c/\theta_0)^n = \alpha, $$ hence, $$ c = \theta_0 (1- \alpha)^{1/n}. $$ As such the UMPT will be $$ \Psi\{ \theta_0(1-\alpha)^{1/n} \le X_{(n)} \}, $$ where the Type~I error will be at most $\alpha$ for $\theta_0(1-\alpha)^{1/n} \le X_{(n)} \le \theta_0$, and $0$ for $X_{(n)} > \theta_0$.

  • 0
    Thank you for your answer! I have two doubts: when you write the critical region as $\{c\leq x_{(n)}\leq\theta_0\}$, why do you assume $x_{(n)}\leq\theta_0$? And how do you know that this test is the UMPT?2017-01-09
  • 0
    And concerning this assertion: "we can find $k_1$ such that, if $T(\vec{x})\geq k$, then $\frac{f(\vec{x}|\theta_2)}{f(\vec{x}|\theta_1)}\geq k_1$, and if $T(\vec{x})2017-01-10
  • 0
    1. I'm not. I'm dividing it into two possible cases. Where in the first one $x_{(n)} \le \theta_0$ and in the second one $x_{(n)} > \theta_0$. Unlike in the regular cases, $P_{\theta_0}(X\ge \theta_0)=0$, however if $H_1$ true then $P(x_{(n)} > \theta_0) = 1 - \theta_0/\theta >0$. 2. UMPT - will check later in Cassela's inference, Don't remember the exact argument. 3. This ratio is not well defined. And for the case it is ($x_{(n)} \le \theta_0$), the dependence on the sufficient statistic is only through the indicator function.2017-01-10
  • 0
    For last comment "we can find $k_1$ such that, if $T(\vec{x})\geq k$, then $\frac{f(\vec{x}|\theta_2)}{f(\vec{x}|\theta_1)}\geq k_1$, and if $T(\vec{x})2017-01-11