0
$\begingroup$

I was trying to solve the following question but I am unable to make any headway.

Denote the lognormal density $\mathcal f (\mathcal w; \mu, \sigma) = (\mathcal w\sigma\sqrt(2\pi)^{-1}e^{\{-[log(\mathcal w) - \mu]^2/(2\sigma^2)\}})$ with $\mathcal w > 0, \mu\; \epsilon \; (0,\infty)$.

Let $\mathcal X_1,\ldots ,\mathcal X_m$ be i.i.d random variables with common density $\mathcal f (\mathcal x; \mu, 2)$ and $\mathcal Y_1, \ldots ,\mathcal Y_n$ be i.i.d random variables with common density $f (\mathcal y; 2\mu, 3)$ distributed independently of $\mathcal X_1,\ldots,\mathcal X_m$. Here $\mu$ is unknown and $m \neq n$.

Examine whether there exists a uniformly most powerful level $\alpha$ test for testing $\mathcal H_0 : \mu = 1$ against $\mathcal H_1: \mu \neq 1.$

  • 0
    I tried using the Neyman Pearson lemma. It intuitively feels that there will be no uniformly most powerful test. The lemma works for either < or > but I don't think it will work both ways. But I am unable to completely demonstrate it.2017-01-06

1 Answers 1

0

First, calculate the likelihood function as $$ L(\mu; \vec x, \vec y) = \exp \left[ - \frac{1}{8} \sum_{i=1}^m [\log x_i - \mu]^2 - \frac{1}{18} \sum_{i=1}^n [ \log y_i - 2\mu]^2 \right]. $$ Let $\mu_1 \neq 1$ and consider testing $H_0: \mu=1$ versus $H_1 : \mu = \mu_1$. Neymann Pearson tells us that the likelihood ratio test will be the UMP test.

For the LRT, we form $(\Lambda;\vec x, \vec y)= \frac{L(\mu=1; \vec x, \vec y)}{L(\mu=\mu_1; \vec x, \vec y)}$ and reject when $(\Lambda;\vec x, \vec y) \le k$ where $P( ( \Lambda;\vec X, \vec Y) \le k;H_0) = \alpha$. Equivalently, you can examine $\lambda = \log \Lambda$ and reject similarly.

In this case, we have (for some generic constant $k$ at each step) $$ \log \Lambda \le k \iff \log L(1) - \log L(\mu_1) \le k $$ which occurs if and only if \begin{align*} &{- \frac{1}{8} \sum_{i=1}^m [\log x_i - 1]^2 - \frac{1}{18} \sum_{i=1}^n [ \log y_i - 2]^2 + \frac{1}{8} \sum_{i=1}^m [\log x_i - \mu_1]^2 + \frac{1}{18} \sum_{i=1}^n [ \log y_i - 2\mu_1]^2 \le k} \\ &\iff \frac{1}{4} \sum_{i=1}^m \log x_i -\frac{1}{4} \left( \sum_{i=1}^m \mu_1 \log x_i \right) + \frac{4}{18} \sum_{i=1}^n \log y_i - \frac{4}{18} \left( \sum_{i=1}^n \mu_1 \log y_i \right) \le k \\ &\iff \frac{1}{4} \left( \sum_{i=1}^m \log x_i\right) \left[ 1 - \mu_1 \right] + \frac{4}{18} \left( \sum_{i=1}^n \log y_i \right) [ 1- \mu_1] \le k \\ &\iff \left( 1- \mu_1 \right) \cdot \left[ \frac{1}{4} \left( \sum_{i=1}^m \log x_i\right) + \frac{4}{18} \left( \sum_{i=1}^n \log y_i \right) \right] \le k. \end{align*} From here, we see that the $\mu_1 > 1$ or $\mu_1 < 1$ will give a different form of a test. In particular,

  1. If $\mu_1 > 1$, we reject $H_0$ if $\frac{1}{4} \left( \sum_{i=1}^m \log x_i\right) + \frac{4}{18} \left( \sum_{i=1}^n \log y_i \right) \ge k$, where $P(\frac{1}{4} \left( \sum_{i=1}^m \log X_i\right) + \frac{4}{18} \left( \sum_{i=1}^n \log Y_i \right) \ge k; H_0) = \alpha$. This is UMP for $H_0:\mu=1$ versus $H_1: \mu>1$ since the computations showed that the explicit $\mu_1$ value didn't matter.

  2. If $\mu_1 < 1$, we reject $H_0$ if $\frac{1}{4} \left( \sum_{i=1}^m \log x_i\right) + \frac{4}{18} \left( \sum_{i=1}^n \log y_i \right) \le k$, where $P(\frac{1}{4} \left( \sum_{i=1}^m \log X_i\right) + \frac{4}{18} \left( \sum_{i=1}^n \log Y_i \right) \le k; H_0) = \alpha$. This is UMP for $H_0:\mu=1$ versus $H_1: \mu<1$ since the computations showed that the explicit $\mu_1$ value didn't matter.

The tests from 1 & 2 are clearly different, and this shows that there is no UMP for the two-sided test $H_0 : \mu=1$ versus $H_1: \mu \neq 1$.