I have $X_1, \ldots, X_n$ and $Y_1, \ldots, Y_n$ as as random samples from two normal distributions with means $0$ and variances $\theta_1$ and $\theta_2$ respectively.The null hypothesis is $\theta_1 = \theta_2$ and the alternative is $\theta_1$ not equal to $\theta_2$ I calculated the likelihood ratio (which is shown below) and now I am trying to figure out what this likelihood ratio is a function of. I believe it is a function of $F$ but I am unsure how to show that it is $F$-distributed with $v_1 = n$, and $v_2 = m$. Thanks for the help. $$ \lambda={ { \left\{ {\textstyle 1\over\textstyle 2\pi\bigl[\,(\,\sum x_i^2+\sum y_i^2\,)/(n+m)\, \bigr]} \right\}^{n+m\over2} } \over \biggl[{ {\textstyle1\over\textstyle 2\pi(\sum x_i^2 /n)} }\biggl]^{n/2} \biggl[{ {\textstyle1\over\textstyle 2\pi(\sum y_i^2 /m)} }\biggl]^{m/2} } $$
Likelihood ratio interpretation
-
2*I believe it is a function of $F$*... What is $F$? – 2011-12-11
-
1I think you should get a monotone function of a statistic with an F-distribution. But you have both $n$ and $m$ in your definition of $\lambda$, whereas no $m$ appears in your statement of the problem. Could it be that you meant $Y_m$ rather than $Y_n$? – 2011-12-11
-
1@DidierPiau : Given that the means are known to be $0$ (which might make this essentially a toy problem), I would think "${}\;F\;{}$" would be $((X_1^2+\cdots+X_n^2)/n)/((Y_1^2+\cdots+Y_m^2)/m)$. That's what I would expect to get as a likelihood-ratio test statistic with an F-distribution in this scenario. – 2011-12-11
-
1Here's an oddity: Wikipedia's article titled [F-test of equality of variances](http://en.wikipedia.org/wiki/F-test_of_equality_of_variances) doesn't mention that it's a [likelihood-ratio test](http://en.wikipedia.org/wiki/Likelihood_ratio_test). – 2011-12-11
-
1In this test you would have $m$ and $n$ degrees of freedom. In the more usual F-test, where you have to estimate the population means, you'd have $m-1$ and $n-1$. – 2011-12-11
-
0My calculation of $\lambda$ agrees with what is given here. – 2011-12-11
-
0So the problem is to show that $\lambda$ is a monotone function of the sum of the squares of the $X$s over the sum of the squares of the $Y$s, and which function it is doesn't depend on the values of the $X$s or the $Y$s or the $\theta$s. – 2011-12-11
-
0How would I use this likelihood ratio I derived to show this then? – 2011-12-11
-
0And by a function of F, I meant a function of a F-Statistic. – 2011-12-11
1 Answers
It is just algebra.
Ok, you have
$$\lambda = \frac{\left[\frac{\sum x_i^2}{n}\right]^{n/2} \left[\frac{\sum y_i^2}{m}\right]^{m/2}} {\left[\frac{\sum x_i^2 + \sum y_i^2}{m+n}\right]^{\frac{m+n}{2}}}$$
We can easily factor this
$$\lambda = \left[\frac{\frac{\sum x_i^2}{n}} {\frac{\sum x_i^2 + \sum y_i^2}{m+n}}\right]^{n/2} \left[\frac{\frac{\sum y_i^2}{m}} {\frac{\sum x_i^2 + \sum y_i^2}{m+n}}\right]^{m/2}$$
Now multiply by the appropriate power of $\frac{\sum y_i^2}{\sum y_i^2}$ to get
$$\lambda = \left[\frac{(m+n)\frac{\sum x_i^2}{\sum y_i^2}}{n\left(1+\frac{\sum x_i^2}{\sum y_i^2}\right)}\right]^{n/2} \left[\frac{(m+n)}{m\left(1+\frac{\sum x_i^2}{\sum y_i^2}\right)}\right]^{m/2}$$
Now we know that $\frac{m}{n}\frac{\sum x_i^2}{\sum y_i^2}$ has a $F_{n,m}$ distribution (it is a ratio of two independent random variables having $\chi^2$ distributions) so we can write $\lambda$ as $$\lambda = \left[\frac{\frac{(n+m)n}{m}F_{n,m}}{ \frac{n^2}{m}\left(F_{n,m}+\frac{m}{n}\right) }\right]^{n/2}\left[\frac{m+n}{n\left(F_{n,m}+\frac{m}{n}\right)}\right]^{m/2}$$
Now this has $\lambda$ as a function of $F$. Check my algebra.
To be useful it should be a monotone function of $F$. That it is is not immediately clear to me.
-
1+1. But just to pick a couple of nits: "Now we know that ...... has a Fn,m distribution (it is a ratio of χ2 distributions)"... A couple of points: It should be a ratio of random variables with chi-square distributions, not a "ratio of chi-square distributions. Also, they should be independent. (In this case they are, but merely saying it's a ratio of r.v.s with chi-square distributions is not enough to justify the conclusion. – 2011-12-12
-
1@MichaelHardy Thanks. You are of course correct. – 2011-12-12