2
$\begingroup$

Let $f(x)$ denote the pdf of a $\chi^2$-distribution with $n\in\mathbb{N}$ degrees of freedom given by $f(x) = \frac{2^{-n/2}}{\Gamma(n/2)}\cdot x^{n/2-1}\cdot\mathrm e^{-1/2x}\cdot\textbf{1}_{[0,\infty)}(x),$ where $\textbf{1}_A(x)=\begin{cases}1,&x\in A,\\0,&\text{else.}\end{cases}$

Furthermore we define $\Gamma(1/2)=\sqrt{\pi},\;\Gamma(1)=1$ and $\Gamma(r+1)=r\cdot\Gamma(r)$.

Assume we have two independant random variables $X_1,X_2\sim\mathcal{N}(\mu,\sigma^2)$ with unknown $\mu$ and unknown $\sigma$ and their sampling variance $S_X^2=\frac{1}{n-1}\sum\limits_{i=1}^n(X_i-\overline{X})^2$ with $\overline{X}=\frac{1}{n}\sum\limits_{i=1}^nX_i$.

Show that $\frac{1}{\sigma^2}S_X^2=\frac{(X_1-\overline{X})^2+(X_2-\overline{X})^2}{\sigma^2}$ is $\chi^2$ distributed with one degree of freedom.

To be honest, i have no idea at all how to start because this huge amount of information intimidates me. Can anyone explain me an appropriate ansatz to proove this?

  • 0
    in your case (n = 2) the two terms in the numerator are the same.2012-07-10

1 Answers 1

1

First note that $X' = \frac{X - \overline{X}}{\sigma}$ and $Y' = \frac{Y - \overline{Y}}{\sigma}$ are $\mathcal{N}_{0,2}$ - just take expectations and variances. By symmetry considerations, $X'^2, Y'^2$ have densities $2\psi_{0,2}$ on positive reals, where $\psi_{\mu ,\sigma }$ is the Gaussian density function. Now, since $f(x)=\sqrt{x}$ is invertible differentiable, $\mathbb{P} (X'^2 \le t ) = \displaystyle\int_{-\infty }^{\sqrt{t}} 2\psi_{0,2}(x)dx.$ Hence the density of $X'^2$ can be computed using the chain rule - $(2\psi_{0,2}(\sqrt{x}) / (2\sqrt{x})=\frac{1}{\sqrt{4\pi }} e^{-x/4} x^{-1/2} = \gamma_{1/4, 1/2} (x),$ where $\gamma $ is a gamma density. Now it's an exercise (use convolutions or change of variables) to show that the sum of $\Gamma_{\alpha_1 , \beta }, \Gamma_{\alpha_2, \beta }$ distributed random variables is $\Gamma_{\alpha_1 + \alpha_2, \beta}$ distributed - in particular, your random variable is $\Gamma_{1/2, 1/2} \sim \chi_1^2.$

  • 0
    Yikes - bad things happen when you carelessly evaluate variances. Thanks for the catch.2012-07-10