18
$\begingroup$

I've been trying to establish that the sample mean and the sample variance are independent. One motivation is to try and write the sample variance, $S^{2}$ as a function of $\left\{ X_{2}-\bar{X},X_{3}-\bar{X},\cdots,X_{n}-\bar{X}\right\} =A$ only. Then we proceed by showing that $A$ and $\bar{X}$ are independent (which I'm unable to show), which then implies the independence of $S^{2}$ and $\bar{X}.$

I would appreciate it if the good people of M.SE would help guide me in the right direction.
Thanks.

Edit: The random samples $X_1,\cdots,X_n$ are from an $N(\mu, \sigma)$ distribution.

  • 0
    @Gortaur:No, I do not. Could please show me how? thanks.2011-06-24

2 Answers 2

22

I guess this is probably a little late, but this result is immediate from Basu's Theorem, provided that you are willing to accept that the family of normal distributions with known variance is complete. To apply Basu, fix $\sigma^2$ and consider the family of $N(\mu, \sigma^2)$ for $\mu \in \mathbb R$. Then $\frac{(n - 1)S^2}{\sigma^2} \sim \chi^2_{n - 1}$ so $S^2$ is ancillary, while $\bar X$ is complete sufficient, and hence they are independent for all $\mu$ and our fixed $\sigma^2$. Since $\sigma^2$ was arbitrary, this completes the proof.

This can also be shown directly without too much hassle. One can find the joint pdf of $(A, \bar X)$ directly by making a suitable transformation to the joint pdf of $(X_1,\cdots, X_n)$. The joint pdf of $(A, \bar X)$ factors as required, which gives independence. To see this quickly, without actually doing the transformation, skipping some algebra we may write

$f(x_1, x_2, ..., x_n) = (2\pi \sigma^2)^{-n/2} \exp\left\{-\frac{\sum(x_i - \bar x)^2}{2\sigma^2}\right\} \exp\left\{-\frac{n(\bar x - \mu)^2}{2\sigma^2}\right\}$

and we can see that everything except for the last term depends only on $(x_2 - \bar x, x_3 - \bar x, ..., x_n - \bar x)$ (note we may retrieve $x_1 - \bar x$ from only the first $n - 1$ deviations) and the last term depends only on $\bar x$. The transformation is linear, so the jacobian term won't screw this factorization up when we actually pass to the joint pdf of $(A, \bar X)$.

  • 0
    @guy: No its never late. Thanks for your nice answer. I wasn't aware of Basu's theorem at that time...:)2011-07-10
8

Hint:

  1. Show that A' = (\bar X, A) can be writen as A' = C X where $C$ is some square matrix. Deduce from that that A' is jointly gaussian.

  2. Recall that independence is equivalent to zero correlation for jointly gaussian variables. Show that $E(A_i \bar X) = 0$. Conclude that $A$ and $\bar X$ are independent.