3
$\begingroup$

I have seen the text book derivation where the independence is established through factoring the joint distribution. But has anyone tried to prove that the covariance is zero?.

Let $Z_{i}$ come from a standard normal distribution.

Let $X = \bar{Z}$ be the sample mean and let $Y = \sum_i\frac{1}{n-1} (Z_{i} -\bar{Z})^2$ be the sample variance.

Prove that $\operatorname{Cov}(X,Y) = 0$ .

I am getting terms like E($\bar{Z}^3$) in the expansion which is making it very cumbersome to handle...

  • 0
    For the more general case (without assuming Normality), see: http://math.stackexchange.com/questions/942810/2014-09-30

2 Answers 2

1

First show that $ \operatorname{cov} (\bar{Z}, Z_i - \bar{Z}) = 0. $ These two random variables are not just normally distributed, but jointly normally distributed, since both are linear combinations of the same sequence of independent normals. Therefore, they are uncorrelated only if they're independent.

Hence the sample variance, which is a function of $Z_i-\bar{Z}$ for $i=1,\ldots,n$, is independent of the sample mean.

Later addendum: I should add that the whole vector $(Z_1- \bar Z,\ldots,Z_n-\bar Z)$ is independent of $\bar Z$, since the covariance between $\bar Z$ and that vector is a matrix whose every entry is $0$ and we have joint normality. If $(Z_1- \bar Z,\ldots,Z_n-\bar Z)$ is independent of $\bar Z$, then $((Z_1- \bar Z)^2+\cdots+(Z_n-\bar Z)^2$ is independent of $\bar Z$.

  • 0
    @openspace : You have $Z_1-\overline Z = \left( 1 - \frac 1 n\right) Z_1 - \frac 1 n Z_2 - \cdots - \frac 1 n Z_n. $ Thus this is a linear combination of $Z_1,\ldots, Z_n$ with constant (i.e. non-random) coefficients. And the same is true of $Z_i-\overline Z$ for $i=1,\ldots,n.$ Every tuple of linear combinations, with constant coefficients, of independent normally distributed random variables $Z_1,\ldots,Z_n$ has a multivariate normal distribution, i.e. its components are jointly normally distributed.2018-09-18
5

Consider the random vectors $Z=(Z_i)_{1\leqslant i\leqslant n}$ and $-Z=(-Z_i)_{1\leqslant i\leqslant n}$. Then $X=\xi(Z)$ for a given odd function $\xi$ and $Y=\eta(Z)$ for a given even function $\eta$. The function $\zeta=\xi\cdot\eta$ is odd as well, hence $\xi(-Z)=-\xi(Z)$ and $\zeta(-Z)=-\zeta(Z)$. Now, $Z$ and $-Z$ follow the same distribution, hence $\xi$ and $\zeta$ being odd functions yields that $\mathrm E(\xi(Z))=\mathrm E(\zeta(Z))=0$. In particular, the covariance of $X$ and $Y$ is $\mathrm E(\zeta(Z))-\mathrm E(\xi(Z))\mathrm E(\eta(Z))=0$.

To sum up, the result you ask a simple proof of (that the empirical mean and empirical variance are uncorrelated) has nothing to do with gaussianity since it holds for every symmetric distribution.

  • 2
    +1, especially because it is not necessary to assume the $Z_i$ are independent or uncorrelated, but I have a suggestion and a minor question. Suggestion: it is not obvious why $Z$ and $-Z$ follow the same distribution, and so perhaps the statement about symmetric distribution should be moved to the beginning. Minor question: The expectation of an odd function of a random variable might be undefined rather than being $0$. So, should the proof be expanded to show that for _these_ particular odd functions the expectations are zero and so the empirical mean and variance are uncorrelated?2012-02-20