3
$\begingroup$

I have seen the text book derivation where the independence is established through factoring the joint distribution. But has anyone tried to prove that the covariance is zero?.

Let $Z_{i}$ come from a standard normal distribution.

Let $X = \bar{Z}$ be the sample mean and let $$Y = \sum_i\frac{1}{n-1} (Z_{i} -\bar{Z})^2$$ be the sample variance.

Prove that $\operatorname{Cov}(X,Y) = 0$ .

I am getting terms like E($\bar{Z}^3$) in the expansion which is making it very cumbersome to handle...

  • 0
    By symmetry of an odd function, $E\left[\bar{Z}^3\right] = 0$ for a standard normal distribution.2012-02-19
  • 0
    It happens that the simplest (and most illuminating) way to prove that $X$ and $Y$ are uncorrelated is to show that they are independent...2012-02-19
  • 0
    ^ yes I understand ... but it turned out to be very convenient that the JDF/MGF was obtained and could be factored for the normal dist2012-02-19
  • 2
    The title of your question "Independence of sample mean and variance through covariance" concerns me a bit. Are you trying to establish that $X$ and $Y$ are independent by showing that their covariance is zero?2012-02-20
  • 0
    ^ I was... but I realize thats a wrong approach.2012-02-20
  • 0
    For the more general case (without assuming Normality), see: http://math.stackexchange.com/questions/942810/2014-09-30

2 Answers 2

1

First show that $$ \operatorname{cov} (\bar{Z}, Z_i - \bar{Z}) = 0. $$ These two random variables are not just normally distributed, but jointly normally distributed, since both are linear combinations of the same sequence of independent normals. Therefore, they are uncorrelated only if they're independent.

Hence the sample variance, which is a function of $Z_i-\bar{Z}$ for $i=1,\ldots,n$, is independent of the sample mean.

Later addendum: I should add that the whole vector $(Z_1- \bar Z,\ldots,Z_n-\bar Z)$ is independent of $\bar Z$, since the covariance between $\bar Z$ and that vector is a matrix whose every entry is $0$ and we have joint normality. If $(Z_1- \bar Z,\ldots,Z_n-\bar Z)$ is independent of $\bar Z$, then $((Z_1- \bar Z)^2+\cdots+(Z_n-\bar Z)^2$ is independent of $\bar Z$.

  • 0
    Sorry but $\bar Z$ and $Z_i-\bar Z$ being independent for every $i$ (as you recall) does NOT imply that a function of $(Z_i-\bar Z)_i$ (such as $Y$) is independent on $\bar Z$.2012-02-21
  • 0
    I've added a comment that it's the whole vector whose entries are $Z_i - \bar Z$ that is independent of $\bar Z$. That is enough to justify the conclusion.2012-02-23
  • 0
    Why vector $(Z_{1}-\bar{Z} \dots Z_{n} - \bar{Z})$ and $\bar{Z}$ are independent ?2018-09-18
  • 0
    @openspace : First note that the covariance between $\overline Z$ and $Z_i-\overline Z$ is $0.$ Then observe that $\big(Z_1-\overline Z, \ldots, Z_n-\overline Z, \overline Z\,\big)$ is a tuple of JOINTLY normally distributed random variables. $\qquad$2018-09-18
  • 0
    @MichaelHardy jointly because of their independence?2018-09-18
  • 0
    @openspace : You have $$Z_1-\overline Z = \left( 1 - \frac 1 n\right) Z_1 - \frac 1 n Z_2 - \cdots - \frac 1 n Z_n. $$ Thus this is a linear combination of $Z_1,\ldots, Z_n$ with constant (i.e. non-random) coefficients. And the same is true of $Z_i-\overline Z$ for $i=1,\ldots,n.$ Every tuple of linear combinations, with constant coefficients, of independent normally distributed random variables $Z_1,\ldots,Z_n$ has a multivariate normal distribution, i.e. its components are jointly normally distributed.2018-09-18
5

Consider the random vectors $Z=(Z_i)_{1\leqslant i\leqslant n}$ and $-Z=(-Z_i)_{1\leqslant i\leqslant n}$. Then $X=\xi(Z)$ for a given odd function $\xi$ and $Y=\eta(Z)$ for a given even function $\eta$. The function $\zeta=\xi\cdot\eta$ is odd as well, hence $\xi(-Z)=-\xi(Z)$ and $\zeta(-Z)=-\zeta(Z)$. Now, $Z$ and $-Z$ follow the same distribution, hence $\xi$ and $\zeta$ being odd functions yields that $\mathrm E(\xi(Z))=\mathrm E(\zeta(Z))=0$. In particular, the covariance of $X$ and $Y$ is $\mathrm E(\zeta(Z))-\mathrm E(\xi(Z))\mathrm E(\eta(Z))=0$.

To sum up, the result you ask a simple proof of (that the empirical mean and empirical variance are uncorrelated) has nothing to do with gaussianity since it holds for every symmetric distribution.

  • 0
    Thanks for the detailed explanation... I understand what you say now but I accepted Michael's answer as it was more direct2012-02-20
  • 2
    +1, especially because it is not necessary to assume the $Z_i$ are independent or uncorrelated, but I have a suggestion and a minor question. Suggestion: it is not obvious why $Z$ and $-Z$ follow the same distribution, and so perhaps the statement about symmetric distribution should be moved to the beginning. Minor question: The expectation of an odd function of a random variable might be undefined rather than being $0$. So, should the proof be expanded to show that for _these_ particular odd functions the expectations are zero and so the empirical mean and variance are uncorrelated?2012-02-20