2
$\begingroup$

Possible Duplicate:
Finding $E\Bigl(\overline{Y^2}\Bigm|\overline{Y\vphantom{Y^2}}\Bigr)$ by Basu's theorem?

Suppose $X_1,\ldots,X_n$ are a random sample of $N(\theta,1)$. if $\bar{X^2}=\displaystyle\frac{1}{n}\sum_{i=1}^n X_i^2$, how can I find $E(\bar{X^2}|\bar{X})$?

  • 0
    *Hmmm...*: http://math.stackexchange.com/q/192431/70032012-09-08

1 Answers 1

1

It's late, so I'll just do the case $\theta = 0$. Thus $X = (X_1,\ldots,X_n)^T$ has a multivariate normal distribution with mean $0$ and covariance matrix $I$. Let $U$ be an $n \times n$ orthogonal matrix whose first row is $(1,\ldots,1)/\sqrt{n}$. Then $W = U X$ also has a multivariate normal distribution with mean $0$ and covariance matrix $I$. Note that $W_1 = \frac{1}{\sqrt{n}} \sum_{i=1}^n X_i = \sqrt{n} \overline{X}$, while $\overline{X^2} = \frac{1}{n}\sum_{i=1}^n X_i^2 = \frac{1}{n} X^T X = \frac{1}{n} W^T W = \frac{1}{n}\sum_{i=1}^n W_i^2$ Since $W_i$ and $W_j$ are independent for $i \ne j$, $ E [\overline{X^2} | \overline{X}] = \frac{1}{n} E \left[ \sum_{i=1}^n W_i^2 | W_1 \right] = \frac{n-1}{n} E[W_i^2] + \frac{1}{n} W_1^2 = \frac{n-1}{n} + \overline{X}^2$