0
$\begingroup$

So given that the least squares estimator of $\beta$ is:

$$ \mathbf{\hat{\beta}} = (\mathbf{X}^T \mathbf{X})^{-1}\mathbf{X}^T \mathbf{Y} $$

And $\mathbf{Y} = \mathbf{X} \mathbf{\beta} + \epsilon$, where $\epsilon$ is a vector of independent zero-mean normals all with the same variance $\sigma^2$.

What is the covariance matrix? I've done this before, but I decided to attempt this differently this time.

Here is my attempt, according to the solutions the answer should be: $(X^T X)^{-1}\sigma^2$ but I am not getting that:

enter image description here

Does anyone know where my mistake is?

2 Answers 2

2

Seems like you have some stuff backwards.

Remember that $\mathrm{Cov}(\hat\beta)$ should be $p\times p,$ so if you're using the convention where $\beta$ is $p\times 1$ and $Y$ is $n\times 1,$ you want to take $$ \mathrm{Cov}(\hat\beta)=E(\hat\beta\hat\beta^T) -E(\hat\beta)E(\hat\beta)^T.$$

Then you have $$ \hat\beta\hat\beta^T = ((X^TX)^{-1}X^T)YY^T(X(X^TX)^{-1})$$ and the last term from the expansion of $E(\hat\beta\hat\beta^T)$ will have the form $$((X^TX)^{-1}X^T)E(\epsilon\epsilon^T)(X(X^TX)^{-1}) = ((X^TX)^{-1}X^T)\sigma^2I(X(X^TX)^{-1})$$

1

Direct approach: Recall that for a random vector $Y$ and constant matrix $A$, $$ cov(Ay)=Acov(y)A', $$ thus \begin{align} Var(\hat{\beta}|X)&=Var((X'X)^{-1}X'y|X)\\ & = (X'X)^{-1}X'Var(y|X)X(X'X)^{-1}\\ &=\sigma^2I(X'X)^{-1}X'X(X'X)^{-1} = \sigma^2(X'X)^{-1}. \end{align} Or \begin{align} Var(\hat{\beta}|X)&= E((\hat{\beta} - \beta)(\hat{\beta} - \beta)'|X)\\ & = E[ (X'X)^{-1}X'\epsilon ((X'X)^{-1}X'\epsilon )'|X]\\ &=(X'X)^{-1}X'E\epsilon\epsilon'X(X'X)^{-1}\\ &=\sigma^2(X'X)^{-1}. \end{align}