0
$\begingroup$

"Consider the standard model:

$ Y = X \beta + \epsilon $

where $X$ is the $n \times p$ matrix of rank $p$, $\beta$ is vector of $p$ unknown parameters and $\epsilon$ is a random vector whose components are independent normal random variables, each with mean $0$ and variance $\sigma^2$. For the least squares estimator $\hat{\beta} = (X^T X)^{-1} X^T Y$ of $\beta$, denote the vector of residuals by $r = Y - X \hat{\beta}$. Show that the residual sum of squares fulfills:

$ r^Tr = Y^TY - Y^TX \hat{\beta}. $""

How do I go about doing this. I managed to prove that $X^TR = 0$ but I'm not really sure what to do from there.

1 Answers 1

0

We have \begin{align*} r^\top r &= Y^\top Y - Y^\top X\hat \beta + \hat\beta^\top X^\top r \end{align*} Now \begin{align*} X^\top r &= X^\top Y - X^\top X \hat \beta\\ &= X^\top Y - X^\top X \hat \beta\\ &= X^\top Y - X^\top X (X^\top X)^{-1} X^\top Y\\ &= 0 \end{align*} Hence from the above \begin{align*} r^\top r &= Y^\top Y - Y^\top X\hat \beta + \hat\beta^\top X^\top r = Y^\top Y - Y^\top X\hat \beta \end{align*}

  • 0
    We have $r = Y - X\hat \beta$, so $r^\top = Y^\top -\hat\beta^\top X^\top$ and hence $r^\top r = Y^\top (Y - X\hat\beta) - \hat\beta^\top X^\top r$.2012-10-28