"Consider the standard model:
$ Y = X \beta + \epsilon $
where $X$ is the $n \times p$ matrix of rank $p$, $\beta$ is vector of $p$ unknown parameters and $\epsilon$ is a random vector whose components are independent normal random variables, each with mean $0$ and variance $\sigma^2$. For the least squares estimator $\hat{\beta} = (X^T X)^{-1} X^T Y$ of $\beta$, denote the vector of residuals by $r = Y - X \hat{\beta}$. Show that the residual sum of squares fulfills:
$ r^Tr = Y^TY - Y^TX \hat{\beta}. $""
How do I go about doing this. I managed to prove that $X^TR = 0$ but I'm not really sure what to do from there.