I just discovered today that $\mathbf{1}=(1,\ldots,1)$ is an eigenvector with eigenvalue $1$ of $X(X^TX)^{-1}X^T$ for certain $X \in \mathbb{R}^{m\times n}$ where $n \leq m$ and where the first column of $X$ is $\mathbf{1}$. Why is this?
The motivation for this is from a statement that says that the residual of the fitted values of a linear least squares estimator sum to 0. In other words, given some standard linear model $y = X\beta + \epsilon$ where $\epsilon$ is some random error, the least squares estimate for $\beta$ would be $\hat \beta = (X^TX)^{-1}X^Ty$ and the fitted value $\hat y = X\hat \beta$ or $X(X^TX)^{-1}X^Ty$. Also important to note is that $X$ is a design matrix and thus has an intercept column where all elements are $1$. Then $\mathbf{1}^T(y - \hat y) = 0$.
Through matlab, I inferred that it was because $X(X^TX)^{-1}X$ has eigenvector $\mathbf{1}$. I couldn't prove why this is the case.