1
$\begingroup$

In $\mathbf X\mathbf X^\prime$, $\mathbf X$ is a matrix contains data points in column fashion, $\mathbf X^\prime$ is its transpose, this looks like a covariance matrix, but does not subtract mean, so what is the different meaning of subtract mean or not subtract it?

2 Answers 2

3

If you subtract the mean you get the covariance matrix, sample covariance matrix to be precise if you do not take expectations. If you do not subtract the mean, you get only the part of covariance matrix, since as @PEV wrote, to get the covariance matrix you need to subtract the term $\mu\mu^T$.

Now the confusion may arise, since in the regression setting if you subtract the mean from the dependent variable and independent variable the least squares estimates of the coefficients do not change, if your original regression has the intercept.

1

The covariance matrix is defined as $\Sigma = E \left[(\textbf{X}-E[\textbf{X}])(\textbf{X}-E[\textbf{X}])^{T} \right]$

$ = E(\textbf{X} \textbf{X}^{T})- \mu \mu^{T}$