I was looking at a post regarding the orthogonality of the eigenvectors of a symmetric matrix, and wanted to see if the following statement is true, and why?
A matrix is symmetric if and only if its eigenspaces are orthogonal.
Is this true and why?
I was looking at a post regarding the orthogonality of the eigenvectors of a symmetric matrix, and wanted to see if the following statement is true, and why?
A matrix is symmetric if and only if its eigenspaces are orthogonal.
Is this true and why?
In this form it is not exactly true. For example, we can take a matrix that has no eigenspaces at all, like $$ A=\left(\begin{array}{ll}0&1\\-1&0\end{array}\right). $$ It is not symmetric, and technically all of its eigenspaces are orthogonal ))
However, the following is true: a real $n\times n$ square matrix $A$ is symmetric if and only if all of its eigenspaces are orthogonal and the sum of these eigenspaces is the whole $\mathbb{R}^n$. This condition is equivalent to saying that there is an orthonormal basis consisting of eigenvectors of $A$, and this is the statement from the post that you mentioned.
UPDATE: If you're interested in the same question but for matrices over $\mathbb{C}$ and using orthogonality that arises from the standard inner product $(a,b) = \sum a_i \overline{b}_i$, then the statement isn't true. The counterexample is the same matrix $A$ as above. It is not symmetric and not Hermitian, but it has two eigenspaces generated by vectors $(1, i)^T$ and $(1, -i)^T$ which are orthogonal.
A definition of "symmetric" which should prove useful here is: if < , > is a canonically chosen inner product, we say T is symmetric if and only if for all vectors $u,v \in V$ we have < u, Tv> = < Tu, v>.
Assume that the dimension of V is equal to the sum of the dimensions of the eigenspaces $E_1,...,E_k$ associated to the eigenvalues $\lambda_1,...,\lambda_k$. Then we can write any vectors $u,v \in V$ as $\sum_{i=1}^k u_i$ and $\sum_{i=1}^k v_i$ respectively. < u, Tv> can then be rewritten <$\sum u_i$, T $\sum v_i$> = < $\sum u_i$, $\sum T v_i$> = < $\sum u_i$, $\sum \lambda_i v_i$>.
Now we use orthogonality. If the eigenspaces are orthogonal to eachother, then this expression becomes $\sum$ < $u_i$, $\lambda_i v_i$> = $\sum \lambda_i$< $u_i$, $v_i$> = $\sum$ < $\lambda_i u_i$, $v_i$> = < $\sum \lambda_i u_i$, $\sum v_i$> = < Tu, v>.
The reverse implication isn't hard, simply consider the equality < u, Tv> = < Tu, v> when u, v are two eigenvectors corresponding to two different eigenvalues.