11
$\begingroup$

I was trying to show that orthogonal matrices have eigenvalues $1$ or $-1$.

Let $u$ be an eigenvector of $A$ (orthogonal) corresponding to eigenvalue $\lambda$. Since orthogonal matrices preserve length, $ \|Au\|=|\lambda|\cdot\|u\|=\|u\|$. Since $\|u\|\ne0$, $|\lambda|=1$.

Now I am stuck to show that lambda is only a real number. Can any one help with this?

  • 3
    Real symmetric matrices do, but otherwise not necessarily.2011-09-24
  • 0
    You need to specify over what field you are working; in some contexts, "orthogonal matrices" is reserved to matrices operating on *real* vector spaces, and as such can have only real eigenvalues (though their characteristic polynomial may have complex roots). In other contexts, "orthogonal matrices" does not restrict the field of the underlying vector space.2011-09-25
  • 0
    This isn't true as shown below. Orthogonal matrices do have two differences, whether they have a determinant of 1 or -1, not the eigenvalues being 1 or -1. The ones that have a determinant of 1 form the Special Orthogonal Group...2013-09-10

3 Answers 3

29

The eigenvalues of $$ \begin{bmatrix} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{bmatrix} $$ are $\cos\theta \pm i\sin\theta= e^{\pm i\theta}$. This is an orthogonal matrix.

If a matrix with real entries is symmetric (equal to its own transpose) then its eigenvalues are real (and its eigenvectors are orthogonal). Every $n\times n$ matrix whose entries are real has at least one real eigenvalue if $n$ is odd. That is because the characteristic polynomial has real coefficients so the complex conjugate of a root is another root, and you can't have an odd number of roots if they come in pairs of distinct entries.

But generally, the eigenvalues of matrices with real entries need not be real.

  • 2
    To reconcile this with SBlade's answer, in this one you are silently taking the matrix to be over the field of complex numbers, as is quite standard. On the other hand, it is true that a matrix over a real vector space can only have real eigenvalues when viewed as giving a transformation from that space to itself, since the tranformation can only have real eigenvectors.2011-09-25
  • 0
    @CarlMummert : But in that case, this matrix would still not have real eigenvalues. $\qquad$2016-11-01
8

No, a real matrix does not necessarily have real eigenvalues; an example is $\pmatrix{0&1\\-1&0}$.

On the other hand, since this matrix happens to be orthogonal and has the eigenvalues $\pm i$ -- for eigenvectors $(1\mp i, 1\pm i)$ -- I think you're supposed to consider only real eigenvalues in the first place.

  • 0
    @anon, otherwise what he's trying to prove is not true, as shown by my counterexample.2011-09-24
3

I guess it depends whether you are working with vector spaces over the real numbers or vector spaces over the complex numbers.In the latter case the answer is no, however in the former the answer has to be yes.Is it not guys ?

  • 2
    No, it doesn't. Finding the eigenvalues of a matrix is identical to finding the zeroes of a polynomial, and polynomials with real coefficients can have complex zeroes. Consider $(a - \lambda)(b - \lambda) - c d = 0$ which has complex roots if $(a+b)^2 < 4 (ab - cd)$.2011-09-25
  • 7
    @rcollyer: wait a minute, *what* is an eigenvalue? A scalar $\lambda$ for which there exists some $v \neq 0$ such that $Av = \lambda v$. If you only allow real scalars then there are only real eigenvalues, tautologically. Your "identical" is simply wrong.2011-09-25
  • 0
    @commenter, no. The equation $A v = \lambda v$ has a non-trivial solution iff $\det(A - \lambda I) = 0$ which is identical to finding the roots of a polynomial in $\lambda$. We know the roots of a polynomial with real coefficients may have complex roots. If we try to restrict ourselves to only real roots, we're not guaranteed to find all of them.2011-09-25
  • 0
    @commenter. No, the "iff" is absolutely correct. Please review the [fundamental theorem of algebra](http://en.wikipedia.org/wiki/Fundamental_theorem_of_algebra) and the definition of the [characteristic polynomial](http://en.wikipedia.org/wiki/Characteristic_polynomial).2011-09-25
  • 11
    FWIW I think this answer by SBlade is completely correct. If $i$ is not a scalar for my vector space then I cannot have $i$ as an eigenvalue. Typically, people take a real-valued matrix and view it as a complex-valued matrix for the purpose of finding eigenvalues, since then the field of scalars is algebraically closed. But the definition of an eigenvalue is geometric, after all, so if a matrix has no eigenvectors in our space it cannot have any eigenvalues either.2011-09-25
  • 0
    this answer is technically true but not at all helpful, and obviously only confuses things given the discussion following the answer2017-11-03