4
$\begingroup$

Suppose the $n \times n$ matrix $A$ has eigenvalues $\lambda_1, \ldots, \lambda_n$ and singular values $\sigma_1, \ldots, \sigma_n$. It seems plausible that by comparing the singular values and eigenvalues we gets some sort of information about eigenvectors. Consider:

a. The singular values are equal to the absolute values of eigenvalues if and only if the matrix is normal, i.e., the eigenvectors are orthogonal (see http://en.wikipedia.org/wiki/Normal_matrix , item 11 of the "Equivalent definitions" section ).

b. Suppose we have two distinct eigenvalues $\lambda_1, \lambda_2$ with eigenvectors $v_1, v_2$. Suppose, hypothetically, we let $v_1$ approach $v_2$, while keeping all the other eigenvalues and eigenvectors the same. Then the largest singular value approaches infinity. This follows since $\sigma_{\rm max} = ||A||_2$ and $A$ maps the vector $v_1 - v_2$, which approaches $0$, to $\lambda_1 v_1 - \lambda_2 v_2$, which does not approach $0$.

It seems reasonable to guess that the ``more equal'' $|\lambda_1|, \ldots, |\lambda_n|$ and $\sigma_1, \ldots, \sigma_n$ are, the more the eigenvectors look like an orthogonal collection. So naturally my question is whether there is a formal statement to this effect.

  • 2
    As written, your point a is wrong: The singular values are always $\ge0$, while the eigenvalues can be complex, even for orthogonal eigenvectors.2012-08-04
  • 1
    To restate @celtschk's comment: the singular values and eigenvalues are the same iff the matrix is symmetric positive semidefinite.2012-08-04
  • 0
    ...I forgot to take the absolute value of the eigenvalues in the first version of this question. Fixed now, I hope.2012-08-04
  • 0
    Can we assume that the eigenvalues are separated from each other by some amount, say $|\lambda_i-\lambda_j|>\delta$ for $\delta$ independent of i,j?2012-08-05

1 Answers 1