0
$\begingroup$

a.) How many orthonormal eigenvector bases does a symmetric $n$ x $n$ matrix have? Now let $A=\pmatrix{a&b\\c&d}$, write down necessary and sufficient conditions on the entries a, b, c, d that ensures that A has only real eigenvalues.

b.) Let $A^T =-A$ be a real, skew-symmetric $n$ x $n$ matrix. Prove that the only possible real eigenvalue of A is $\lambda = 0$?

Answer for a:

If all eigenvalues are distinct there are $2^n$ different bases. If the eigen values are repeated there are infinitely many.

How did they get that? Lets say I have a $2$ x $2$ matrix and it has distinct eigenvalues (lets say 1 and 2 are the eigenvalues) wouldn't the eigenvectors be equal to the amount of eigenvalues, so in this case it will equal 2? But the answer says it equals 4?

  • 0
    You can multiply an eigenvector by $-1$ to obtain a new eigenvector,while preserving orthonormality. (If our field of scalars is $\mathbb C$, you can multiply by any unit.)2012-11-16
  • 0
    @littleO so all distinct eigenvalues have the property of $^+_-$ ? And how is repeated eigenvalues infinite?2012-11-16
  • 0
    Well, think about the identity matrix, a symmetric matrix with repeated eigenvalue 1. *Any* pair of orthogonal unit vectors will be an orthonormal basis consisting of eigenvectors, right?2012-11-16
  • 0
    @GerryMyerson Can you elaborate more on your second sentence. I am really trying to understand this but need more help if you don't mind.2012-11-16
  • 0
    Start with this: what are the eigenvectors of the identity matrix?2012-11-16
  • 0
    And don't edit in an entirely new question --- post a new question, instead. But first make sure you understand the answer to this question.2012-11-16
  • 0
    @GerryMyerson sry for that. and for your question the eigenvalues are repeated 1 therefor the eigenvector can be any vector of the identity matrix? maybe i am not entirely understanding the eigenvector concept.2012-11-16
  • 0
    Yes, every (nonzero) vector is an eigenvector for the identity matrix. So every pair of orthonormal vectors is a basis of orthonormal eigenvectors for the identity matrix. And the are infinitely many pairs of orthonormal vectors --- just take any one such pair, and rotate them, together, through any angle.2012-11-16
  • 0
    @GerryMyerson thank you very much for clearing that up for me.2012-11-16

1 Answers 1

1

Let's say a symmetric matrix $A \in \mathbb R^{2 \times 2}$ has distinct eigenvalues $\lambda_1$ and $\lambda_2$, and assume $\{v_1,v_2\}$ is a corresponding orthonormal basis of eigenvectors for $\mathbb R^2$. Then the following are also orthonormal eigenbases of $\mathbb R^2$: $\{ v_1,-v_2 \},\{ -v_1,v_2\},\{-v_1,-v_2\}$.

For part b): suppose $A \in \mathbb R^{n \times n}$ is skew-symmetric and $\lambda \in \mathbb R$ is an eigenvalue of $A$ with corresponding (nonzero) eigenvector $x$. Then

\begin{align*} \langle Ax,x \rangle &= \langle \lambda x, x \rangle \\ &= \lambda \|x\|_2^2. \end{align*}

On the other hand, \begin{align*} \langle Ax,x \rangle &= \langle x, A^T x \rangle \\ &= \langle x, -Ax \rangle \\ &= \langle x, -\lambda x \rangle \\ &= -\lambda \|x\|_2^2. \end{align*} It follows that $\lambda = -\lambda$, which implies that $\lambda = 0$.

  • 0
    Thank you very much! For part b. does a skew matrix always have $0$ as its diagonal? If not, why does the negative make it zero? I cant see that in your proof.2012-11-16
  • 0
    Yes, REAL skew symmetric matrices have zeros on the diagonal. That's because $a_{ii} = -a_{ii}$2012-11-16
  • 0
    @diimension Yes, as bartgol explained ; I don't think I was directly using that fact though. I was just using the fact that if $\lambda$ is a real number, and $\lambda = -\lambda$, then $\lambda = 0$.2012-11-16
  • 0
    @bartgol thanks but why does the negative make it have zero diagonals? What does the negative do?2012-11-16
  • 0
    @littleO thank you very much and also can you explain what I asked bartgol?2012-11-16
  • 0
    Sure. Suppose $A = \begin{bmatrix} a_{11} & a_{12} \\ a_{21} & a_{22}\end{bmatrix}$ is skew-symmetric, so that $A^T = \begin{bmatrix} a_{11} & a_{21} \\ a_{12} & a_{22}\end{bmatrix} = \begin{bmatrix} -a_{11} & -a_{12} \\ - a_{21} & -a_{22} \end{bmatrix}$. Then we see $a_{11} = -a_{11}$. And how can you have a real number which is equal to its own negative? The only way is if that number is equal to $0$. So we conclude $a_{11} = 0$. Similarly $a_{22} = 0$.2012-11-16
  • 1
    @littleO Beautiful. Thank you very much! I understand it now with your help.2012-11-16
  • 1
    @littleO Sry to bother you on this question again but for part b I was looking at your proof again and I am wondering will all the eigenvalues of A be imaginary then?2012-11-26
  • 1
    @diimension Yes, that's correct.2012-11-26