2
$\begingroup$

I've been trying to complete a proof for a while now, but I can't. It would be great if someone could finish it for me so that I can at least learn from the solution.

The problem is asked like this:

Show that $\lambda$ is an eigenvalue of A iff $\lambda$ is eigenvalue of A transpose.

There's a hint and it says:

For any $\lambda$, $(A-\lambda I )^T = A^T-\lambda I$. By a theorem (which one?), $A^T - \lambda I$ is invertible iff $A-\lambda I$ is invertible.

I know that there is a theorem, in this book called the Invertible Matrix Theorem, which says that if A is invertible then so is its transpose. But I don't see what that has to do with anything? If A has an eigenvalue, then $A-\lambda I$ is linearly dependent and is not even invertible! Is that what the hint is all about? Well, I still don't see what that says about the solution set. In order to complete this proof, I think, we need to show that the solution set of the homogenous equation $A-\lambda I = 0$ is the same as that for its transpose.

Many thanks.

  • 0
    "If A has an eigenvalue, then A is linearly dependent and is not even invertible!" - you meant "...then $\mathbf A-\lambda\mathbf I$ is linearly dependent..." I suppose.2011-04-28
  • 0
    Yes that's what I meant. Thanks for pointing it out.2011-04-28
  • 0
    "I still have the problem with =>" - Better now?2011-04-28
  • 0
    Yes, you fixed it, thank you!2011-04-28
  • 1
    I do not know what *$A$ is linearly dependent* nor what *$A-\lambda I$ is linearly dependent* means.2011-04-28
  • 0
    Possible duplicate of http://math.stackexchange.com/questions/24924/if-c-is-an-eigenvalue-of-a-is-it-an-eigenvalue-of-a-mathbft/24925#249252011-04-28
  • 0
    @Didier: neither do I; I was merely guessing Calle's intent. "$(\mathbf A-\lambda\mathbf I)$'s rows/columns are linearly dependent" would be the proper way of putting it, though.2011-04-28
  • 0
    damn, I'm not good at this. Alright, formally, I think, I should write it like this: "the rows make up a linearly dependent set". Do you think that is correct? That is how I infer that $A-\lambda I$ cannot be invertible, because if there is an eigenvalue then one of the rows will have to be a combination of the others. If you infer it differently I would like to know about it.2011-04-28

2 Answers 2

7

Use the definition of being an eigenvalue and the invertible matrix theorem mentioned in your post: $\lambda$ is an eigenvalue of $A$ if and only if $A-\lambda I$ is not invertible if and only if $A^T-\lambda I$ is not invertible if and only if $\lambda$ is an eigenvalue of $A^T$.

  • 0
    I get that if $A^T - \lambda I$ is not invertible, there must be an eigenvalue. Because if there would be no solution to the homogenous equation $A^T - \lambda I = 0$, the set of rows would make up an independent set and the matrix would in fact be invertible (please tell me if I'm getting this wrong). But I don't know why the fact that $A^T$ HAS an eigenvalue neccesarily means that it is the SAME eigenvalue as for A. How can I know this?2011-04-28
  • 0
    @Calle: The determinant is invariant under transpose. In particular, $\det(A - \lambda I) = \det( (A-\lambda I)^T) = \det( A^T - (\lambda I)^T) = \det(A^T - \lambda I^T) = \det(A^T - \lambda I)$. Say you know that $\lambda_0$ is an eigenvalue for $A^T$, so it's a solution for $\det (A^T - \lambda I) = 0$. Can that tell you anything about the solutions for $\det(A-\lambda I) = 0$?2011-04-28
  • 0
    Alright it will suffice. Thank you everybody.2011-04-29
2

$$\text{det}(A−\lambda I)=\text{det}((A−\lambda I)^T)=\text{det}(A^T-\lambda I)$$