53
$\begingroup$

Let $ \sigma(A)$ be the set of all eigenvalues of $A$. Show that $ \sigma(A) = \sigma(A^T)$ where $A^T$ is the transposed matrix of $A$.

  • 4
    This is a bit more advanced than what you need, but: [an interesting article](http://projecteuclid.org/euclid.pjm/1103039127).2012-03-24
  • 1
    I guess your work in an algebraically closed field. In this case, use the fact that $r$ is an eigenvalue of $A$ if and only if $r$ is an eigenvalue of $A^T$. In fact, it can be shown that $A$ and $A^T$ are similar.2012-03-24
  • 1
    Here's one possible simpler problem that will get you started on the right path. If $A$ is an n by n singular matrix, can you show that $A^T$ is also singular?2012-03-24
  • 4
    Please don't post your questions in the imperative; please tells us what your thoughts are about the question, so that people don't tell you things you already know; please tell us the context in which you encountered the question, so that people can write their answers at an appropriate level.2012-03-24

3 Answers 3

75

The matrix $(A - \lambda I)^{T}$ is the same as the matrix $(A^{T} - \lambda I)$, since the identity matrix is symmetric.

Thus:

$$\det(A^{T} - \lambda I) = \det((A - \lambda I)^{T}) = \det (A - \lambda I)$$

From this it is obvious that the eigenvalues are the same for both $A$ and $A^{T}$.

  • 0
    Do they have same minimal polynomial ?2015-10-06
  • 3
    Any polynomial satisfied by $A$ is also satisfied by $A^T$ so yeah.2015-10-06
20

I'm going to work a little bit more generally.

Let $V$ be a finite dimensional vector space over some field $K$, and let $\langle\cdot,\cdot\rangle$ be a nondegenerate bilinear form on $V$.

We then have for every linear endomorphism $A$ of $V$, that there is a unique endomorphism $A^*$ of $V$ such that $$\langle Ax,y\rangle=\langle x,A^*y\rangle$$ for all $x$ and $y\in V$.

The existence and uniqueness of such an $A^*$ requires some explanation, but I will take it for granted.

Proposition: Given an endomorphism $A$ of a finite dimensional vector space $V$ equipped with a nondegenerate bilinear form $\langle\cdot,\cdot\rangle$, the endomorphisms $A$ and $A^*$ have the same set of eigenvalues.

Proof: Let $\lambda$ be an eigenvalue of $A$. And let $v$ be an eigenvector of $A$ corresponding to $\lambda$ (in particular, $v$ is nonzero). Let $w$ be another arbitrary vector. We then have that: $$\langle v,\lambda w\rangle=\langle\lambda v,w\rangle=\langle Av,w\rangle=\langle v,A^*w\rangle$$ This implies that $\langle v,\lambda w-A^*w\rangle =0$ for all $w\in V$. Now either $\lambda$ is an eigenvalue of $A^*$ or not. If it isn't, the operator $\lambda I -A^*$ is an automorphism of $V$ since $\lambda I-A^*$ being singular is equivalent to $\lambda$ being an eigenvalue of $A^*$. In particular, this means that $\langle v, z\rangle = 0$ for all $z\in V$. But since $\langle\cdot,\cdot\rangle$ is nondegenerate, this implies that $v=0$. A contradiction. $\lambda$ must have been an eigenvalue of $A^*$ to begin with. Thus every eigenvalue of $A$ is an eigenvalue of $A^*$. The other inclusion can be derived similarly.

How can we use this in your case? I believe you're working over a real vector space and considering the dot product as your bilinear form. Now consider an endomorphism $T$ of $\Bbb R^n$ which is given by $T(x)=Ax$ for some $n\times n$ matrix $A$. It just so happens that for all $y\in\Bbb R^n$ we have $T^*(y)=A^t y$. Since $T$ and $T^*$ have the same eigenvalues, so do $A$ and $A^t$.

  • 1
    For an explanation on the things I took for granted, I suggest you read these excellent lecture notes: https://www.dpmms.cam.ac.uk/study/IB/LinearAlgebra/2008-2009/bilinear-08.pdf2014-07-24
  • 0
    I have a questions: First, why did you assume surjectivity of A ?2017-10-05
  • 0
    Plus, where did we used the fact that V is finite dimensional. (I'm only interested with the fact that $A$ and $A^*$ has the same eigenvalues.)2017-10-05
  • 0
    @onurcanbektas Where have I used surjectivity? The finite dimension is used in my reference to construct $A^*$. If you have $A^*$ to start with, construction isn't required. I would have to double check whether $\lambda I-A^*$ is an automorphism (bounded) cause I don't have infinite dimensional facts at my fingertips. But you might.2017-10-05
12

$$ \operatorname{det}(A-tI) = \operatorname{det}(A-tI)^T = \operatorname{det}(A^T-tI)$$ A matrix and its transpose have the same determinant. If you apply properties of transposition, you get that both $A$ and its transpose have the same characteristic polynomial.