4
$\begingroup$

I have the following problem which I don't know how to solve, any help is appreciated.

Let $A,B$ and $C$ be $n \times n$ matrices. Suppose that $B$ and $C$ are symmetric.

Consider the matrix $$ M = \begin{bmatrix} A & B \\ C & -A^T \end{bmatrix} $$

Show that if $\lambda$ is an eigenvalue of $M$ then so is $-\lambda$.

My idea:

I know that for $\lambda$ to be an eigenvalue of $A$, $det(A-λI)$ has to be zero, then you can work out this determinant and get the eigenvalues. However I don't know if you need to solve this matrix like that since its entries aren't numbers but matrices. Besides that, I'm quite sure this isn't the way to tackle this problem since you don't need the values of the eigenvectors but you just have to show that if $\lambda$ is an eigenvalue of M then so is $-\lambda$.

Thanks in advance :)

2 Answers 2

1

Let $u$ be an eigenvector: $u\ne0$ and $Mu=\lambda u$. Then, writing $$ u=\begin{bmatrix}v\\w\end{bmatrix} $$ with $v$ and $w$ being $n\times 1$ column vectors, we have $$ \begin{bmatrix}A & B \\ C & -A^T\end{bmatrix} \begin{bmatrix}v\\w\end{bmatrix}= \begin{bmatrix}Av+Bw\\Cv-A^Tw\end{bmatrix}= \begin{bmatrix}\lambda v\\\lambda w\end{bmatrix} $$ Now consider $$ u'=\begin{bmatrix}w\\-v\end{bmatrix} $$ so $$ M^Tu'= \begin{bmatrix} A^T & C \\ B & -A\end{bmatrix} \begin{bmatrix}w\\-v\end{bmatrix} = \begin{bmatrix} A^Tw-Cv \\ Bw+Av \end{bmatrix} =\begin{bmatrix} -\lambda w \\ \lambda v \end{bmatrix}= -\lambda u' $$ A matrix and its transpose have the same eigenvalues.

  • 0
    I'm sorry but I don't really get what you do after 'so'. Do you take the transpose of the eigenvector? is it just a different notation for $[w -v]^T$? and why do you do eigenvector times matrix instead of the other way arround? thanks in advance2017-01-19
  • 0
    @Amaluena I changed the argument to use column vectors. It's essentially the same, but perhaps clearer.2017-01-19
  • 0
    this makes it indeed more clear to me, however I'm wondering what the ' means, you don't seem to use it as a transpose do you?2017-01-19
  • 1
    @Amaluena It's just a new vector; call it $u_1$, if you prefer.2017-01-19
  • 0
    another question, a bit late but I hope you don't mind. Can you just halfway switch to another vector and then it still is a solid proof?2017-01-22
  • 0
    @Amaluena Why not? You want to find an eigenvector, don't you?2017-01-22
  • 0
    yeah okay, now I've worked it out once more it seems more logical2017-01-22
0

$M^T = \begin{bmatrix} A^T &C\\B&-A\end{bmatrix}$

Every row vector in $(M - \lambda I)$ there is a corresponding column vector in $(M^T + \lambda I)$ Such that the two vectors are equal.

So, if the rows of $(M - \lambda I)$ and $\lambda$ is an eigenvalue of M, the column vectors of $(M^T + \lambda I)$ must also be linearly dependent and $-\lambda$ is an eigenvalue of $M^T.$

If $\lambda$ is an eigenvalue of $M^T$ it is also an eigenvalue of $M$