5
$\begingroup$

I'm working on a problem where I am trying to find the eigenvectors of a pretty complicated matrix, and I am in need of some assistance. The matrix in question is:

$$A =\begin{bmatrix} \sin(x) & \cos(x)\cos(y) - i\cos(x)\sin(y)\\ \cos(x)\cos(y) + i\cos(x)\sin(y) & -\sin(x)\\ \end{bmatrix}$$

I know that the matrix is Hermitian, so that it is equal to its own conjugate transpose. Moreover, the eigenvalues are $\lambda = \pm 1$, as $A^2 = I$. However, I'm not sure how to use these properties to find the possible eigenvectors (if that would even help), and I would like to avoid doing it by brute force if possible, as it seem unruly.

Thus far, I have tried to separate the matrix into real and imaginary parts, but that didn't seem to help. I also had the thought to assume diagonalization in an attempt to find the diagonalizing unitary matrix (and, in turn, the eigenvectors), but I don't see that making things much nicer either. Any help would be greatly appreciated.

2 Answers 2

4

The determinant is $$ -\sin^2x-(\cos^2x\cos^2y+\cos^2x\sin^2y)=-1 $$ so the characteristic polynomial is $X^2-1$, because the trace is $0$.

An eigenvector relative to $1$ is a nonzero solution of $$ \begin{bmatrix} \sin x - 1 & \cos x\cos y−i\cos x\sin y \end{bmatrix} \begin{bmatrix} \alpha \\ \beta \end{bmatrix}=0 $$ so an eigenvector is $$ \begin{bmatrix} \cos x\cos y−i\cos x\sin y \\ 1-\sin x \end{bmatrix} $$ unless $\sin x=1$ when an eigenvector is $$ \begin{bmatrix}1 \\ 1\end{bmatrix} $$

For the $-1$ eigenvector, do similarly.

  • 0
    I understand how that works for making the top row of the matrix be $0$, but why would that same vector also make the bottom row become $0$?2017-01-13
  • 0
    Update: I was able to verify this method by showing that it worked for the more general Matrix $A-I =\begin{bmatrix} a \pm 1 & b - ic\\ b+ic & -a \pm 1\\ \end{bmatrix}$ given the property that $A^2 = I$. That was quite an insight. Where did you get the motivation for that idea?2017-01-13
  • 0
    @infinitylord It looks like egreg took the first component of the standard equation for finding eigenvectors $(A-I)v=0$. The second component of this equation turns out to be redundant with the first because of the structure of $A$.2017-01-13
  • 0
    @infinitylord We know that the matrix has rank $1$, because $1$ is an eigenvalue; so if a row is nonzero, the other one is its scalar multiple.2017-01-13
  • 0
    @egreg: Is that a general fact that a matrix with $1$ as an eigenvalue will have rank $1$? I understand how the matrix being rank $1$ leads to a row being a scalar multiple of the other (if it's nonzero), but I've never heard the other fact. Moreover, I can see that If I take the dot product of the two row vectors, it results in $0$, so that they are orthogonal. I thought this tells you that they can't be scalar multiples of the other?2017-01-13
  • 1
    @infinitylord No, not general; but we know the matrix is $2\times2$ and that there are two distinct eigenvalues, so each one has geometric multiplicity $1$; the rank of $A-\lambda I$, where $A$ is $n\times n$ and $\lambda$ is an eigenvalue of geometric multiplicity $d$ is $n-d$.2017-01-13
  • 0
    @egreg: Ahh okay, that makes sense. So in general, $A- \lambda I$ will have rank $1$ if it has as many distinct eigenvalues as its dimension. That's a useful trick for calculating eigenvectors, thank you.2017-01-13
1

Note that $$ A=DQD^{-1},\ D=\pmatrix{1\\ &e^{iy}},\ Q=\pmatrix{\sin x&\cos x\\ \cos x&-\sin x}. $$ It follows that if $v$ is an eigenvector of $Q$, then $Dv$ is an eigenvector of $A$.