5
$\begingroup$

I am a little apprehensive to ask this question because I have a feeling it's a "duh" question but I guess that's the beauty of sites like this (anonymity):

I need to find an orthonormal eigenbasis for the $2 \times 2$ matrix $\left(\begin{array}{cc}1&1\\ 1&1\end{array}\right)$. I calculated that the eigenvalues were $x=0$ and $x=2$ and the corresponding eigenvectors were $E(0) = \mathrm{span}\left(\begin{array}{r}-1\\1\end{array}\right)$ and $E(2) = \mathrm{span}\left(\begin{array}{c}1\\1\end{array}\right)$. Therefore, an orthonormal eigenbasis would be: $\frac{1}{\sqrt{2}}\left(\begin{array}{r}-1\\1\end{array}\right), \frac{1}{\sqrt{2}}\left(\begin{array}{c}1\\1\end{array}\right).$

Here my question: Could the eigenvalues for $E(0)$ been $\mathrm{span}\left(\begin{array}{r}1\\-1\end{array}\right)$?? This would make the final answer $\frac{1}{\sqrt{2}}\left(\begin{array}{r}1\\-1\end{array}\right), \frac{1}{\sqrt{2}}\left(\begin{array}{c}1\\1\end{array}\right)$. Is one answer more correct than the other (or are they both wrong)?

Thanks!

  • 0
    @Qiaochu You are right. Read that as "real or complex, depending on..."2011-01-06

2 Answers 2

5

0 and 2 are the correct eigenvalues to your matrix; (1, -1) is one eigenvector, (1, 1) the other. Your solution is correct.

Span is the set of all linear combinations, so if you consider a vector space over $\mathbb{R}$, it absolutely doesn't matter what scalar in $\mathbb{R}$ you multiply your vectors with inside Span. This does not affect the set Span at all. So the solution is the same.

4

There is no such thing as the eigenvector of a matrix, or the orthonormal basis of eigenvectors. There are usually many choices.

Remember that an eigenvector $\mathbf{v}$ of eigenvalue $\lambda$ is a nonzero vector $\mathbf{v}$ such that $T\mathbf{v}=\lambda\mathbf{v}$. That means that if you take any nonzero multiple of $\mathbf{v}$, say $\alpha\mathbf{v}$, then we will have $T(\alpha\mathbf{v}) = \alpha T\mathbf{v} = \alpha(\lambda\mathbf{v}) = \alpha\lambda\mathbf{v}=\lambda(\alpha\mathbf{v}),$ so $\alpha\mathbf{v}$ is also an eigenvector corresponding to $\lambda$. More generally, if $\mathbf{v}_1,\ldots,\mathbf{v}_k$ are all eigenvectors of $\lambda$, then any nonzero linear combination $\alpha_1\mathbf{v}_1+\cdots+\alpha_k\mathbf{v}_k\neq \mathbf{0}$ is also an eigenvector corresponding to $\lambda$.

So, of course, since $\left(\begin{array}{r}-1\\1\end{array}\right)$ is an eigenvector (corresponding to $x=0$), then so is $\alpha\left(\begin{array}{r}-1\\1\end{array}\right)$ for any $\alpha\neq 0$, in particular, for $\alpha=-1$ as you take.

Now, a set of vectors $\mathbf{w}_1,\ldots,\mathbf{w}_k$ is orthogonal if and only if $\langle \mathbf{w}_i,\mathbf{w}_j\rangle = 0$ if $i\neq j$. If you have an orthogonal set, and you replace, say, $\mathbf{w}_i$ by $\alpha\mathbf{w}_i$ with $\alpha$ any scalar, then the result is still an orthogonal set: because $\langle\mathbf{w}_k,\mathbf{w}_j\rangle=0$ if $k\neq j$ and neither is equal to $i$, and for $j\neq i$, we have $\langle \alpha\mathbf{w}_i,\mathbf{w}_j\rangle = \alpha\langle\mathbf{w}_i,\mathbf{w}_j\rangle = \alpha 0 = 0$ by the properties of the inner product. As a consequence, if you take an orthogonal set, and you take any scalars $\alpha_1,\ldots,\alpha_k$, then $\alpha_1\mathbf{w}_1,\ldots,\alpha_k\mathbf{w}_k$ is also an orthogonal set.

A vector $\mathbf{n}$ is normal if $||\mathbf{n}||=1$. If $\alpha$ is any scalar, then $||\alpha\mathbf{n}|| = |\alpha|\,||\mathbf{n}|| = |\alpha|$. So if you multiply any normal vector $\mathbf{n}$ by a scalar $\alpha$ of absolute value $1$ (or of complex norm $1$), then the vector $\alpha\mathbf{n}$ is also a normal vector.

A set of vectors is orthonormal if it is both orthogonal, and every vector is normal. By the above, if you have a set of orthonormal vectors, and you multiply each vector by a scalar of absolute value $1$, then the resulting set is also orthonormal.

In summary: you have an orthonormal set of two eigenvectors. You multiply one of them by $-1$; this does not affect the fact that the two are eigenvectors. The set was orthogonal, so multiplying one of them by a scalar does not affect the fact that the set is orthogonal. And the vectors were normal, and you multiplied one by a scalar of absolute value $1$, so the resulting vectors are still normal. So you still have an orthonormal set of two eigenvectors. I leave it to you to verify that if you have a linearly independent set, and you multiply each vector by a nonzero scalar, the result is still linearly independent.