2
$\begingroup$

I have this matrix:

$ \begin{pmatrix} 3 & \sqrt{2} \\ \sqrt{2} & 2 \\ \end{pmatrix} $

I found the eigenvalues ( that are $1$ and $4$) but I find it really difficult to find the bases... ($(cI-A)x=0$ ) I think maybe one is $(1,1)$ but I'm not sure..

3 Answers 3

2

The eigenvalues are correct.

To find the eigenvectors, you need to solve systems of linear equations. Namely, to find eigenvectors of $1$, you need to find all vectors $\mathbf{x}$ for which $(A-(1)I)\mathbf{x}=\mathbf{0}$ (this is equivalent to $(I-A)\mathbf{x}=\mathbf{0}$, but it involves changing fewer entries, so it may be less prone to errors).

$A-I = \left(\begin{array}{cc} 3&\sqrt{2}\\ \sqrt{2}&2\end{array}\right) - \left(\begin{array}{cc}1&0\\0&1\end{array}\right) = \left(\begin{array}{cc}2 & \sqrt{2}\\ \sqrt{2} & 1 \end{array}\right).$

When is $(A-I)\mathbf{x}=\mathbf{0}$? We need to solve the system $\left(\begin{array}{cc} 2 & \sqrt{2}\\ \sqrt{2}&1\end{array}\right)\left(\begin{array}{c}x_1\\x_2\end{array}\right) = \left(\begin{array}{c}0\\0\end{array}\right).$

At this point, you probably know how to solve systems of linear equations. For example, we can use Gaussian elimination on the matrix: $\begin{align*} \left(\begin{array}{cc} 2 & \sqrt{2}\\ \sqrt{2} & 1 \end{array}\right) &\to \left(\begin{array}{cc} 1 & \frac{\sqrt{2}}{2}\\ \sqrt{2} & 1 \end{array}\right) &\text{(divide first row by }2\text{)}\\ &\to \left(\begin{array}{cc} 1 & \frac{\sqrt{2}}{2}\\ 0 & 1 - \sqrt{2}\left(\frac{\sqrt{2}}{2}\right)\end{array}\right) &\text{(subtract }\sqrt{2}\text{ times the first row from row 2)}\\ &=\left(\begin{array}{cc} 1 & \frac{\sqrt{2}}{2}\\ 0 & 0 \end{array}\right). \end{align*}$ The system has infinitely many solutions (as it should, since there have to be eigenvectors associated to the eigenvalue). They are all vectors $(x_1,x_2)^t$ such that $x_1+\frac{\sqrt{2}}{2}x_2 = 0$. That means that we must have $x_1 = -\frac{\sqrt{2}}{2}x_2$. If we set $x_2=\sqrt{2}$, then $x_1=-1$. So one eigenvector is $(-1,\sqrt{2})^t$.

Now you need to do the same thing with $A-4I$, that is, with the matrix $\left(\begin{array}{rr} -1 & \sqrt{2}\\ \sqrt{2} & -2 \end{array}\right).$

(No, $(1,1)$ is not an eigenvectors; note that $A(1,1)^t = (3+\sqrt{2},2+\sqrt{2})^t$, which is not a scalar multiple of $(1,1)$)

  • 1
    @Nusha: First: it's not $|cI-A|x=0$. The absolute value bars are just **wrong**. Second: By definition, an eigenvector of $A$ associated to $c$ is a vector $x$ such that $Ax=cx$. This is the same as $Ax-cx=0$, which is the same as $Ax-cIx = 0$, which is the same as $(A-cI)x = 0$, which is the same as $-(A-cI)x = -0 = 0$, which is the same as $(cI-A)x=0$. So finding eigenvectors of $A$ associated to $c$ **is the same thing** as finding solutions to the system of equation $(cI-A)x=0$. The advantage of the latter is that we are supposed to *know* how to solve homogeneous systems of linear eqs.2012-05-18
1

Indeed, if you solve $(A-I)x = 0$, i.e. $ \begin{pmatrix} 2 & \sqrt{2} \\ \sqrt{2} & 1 \end{pmatrix} x = 0$ you find that $x = (1, -\sqrt{2})^{T}$ is an eigenvector.

For $(A - 4I)x = 0$, i.e. $ \begin{pmatrix} -1 & \sqrt{2} \\ \sqrt{2} & -2 \end{pmatrix} x = 0$ you find that $x = (\sqrt{2}, 1)^{T}$ is another eigenvector.

1

Just form the matrix $\lambda I -A$ and look for any element in the null space.

For example, with $\lambda = 1$, you have $\begin{pmatrix} -2 & -\sqrt{2} \\ -\sqrt{2} & -1 \\ \end{pmatrix}$. You might notice that the first row is $\sqrt{2}$ times the second, so there is an eigenvector $(1,-\sqrt{2})^T$.

Repeat the process for the other eigenvalue. Since the eigenvalues are different, the eigenvectors are linearly independent and so span $\mathbb{R}^2$.