0
$\begingroup$

I think I have a basic understanding of the intuition behind what eigenvalues and eigenvectors are themselves, but I'm struggling to understand why the process I've found online to solve for them actually works. I feel like I'm just going through the motions and have no idea why this process actually works.

Would it be possible for someone to provide me with some intuition behind the process in plain english? If there isn't any easy-to-explain intuition behind this process and as a linear algebra novice I'm better off just memorizing the steps, just tell me that. This is the process I'm referring to:

enter image description here

  • 0
    Actually, steps 1,2,3 together are just the single step 42017-01-20

2 Answers 2

2

By definition, $\lambda$ is an eigenvalue of $A$ iff there exists nonzero $x$ with $Ax=\lambda x$. The latter means that $x$ is a non-trivial solution of $(A-\lambda I)x=0$. Such a non-trivial solution exists iff the matrix $A-\lambda I$ is singular, i.e., iff the determinant $\det(A-\lambda I)$ is zero. No intuition required.

  • 1
    When you move $\lambda x$ to the left side, why is the identity matrix added in? I figure it has something to do with matching dimensions, but why the identity matrix in particular?2017-01-20
  • 1
    @Jake Do you agree that $\lambda x = \lambda I x$? It would certainly be an error to write $(A - \lambda)x = 0$, because you can't subtract a scalar from a matrix.2017-01-20
  • 0
    Ah ok I didn't realize those to terms were equal until I wrote it out2017-01-20
2

An eigenvector for an endomorphism of a finite dimensional vector space $V$, represented by a square matrix $A$ is a non-zerovector $v$ such that $Av$ is collinear to $v$, i.e a vector $v\ne 0$ such that $Av=\lambda v$ for some scalar $\lambda$. This translates into the homogeneous linear system: $$(A-\lambda I)v=0.$$

As $v $ has to be non-zero, this implies the matrix $A$ is not invertible. This is characterised by the condition $$\det(A-\lambda I)=0.$$ This determinant is a polynomial in $\lambda$, of degree equal to the dimension of $V$: the characteristic polynomial of the endomorphism (or of the matrix $A$.

Once you've solved for the roots of the characteristic polynomial (the eigenvalues of $A$), you have to find the non-trivial solutions of each linear system $(A-\lambda_iI)v=0$ (the eigenspace $V_{\lambda_i}$ associated to the eigenvalue $\lambda_i$), and more precisely, a basis of these eigenspaces.

  • 0
    I'm sure you're correct, but the terminology here is way above my head. I'll try to look up the individual terms later.2017-01-20
  • 2
    If you allude to endomorphism, it's simply a linear map from a vector space into itself, so that it's represented by a square matrix.2017-01-20