1
$\begingroup$

I really don't understand why this would be true just from looking at it, and my book offers a proof I find confusing: http://calcchat.com/book/Elementary-Linear-Algebra-7e/7/1/61/

I don't understand how they found A', and it just overall still isn't getting the concept across to me about why this is true on an intuitive level.

Can someone explain this concept to me in a more intuitive manner?

(I saw some people mention diagonalization in similar questions on here, but my book doesn't get to that until the next section)

  • 0
    The number of Jordan blocks corresponding to an eigenvalue in the Jordan normal form is less than or equal to the sum of the dimensions of the block.2017-02-18

2 Answers 2

0

Here's an intuitive overview:

What is a matrix? A matrix is a representation of a linear transformation between two vector spaces. The way we get this representation is by considering the linear transformation of basis vectors. If we know the linear transformation of all the basis vectors, we know the transformation of any vector by expressing it as a combination of basis vectors.

So we define a matrix is a list of transformations of basis vectors (the columns of the matrix), and we define matrix multiplication as finding the appropriate combination of transformed basis vectors (tell me if this needs clarification).

Consider an eigenspace $E_\lambda$ of a linear transformation $T$. We know that there is a basis for $E_\lambda$ which has $\dim E_\lambda$ vectors. A basis is linearly independent, so there's an idea that if we have a set of linearly independent vectors, we can add in some more vectors to get a basis for the entire vector space.

Now, we use our basis for $E_\lambda$ and the other vectors we added in to create a matrix representation of $T$. What we're going to do this time though is probably get a different, but equally valid representation of $T$ than the one we originally had. When we transform the basis vectors of $E_\lambda$, we know we get a scalar multiple of the original vector. This is why the first few columns (list of transformations of basis vectors) of the matrix they give are columns with all 0s except for the one eigenvalue. From this matrix representation we created, we start to calculate the determinant, and we know that $(x-\lambda)$ shows up at least $\dim E_\lambda$ times.

But it might show up more times. Therefore the algebraic multiplicity (the number of times $(x-\lambda)$ appears) is greater than or equal to the dimension of the eigenspace ($\dim E_\lambda$).

Did that help?

0

Note that from theorem we know that two distinct eigenvectors must correspond to two linearly independent vectors. NOT THE OTHER WAY AROUND. That is, two vectors can be linearly independent having the same eigenvalue.
So, suppose the multiplicity of an eigenvalue is 2. Then, this either means that there are two linearly independent eigenvector or two linearly dependent eigenvector. If they are linearly dependent, then their dimension is obviously one. If not, then their dimension is at most two. And this generalizes to more than two vectors. Hence, it cannot exceed the multiplicity.
Was a little late, but hope it helps future readers.