2
$\begingroup$

The eigenspace corresponding with the eigenvalue zero is the same as the null space of the original matrix. All vectors in the null space are linearly independent so the eigenvectors of zero are also independent.

Is this conclusion right?

1 Answers 1

3

You're right that the eigenspace for 0 is the kernel of the transformation.

The last sentence is not correct, however. Certainly not all vectors in the nullspace are linearly independent. If $Ax=0$ so that $x$ is in the nullspace, then $2x$ is another vector in the nullspace that is linearly dependent with $x$.

I think you have some thinking to do about what linear independence means. One way to think about a collection of vectors being linearly independent is that you can't express one of those vectors as a nontrivial linear combination of the others.

  • 0
    I thought the dimension of the kernel was the number of columns minus the rank. The dimension is the number of vectors in the basis and the basis consists solely of linearly independent vectors so the null space was a linear independent set.2012-12-28
  • 0
    @Jef The kernel is a subspace, and *no subspace is a linearly independent set* since it contains the zero vector. Reading your last comment, it kind of looks like you are confusing bases with subspaces, somehow. The first two claims of the second sentence are correct, but the last claim does not follow from the first two.2012-12-28
  • 0
    @Jef Here's another try at breaking your bridge between bases and subspaces. For finite dimensional vector spaces, the base is going to have only finitely many elements. However, if your field is infinite, then nonzero subspaces are always going to have infinitely many elements. Certainly they can't all be in the basis. If the field is finite, then a subspace with dimension $k$ is going t have $q^k$ elements, (definitely more than $k$) where $q$ is the size of the field.2012-12-28
  • 0
    Hmmm. Now it's getting really abstract. Are the following sentences correct? By solving Ax=0, I found a base for the kernel. It's also a base for the eigenspace of zero. If I construct a vector from this base, it will be an eigenvector and it will be dependent of the ones in the base. This new vector satisfies Ax=0x.2012-12-28
  • 0
    If you solve $Ax=0$ for $x$, you do not have a base, you have a single element $x$ in the kernel. You can work to find several linearly independent $x_i$ such that $Ax_i=0$. When you find sufficiently many, you will have a base for the kernel $\{x_1\dots. x_k\}$. Any linear combination of those $x_i$ will *also* be in the kernel.2012-12-28