0
$\begingroup$

If I am given a matrix and told to find a basis for its eigenspace, does that just mean find the eigenvectors of the matrix? In my understanding, an eigenspace of an eigenvalue $\lambda$ is the set of degenerate eigenvectors associated with it, but what about the eigenspace of a matrix? Thanks.

  • 1
    Does this answer your question: http://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors#Eigenspace.2011-11-06

2 Answers 2

2

I wonder if you were told to find a basis of each of the eigenspaces? Or a basis of the eigenspace associated with one of its eigenvalues? The question must have been mangled somehow between when you read it and when you put it here.

If you have one 2-dimensional eigenspace and one 3-dimensional eigenspace, then finding a basis would mean finding two particular linearly independent vectors in the former, and three in the latter.

I don't know why you use the word "degenerate"; it doesn't belong there.

0

Assume everything is over the complex numbers.

Given a transformation $A:\ V\to V$, where $V$ is a finite-dimensional vector space over ${\mathbb C}$, and an arbitrary complex number $\lambda$, the set $E(\lambda):=\{x\in V\ |\ Ax=\lambda x\}$ is a subspace of $V$, which for most $\lambda\in{\mathbb C}$ is zero-dimensional, i.e., consists only of the vector $0$. If $\dim\bigl(E(\lambda)\bigr)>0$ (and this happens for only finitely many $\lambda$'s) then the particular $\lambda$ is called an ${\it eigenvalue}$ of $A$, and $E(\lambda)$ is called the corresponding ${\it eigenspace}$. The vectors $x\in E(\lambda)$ different from $0$ are ${\it eigenvectors}$ of $A$ belonging to the eigenvalue $\lambda$.

These definitions apply in particular if $V={\mathbb C}^n$ and $A$ is given by a matrix with respect to the standard basis in the usual way.