-1
$\begingroup$

I think the answer is no, but to be precise, is it correct to assume that if we have one eigenvalue that is the same, then the eigenvectors for these have to be the same too?

For example,

$$\begin{gathered} T(1,0,0) = (0, - 2,0) \hfill \\ T(0,1,0) = (0,0.5,0) \hfill \\ T(0,0,1) = (0,1.5,0) \hfill \\ \end{gathered}$$

The transformation matrix $T$ has two eigenvalues that are zero, but this cannot be the case? The other one is $0.5$.

  • 2
    If $v$ is eigenvector then $\lambda v$ is also a eigenvector.2017-02-08
  • 4
    Well, *any* scalar multplication of an eigenvector is again an eigenvector (and of the same eigenvalue), but more is true: a single eigenvector can have several **linearly independent** eigenvectors for one single eigenvalue. Take, for example, the unit matrix (operator) ...2017-02-08
  • 3
    _Every_ vector is an eigenvector of eigenvalue $1$ for the identity matrix.2017-02-08

1 Answers 1

0

The same eigenvalue can have multiple eigenvectors that are not only not equal but are also orthogonal. In general, the eigenvectors for a particular eigenvalue form an eigenspace which is a vector space and can have any dimension up to dimension of the matrix.

For a counterexample to your theory, consider the identity matrix. Every nonzero vector is an eigenvector of the identity matrix and has an eigenvalue of 1.

  • 0
    in my example, how many eigenvectors does $T$ have? 2 or 3?2017-02-08
  • 0
    @Artem $T$ has two different eigenvalues $0$ and $0.5$. Eigenspace (vector space consisting of all eigenvectors) that corresponds to eigenvalue 0 has dimension $2$, while the one corresponding to eigenvalue $0.5$ has dimension $1$; both eigenspaces (like all vector spaces over infinite fields) contain infinitely many vectors.2017-02-08
  • 0
    @Qudit In the first sentence, did you meant independent when you wrote non-orthogonal?2017-02-08
  • 0
    I meant orthogonal (I'm assuming an inner product).2017-02-08
  • 0
    @ZoranLoncarevic: Just nitpicking: "both eigenspaces (like all vector spaces over infinite fields) contain infinitely many vectors." — Well, *almost* all vector spaces over infinite fields: The zero dimensional vector space contains only one vector.2017-02-08
  • 0
    @celtschk Yes, you are right :)2017-02-08
  • 0
    @Artem What are $e_1,e_2,e_3$? Is $e_1=(1,0,0)$, $e_2=(0,1,0)$, $e_3=(0,0,1)$? In that case, of those 3 vectors, only $e_2$ is eigenvector that corresponds to eigenvalue $0.5$. One basis of the other eigenspace that correspond to eigenvalue zero is given, for example, by $$\{e_1+4e_2, 3e_2-e_3\}.$$2017-02-08
  • 0
    @Artem Do you know the definition of eigenvalue and eigenvector? Do you know what is dimension of vector space?2017-02-08
  • 0
    Sorry, I was a bit confused by the question I was working on, and apparently it had different definitions of the basis vectors, which confused me.2017-02-09