2
$\begingroup$
  • 1.) Show that any nonzero linear combination of two eigenvectors v,w corresponging to the same eigenvalue is also an eigenvector.

  • 2.) Prove that a linear combination $cv+dw$, with $c,d \ne 0$, of two eigenvectors corresponding to different eigenvalues is never an eigenvector.

  • 3.) Let $\lambda$ be a real eigenvalue of the real n x n matrix A, and $v_1,...,v_k$ a basis for the associated eigenspace $V_{\lambda}$. Suppose $w \in \mathbb{C^n}$ is a complex eigenvector, so $Aw = \lambda w$. Prove that $w = c_1v_1 + ... + c_kv_k$ is a complex linear combination of the real eigenspace basis.

For 1 and 2 I know that if two eigenvectors $\vec{v}_1$ and $\vec{v}_2$ are associated with the same eigenvalue then any linear combination of those two is also an eigenvector associated with that same eigenvalue. But, if two eigenvectors $\vec{v}_1$ and $\vec{v}_2$ are associated with different eigenvalues then the sum $\vec{v}_1+\vec{v}_2$ need not be related to the eigenvalue of either one. In fact, just the opposite. If the eigenvalues are different then the eigenvectors are not linearly related. But I can show this using a proof?

For 3. I am not too sure.

2 Answers 2

2

For 2. let $A$ be the matrix; calculate $A(cv+dw)$ under the assumption that $v$ and $w$ are eigenvectors belonging to different eigenvectors; see whether the result is consistent with $cv+dw$ being an eigenvector.

For 3. write $w=w_1+iw_2$ where $w_1$ and $w_2$ are real, and then see what $Aw=\lambda w$ tells you about $w_1$ and $w_2$.

EDIT: In view of the length of the discussion in the comments, I'll expand on this last part.

We get $$A(w_1+iw_2)=\lambda(w_1+iw_2)$$ which is $$Aw_1+iAw_2=\lambda w_1+i\lambda w_2$$ Now if $a+bi=c+di$ where $a,b,c,d$ are real (real numbers, or vector with real entries, or matrices with real entries), then necessarily $a=c$ and $b=d$ --- that's what equality means in the complex realm. So we deduce $$Aw_1=\lambda w_1,\qquad Aw_2=\lambda w_2$$ So $w_1$ and $w_2$ are in the eigenspace $V_{\lambda}$. We are told $v_1,\dots,v_k$ is a basis for $V_{\lambda}$, so $$w_1=r_1v_1+\cdots+r_kv_k,\qquad w_2=s_1v_1+\cdots+s_kv_k$$ for some real numbers $r_1,\dots,r_k,s_1,\dots,s_k$. Then $$w=w_1+iw_2=c_1v_1+\cdots+c_kv_k{\rm\ with\ }c_j=r_j+is_j,j=1,\dots,k$$

  • 0
    Thanks a lot! Will I use the same method you used for part 2 for part 1, but instead showing that it is consistent instead of inconsistent?2012-11-20
  • 1
    Yes (I didn't mention part 1 in my answer because I misunderstood you to be saying you had already done that part).2012-11-20
  • 0
    I am having trouble proving part3. Do you think you can add more detail please? What do they mean when they say that $v_1,...,v_k$ are the basis for the eigenspace?2012-11-20
  • 0
    The eigenspace $V_{\lambda}$ is the set of all vectors $v$ such that $Av=\lambda v$. You can prove that $V_{\lambda}$ is a vector space, a subspace of ${\bf R}^n$. Being a (finite-dimensional) vector space, it has a basis. They are saying, let $v_1,\dots,v_k$ be a basis for the vector space $V_{\lambda}$.2012-11-21
  • 0
    so far what I understand is that $v_1,...,v_k$ is basis of the real eigenvalue. So I have to prove that the linear combination of the basis equals w which is a complex eigenvector. Is that right? If it is, how does the basis of the real eigenvalue, $v_1,...,v_k$ make the linear combinations of it equal a complex eigenvector. I am having trouble understanding that.2012-11-21
  • 0
    Substitute $w=w_1+iw_2$ into $Aw=\lambda w$; multiply out; set the real parts equal, and set the imaginary parts equal; notice what the equations you get tell you about $w_1$ and $w_2$. You have to actually do these things if you're going to understand what's happening, so, do them.2012-11-21
  • 0
    I got $A(w_1 + iw_2)= \lambda (w_1 +iw_2) = Aw_1 + Aiw_2 =\lambda w_1 + \lambda iw_2$ then $w_1(A- \lambda) = - iw_2(A - \lambda)$ thus $w_1 = -iw_2$ . That is what I got but that is exactly telling us that $w_1 + iw_2 = w$. Which I cant see how it relates to the proof .2012-11-21
  • 0
    Remember, $w_1,w_2,A$ and $\lambda$ are all real. So that equation you reached, $w_1(A-\lambda)=-iw_2(A-\lambda)$, actually tells you something very different from the conclusion you drew from it.2012-11-21
  • 0
    Yup, I am still not following. My biggest problem is if $w_1,w_2,A, \lambda$ are all real then how does it make a complex linear combination. I thought only complex eigenvalues gives complex eigenvectors.2012-11-21
  • 0
    No, your biggest problem is looking at an equation that has a real vector on one side and a pure imaginary vector on the other, and not realizing what that is telling you about both of those vectors.2012-11-21
  • 0
    I know that is my problem. Is it telling me that the pure imaginary is just a scalar multiple of the vector?2012-11-21
  • 0
    I have added some details to my answer.2012-11-21
  • 0
    Thanks a lot for clearing that up!!2012-11-22
0

All eigenvectors of a matrix are of the form $\lambda v$ where lambda is a scalar and v is any one of the eigenvalues.

$v+w=\lambda v \implies w=(\lambda-1)v$

The eigenspaace is the space of al eigenvectors with a given eigenvalue so I guess that the question meant to ask for the eigensystem. The definition of a basis is a set of vectors that all the eigenvectors are linear combinations of, so there doesn't seem to be anything to prove fr part 3.

  • 0
    Thanks and I am not sure either? Does that also follow question 1 and 2?2012-11-20