2
$\begingroup$
  • 1.) Show that any nonzero linear combination of two eigenvectors v,w corresponging to the same eigenvalue is also an eigenvector.

  • 2.) Prove that a linear combination $cv+dw$, with $c,d \ne 0$, of two eigenvectors corresponding to different eigenvalues is never an eigenvector.

  • 3.) Let $\lambda$ be a real eigenvalue of the real n x n matrix A, and $v_1,...,v_k$ a basis for the associated eigenspace $V_{\lambda}$. Suppose $w \in \mathbb{C^n}$ is a complex eigenvector, so $Aw = \lambda w$. Prove that $w = c_1v_1 + ... + c_kv_k$ is a complex linear combination of the real eigenspace basis.

For 1 and 2 I know that if two eigenvectors $\vec{v}_1$ and $\vec{v}_2$ are associated with the same eigenvalue then any linear combination of those two is also an eigenvector associated with that same eigenvalue. But, if two eigenvectors $\vec{v}_1$ and $\vec{v}_2$ are associated with different eigenvalues then the sum $\vec{v}_1+\vec{v}_2$ need not be related to the eigenvalue of either one. In fact, just the opposite. If the eigenvalues are different then the eigenvectors are not linearly related. But I can show this using a proof?

For 3. I am not too sure.

2 Answers 2

2

For 2. let $A$ be the matrix; calculate $A(cv+dw)$ under the assumption that $v$ and $w$ are eigenvectors belonging to different eigenvectors; see whether the result is consistent with $cv+dw$ being an eigenvector.

For 3. write $w=w_1+iw_2$ where $w_1$ and $w_2$ are real, and then see what $Aw=\lambda w$ tells you about $w_1$ and $w_2$.

EDIT: In view of the length of the discussion in the comments, I'll expand on this last part.

We get $A(w_1+iw_2)=\lambda(w_1+iw_2)$ which is $Aw_1+iAw_2=\lambda w_1+i\lambda w_2$ Now if $a+bi=c+di$ where $a,b,c,d$ are real (real numbers, or vector with real entries, or matrices with real entries), then necessarily $a=c$ and $b=d$ --- that's what equality means in the complex realm. So we deduce $Aw_1=\lambda w_1,\qquad Aw_2=\lambda w_2$ So $w_1$ and $w_2$ are in the eigenspace $V_{\lambda}$. We are told $v_1,\dots,v_k$ is a basis for $V_{\lambda}$, so $w_1=r_1v_1+\cdots+r_kv_k,\qquad w_2=s_1v_1+\cdots+s_kv_k$ for some real numbers $r_1,\dots,r_k,s_1,\dots,s_k$. Then $w=w_1+iw_2=c_1v_1+\cdots+c_kv_k{\rm\ with\ }c_j=r_j+is_j,j=1,\dots,k$

  • 0
    Thanks a lot for clearing that up!!2012-11-22
0

All eigenvectors of a matrix are of the form $\lambda v$ where lambda is a scalar and v is any one of the eigenvalues.

$v+w=\lambda v \implies w=(\lambda-1)v$

The eigenspaace is the space of al eigenvectors with a given eigenvalue so I guess that the question meant to ask for the eigensystem. The definition of a basis is a set of vectors that all the eigenvectors are linear combinations of, so there doesn't seem to be anything to prove fr part 3.

  • 0
    Thanks and I am not sure either? Does that also follow question 1 and 2?2012-11-20