0
$\begingroup$

Let $V$ be a vector space and $f:V\rightarrow V$ a linear map. Let $λ_1,...,λ_κ$ be distinct eigenvalues of $f$. Then these properties hold:

  1. Let $υ_1+υ_2+...+υ_κ=0$, where $υ_i\in V_f(λ_i),i=1,...,k$ , then $υ_i=0$
  2. (...)

Proof of 1. : By applying $f$ we get : $λ_1υ_1+λ_2υ_2+...+λ_κυ_κ=f(υ_1)+...+f(υ_κ)=f(υ_1+...+υ_κ)=0$ (so far so good!)

By repeating the above result, we get : $λ_1^mυ_1+λ_2^mυ_2+...+λ_κ^mυ_κ=0, \forall m\in \Bbb N$ and thus: $φ(λ_1)υ_1+...+φ(λ_κ)υ_κ=0, \forall φ(x) \in \Bbb F[x]$
(...)

It is not very clear to me how these two last results hold.

  • 1
    What is $V_f(\lambda_i)$? Is that the eigenspace of the $\lambda_i$ eigenvalue?2017-01-02
  • 0
    Yes ,the eigenspace which corresponds to $λ_i$2017-01-02
  • 0
    The definition of $\phi$ is unclear, also those last steps seem like a jump to conclusions. I think the easiest way to prove that eigenvectors corresponding to distinct eigenvalues are linearly independent is by induction. Assume it holds for $n$ such vectors. Add vector $(n+1)$ and show it cannot be in the span of the others.2017-01-02

2 Answers 2

3

So far you have $\lambda_1v_1+\lambda_2v_2+\dots+\lambda_kv_k=0$.

Repeating the technique one more time, you will get \begin{align*} 0&=f(0)\\ &=f(\lambda_1v_1+\lambda_2v_2+\dots+\lambda_kv_k)\\ &=\lambda_1^2v_1+\lambda_2^2v_2+\dots+\lambda_k^2v_k \end{align*}

Repeating it $m$ times will give you $\lambda_1^mv_1+\dots+\lambda_k^mv_k=0$.

For the second statement, if $\varphi(x)\in F[x]$, then $\varphi(x)=\sum_{i=1}^ma_ix^i$, where $a_i\in F$, $m\in\mathbb{Z}$.

So $\varphi(\lambda_k)=\sum_{i=1}^ma_i\lambda_k^i$.

Hence, \begin{align*} \varphi(\lambda_1)v_1+\dots+\varphi(\lambda_k)v_k&=\sum_{i=1}^ma_i\lambda_1^iv_1+\dots+\sum_{i=1}^ma_i\lambda_k^iv_k\\ &=a_1(\lambda_1v_1+\dots+\lambda_kv_k)+\\ &\dots a_m(\lambda_1^mv_1+\dots+\lambda_k^mv_k)\\ &=0+0+\dots+0\\ &=0 \end{align*}

0

Here's an alternative proof. Though this is not an answer to your question, it can't be a comment either and it may help you.
Define, polynomials $P_j(x)=\prod_{i\neq j}^k \frac{x-\lambda_i}{\lambda_j-\lambda_i}$. Note that $P_j(\lambda_i) = \delta_{ij}.$
Now, let $v_1+v_2+\cdots+v_k=0,$ where $v_i \in V_f(\lambda_i). $Or, $v_i = \sum_{j\neq i} v_j.$ Applying $P_i$ to the equation,
$P_i(f)v_i = P_i(f)(\sum_{j\neq i}v_j)$. As all of the vectors are eigenvectors, $P_i(f)v_j = P_i(\lambda_j)v_j = \delta_{ij}v_j$.(Using linearity of $P_i(f)$)
So, $1*v_i = \sum_{j\neq i}\delta_{ij}v_j = 0 \therefore v_i = 0 \ \forall 1\leq i \leq k.$