How can I prove that if I have $n$ eigenvectors from different eigenvalues, they are all linearly independent?
How to prove that eigenvectors from different eigenvalues are linearly independent
4 Answers
I'll do it with two vectors. I'll leave it to you do it in general.
Suppose $\mathbf{v}_1$ and $\mathbf{v}_2$ correspond to distinct eigenvalues $\lambda_1$ and $\lambda_2$, respectively.
Take a linear combination that is equal to $0$, $\alpha_1\mathbf{v}_1+\alpha_2\mathbf{v}_2 = \mathbf{0}$. We need to show that $\alpha_1=\alpha_2=0$.
Applying $T$ to both sides, we get $$\mathbf{0} = T(\mathbf{0}) = T(\alpha_1\mathbf{v}_1+\alpha_2\mathbf{v}_2) = \alpha_1\lambda_1\mathbf{v}_1 + \alpha_2\lambda_2\mathbf{v}_2.$$ Now, instead, multiply the original equation by $\lambda_1$: $$\mathbf{0} = \lambda_1\alpha_1\mathbf{v}_1 + \lambda_1\alpha_2\mathbf{v}_2.$$ Now take the two equations, $$\begin{align*} \mathbf{0} &= \alpha_1\lambda_1\mathbf{v}_1 + \alpha_2\lambda_2\mathbf{v}_2\\ \mathbf{0} &= \alpha_1\lambda_1\mathbf{v}_1 + \alpha_2\lambda_1\mathbf{v}_2 \end{align*}$$ and taking the difference, we get: $$\mathbf{0} = 0\mathbf{v}_1 + \alpha_2(\lambda_2-\lambda_1)\mathbf{v}_2 = \alpha_2(\lambda_2-\lambda_1)\mathbf{v}_2.$$
Since $\lambda_2-\lambda_1\neq 0$, and since $\mathbf{v}_2\neq\mathbf{0}$ (because $\mathbf{v}_2$ is an eigenvector), then $\alpha_2=0$. Using this on the original linear combination $\mathbf{0} = \alpha_1\mathbf{v}_1 + \alpha_2\mathbf{v}_2$, we conclude that $\alpha_1=0$ as well (since $\mathbf{v}_1\neq\mathbf{0}$).
So $\mathbf{v}_1$ and $\mathbf{v}_2$ are linearly independent.
Now try using induction on $n$ for the general case.
-
3I believe that you wrote $\lambda_2$ instead of $\lambda_1$ in the row before "Now take" – 2014-03-14
-
4Is there any intuition behind this? Any pictorial way of thinking? – 2016-05-07
-
0Maybe this will be of use: from the $0 = \alpha_1 v_1 + ... + \alpha_n v_n$, if you do $$\left|\left|\lim_{j\to\infty} (1/ \lambda_1^j) A^j (\alpha_1 v_1 + ... + \alpha_n v_n)\right|\right|$$, from the definition of the eigenvalue (that $Av = \lambda v$), we can see the $v_1$ component will grow much faster than the others, so that limit equals $\alpha_1$, which equals $0$ if linearly independent. – 2017-04-29
-
0@Arturo Very nice. – 2017-05-27
Alternative:
Let $j$ be the maximal $j$ such that $v_1,\dots,v_j$ are independent. Then there exists $c_i$, $1\leq i\leq j$ so that $\sum_{i=1}^j c_iv_i=v_{j+1}$. But by applying $T$ we also have that
$$\sum_{i=1}^j c_i\lambda_iv_i=\lambda_{j+1}v_{j+1}=\lambda_{j+1}\sum_{i=1}^j c_i v_i$$ Hence $$\sum_{i=1}^j \left(\lambda_i-\lambda_{j+1}\right) c_iv_i=0$$ which is a contradiction since $\lambda_i\neq \lambda_{j+1}$ for $1\leq i\leq j$.
Hope that helps,
-
4P.S. The argument uses the well-ordering principle on the naturals (by looking at the least $j$ such that $v_1,\ldots,v_{j+1}$ is dependent). Well-ordering for the naturals is equivalent to induction. – 2011-03-28
-
0@Arturo: Thanks! Should be fixed now. (Didn't realize I was using well ordering in that way) – 2011-03-28
-
1No problem; I'll delete the other two comments since it's been fixed. In any case, many people prefer arguments along these lines to an explicit induction, even if they are logically equivalent (you can think of the argument you give as a proof by contradiction of the inductive step, with the base being taken for granted [or as trivial, since $v_1$ is nonzero]; cast that way, it may be clearer why the two arguments are closely connected). – 2011-03-28
-
1@Eric Naslund This proofs seems very similar to the one given in Axler... – 2011-08-31
-
0@D B Lim: What is Axler? – 2011-08-31
-
0@Eric Naslund Sheldon Axler's *Linear Algebra Done Right*. – 2011-09-03
Hey I think there's a slick way to do this without induction. Suppose that $T$ is a linear transformation of a vector space $V$ and that $v_1,\ldots,v_n \in V$ are eigenvectors of $T$ with corresponding eigenvalues $\lambda_1,\ldots,\lambda_n \in F$ ($F$ the field of scalars). We want to show that, if $\sum_{i=1}^n c_i v_i = 0$, where the coefficients $c_i$ are in $F$, then necessarily each $c_i$ is zero.
For simplicity, I will just explain why $c_1 = 0$. Consider the polynomial $p_1(x) \in F[x]$ given as $p_1(x) = (x-\lambda_2) \cdots (x-\lambda_n)$. Note that the $x-\lambda_1$ term is "missing" here. Now, since each $v_i$ is an eigenvector of $T$, we have \begin{align*} p_1(T) v_i = p_1(\lambda_i) v_i && \text{ where} && p_1(\lambda_i) = \begin{cases} 0 & \text{ if } i \neq 1 \\ p_1(\lambda_1) \neq 0 & \text{ if } i = 1 \end{cases}. \end{align*}
Thus, applying $p_1(T)$ to the sum $\sum_{i=1}^n c_i v_i = 0$, we get $$ p_1(\lambda_1) c_1 v_1 = 0 $$ which implies $c_1 = 0$, since $p_1(\lambda_1) \neq 0$ and $v_1 \neq 0$.
For eigenvectors $\vec{v^1},\vec{v^2},\dots,\vec{v^n}$ with different eigenvalues $\lambda_1\neq\lambda_2\neq \dots \neq\lambda_n$ of a $ n\times n$ matrix $A$.
Given the $ n\times n$ matrix $P$ of the eigenvectors (with eigenvectors as the columns). $$P=\Big[\vec{v^1},\vec{v^2},\dots,\vec{v^n}\Big]$$
Given the $ n\times n$ matrix $\Lambda$ of the eigenvalues on the diagonal (zeros elsewhere): $$\Lambda = \begin{bmatrix} \lambda_1 & 0 & \dots & 0 \\ 0 & \lambda_2 & \dots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \dots & \lambda_n \end{bmatrix} $$ Let $\vec{c}=(c_1,c_2,\dots,c_n)^T$
We need to show that only $c_1=c_2=...=c_n=0$ can satisfy the following: $$c_1\vec{v^1}+c_2\vec{v^2}+...= \vec{0^{}}$$ Applying the matrix to this equation gives: $$c_1\lambda_1\vec{v^1}+c_2\lambda_2\vec{v^2}+...+c_n\lambda_n\vec{v^n}= \vec{0^{}}$$ We can write this equation in the form of vectors and matrices:
$$P\Lambda \vec{c^{}}=\vec{0^{}}$$
But with since $A$ can be diagonalised to $\Lambda$, we know $P\Lambda=AP$ $$\implies AP\vec{c^{}}=\vec{0^{}}$$ since $AP\neq 0$, we have $\vec{c}=0$.
-
0Let me add a comment to the last. In order to have $c=0$ the only solution, $AP$ must be invertible but since $A$ is already invertible because $\lambda_i \neq \lambda_j \forall i\neq j$, the only demand is $P$ to be invertible $\Rightarrow$ independent eigenvectors. – 2016-09-13