0
$\begingroup$

For a $n\times n$ matrix $A$ that has independent eigenvectors, I want to raise the power of $A$ recursively like $A^{1}\vec{u_{0}}=\vec{u_{1}}$ and then to find out $\vec{u_{k}}$, I could use $A^{k}\vec{u_{0}}=\vec{u_{k}}$.

And if I expand $A^{k}\vec{u_{0}}=S\Lambda ^{k}\vec{c}$, where $S$ is the eigenspace and $\vec{c}$ is the combination factors of the eigenvectors in $S$ such that $\vec{u_{k}}=c_{1}\vec{s_{1}}+\cdots+c_{n}\vec{s_{n}}$ and $\Lambda $ is the eigenvalues matrix.

Can I say I do it this way instead: $ A^{k}\vec{u_{0}}=\Lambda ^{k}S\vec{c} $ Then... $ A^{k}\vec{u_{0}}=\Lambda ^{k}\vec{u_{0}} $

Would everything still be the same? I have a hard time trying to prove to myself. It doesn't look like they the same since the multiplication sequence for matrices does make a difference. But it's like they have the same meaning as: $ A^{k}\vec{u_{0}}=C_{1}\lambda _{1}^{k}\vec{x_{1}}+C_{2}\lambda _{2}^{k}\vec{x_{2}}+\cdots+C_{n}\lambda _{n}^{k}\vec{x_{n}} $

Thanks for any help.

  • 0
    Take a look at this [link](http://en.wikipedia.org/wiki/Power_iteration).2014-04-02

2 Answers 2

0

It is true that if $A$ has a complete set of eigenvectors, you can expand $\vec{u_{0}}=\sum c_n \vec{s_n}$ with the $\vec{s_n}$ being eigenvectors of $A$. Then, if $\lambda_n$ is the eigenvalue corresponding to $\vec{s_n}$, you have $\vec{u_1}=A\vec{u_0}=\sum c_n \lambda_n \vec{s_n}$ and $\vec{u_k}=A^k\vec{u_0}=\sum c_n \lambda_n^k \vec{s_n}$. You can show this by using the linearity of the operations.

0

As J.M. suggests, the matrix multiplication's non-commutative rule applies and therefore $A^{k}\vec{u_{0}}=\Lambda ^{k}\vec{u_{0}}$ is not valid. The thing is $A^{k}\vec{u_{0}}=S\Lambda ^{k}\vec{c}\neq\Lambda ^{k}S\vec{c}$