0
$\begingroup$

This theorem is in my book. As no proof is given, I looked for some reference online, but I didn't find any real proof...

Let $M \in \mathbb R ^{n \times n}$ a diagonalisable matrix such that its eigenvalues satisfy $|\lambda_1|> |\lambda_2| \ge \dots \ge |\lambda_n|$. Let $v_1,\dots,v_n$ be respective eigenvectors. Let $x_0 = \sum_{i=1}^n c_i v_i$ for some scalars $c_i$, $c_1\neq0$, $x_k = M^k x_0$, for all positive $k \in \mathbb Z$, and $w\in \mathbb R^n \smallsetminus v_1^\perp$. Then $$\frac{w^T x_{k}}{w^T x_{k-1}} = \lambda_1 + O\left(\left|\frac{\lambda_2}{\lambda_1}\right|^{k}\right)$$

First of all, is this statement correct? And then, does it mean that there's a constant $C$ such that $$\left|\frac{w^T x_{k}}{w^T x_{k-1}} - \lambda_1 \right| \le C\left|\frac{\lambda_2}{\lambda_1}\right|^{k}$$ is satisfied for all $k$? I know that $\lim \frac{w^T x_{k}}{w^T x_{k-1}} = \lambda_1$ and that there's a constant $D$ such that $\left\|\frac{1}{\lambda_1^k}x_k-c_1 v_1\right\|_2 \le D\left|\frac{\lambda_2}{\lambda_1}\right|^{k}$ for all $k$, but I not particularly at ease with the big oh notation... Do you know any proof (a reference is fine as well) that doesn't use some "obvious" and "immediate" steps with the big oh?


Since $x_k= \sum_{i=1}^n c_i \lambda_i^kv_i$, then \begin{align}\left|\frac{w^T x_{k}}{w^T x_{k-1}} - \lambda_1 \right| &= \left| \dfrac{ \sum_{i=1}^n c_i \lambda_i^kw^Tv_i - \lambda_1 \sum_{i=1}^n c_i \lambda_i^{k-1}w^Tv_i }{\sum_{i=1}^n c_i \lambda_i^{k-1}w^Tv_i} \right|\\ &=\left| \frac{\sum_{i=2}^n (\lambda_i-\lambda_1) c_i\lambda_i^{k-1}w^Tv_i} {\sum_{i=1}^n c_i \lambda_i^{k-1}w^Tv_i} \right|\\ &=\left| \frac{\lambda_2}{\lambda_1} \right|^{k-1} \left| \frac{\sum_{i=2}^n (\lambda_i-\lambda_1) c_i(\lambda_i/\lambda_2)^{k-1}w^Tv_i} {\sum_{i=1}^n c_i (\lambda_i/\lambda_1)^{k-1}w^Tv_i} \right| \\ &\le \left| \frac{\lambda_2}{\lambda_1} \right|^{k-1} \frac{\sum_{i=2}^n \left|(\lambda_i-\lambda_1) c_i w^Tv_i\right|} {\left|\left|c_1 w^Tv_1\right|- \sum_{i=2}^n \left| c_i w^Tv_i\right| \right|} . \end{align} Is this correct?

  • 0
    Now cancel the fastest growing term $c_1λ_1^{k-1}w^Tv_i$.2017-01-30
  • 0
    @LutzL, thanks. I'm not sure I understand, though... Please see my edit.2017-01-30
  • 0
    Yes, in principle that is correct. However, you will need a lower bound for the denominator to get an upper bound for the fraction. And there $|a+b+c|\ge|a|-|b|-|c|$ and you can get $|b|,|c|,…$ arbitrarily small. But $|a|$ is not a lower bound, $|a|/2$ will be for sufficiently large $k$.2017-01-30

1 Answers 1

1

Write $w=\sum_i a_i v_i$. Notice that $x_k=\sum_i c_i \lambda_i^k v_i$. Therefore $w^T x_k = \sum_{i,j} a_i c_j \lambda_j^k v_i^T v_j$. The point is that the $\lambda_1^k$ term does not vanish (this is what the technical assumptions are for) and that it is far larger than the others if $k$ is large enough. Precisely speaking, $w^T x_k=C \lambda_1^k + O(\lambda_2^k)$ where $C$ does not depend on $k$. Thus when you take the ratio you are looking at $\frac{C \lambda_1^k+O(\lambda_2^k)}{C \lambda_1^{k-1}+O(\lambda_2^k)}$. Eventually the first term in the denominator is bigger than the second, at which point you may recover the desired estimate by dividing top and bottom by $C \lambda_1^{k-1}$ and then applying the formula for the the geometric series. (There are other ways to do the analysis, of course.)