Let $\rho(A)$ be the spectral radius of $A,$ that is the maximal eigenvalue of $A$ in absolute value. I want to show that for any $ \rho(A) < \eta < 1,$ there is $c>0$ such that $\|A^k\| \leq c \eta^k$ for $k = 0,1, \dots.$ Can somebody show me how to do this? Thanks!
Inequality from a matrix norm
2
$\begingroup$
linear-algebra
real-analysis
matrices
-
0@Inquest L'Hôpital's rule would do the trick. By the way, how could you've $\|A^t\| = \|P\|\|D\|\|P^{-1}\|.$ Isn't $\|A^t\| \leq \|P\|\|D\|\|P^{-1}\|.$ If its the case I don't think the next will follow. – 2012-12-18
1 Answers
3
Let $B=\eta^{-1}A$. Then $\rho(B)<1$. Hence $B^k\rightarrow0$ as $k\rightarrow\infty$. Therefore the entries of $B^k$ are bounded. Hence the Frobenius norm of $B^k$ is bounded and in turn, $\|B^k\|$ is also bounded because all matrix norms are equivalent. Therefore $\|B^k\|\le c$ for some $c>0$, i.e. $\|A^k\|\le c\eta^k$.
-
0That's what I am talking about.. Nice proof. – 2012-12-18