6
$\begingroup$

Can anybody show how to prove or show where to find a proof of the following statement:

Given a matrix $$T = \begin{pmatrix} t_{11} & 0 & 0 & \dotsm & 0 & 0& \dotsm & 0& 0\\ t_{21} & t_{22} & 0 & \dotsm & 0 & 0& \dotsm & 0 & 0 \\ 0 & t_{32} & t_{33} & \dotsm & 0& 0& \dotsm & 0 & 0 \\ \vdots \\0 & 0 & 0 & \dotsm & t_{i,i-1} & t_{ii} & \dotsm &0&0 \\ \vdots \\0&0&0& \dotsm &0& 0 & \dotsm &t_{m,m-1}&t_{mm} \end{pmatrix}$$ where $0 \leqslant t_{ij} \leqslant 1$ ( for $ 1 \leqslant i,j \leqslant m , i \neq j), 0 (for $1\leqslant i \leqslant m$) and $\sum_{i=1}^m{t_{ij}} \leqslant 1$ for each $j$ Then \begin{equation} \label{trans_mat_geom} (I-T)^{-1} = \sum\limits_{k=0}^{\infty}{T^k}. \end{equation} Thanks in advance.

  • 2
    One way of proceeding is to reduce to the case where $\mathbf T$ is either a scalar or a Jordan block, since any other matrix can be treated as a block diagonal matrix with scalar or Jordan blocks after a similarity transformation. The scalar case is familiar. The Jordan case requires a bit more work.2011-12-01
  • 0
    Dear Pierre , Thank you too :) :). @ J.M I've already had Gantmacher's book! I'm going through it and I'm downloading the other book. I will let you know with the progress. Thank you so much. :)2011-12-01
  • 1
    Dear Suso: I removed the "$\geqslant 0$". I hope it's ok. Please feel free to tell me if it's not.2011-12-01
  • 0
    Sure Pierre. Thanks. I've reformatted the question I wish it make sense. Thanks.2011-12-01
  • 2
    You have made drastic changes to your question. If $T$ is triangular and some $t_{ii}=1$, the series $\sum_k T^k$ simply diverges because $\sum_{k=0}^\infty (T^k)_{ii} = \sum_{k=0}^\infty t_{ii}^k = \sum_{k=0}^\infty 1=\infty$.2011-12-01
  • 0
    Ok I was not aware of this. Thanks. I'm learning. If $t_{ii} \neq 1,$ is there a direct way of getting the maximal eigenvalue of $T$?2011-12-01

3 Answers 3

1

One way to prove the result is to note that (a) we can always find a matrix norm $\|\cdot\|$ such that $\|T\|$ is arbitrarily close to the spectral radius of $T$. So we may assume that $\|T\|<1$. (b) As $\|T\|<1$, the infinite series $\sum\limits_{k=0}^{\infty}{T^k}$ converges.

For both (a) and (b), see sec. 5.6 (Matrix Norm) of Horn and Johnson's Matrix Analysis.

  • 0
    How did you rule out the identity matrix?2011-12-01
  • 1
    The OP has modified his question significantly after I answered the question. He/she initially asked how to show that $\sum T^k=(I-T)^{-1}$ given that the spectral radius of $T$ is smaller than $1$.2011-12-01
  • 0
    Thanks, I wasn't aware of my mistake! in fact $ 0 < t_{ii} < 1.$ I'm downloading the book you suggested. Thanks.2011-12-01
  • 0
    I'm reading section 5.6 now.2011-12-01
1

Try Gantmacher, Matrix Theory or Lancaster-Tismenetsky, The theory of matrices.

I don't have any of the references with me right now, so I'm relying on my memory. But, if I remember correctly, the point, in a nutshell, is: if $p(t) = (t-\lambda_1)^{\alpha_1} \dots (t-\lambda_r)^{\alpha_r}$ is the minimal polynomial of $A$, in order to compute $f(A)$, for some function $f$, you just need $f(\lambda_i), f'(\lambda_i) \dots , f^{(\alpha_i -1)}(\lambda_i)$ to be defined for all $i= 1, \dots , r$ (this is called the spectrum of $A$). Or, for what matter, in order of some identity concerning $A$ to hold, it must hold for the spectrum of $A$. So, in your case, you need (and it's enough that) the maximal eigenvalue of $A$ to be less than $1$.

1

It suffices to check that the series converges.

We can assume that $S:=T-\lambda$ satisfies $S^{k+1}=0$ some $\lambda$ with $|\lambda| < 1$ and some $k > 0$. Then we have $$ T^n=(\lambda+S)^n=\sum_{j=0}^k\ \binom{n}{j}\ \lambda^{n-j}\ S^j, $$ and the convergence is clear.