Let $A$ be an $n\times n$ matrix of nonnegative entries such that $A_{i1}+A_{i2}+\cdots+A_{in}=1$ for all $i\in\{1,2,\ldots,n\}$. What does $A$ have to satisfy so that the sequence
$M_m=\frac1m(I+A+A^2+\cdots+A^{m-1})$
converge?
I believe this converges for almost all $A$. It should converge to a matrix $M$ such that $MA=M$ and $M_{i1}+M_{i2}+\cdots+M_{in}=1$ for all $i\in\{1,2,\ldots,n\}$. I am pretty sure all of the rows of $M$ are equal most of the time, but I haven't found a rigorous proof. For what I need, I'm interested in something like $|M_m-M|<\epsilon\ll 1$ (for some norm $|\cdot|$, taxicab norm preferred). Thanks in advance!