If $A$ is a Markov matrix, all the columns of $A^k$ approaches the steady state vector $x_\textrm{ss}$. How can you prove this?
So far I was trying to understand with the decomposition.
$A^k = S \Lambda^k S^{-1} = c[x_1, 0 , ..., 0]S^{-1} = c[x_1, x_1, ..., x_1] $ where $x_1$ is the first column of $S$ or an eigenvector of $A^k$ associated with $\lambda = 1$, which is the steady state vector, $x_\textrm{ss}$. Here $c$ is just a constant so that the components of $cx_1$ sum to 1.
I can't figure out the last equality.
Although the suggested question is very similar to my question, I think my question focuses more on why other columns of $A^k$ which does not have to do with "probability vector" mentioned in the other question. So I don't think this is an exact duplicate.