That all eigenvalues $\lambda_i$ of a matrix $A$ are strictly $<1$ in abs. value is both necessary and sufficient.
- Necessary: If $R=I+A+A^2+\cdots$ is to converge and $\lambda_i,v_i$ is an eigenpair of $A$, then $Rv_i$ must be well-defined, and additionally $S_nv_i\to Rv_i$ where $S_n$ are the partial sums, because matrix multiplication is continuous, but $S_nv_i$ can be written as $1+\lambda_i+\lambda_i^2+\cdots+\lambda_i^{n-1}$, and this only converges in the reals if $|\lambda_i|<1$ (it is the usual geometric series).
- Sufficient: Suppose $|\lambda_i|<1$ for each $i$. Then $\det(I-A)\ne0$ since $\det(\lambda I-A)$ does not have any root $\lambda=1$, hence $(I-A)^{-1}$ is well defined and $S_n=(I-A)^{-1}(I+A^n)$. All eigenvalues of the powers $A^n$ tend to zero as $r\to\infty$ hence $A^n\to0$ and thus $S_n\to (I-A)^{-1}$.
A couple of items require some further justification using the underlying topology (defined e.g. by the metric induced by the Frobenius norm): continuity of matrix multiplication (in the example this should be clear because matrix multiplication is a linear map..), and the fact that eigenvalues approaching zero implies a sequence of matrices converges to zero (perhaps we could consider the action of $S_n$ on the subspaces in the Jordan aka generalized eigenvector decomposition?).
(There are probably some standard arguments for these things, but I may not have hit them all because I'm pulling them out of my hat instead of going on experience here.)