Any polynomial operation on a matrix $A$, i.e. any scalar sums of powers of $A$, including the identity matrix and negative (integer) powers of A, operate directly on the eigenvalues of $A$.
For example, if $2 \in \sigma(A)$ so that det($A - 2I)=0$ (2 is an eigenvalue of $A$), then $M=A^4 + I$ has as eigenvalu $2^4+1 = 17$. Thus, if you find that $(I-A)^{-1}$ seems to have a better form for finding an eigenvalue, call it $x$, then $\lambda$ the eigenvalue for A can be found by solving the equation $x = (1-\lambda)^{-1}$
For possibilities of estimating eigenvalues given some matrix, use Gerschgorin Circles, which gives bounds on eigenvalues by comparing diagonal elements to their respective row-sum (of absolute values in the row). Eigenvalues are near the value of the diagonal element, no further from it than the absolute sum of the other elements in its row.
I am thinking though that for your purposes you may want to use a matrix norm somehow...