You can use the Cayley-Hamilton theorem to evaluate powers of a matrix (in fact, any analytic function of a matrix). The basic idea is that by dividing by the characteristic polynomial, you can reduce the degree of any polynomial of an $n\times n$ matrix to at most $n-1$.
Let $\Delta(x)$ be the characteristic polynomial of the matrix $A$ and write the polynomial $P(x)$ as the quotient and remainder $P(x)=Q(x)\Delta(x)+R(x)$. By the Cayley-Hamilton theorem, $\Delta(A)=0$, so $P(A)=R(A)$. Since by definition $\Delta(\lambda)=0$ if $\lambda$ is an eigenvalue of $A$, we also have $P(\lambda)=R(\lambda)$.
The characteristic polynomial of A is $\Delta(x)=x^2-7x+6$, so one way to compute $A^{105}$ is to find the remainder of $x^{105}$ after division by $\Delta(x)$ and plug in $A$. Another way is to use the property of the eigenvalues mentioned above. We know that for each eigenvalue $\lambda$, $\lambda^{105}=a_0+a_1\lambda$ for some unknown coefficients $a_0$ and $a_1$. Plugging in the two eigenvalues $6$ and $1$ produces a system of linear equations that can be solved for the unknown coefficients: $$\begin{align}a_0+a_1&=1 \\ a_0+6a_1&=6^{105}\end{align}$$ from which we get $A^{105}=\frac15(6-6^{105})I+\frac15(6^{105}-1)A$.
This method works for any size square matrix, although you need to modify the equations for the unknown coefficients a bit if there are repeated eigenvalues.