Actually, it's not harder to compute the $n$th power of any $2$ by $2$ matrix:
Let $a,b,c,d$ be complex numbers and consider the matrix $A=\begin{pmatrix}a&b\\ c&d\end{pmatrix}.$ Let $n\ge2$ be an integer.
How to compute $A^n$?
Here is a recipe:
Assume first that the roots $u$ and $v$ of the polynomial $f:=X^2-(a+d)\,X+ad-bc$ are distinct. The equation of the secant line to the curve $y=x^n$ through the points $(u,u^n)$ and $(v,v^n)$ is $y=\frac{u^n-v^n}{u-v}\ \ x-uv\ \ \frac{u^{n-1}-v^{n-1}}{u-v}\quad,$ and we have $A^n=\frac{u^n-v^n}{u-v}\ \ A-uv\ \ \frac{u^{n-1}-v^{n-1}}{u-v}\ \ I,$ where $I$ is the identity matrix.
For all nonnegative integer $k$ put $s_k:=u^k+u^{k-1}v+u^{k-2}v^2+\cdots+v^k.$ [In particular $s_0=1$.] As $s_k=\frac{u^{k+1}-v^{k+1}}{u-v}\quad,$ we have $A^n=s_{n-1}\,A-u\,v\,s_{n-2}\,I.$ This formula still makes sense, and is still true, if $u=v$, in which case it reads $A^n=n\,u^{n-1}\,A-(n-1)\,u^n\,I.$
Why does this recipe work?
The key point is the easily checked equality $A^2-(a+d)\,A+(ad-bc)\,I,\quad(1)$ which enables us to compute $g(A):=a_n\,A^n+\cdots+a_2\,A^2+a_1\,A+a_0\,I$ for all polynomial $g=a_n\,X^n+\cdots+a_2\,X^2+a_1\,X+a_0\in\mathbb C[X]$ as follows.
Assume again $u\not=v$, and let $h\in\mathbb C[X]$ be the unique polynomial of degree $\le1$ which agrees with $g$ at $u$ and $v$: $h=g(u)\ \frac{X-v}{u-v}+g(v)\ \frac{X-u}{v-u}\quad.$ [In particular, the secant line to $y=g(x)$ through $(u,g(u))$ and $(u,g(u))$ is $y=h(x)$.] Then our polynomial $f$, which can be written as $f=(X-u)(X-v),$ divides $g-h$. That is, we have $g(X)-h(X)=f(X)q(X)$ for some polynomial $q$. On substituting $A$ for $X$, and remembering that $f(A)=0$ by (1), we get $g(A)=h(A)$, or $g(A)=g(u)\ \frac{A-vI}{u-v}+g(v)\ \frac{A-uI}{v-u}\quad.$ If $u=v$ we use the tangent line instead of the secant line: $h:=g(u)+g'(u)\,(X-u),$ and we get $g(A)=g(u)\,I+g'(u)\,(A-u\,I).$
This is the case of $2$ by $2$ matrices. For arbitrary square matrix, see for instance the last part of this answer.
EDIT. The $n$th power of an $r$ by $r$ diagonalizable matrix $A$ is given by the beautiful Lagrange Interpolation Formula: $A^n=\sum_{i=1}^k\ u_i^n\ \prod_{j\not=i}\ \frac{A-u_j\,I}{u_i-u_j}\quad,$ where the $u_i$ are the distinct eigenvalues.
Here the justification (which is essentially the same as above). The polynomial $f:=(X-u_1)\cdots(X-u_k)$ satisfies $f(A)=0$. Note that $g:=\sum_{i=1}^k\ u_i^n\ \prod_{j\not=i}\ \frac{X-u_j}{u_i-u_j}$ is the unique polynomial of degree less than $k$ which agrees with $X^n$ on all the $u_i$. Hence, $X^n-g$ is divisible by $f$, and this implies, as above, $A^n=g(A)$.