Qiaochu's answer is a general way and that method ought to be in your bag of tools.
There are some 'gotchas' with this problem (especially if taking an exam) though, and you need to be careful...
Consider the function
$g(x) = \begin{cases} \cos^{-1}(\cos^2 x) & x \ge 0 \\ -\cos^{-1}(\cos^2 x) & x \lt 0 \end{cases} $
This function has the derivative \displaystyle g'(x) = \frac{2\cos x}{\sqrt{1 + \cos^2 x}} \forall x \in (-\epsilon, \epsilon)
Notice that $g(x)$ is an odd function.
Hence the power series will only have terms of the form $\displaystyle x^{2n+1}$ and so
$g(x) = \sqrt{2}x + \mathcal{O}(x^3)$
The same reasoning we cannot apply to $\cos^{-1}(\cos^2 x)$, which is an even function, as its derivative at $0$ does not exist.
So basically, your question is incomplete when you say about $x = 0$. Technically speaking, as asked, the derivative at $0$ does not exist and so the Taylor series does not exist either.
i.e: in the $(0,\epsilon)$ neighbourhood we have
$\cos^{-1}(\cos^2 x) = \sqrt{2} x + \mathcal{O}(x^3)$
while in the $(-\epsilon, 0)$ neighbourhood we have
$\cos^{-1}(\cos^2 x) = -\sqrt{2} x + \mathcal{O}(x^3)$