Normally, I would see that called a first-order approximation, not a second-order one. It means that as $x \to a$, the error in this approximation goes to 0 more rapidly than $x$ goes to $a$. More precisely:
$ \lim_{x \to a} \frac{f(x) - \left( f(a) + f'(a) (x-a) \right)}{x - a} = 0 $
Understanding the error is an important thing to learn in calculus! When you learned Taylor series, you should have also learned formulas for the remainder of a truncated Taylor series.
If $f$ is twice differentiable at $a$, one way to describe the error is similar to that of the mean value theorem:
$ f(x) = f(a) + f'(a) (x-a) + R(x, a) $
where
$ R(x,a) = \frac{1}{2} f''(\xi) (x-a)^2 $
and $\xi$ is some value between $a$ and $x$. (The precise value of $\xi$, of course, depends on both $x$ and $a$) Commonly, one can find upper bounds on the value of $f''(\xi)$: for example, if $f''(x)$ is an increasing function and $x > a$, you know $f''(a) < f''(\xi) < f''(x)$.
Making use of the remainder term is an important skill when it is important to know how good your approximations are when using Taylor series!