Let $f : \mathbb{R} \to \mathbb{R}$ be defined by $f (x) = x^2$. Then, we have that
$f (x + \Delta x ) = (x + \Delta x)^2 = x^2 + 2 x \Delta x + (\Delta x)^2$
Think of $f$ as a black-box that takes values of $x$ and spits out $f (x)$. For a given $x$, we obtain $f (x)$. What happens if we perturb the input? If the input is $x + \Delta x$ then the output will be $f (x + \Delta x )$. The perturbation in the output is thus
$f (x + \Delta x ) - f (x) = 2 x \Delta x + (\Delta x)^2$
Note that the magnitude of the perturbation in the output depends on the input value $x$. If $\Delta x$ is "small enough", then the perturbation in the output can be given by its first-order approximation
$f (x + \Delta x ) - f (x) \approx 2 x \Delta x$
However, if $\Delta x$ is not "small enough", the $(\Delta x)^2$ term will have to be included.