Recall what you know about polynomials. If $f(0) = 0$, then you can factor a copy of $x$ out of the polynomial to get $f(x) = x g(x)$; we have some idea bout the "multiplicity" of a root which corresponds to how many time $x$ divides $f(x)$.
If $f(x,y,z)$ is a polynomial in three variables with $f(0,0,0) = 0$, then if you expand it, every term has to have at least one copy of $x$, $y$, or $z$, and so you can group the terms together as $f(x,y,z) = x g(x,y,z) + y h(y,z) + z k(z)$.
This doesn't carry over to arbitrary continuous functions; consider $\sqrt{|x|}$ or even just $|x|$.
However, differentiable functions behave well, to "first-order". Many of the facts regarding roots of polynomials (as long as you only consider multiplicity 1) still apply to differentiable functions. For twice differentiable functions, you can consider the sorts of things that happen to "second-order", and so forth.
One of the most excellent general methods here is the Taylor series. To the first order, it says that you can write any differentiable function as a linear polynomial, plus a remainder term that goes to zero more rapidly than linearly.
(in fact, it's so useful that Taylor series are often used to prove things about polynomials. In fact, the usual way of writing a polynomial can be viewed as a Taylor series)
I'll do the one-dimensional case for you. If $f(x)$ is a univariate function differentiable at zero, and $f(0) = 0$, then its first-order Taylor series (also known as "differential approximation") at zero shows we can write it as
$ f(x) = f(0) + x f'(0) + x r(x) $
where $r(x)$ is continuous at zero, and $r(0) = 0$. In particular, this means we can set $g(x) = f'(0) + r(x)$, and have an identity
$ f(x) = x g(x). $