Adding on Dennis Gulko's answer.
For any $m \in \mathbb{N}$ fixed, it holds
$$
\frac{1}{{(1 + mx)^m }} = \sum\limits_{n = 0}^\infty {\frac{{m \cdots (m + n - 1)}}{{n!}}( - m)^n x^n },
$$
provided that $|x|$ is sufficiently small.
This fact (confirmed numerically) can be shown in a probabilistic setting as follows. Let $X$ be a gamma$(m,1)$ random variable, so that $X$ has density function $f(x)=e^{ - x} x^{m - 1} /\Gamma (m)$, $x > 0$. The moment-generating function (MGF) of $X$ is given by
$$
{\rm E}[e^{tX} ] = \frac{1}{{(1 - t)^m }}
$$
(indeed, note that $1/(1-t)$ is the MGF of the exponential$(1)$ distribution, and $X$ can be written as a sum of $m$ independent exponential$(1)$ variables). Further, the $n$th moment ($n=0,1,2,\ldots$) of $X$ is given by
$$
\mu_n' = \int_0^\infty {x^n f(x)\,{\rm d}x} = \frac{1}{{\Gamma (m)}}\int_0^\infty {x^{n + m - 1} e^{ - x} \,{\rm d}x} = \frac{{\Gamma (n + m)}}{{\Gamma (m)}} = m \cdots (m + n - 1)
$$
(note that $\mu_0' = 1$). Hence, for all $t$ in a neighborhood of $0$, it holds
$$
{\rm E}[e^{tX} ] = \sum\limits_{n = 0}^\infty {\frac{{\mu _n' }}{{n!}}t^n } = \sum\limits_{n = 0}^\infty {\frac{{m \cdots (m + n - 1)}}{{n!}}t^n } .
$$
Finally, putting $t = -mx$, we get
$$
\frac{1}{{(1 + mx)^m }} = \sum\limits_{n = 0}^\infty {\frac{{m \cdots (m + n - 1)}}{{n!}}( - m)^n x^n } .
$$