Consider a function $f(r,R)$ of two variables, and consider that $R = R_0 + x$ for some constant $R_0$.
I'm looking for a series expansion of $1 / |r - R_0 - x|$ for small $x$. In other words, the expansion of $1/|r - R|$ around a specific $R_0$.
Here, $r$ is yet another variable and not just a parameter. So, if it wasn't for the absolute value, I could just compute the derivatives and set up the Taylor series.
However, I am a bit unsure about how to handle the absolute value situation. Since $r$ is a variable, it's not so easy to say that, e.g. "$r - R_0$ is always larger (smaller) than $x$". While it is true that the resulting function of $r$ will typically be evaluated at values of $r$ close to $R_0$, the sign of the expression can still be both positive or negative.
Now, I guess I can use the fact that $\frac{d}{dx} | x | = \mathrm{sig}(x)$, but then higher terms in the series will involve derivatives of the signum, and I don't really want to deal with them...
Is there a nicer way to handle these things? If the variables were vectors in 3D, I could use the multipole expansion, but that doesn't really apply here, does it?