3
$\begingroup$

Consider a function $f(r,R)$ of two variables, and consider that $R = R_0 + x$ for some constant $R_0$.

I'm looking for a series expansion of $1 / |r - R_0 - x|$ for small $x$. In other words, the expansion of $1/|r - R|$ around a specific $R_0$.

Here, $r$ is yet another variable and not just a parameter. So, if it wasn't for the absolute value, I could just compute the derivatives and set up the Taylor series.

However, I am a bit unsure about how to handle the absolute value situation. Since $r$ is a variable, it's not so easy to say that, e.g. "$r - R_0$ is always larger (smaller) than $x$". While it is true that the resulting function of $r$ will typically be evaluated at values of $r$ close to $R_0$, the sign of the expression can still be both positive or negative.

Now, I guess I can use the fact that $\frac{d}{dx} | x | = \mathrm{sig}(x)$, but then higher terms in the series will involve derivatives of the signum, and I don't really want to deal with them...

Is there a nicer way to handle these things? If the variables were vectors in 3D, I could use the multipole expansion, but that doesn't really apply here, does it?

  • 0
    Hope I clarified it.2012-06-03

2 Answers 2

4

I don't think that the question got a satisfactory answer yet (mostly because of the way in which it was worded).

The first important point is that a Taylor series cannot get around a singularity: it only represents the original function in some open set where the function is smooth. So the distributional derivatives of $\mathrm{sgn}$ are irrelevant: we can only work in the region where the denominator does not vanish.

Second, the talk about two variables is confusing. Let's introduce $t=r-R$. Now we want to expand $1/|t|$ into a series around $t_0=r-R_0$. Here $t_0$ can be positive or negative, but it cannot be zero. The Taylor series $\frac{1}{|t|}=\frac{1}{|t_0|}\frac{1}{1+t_0^{-1}(t-t_0)}=\frac{1}{|t_0|}\sum_{n=0}^\infty (-1)^n(t_0)^{-n}(t-t_0)^n$ converges in the neighborhood $\{t:|t-t_0|<|t_0|\}$.

0

I guess one reason for confusion is that you are trying to expand a multivariable function near a point without actually having fixed a point. You need to consider a pair $(R_0, r_0)$ and then proceed: $f(R_0+x,r_0+y)=f(R_0, r_0)+...$ Certainly the function must possess derivatives of all orders in order to be expanded in a Taylor series, so you need to decide on the sign first. As for the higher order terms in the "unpleasant" case: $\operatorname{sgn}{x}=\frac{1}{2}\left(e(x)+1\right)$ where $e(x)$ is the step function: $e\left(x\right)=\begin{cases} 1 & x\ge0\\ 0 & x<0 \end{cases}$ therefore $\frac{d}{dx}\left(\operatorname{sgn}{x}\right)=\frac{1}{2}\delta(x)$ where $\delta(x)$ is the Dirac delta function, so there are concerns about the value of such expansion for the applications.