Let $\chi(t)$ be the Heaviside function, i.e. $\chi(t) = 1$ for $t > 0$ and $\chi(t) = 0$ if $t \leq 0$. Reading a paper I faced with a statement that $$ \frac{t^{p-1}}{\Gamma(p)}\chi(t) \to \delta^{(k)} $$ when $p \to -k$, where $k = 1,2,\ldots$ What does it mean? I can't integrate a function on the left when $\Re p < 0$ on intervals containing $0$.
How to understand limit
-
1Do you have a link to the paper? – 2012-11-01
-
0@DavideGiraudo http://people.tuke.sk/igor.podlubny/USU/05_fd_continued.pdf 6-th (last) slide on first page – 2012-11-01
-
2Probably this limit is in the sense of distributions. – 2012-11-01
-
0@Tomás but distribution on the left is defined only on functions that are zero near $t = 0$ – 2012-11-01
1 Answers
As Tomas observes, this limit is not an elementary sense. It is in a sense of meromorphic continuation of distributions. That is, for complex $s$ at first with real part $>0$, let $u_s(f)=\int_0^\infty x^s\,f(x)\,{dx\over x}$. This defines a tempered distribution $u_s$ on that half-plane. It has a (pretty well-known) meromorphic continuation, certifiable in several ways. One is to integrate by parts repeatedly, obtaining $u_s(f)={-1\over s}\int_0^\infty x^{s+1}\,f'(x)\,{dx\over x}$ and so on. In particular, as this single application of integration by parts already shows, $u_s$ meromorphically continues to $\Re(s)>-1$, with residue $-\int_0^\infty f'(x)\,dx$ at $s=0$. Since $f$ is Schwartz, this is $f(0)$. Higher derivatives are obtained similarly.
About the sign: the original question is a bit misleading, insofar as the residue of Gamma at non-positive integers has a sign.
Trying to be careful about sign: I think we find $$ u_s(f)\;=\;{-1\over s}{-1\over s+1}\cdots {-1\over s+k}\int_0^\infty x^{s+k}\,f^{(k+1)}(x)\;dx $$ The residue at $s=-k$ is $$ {-1\over -k}{-1\over -k+1}\cdots{-1\over -k+(k-1)}(-1)\int_0^\infty f^{(k+1)}(x)\,dx \;=\; {1\over k!} f^{(k)}(0) $$ That is, (maybe!) the signs go away upon taking the residue. On the other hand, as posed above, the $\Gamma(s)$ has residue $(-1)^k/\Gamma(k)$ at $s=-k$, by similar integration by parts: $$ \Gamma(s) \;=\; \int_0^\infty x^s\,e^{-x}\,{dx\over x} \;=\; {1\over s} \int_0^\infty x^{s+1}\,e^{-x}\,{dx\over x} \;=\; \ldots \;=\; {1\over s}\ldots{1\over s+k}\int_0^\infty x^{s+k+1}\,e^{-x}\,{dx\over x} $$ The residue at $s=-k$ is (maybe!) $$ {1\over -k}{1\over -k+1}\ldots {1\over -k+(k-1)}\int_0^\infty e^{-x}\,dx \;=\; {(-1)^k\over k!} $$ Maybe there is some further rationalization about "sign" in the source? ... Edit-edit: the "convolution" with $t^k$ may include something like $(x-t)^k$, thus producing the sign.
-
0That's very beautiful! Great thanks! – 2012-11-01
-
0Acting like you, I obtained $\frac{u_{p}(f)}{\Gamma(p)} = \frac{(-1)^{k+1}}{\Gamma(k+p+1)} \int_{0}^{\infty} f^{(k+1)} t^{p+k} dt$. When $p \to -k$ this expression tends to $(-1)^{k} f^{(k)}(0)$. Is there a problem with sign? – 2012-11-01
-
0I think your sign is correct. At least up to $\pm$! :) ... in the original the denominator differs from the actual residue of Gamma by that sign. – 2012-11-01
-
0That's bad, because author defines $k$-th derivative of $f$ as $f(t)*\frac{t^{-k-1}}{\Gamma(-k)}\chi(t)$ and this definition is contradictory with classical if sign depends of $k$ – 2012-11-01
-
0Maybe one has to think carefully about this sign... Give me a moment. – 2012-11-01
-
0Oh, you're absolutely right! $\frac{t^{p-1}}{\Gamma(p)}\chi(t) * f(t) = \int_{0}^{\infty} \frac{\tau^{p-1}}{\Gamma(p)} f(t-\tau) d\tau$ and this tends to $\langle (-1)^k \delta^{(k)}(\tau), f(t-\tau) \rangle = (-1)^k (-1)^k f^{(k)}(t)$. $\aleph_0$ times thank you! – 2012-11-01
-
0Whew! :) .... . – 2012-11-01