51
$\begingroup$

Of course, it is easy to see, that the integral (or the antiderivative) of $f(x) = 1/x$ is $\log(|x|)$ and of course for $\alpha\neq - 1$ the antiderivative of $f(x) = x^\alpha$ is $x^{\alpha+1}/(\alpha+1)$.

I was wondering if there is an intuitive (probably geometric) explanation why the case $\alpha=-1$ is so different and why the logarithm appears?

Some answers which I thought of but which are not convincing:

  1. Taking the limit $\alpha=-1$ either from above or below lead to diverging functions.

  2. Some speciality of the case $\alpha=-1$ are that both asymptotes are non-integrable. However, the antidrivative is a local thing, and hence, shouldn't care about the behavior at infinity.

  • 0
    I tend to accept anon's comment as an answer if it was given as an answer...2011-09-20

7 Answers 7

22

Assume you know that for every $\beta$ the derivative of the function $x\mapsto x^\beta$ is the function $x\mapsto\beta x^{\beta-1}$ and that you want to choose $\beta$ such that the derivative is a multiple of the function $x\mapsto x^{\alpha}$. You are led to solve the equation $\beta-1=\alpha$, which yields $\beta=\alpha+1$. If $\alpha=-1$, this gets you $\beta=0$, but then the derivative you obtain is the function $x\mapsto 0x^{-1}=0$, which is not a nonzero multiple of $x\mapsto x^{-1}$. For every other $\alpha$, this procedure gets you an antiderivative but for $\alpha=-1$, you failed. Or rather, you proved that no power function is an antiderivative of $x\mapsto x^{-1}$. Your next step might be (as mathematicians often do when they want to transform one of their failures into a success) to introduce a new function defined as the antiderivative of $x\mapsto x^{-1}$ which is zero at $x=1$, and maybe to give it a cute name like logarithm, and then, who knows, to start studying its properties...

Edit (Second version, maybe more geometric.)

Fix $s>t>0$ and $c>1$ and consider the area under the curve $x\mapsto x^\alpha$ between the abscissæ $x=t$ and $x=s$ on the one hand and between the abscissæ $x=ct$ and $x=cs$ on the other hand. Replacing $x$ by $cx$ multiplies the function by a factor $c^\alpha$. The length of the interval of integration is multiplied by $c$ hence the integral itself is multiplied by $c^{\alpha+1}$.

On the other hand, if an antiderivative $F$ of $x\mapsto x^\alpha$ is a multiple of $x\mapsto x^\beta$ for a given $\beta$, then $F(ct)=c^\beta F(t)$ and $F(cs)=c^\beta F(s)$ hence $F(ct)-F(cs)=c^\beta (F(t)-F(s))$. Note that this last relation holds even if one assumes only that $F$ is the sum of a constant and a multiple of $x\mapsto x^\beta$.

Putting the two parts together yields $c^{\alpha+1}=c^\beta$. Once again, if $\alpha=-1$, this would yield $\beta=0$, hence $F$ would be constant and the area $F(t)-F(s)$ under the curve $x\mapsto x^\alpha$ from $s$ to $t\ge s$ would be zero for every such $s$ and $t$, which is impossible since the function $x\mapsto x^\alpha$ is not zero. (And for every $\alpha\ne1$, this scaling argument yields the correct exponent $\beta$.)

  • 1
    Dirk: The geometric argument does not fail, rather it shows rigorously that no constant-plus-a-power-of-$x$ can be an antiderivative of $x\mapsto x^{-1}$. // Re$a$geometric way to see that log solves the case $\alpha=-1$, here is$a$*partly* geometric one. By the geometric proof in my post, by scaling, $F(ct)-F(cs)=F(t)-F(s)$ for every positive $c$, $t$ and $s$. The rest is analysis: $F(xy)=F(x)+F(y)-F(1)$, hence $G=F(\exp(\ ))-F(1)$ is such that $G(t+s)=G(t)+G(s)$. Adding the continuity of $G$, $G(t)=tG(1)$ hence $F$ is an affine function of the inverse of exp.2011-09-20
13

The algebra of all polynomials is closed under differentiation and integration, however as soon as one wants to include negative powers of $x$, integration is no longer closed. As this paper discusses,

Roman, Steven. The Logarithmic Binomial Formula. Amer. Math. Monthly. Vol. 99, No. 7, Aug.-Sept. 1992.

the smallest algebra of functions including both $x$ and $x^{-1}$ that is closed under both diff. and anti-diff. is generated by functions of the form $x^i (\log x)^j$, for $i, j \in \mathbb{Z}$.

As a (very loose) analogy, $\mathbb{R}$ is not closed under taking square roots, but by adjoining $i$, we get closure under arbitrary roots.

Hope this helps!

  • 2
    +1 for a nice reference, even if it does not answer the original question...2011-09-20
4

For $\alpha \neq -1$, we have $\int x^{\alpha} dx = \dfrac{x^{\alpha+1}}{\alpha+1} + \text{ constant} = \dfrac{x^{\alpha+1}-1}{\alpha+1} + \text{ another constant}$ Now letting $\alpha \to -1$, and assuming some nice swapping of limit and integral, we get that $\lim_{\alpha \to -1}\int x^{\alpha} dx = \int \lim_{\alpha \to -1}x^{\alpha} dx = \int \dfrac{dx}x= \lim_{\alpha \to -1}\dfrac{x^{\alpha+1}-1}{\alpha+1} + \text{ constant} = \log(x) + \text{constant}$

1

imagine the following

$ \int \frac{dx}{x} = \frac{x^{0}}{0}$

substract the quantity $ \frac{1}{0} $ so we get $ \frac{x^{0}-1}{0} =log(x)$ by using the taylor expansion of x^{s} on 's' near $ s=0 $

  • 0
    The "quantity" $\frac{1}{0}$?2013-12-08
1

Let $b>a>0$. Neither $lim_{b\to\infty}\int_a^b x^{\alpha}dx$ nor $lim_{a\to 0}\int_a^b x^{\alpha}dx$ converge for $\alpha=-1$. For any other $\alpha$, at least one of the limits exists.

$\log(x)$ also goes to infinity for $x$ going to infinity and for $x$ going to zero. So it's a candidate to consider for the antiderivative. For a power function of the form $cx^\beta$, however, at least one way goes to zero. So a $cx^\beta$ cannot be "the" antiderivative on the entire real line.

1

OK, suppose you know some calculus but you fall asleep as your professor talks about the exponential function. Your true love is algebra and you've noticed the following:

$x \mapsto x^{-1}$ is a homomorphism of multiplication. You look at different forms of the function on algebraic objects,
$\Bbb R^* \to \Bbb R^*$
$(0, +\infty) \to (0, +\infty)$
$(0, 1] \to [1, +\infty)$
taking the inverse of a product to the product of the inverses.

As you study the mapping $(0, +\infty) \to (0, +\infty)$ you find it interesting to look at how the set $\{\dots,6^{-1},5^{-1},4^{-1},3^{-1},2^{-1},1^0,2^1,3^1,4^1,5^1,6^1,\dots\}$
is mapped under inversion. It looks like multiplying by $-1$ on $\Bbb Z$, and as you explore it further, you wonder if you can create a homomorphism that takes multiplication to addition,
$(0, +\infty) \to \Bbb R$
that somehow uses the inversion function $1/x$. You are looking for a function that dilates and is negative when operating on arugments less than $1$, is equal to $0$ at $1$, and is a contraction and positive when operating on arguments greater than $1$.

But $1/x$ is greater than $1$ and explodes near 0 when $x \lt 1$, and is less than $1$ and shrinks to zero near $+\infty$ when $x \gt 1$. With a flash of insight you think, just maybe, the function

$F(x) =\int_1^x u^{-1}du$

will do the trick. It feels like it will stretch $(0, +\infty) \to \Bbb R$ in the right way and it takes $1$ to $0$ with $F^{'}(1) = 1$.

OK, time to see if integrating a multiplicative homomorphism with the lower limit the identity $1$ will give you a new homomorphism that takes multiplication to addition.

0

In the expression $\dfrac{x^{\alpha+1}}{\alpha+1}$, when $\alpha=-1$, then you're dividing by $0$. If you understand why you can divide by any other number but not by $0$, then that immediately gives you a reason to expect the answer to be quite different in that case.

(I'm surprised to see this comment absent from the other answers.)

  • 9
    It is in the question, listed as the first of two answers considered "not convincing".2011-09-14