290
$\begingroup$

I've come across statements in the past along the lines of "function $f(x)$ has no closed form integral", which I assume means that there is no combination of the operations:

  • addition/subtraction
  • multiplication/division
  • raising to powers and roots
  • trigonometric functions
  • exponential functions
  • logarithmic functions

, which when differentiated gives the function $f(x)$. I've heard this said about the function $f(x) = x^x$, for example.

What sort of techniques are used to prove statements like this? What is this branch of mathematics called?


Merged with "How to prove that some functions don't have a primitive" by Ismael:

Sometimes we are told that some functions like $\dfrac{\sin(x)}{x}$ don't have an indefinite integral, or that it can't be expressed in term of other simple functions.

I wonder how we can prove that kind of assertion?

7 Answers 7

126

It is a theorem of Liouville, reproven later with purely algebraic methods, that for rational functions $f$ and $g$, $g$ non-constant, the antiderivative

$f(x)\exp(g(x)) \, \mathrm dx$

can be expressed in terms of elementary functions if and only if there exists some rational function $h$ such that it is a solution to the differential equation:

$f = h' + hg$

$e^{x^2}$ is another classic example of such a function with no elementary antiderivative.

I don't know how much math you've had, but some of this paper might be comprehensible in its broad strokes: http://www.sci.ccny.cuny.edu/~ksda/PostedPapers/liouv06.pdf

Liouville's original paper:

Liouville, J. "Suite du Mémoire sur la classification des Transcendantes, et sur l'impossibilité d'exprimer les racines de certaines équations en fonction finie explicite des coefficients." J. Math. Pure Appl. 3, 523-546, 1838.

Michael Spivak's book on Calculus also has a section with a discussion of this.

68

Have you ever heard of Galois theory? It is a theory that studies the solutions of equations over fields.

As it turns out, there is a special type of Galois theory called Differential Galois theory, which studies fields with a differential operator on them:

http://en.wikipedia.org/wiki/Differential_galois_theory

Using this theory, one can prove that functions like $\frac{\sin(x)}{x}$ and $x^x$ don't have an indefinite integral.

  • 0
    @BillDubuque is correct. The Galois group of an integral is just the additive group of the constant subfield. Why? Well, the Galois group is the group of automorphisms that fix the base field and permute the solutions around amongst themselves. For an integral, the solutions differ only by the constant of integration "C". So, the integral's Galois group is the additive group of the real (or complex) numbers. Actually that's one of two cases - the case when we need to extend the original field to construct the integral. If we don't need to extend, then the Galois group is nil.2016-12-13
40

The techniques used for indefinite integration of elementary functions are actually quite simple in the transcendental (vs. algebraic) case, i.e. the case where the integrand lies in a purely transcendental extension of the field of rational functions $\rm\mathbb C(x)$. Informally this means that the integrand lies in some tower of fields $\rm\mathbb C(x) = F_0 < \cdots < F_n = \mathbb C(x,t_1,\ldots,t_n)$ which is built by adjoining an exponential or logarithm of an element from the prior field, i.e $\rm\ t_{i+1} =\: exp(f_i)\ $ or $\rm\ t_{i+1} =\: log(f_i)\ $ for $\rm\ f_i \in F_i$ where $\rm t_{i+1}$ is transcendental over $\rm F_i\:.\ $ For example $\rm\ exp(x),\ log(x)\ $ are transcendental over $\rm\mathbb C(x)$ but $\rm\ exp(2\ log(x)) = x^2\ $ is not. Now, because $\rm\ F_{i} = F_{i-1}(t_{i})$ is transcendental it has particularly simple structure, viz. it is isomorphic to the field of rational functions in one indeterminate $\rm\:t_i\:$ over $\rm\ F_{i-1}\ $. In particular, this means that one may employ well-known rational function integration, techniques such as expansions into partial fractions. This, combined with a simple analysis of the effect of differentiation on the degree of polynomials $\rm\ p(t_i)$, quickly leads to the fundamental result of Liouville on the structure of antiderivatives, namely they must lie in the same field $\rm F$ as the integrand except possibly for the addition of constant multiples of log's over $\rm F$. With this structure theorem in hand, the transcendental case reduces to elementary computations in rational function fields. This transcendental case of the algorithm is so simple that it may be easily comprehended by anyone who has mastered a first course in abstract algebra.

On the other hand, the full-blown algebraic case of the algorithm requires nontrivial results from the theory of algebraic functions. Although there are some simple special case algorithms for sqrt's and cube-roots (Davenport, Trager) the general algorithm requires deep results about points of finite order on abelian varieties over finitely generated ground fields. This algebraic case of the integration algorithm was discovered by Robert Risch in 1969 - who did his Berkeley Ph.D. on this topic (under Max Rosenlicht).

For a very nice introduction to the theory see Max Rosenlicht's Monthly paper, available from JSTOR and also here. This exposition includes a complete proof of the Liouville structure theorem along with a derivation of Liouville's classic criterion for $\rm\int f(z)\: e^{g(z)}\: dz\ $ to be elementary, for $\rm\: f(z),\: g(z)\in \mathbb C(x)$. For algorithms see Barry Trager's 1984 MIT thesis and Manual Bronstein: Symbolic Integration I: Transcendental Functions.

Disclaimer: I implemented the integration algorithm in Macsyma (not the much older Maxima) so, perhaps due to this experience, my judgment of simplicity might be slightly biased. However, the fact that the basic results are derived in a handful of pages in Rosenlicht's paper yields independent evidence for such claims.

32

A key result in this area is a theorem by Liouville. You'll find easy expositions of this result in the references below:

  • Antoine Chambert-Loir. A Field Guide to Algebra, last chapter, Springer, 2005.
  • M Rosenlicht. Integration in finite terms, Amer. Math. Monthly, (1972), pp 963-972.
  • Toni Kasper. Integration in finite terms: the Liouville theory. Math. Mag., 53(4):195–201, 1980.
  • D. H. Potts. Elementary integrals. Amer. Math. Monthly, 63:545–554, 1956.
  • A. D. Fitt and G. T. Q. Hoare. The closed-form integration of arbitrary functions, The Mathematical Gazette, Vol. 77, No. 479 (Jul., 1993), pp. 227-236
  • Elena Anne Marchisotto and Gholam-Ali Zakeri. An invitation to integration in finite terms, The College Mathematics Journal, Vol. 25, No. 4 (Sep., 1994), pp. 295-308
  • D. G. Mead. Integration, The American Mathematical Monthly, Vol. 68, No. 2 (Feb., 1961), pp. 152-156
  • 0
    See also [Impossibility theorems on integration in elementary terms](http://www.claymath.org/2005-academy-colloquium-series).2014-06-16
31

Surprisingly, there is a procedure for determining this, called the Risch Algorithm. However, it is apparently very complicated to implement, and furthermore it is not a true algorithm: if the input does have an anti-derivative, it can find it in a finite amount of time, but demonstrating there is no anti-derivative requires solving the constant problem, which can fail to halt. (In that case, probabilistic methods can be used.)

  • 3
    @Bill Dubuque: That's exactly correct. $I$'ve implemented the transcendental version of the Risch Algorithm in SymPy, and you can trick it into thinking something like `exp((sin(y)**2 + cos(y)**2 - 1)*x**2)` has no elementary integral with respect to `x` (when in fact it does, because it reduces to just 1).2011-06-26
20

Brian Conrad explains this in the following:

Impossibility theorems on integration in elementary terms - pdf

-1

The general pattern is always the same: (1). Study properties of derivatives of closed form functions. (2). If your function $f(x)$ fails one of those properties, then it has no closed form antiderivative.

Here is how to get started: First, consider derivatives of rational functions. Notice that their pole orders at finite points are never 1 (more generally: the residue is 0). If $f(x)$ fails this property, then you know that its antiderivative can not be a rational function. Sounds simple, but the surprising part is that this can be generalized quite easily to logarithmic extensions (handling exponential extensions in Risch algorithm is more involved, and handling algebraic extensions is more involved still).

Take for example a function $f(x) \in \mathbb{C}(x, \ln(x))$. For each irreducible factor $p \in \mathbb{C}(x)[ \ln(x) ]$ in the denominator of $f(x)$, you can define a corresponding residue (which will be algebraic over $\mathbb{C}(x)$). A necessary condition for $f(x)$ to have an elementary anti-derivative is for these residues to be constants (in the more basic case, $f(x) \in \mathbb{C}(x)$, residues are always constants, and thus, rational functions always have an elementary antiderivative). See "Liouville's theorem (differential algebra)" on wikipedia for more.