How can you prove that a function has no closed form integral?
In the past, I've come across statements along the lines of "function $f(x)$ has no closed form integral", which I assume means that there is no combination of the operations:
- addition/subtraction
- multiplication/division
- raising to powers and roots
- trigonometric functions
- exponential functions
- logarithmic functions
which when differentiated gives the function $f(x)$. I've heard this said about the function $f(x) = x^x$, for example.
What sort of techniques are used to prove statements like this? What is this branch of mathematics called?
Merged with "How to prove that some functions don't have a primitive" by Ismael:
Sometimes we are told that some functions like $\dfrac{\sin(x)}{x}$ don't have an indefinite integral, or that it can't be expressed in term of other simple functions.
I wonder how we can prove that kind of assertion?
Solution 1:
It is a theorem of Liouville, reproven later with purely algebraic methods, that for rational functions $f$ and $g$, $g$ non-constant, the antiderivative of
$$f(x)\exp(g(x)) \, \mathrm dx$$
can be expressed in terms of elementary functions if and only if there exists some rational function $h$ such that it is a solution of
$$f = h' + hg'$$
$e^{x^2}$ is another classic example of such a function with no elementary antiderivative.
I don't know how much math you've had, but some of this paper might be comprehensible in its broad strokes: https://ksda.ccny.cuny.edu/PostedPapers/liouv06.pdf
Liouville's original paper:
Liouville, J. "Suite du Mémoire sur la classification des Transcendantes, et sur l'impossibilité d'exprimer les racines de certaines équations en fonction finie explicite des coefficients." J. Math. Pure Appl. 3, 523-546, 1838.
Michael Spivak's book on Calculus also has a section with a discussion of this.
Solution 2:
Have you ever heard of Galois theory? It is a theory that studies the solutions of equations over fields.
As it turns out, there is a special type of Galois theory called Differential Galois theory, which studies fields with a differential operator on them:
http://en.wikipedia.org/wiki/Differential_galois_theory
Using this theory, one can prove that functions like $\frac{\sin(x)}{x}$ and $x^x$ don't have an indefinite integral.
Solution 3:
The techniques used for indefinite integration of elementary functions are actually quite simple in the transcendental (vs. algebraic) case, i.e. the case where the integrand lies in a purely transcendental extension of the field of rational functions $\rm\mathbb C(x)$. Informally this means that the integrand lies in some tower of fields $\rm\mathbb C(x) = F_0 < \cdots < F_n = \mathbb C(x,t_1,\ldots,t_n)$ which is built by adjoining an exponential or logarithm of an element from the prior field, i.e $\rm\ t_{i+1} =\: exp(f_i)\ $ or $\rm\ t_{i+1} =\: log(f_i)\ $ for $\rm\ f_i \in F_i$ where $\rm t_{i+1}$ is transcendental over $\rm F_i\:.\ $ For example $\rm\ exp(x),\ log(x)\ $ are transcendental over $\rm\mathbb C(x)$ but $\rm\ exp(2\ log(x)) = x^2\ $ is not. Now, because $\rm\ F_{i} = F_{i-1}(t_{i})$ is transcendental it has particularly simple structure, viz. it is isomorphic to the field of rational functions in one indeterminate $\rm\:t_i\:$ over $\rm\ F_{i-1}\ $. In particular, this means that one may employ well-known rational function integration, techniques such as expansions into partial fractions. This, combined with a simple analysis of the effect of differentiation on the degree of polynomials $\rm\ p(t_i)$, quickly leads to the fundamental result of Liouville on the structure of antiderivatives, namely they must lie in the same field $\rm F$ as the integrand except possibly for the addition of constant multiples of log's over $\rm F$. With this structure theorem in hand, the transcendental case reduces to elementary computations in rational function fields. This transcendental case of the algorithm is so simple that it may be easily comprehended by anyone who has mastered a first course in abstract algebra.
On the other hand, the full-blown algebraic case of the algorithm requires nontrivial results from the theory of algebraic functions. Although there are some simple special case algorithms for sqrt's and cube-roots (Davenport, Trager) the general algorithm requires deep results about points of finite order on abelian varieties over finitely generated ground fields. This algebraic case of the integration algorithm was discovered by Robert Risch in 1969 - who did his Berkeley Ph.D. on this topic (under Max Rosenlicht).
For a very nice introduction to the theory see Max Rosenlicht's Monthly paper, available from JSTOR and also here. This exposition includes a complete proof of the Liouville structure theorem along with a derivation of Liouville's classic criterion for $\rm\int f(z)\: e^{g(z)}\: dz\ $ to be elementary, for $\rm\: f(z),\: g(z)\in \mathbb C(x)$. For algorithms see Barry Trager's 1984 MIT thesis and Manual Bronstein: Symbolic Integration I: Transcendental Functions.
Disclaimer: I implemented the integration algorithm in Macsyma (not the much older Maxima) so, perhaps due to this experience, my judgment of simplicity might be slightly biased. However, the fact that the basic results are derived in a handful of pages in Rosenlicht's paper yields independent evidence for such claims.