I am about to finish my first year of studying mathematics at university and have completed the basic linear algebra/calculus sequence. I have started to look at some real analysis and have really enjoyed it so far.

One thing I feel I am lacking in is motivation. That is, the difference in rigour between the usual introduction to calculus class and real analysis seems to be quite strong. While I appreciate rigour for aesthetic reasons, I have trouble understanding why the transition from the 18th century Euler style calculus to the rigorous "delta-epsilon" formulation of calculus was necessary.

Is there a book that provides some historical motivation for the rigorous developement of calculus? Perhaps something that gives several counterexamples that occur when one is only equipped with a non-rigorous (i.e. first year undergraduate) formulation of calculus. For example, were there results that were thought to be true but turned out to be false when the foundations of calculus were strengthened? I suppose if anyone knows good counterexamples themselves they could list them here as well.


Solution 1:

In general, the push for rigor is usually in response to a failure to be able to demonstrate the kinds of results one wishes to. It's usually relatively easy to demonstrate that there exist objects with certain properties, but you need precise definitions to prove that no such object exists. The classic example of this is non-computable problems and Turing Machines. Until you sit down and say "this precisely and nothing else is what it means to be solved by computation" it's impossible to prove that something isn't a computation, so when people start asking "is there an algorithm that does $\ldots$?" for questions where the answer "should be" no, you suddenly need a precise definition. Similar things happened with real analysis.

In real analysis, as mentioned in an excellent comment, there was a shift in what people's conception of the notion of a function was. This broadened conception of a function suddenly allows for a number of famous "counter example" functions to be constructed. These often that require a reasonably rigorous understanding of the topic to construct or to analyze. The most famous is the everywhere continuous nowhere differentiable Weierstrass function. If you don't have a very precise definition of continuity and differentiability, demonstrating that that function is one and not the other is extremely hard. The quest for weird functions with unexpected properties and combinations of properties was one of the driving forces in developing precise conceptions of those properties.

Another topic that people were very interested in was infinite series. There are lots of weird results that can crop up if you're not careful with infinite series, as shown by the now famously cautionary theorem:

Theorem (Summation Rearrangement Theorem): Let $a_n$ be a sequence such that $\sum a_n$ converges conditionally. Then for every $x$ there is some $b_n$ that is a reordering of $a_n$ such that $\sum b_n=x$.

This theorem means you have to be very careful dealing with infinite sums, and for a long time people weren't and so started deriving results that made no sense. Suddenly the usual free-wheeling algebraic manipulation approach to solving infinite sums was no longer okay, because sometimes doing so changed the value of the sum. Instead, a more rigorous theory of summation manipulation, as well as concepts such as uniform and absolute convergence had to be developed.

Here's an example of an problem surrounding an infinite product created by Euler:

Consider the following formula: $$x\prod_{n=1}^\infty \left(1-\frac{x^2}{n^2\pi^2}\right)$$ Does this expression even make sense? Assuming it does, does this equal $\sin(x)$ or $\sin(x)e^x$? How can you tell (notice that both functions have the same zeros as this sum, and the same relationship to their derivative)? If it doesn't equal $\sin(x)e^x$ (which it doesn't, it really does equal $\sin(x)$) how can we modify it so that it does?

Questions like this were very popular in the 1800s, as mathematicians were notably obsessed with infinite products and summations. However, most questions of this form require a very sophisticated understanding of analysis to handle (and weren't handled particularly well by the tools of the previous century).

Solution 2:

One good motivating example I have is the Weierstrass Function, which is continuous everywhere but differentiable nowhere. Throughout the 18th and 19th centuries (until this counter example was discovered) it was thought that every continuous function was also (almost everywhere) differentiable and a large number of "proofs" of this assertion were attempted. Without a rigorous definition of concepts like "continuity" and "differentiabiliy", there is no way to analyze these sort of pathological cases.

In integration, a number of functions which are not Riemann integrable (see also here) were discovered, paving the way for the Stieltjes and more importantly the Lebesgue theories of integration. Today, the majority of integrals considered in pure mathematics are Lebesgue integrals.

A large number of these cases, especially pertaining to differentiation, integration, and continuity were all motivating factors in establishing analysis on a rigorous footing.

Lastly, the development of rigorous mathematics in the late 19th and early 20th centuries changed the focus of mathematical research. Before this revolution, mathematics--especially analysis--was extremely concrete. One did research into a specific function or class of functions--e.g. Bessel functions, Elliptic functions, etc.--but once rigorous methods exposed the underlying structure of many different classes and types of functions, research began to focus on the abstract nature of these structures. As a result, virtually all research in pure mathematics these days is abstract, and the major tool of abstract research is rigor.

Solution 3:

Some other answers have already provided excellent insights. But let's look at the problem this way: Where does the need for rigor originates? I think the answer lies behind one word: counter-intuition.

When someone is developing or creating mathematics, they mostly need to have an intuition about what they are talking about. I don't know much about the history, but for example, I bet the notion of derivative was first introduced because they needed something to express the "speed" or "acceleration" in motion. I mean, first there was a natural phenomenon for which a mathematical concept was developed. This math could perfectly describe the thing they were dealing with, and the results matched with expectation/intuition. But as time passed, some new problems popped out that led to unexpected/counter-intuitive results. So they felt the need to provide some more rigorous (and consequently, more abstract) concepts. This is why the more we develop in math, the harder its intuition become.

A classic example, as mentioned in other answers, is the Weierstrass function. Before knowing calculus, we may have some sense about the notion of continuity as well as the slope, and this helps us understand calculus more thoroughly. But Weierstrass function is something unexpected and hard-to-imagine, which leads us to the fact that "sometimes mathematics may not make sense, but it's true!"

Another (somehow related) example is the Bertrand paradox in probability. In a same manner, we may have some intuition about the probability even before studying it. This intuition is helpful in understanding the initial concepts of probability, until we are faced with the Bertrand paradox and be like, Oh... what can we do about that?

There are some good questions on this site and mathoverflow about some counter-intuitive results in various fields of mathematics, some of which were the initial incentive to develop more rigorous math. I recommend taking a look at them as well.

Solution 4:

You may enjoy these books. The first one is a classic.

  • The History of the Calculus and Its Conceptual Development, by Carl B. Boyer

  • A History of Analysis, edited by Hans Niels Jahnke

  • Analysis by Its History, by Ernst Haider and Gerhard Wanner

  • A Radical Approach to Real Analysis, by David Bressoud