Rigour in mathematics

Finding the roots of a linear polynomial is trivial. Already the Babylonians could find roots of quadratic polynomials. Methods to solve cubic polynomials and fourth-degree polynomials were discovered in the sixteenth century, all using radicals (i.e. $n$th roots for some $n$). Isn't it obvious that finding the roots of higher degree polynomials is also possible using radicals and that we have not found the formulas yet is only because they become more and more complicated with higher polynomial degrees?

Galois theory shattered this belief.


Berry's Phase was discovered after a lack of rigor in the proof of the Adiabatic theorem was discovered around 1980. It now appears in standard Quantum Mechanics texts and has produced at least 3000 papers since 1980 (actually, that number is about 10 years old, I'm not sure how many by now).

While this appears in physics, the mistake was mathematical. In particular, topological. There is an integration over parameter space in the proof which is assumed to be trivial. That would be fine if the parameter space was one-dimensional, but for higher-dimensional parameter spaces there might be a singularity in the domain which obstructs the vanishing of the integral. Once this mistake was uncovered, physicists then gained insight from the corrected theorem to modify phase with ease. It would be exciting to see similar developments in other physical arenas where adhoc mathematics is utilized. However, it seems this is the abberation from the norm. Much like the case with math. I think it's fair to say that most often the heursitic proof has turned out to be correct once the details are fleshed out. This is why this thread is interesting.


First thing that comes to mind: not every smooth function is equal to its Taylor series over the series' region of convergence. As a counterexample, consider the function $$ f(x) = \begin{cases} 0 & x=0\\ \exp\left(-\frac1{x^2}\right) & x\neq0 \end{cases} $$ whose Taylor series centered at $x=0$ is simply $0$, with an infinite radius of convergence.


I think the simplest example is the answer to the question:

Are there more rational numbers or natural ones, or is there equally many of those?

Intuition says that "of course there are much, much more rationals". However, rigorous mathematical proof shows that there are exactly the same number of each.


The one that currently bugs me is that exponentiation is Diophantine.

This means that there exists an integer polynomial (so, no variables in the exponents) $P(x,y,z,w_1,\dots,w_n)$ such that:

$$\forall x,y,z\in\mathbb N\,\left(z=x^y \iff \exists w_1,\dots,w_n\in\mathbb N\,\left(0=P(x,y,z,w_1,\dots,w_n)\right)\right)$$

I've read the proof. I believe the proof is correct. I still don't instinctively believe the result.

One of the surprising results of this is that first order number theory only needs to have multiplication and addition - you can still answer questions about exponentiation using the above polynomial.