Mind-blowing mathematics experiments

We've all heard of some mind-blowing phenomena involving the sciences, such as the double-slit experiment. I was wondering if there are similair experiments or phenomena which seem very counter-intuitive but can be explained using mathematics? I mean things such as the Monty Hall problem. I know it is not exactly an experiment or phenomenon (you could say a thought-experiment), but things along the line of this (so with connections to real life). I have stumbled across this interesting question, and this are the type of phenomena I have in mind. This question however only discusses differential geometry.


Solution 1:

Off the top of my head, I can think of two astonishing phenomena which were observed experimentally before they were explained mathematically.

The first is Benford's law. This was discovered by an astronomer in the 19th century who noticed that the early part of his book of logarithmic tables was more worn than the later part, suggesting that numbers with first digit 1 or 2 appear more frequently in nature than numbers with first digit 8 or 9. In fact, the distribution of first digits from most data sets (including numbers lifted from the New York Times!) tend to obey a fairly consistent logarithmic distribution. This pattern has a variety of different explanations depending on the context - it is known to happen for power law data or data coming from a variety of different distributions, for instance. Some explanations involve serious mathematics, such as the ergodic theorem.

The second is Feigenbaum's constant. Begin with a simple one-parameter dynamical system, such as $f(x) = ax(1-x)$. To say that this is a dynamical system means that we are going to pick a value of $x$ (between $0$ and $1$), plug it into $f$, plug the output back into $f$ again, and so on. The question is: what will happen as we keep iterating this procedure? The answer depends on the constant $a$. If you start off with $a$ between $3$ and $1 + \sqrt{6}$ then you get an attracting $2$-cycle, meaning the system bounces back and forth between two points. For a range of values slightly larger than $1 + \sqrt{6}$, you get an attracticing $4$-cycle, meaning he system oscillates between four points. As you keep sliding $a$ upward, you get $8$-cycles, $16$-cycles, $32$-cycles... until finally, for values of $a$ around 3.6 or higher, you just get chaos. The question is: what is the period doubling rate, i.e. the rate at which you go from an $2^k$-cycle to a $2^{k+1}$-cycle? The period keeps doubling faster and faster, but the ratio is asymptotically a constant: about 4.6692. The crazy thing is that you can repeat this analysis for a variety of similar dynamical systems, such as $f(x) = a - x^2$, and you get the same phenomenon: accelerating period doubling followed by chaos. The ranges of $a$ for which you see a given cycle vary from system to system, but the period doubling rate of $4.6692$ appears over and over again. Feigenbaum discovered this experimentally and then proved that it holds for any one-parameter dynamical system $f(x)$ with a single quadratic maximum.

Solution 2:

If you let $a_1=a_2=a$, and $a_{n+1}=20a_n-19a_{n-1}$ for $n=2,3,\dots$, then it's obvious that you just get the sequence $a,a,a,\dots$. But if you try this on a calculator with, say, $a=\pi$, you find that after a few iterations you start getting very far away from $\pi$. It's a good experiment/demonstration on accumulation of round-off error.

Solution 3:

There's the famous Buffon Needle problem: if you drop a needle of length $a$ at random onto a floor made of parallel wooden strips of identical width $l \gt a$, what is the probability that the needle will lie across a line separating two strips? The answer involves $\pi$, and you can actually use this process to experimentally approximate $\pi$.

One that definitely blew my mind when I first heard about it was Khinchin's constant: if you take just about any real number (excluding the rationals and a few others) and write down its (unique) continued fraction expansion,

$$x = a_0 + \frac{1}{a_1 + \frac{1}{a_2 + \frac{1}{a_3 + ...}}} = [a_0;a_1,a_2,a_3,...]$$

and then form the geometric mean of the first n coefficients,

$$\left(\,\prod_{k=1}^n a_k\right)^{1/n}$$

you can ask what happens if you do it again with larger and larger n, i.e. in the limit as $n \rightarrow \infty$. The intuitive guess is that this depends on the $a_k$ (meaning, on the $x$ you picked.) Surprisingly, it turns out that this isn't true: basically, no matter what $x$ you pick (i.e. almost always), this infinite product will converge to the same value, which is called Khinchin's constant.

Weirdest of all, while it's really easy to check this numerically (just pick a real number at random and start computing), and while there are several different proofs that this happens almost always, it hasn't been proven for any one particular case: for example, $\pi, \gamma$ and even Khinchin's constant itself all seem to have this property, but no one has a proof.