Examples of famous problems resolved easily

Have there been examples of seemingly long standing hard problems, answered quite easily possibly with tools existing at the time the problems were made? More modern examples would be nice. An example could be Hilbert's basis theorem. Another could be Dwork's p-adic technique for rationality of zeta functions from algebraic varieties.


Solution 1:

I'm sure there are loads. One that comes to mind is the Seven Bridges of Konisberg problem.

http://en.wikipedia.org/wiki/Seven_Bridges_of_K%C3%B6nigsberg

Seven Bridges of Konisberg Illustration from Wikipedia

Basically, the problem asked: is there a path which you can walk such that you cross every bridge once and only once. (you can't do things like walk halfway onto a bridge and walk back).

It was kinda known, well... conjectured, that there was no solution to this problem. Basically because no one found a solution to this problem and no one could prove why or why not.

Until Euler of course. As usual, it is always Euler who saves the day. He showed that there was no answer to this problem, as in, it was impossible to construct a path such that you cross every bridge just once.

The justification ended up being really simple: First, you have to put the problem in terms of the areas in town as opposed to the bridges. From the illustration, you can see that you can split the town into 4; the top, the bottom, the right and the center. Also, you have to realize that in order not to reuse a bridge, every region in town must have an even number of bridges (an entry bridge and an exit bridge). If you don't have an even number of bridges, you get stuck in one of the regions (As in, if there's just 1 bridge, you use that bridge and then get stuck. If there are 3 bridges, you go in through one, go out from the second, and then get stuck upon using the third). So, if you look closely, the regions have all odd numbers of bridges connecting them. (3 connecting to the top region, 5 connecting to the center, 3 connecting to the right and 3 connecting to the bottom)

Therefore, from this you get the idea of eulerian cycles and from there loads of properties of graphs and then graph theory was born. And if it were not for graph theory we'd be lost when it comes to the telephone network and the internet. So, yeah, long-standing problem with a simple solution and with even more long-standing benefits.

Acknowledgment: About the problem, another possibility would be if you had only 2 regions with an odd number of bridges connecting them, and all the rest must have an even number of bridges. The logic behind this would be that one of those two regions would be your "starting" point and the other would be you "ending" point or finish line. In this case, you would get an Eulerian path instead of a cycle

Solution 2:

I don't know how old or how famous Tarski's Unique Square Root Problem was when the young László Lovász solved it. Certainly Tarski was very famous, and I'm sure he thought the problem was quite hard. Lovász's solution, while brilliant, was surely much shorter and simpler than Tarski or anyone had expected.

Tarski's Problem. If two finite structures (groups, rings, lattices, graphs, whatever) have isomorphic squares, must they be isomorphic? (The square of a structure $A$ is the direct product $A\times A$.)

Lovász solved Tarski's problem in the affirmative with a clever counting argument, which I will try to reproduce since I'm having trouble finding a good reference. I assume that all structures are finite structures of a fixed type, e.g., finite groups. Let $A,B$ be two structures such that $A\times A\cong B\times B$.

For any structures $X$ and $Y$, let $h(X,Y)$ be the number of homomorphisms, and $i(X,Y)$ the number of injective homomorphisms, from $X$ to $Y$.

Claim 1. For any structure $X$, we have $h(X,A)=h(X,B)$.

Proof. $h(X,A)^2=h(X,A^2)=h(X,B^2)=h(X,B)^2$, so $h(X,A)=h(X,B)$.

Claim 2. For any structure $X$, we have $i(X,A)=i(X,B)$.

Waving of Hands. We can use the inclusion-inclusion principle to express $i(X,A)$ in terms of values of $h(Y,A)$ where $Y$ ranges over quotients of $X$. (I think I could write a correct proof of this, but it would take some laborious typing. Anyway, it's obvious, isn't it?) Since $h(Y,A)=h(Y,B)$ by Claim 1, it follows that $i(X,A)=i(X,B)$.

Now we have $i(A,B)=i(B,B)\ge1$ and $i(B,A)=i(A,A)\ge1$, i.e., we have injective homomorphisms from $A$ to $B$ and from $B$ to $A$. Since the structures are finite, it follows from this that they are isomorphic.

I learned this argument by word of mouth, attributed to Lovász, several decades ago. This paper (which I haven't read) seems to be the source: L. Lovász, Operations with structures, Acta Math. Acad. Sci. Hungar. 18 (1967) 321-328; see Theorem 4.1 on p. 327, which appears to be a generalized version of the Unique Square Root Theorem.

Solution 3:

I like the establishment of the independence of Euclid's 5th postulate from the others for winner in this category. It's arguably the first long-standing problem of pure mathematics, and maybe the oldest example we have of a mathematical community posing a problem to itself and investigating it collaboratively. Perhaps some leniency is required here as to how accessible the "methods" by which it was eventually resolved really would have been to the Greeks, but in the spirit of positive historico-imaginationism I'm going to say they could have done it. I have no doubt Archimedes could have accomplished much of the necessary editing and improvements in rigor to Euclid's Elements, had been of a mindset and motivation to do so. The harder ingredient is a brilliant mathematician with enough childish instinct to go the distance and say "I don't care if Euclid's axioms are self-evident truths of your world. This is MY world where dragons are real, my sword is magic, and parallel lines can intersect each other." Who knows. Had the Fates granted Plato a star-student interested in mathematics (instead of the metaphysician he got in Aristotle)....

Solution 4:

Steiner-Lehmus theorem is a good one: Any triangle that has two equal angle bisectors (each measured from a vertex to the opposite side) is isosceles.

To quote from "Geometry Revisited", by Coxeter and Greitzer: This theorem was sent to the great Swiss geometer Jacob Steiner in 1840 by C.L. Lehmus (whose name would otherwise have been forgotten long ago) with a request for a pure geometrical proof. Steiner gave a fairly complicated proof which inspired many other people to look for simpler methods. Papers on the Steiner-Lehmus theorem appeared in various journals in 1842, 1844, 1848, almost every year from 1854 till 1864, and with a good deal of regularity during the next hundred years.

It turns out to be a straightforward proof when converted to the contrapositive form, devised by two English engineers, Gilbert and Macdonnell, in 1963, showing that (see first diagram below) if in $\Delta ABC,\ B \neq C,$ then $BM \neq CN.$ The proof is given in Geometry Revisited, this is just an outline, but it rests on an easy lemma that if a triangle has two different angles, the smaller angle has the longer internal bisector.

![steiner-lehmus.ggb][1]

One direct proof involves solving a complicated algebraic equation based on the formula for the length of a triangle's angular bisector. See below.

$$\text{The equation BM}=\text{CN implies}\\ca\left[1-\left(\frac b{c+a}\right)^2\right]=ab\left[1-\left(\frac c{a+b}\right)^2\right]$$

To summarize, the theorem had a simple proof available using methods of the time (19th century), but was seen as difficult all the way up to well into the 20th century!

enter image description here

Solution 5:

From least likely to fit your criteria to most, in my opinion:

Halting problem
Special Cases of Fermats last theorem (n = 3, n = 4...)
These problems weren't around for very long, but their solutions are fairly easy to understand after you've seen them.

Polynomial time primality
This problem has been around forever, but the solution is only "relatively" easy; as in, it's simple for those who study number theory algorithms. It's not introductory material though. It is fairly modern though.

Sum of inverse squares
$\sum_{k = 0}^{\infty} {\frac 1 {k^2}} = \text{???}$

Definitely a winner here. It had been known for a long time that the sum converged to something finite, but no one had any clue what. The solution evaded the best mathematicians for generations until Euler came down from heaven and told us.

Edit: From "A History of PI" by Petr Beckmann
$\begin{align} \sin(x) &= x - \frac {x^3} {3!} + \frac {x^5} {5!} - \frac {x^7} {7!} \cdots \\ &= x \cdot (1 - \frac {x^2} {3!} + \frac {x^4} {5!} - \frac {x^6} {7!} \cdots)\end{align}$

Let $y^2 = x$, consider $\sin(x) = 0$ for $x \ne 0$:
$0 = 1 - \frac y {3!} + \frac {y^2} {5!} - \frac {y^3} {7!} \cdots \tag{P}$

We know roots of $\sin(x)$ are $0, \pm \pi, \pm 2 \pi, \cdots$, so the roots of $(P)$ are $(\pi)^2, (2 \pi)^2, (3 \pi)^2 \cdots$.

By theory of equations, the sum of the inverse of the roots of $(P)$ is the negative of the linear coefficient. So $$\frac 1 {\pi^2} + \frac 1 {(2 \pi)^2} + \frac 1 {(3 \pi)^2} + \cdots = \frac 1 {3!}$$

$$\frac 1 {1^2} + \frac 1 {2^2} + \frac 1 {3^2} + \cdots = \frac {\pi ^2} 6$$


There is a property of finite polynomials, that if:

$$ 0 = a_0 + a_1x + a_2x^2 + \cdots a_nx^n = a_n(x - r_0)(x - r_1)(x - r_2)\cdots(x - r_n)$$

then

$$\frac {a_1} {a_0} = \frac 1 {r_0} + \frac 1 {r_1} + \frac 1 {r_2} + \cdots \frac 1 {r_n}$$

It is a major skip in reasoning to assume that this is true for infinite polynomials, but I think this is what the author was getting at.