Why do statements which appear elementary have complicated proofs?

The motivation for this question is : Rationals of the form $\frac{p}{q}$ where $p,q$ are primes in $[a,b]$ and some other problems in Mathematics which looks as if they are elementary but their proofs are very much sophisticated.

I would like to consider two famous questions: First the "Fermat's Last Theorem" and next the unproven "Goldbach conjecture". These questions appear elementary in nature, but require a lot of Mathematics for even comprehending the solution. Even the problem, which I posed in the link is so elementary but I don't see anyone even giving a proof without using the prime number theorem.

Now the question is: Why is this happening? If I am able to understand the question, then I should be able to comprehend the solution as well. A Mathematician once quoted: Mathematics is the understanding of how nature works. Is nature so complicated that a common person can't understand as to how it works, or is it we are making it complicated.

At the same time, I appreciate the beauty of Mathematics also: Paul Erdős' proof of Bertrand's postulate is something which I admire so much because, of its elementary nature. But at the same time i have my skepticism about FLT and other theorems.

I have stated 2 examples of questions which appear elementary, but the proofs are intricate. I know some other problems, in number theory which are of this type. Are there any other problems of this type, which are not Number Theoretical? If yes, I would like to see some of them.


Simple questions often have complex answers, or no answers at all.

If you were to ask me why the sky was blue, to give a complete answer I would have to describe the heliocentric model, the earth's atmosphere, and the electromagnetic spectrum.

If you asked me how my computer connected to the internet, the answer would take a considerable amount of time to explain.

If you asked my whether there was life on other planets, I wouldn't be able to give an answer; we just don't know.

Often, it is impossible to give a simple answer to a simple question. This is true in any field you might encounter, ranging from the sciences to the humanities. And it is true in mathematics,

Mathematics allows us to formalize our questions within an axiomatic structure. It lets us ask our questions more precisely. But it in no way guarantees that the simplicity of the question would translate into simplicity of the answer.

Some other simple problems which cannot be answered simply:

The Four color theorem states that any map made up of continuous regions can be colored with 4 colors such that each region gets 1 color and no two adjacent regions get the same color. The proof is quite complex, requiring the use of a computer.

Scheinerman's conjecture is the conjecture that "every planar graph is the intersection graph of a set of line segments in the plane" (sourced from http://en.wikipedia.org/wiki/Scheinerman%27s_conjecture). However, the proof was only completed in 2009, and is fairly difficult.


I suppose there is a way to formalize your question into a relatively precise one. Let $P$ be the set of all propositions in the language of Zermelo-Frankel set theory for which there exist proofs.

Given a natural number $k$ we want to have an upper bound $l(k)$ on the minimal-length of the proof of any proposition $p$ from $P$ such that the length of the proposition $p$ is at most $k$. Here I'm using "length" in the sense of the number of ASCII characters it would take to write the proposition (or proof of the proposition, respectively).

Presumably $l(k)$ is a non-computable function. Since if $l(k)$ were computable, given a proposition $p$ of length $k$ you'd have a "simple" procedure to find a proof of any statement in Zermelo-Frankel set theory, provided one exists. The idea would be to iterate through all propositions of length at most $l(k)$ and verify they're proofs of $p$. :)

I imagine such a statement is known to logicians. If it is known, I would call it the "rationalization is hard theorem".