Why some people don't like proofs by contradiction [duplicate]

Solution 1:

I find it harder to read and proofread proofs by contradiction for the following reason: In an ordinary proof, one is trying to show $P \implies Q$. When I read it, I will have in my head a few examples of different things that obey $P$, and I'll check that each step in the proof is consistent with my examples. Hopefully, the last line of the proof will be "$Q$", and everything will work.

In a proof by contradiction, we start out assuming $P \wedge NOT(Q)$. If the theorem is true, there are no examples of things of things which obey $P$ and not $Q$, so I can't think of any examples.

Note that this problem does not arise in proofs which start out by assuming $NOT(Q)$ and deduce $NOT(P)$ at the end, since I can think of examples of things which obey $NOT(Q)$. I have taken to starting these proofs by writing "We establish the contrapositive, that $NOT(Q)$ implies $NOT(P)$" rather than my standard "This proof proceeds by contradiction. Assume...". My hope is that this will help my readers understand that there are still examples available for them to think about.

Solution 2:

Philosophical issues aside, a point that's been left out: what's better when proving need not be what's better when writing down the proof.

You say "For me it seemed so natural to argue by contradiction…", and you're right: yes, it is often easier when proving to argue by contradiction, but the version without contradiction is often easier to read (and therefore better to write).

When you're trying to prove $P \implies Q$, if you prove by contradiction, then you get to assume both $P$ and $\lnot Q$ before exploring their consequences, so in principle it's never worse than assuming $P$ alone: proving by contradiction never hurts. But after you've finished the proof, it's worth going over your proof and seeing if it can be written directly.

That is, the issue (stylistically speaking) is not so much with proof by contradiction per se, but with writing a proof in terms of contradiction when it's not necessary or helpful. ("Proof by unnecessary contradiction".) See e.g. Mathematical Writing (by Donald Knuth, Tracy Larrabee, and Paul M. Roberts) for this (emphasis added):

The proof above actually commits another sin against mathematical exposition, namely the unnecessary use of proof by contradiction. It would have been better to use a direct proof…

Sometimes (often) the direct proof may be easier to read and more illuminating, and sometimes it may not. It is always worth considering both versions and choosing the one that's easier to read. You owe it to your readers to think carefully about how best to organise your proof.

Solution 3:

To take a simple example, supposed you wanted to prove that $9$ was a composite number.

You know from Fermat's little theorem that $a^p-a$ is divisible by $p$ if $p$ is prime, but $2^9-2 = 510$ is not divisible by $9$, implying by contradiction that $9$ is not prime.

This is a fully logical proof which (combined with the fact $9$ is not a unit) answers the question, but I personally would regard showing directly that $9 = 3 \times 3$ as being more informative.

Solution 4:

I should elaborate on what I mean by weaker logics being more applicable.

Most mathematicians do believe that the world of mathematics is governed by the laws of classical logic. But there are worlds within mathematics itself which are not: although we may study such a mathematical universe using a classical background, the internal laws may not be classical. You might be dismissive and say that non-classical logic do not show up in ‘real’ mathematics, but the fact is that they do, albeit in somewhat disguised form. I give three examples:

  1. The Curry–Howard correspondence between the type theory of computation and intuitionistic logic. In essence, if you can prove a propositional formula in intuitionistic propositional calculus, then, we can interpret the proof as a computer program, and the logical formulae as type declarations for the program; conversely, if you have a computer program, then its type declarations can be interpreted as logical formulae, and the program itself is a proof.

  2. The algebra of open sets in a topological space forms a complete Heyting algebra: in particular, it is a model of intuitionistic propositional logic. A particularly memorable application of this is Kuratowski's closure–complement problem: indeed, if we denote by $\lnot A$ the interior of the complement of an open subset $A$ of a topological space $X$, then the fact that $\lnot \lnot \lnot A = \lnot A$ is nothing more than a special case of the general fact that $\lnot \lnot \lnot p$ and $\lnot p$ are equivalent in intuitionistic logic.

  3. Generalising the previous example, a remarkable discovery of the mid-20th century showed that much of mathematics is interpretable in the categories of sheaves on a topological space. Such categories are known as toposes. The category of sets is, of course, a topos, but it just one of many. The theory of rings can be interpreted in any topos (or, indeed, any category with a terminal object and finite products), and such an interpretation is known as a ring object. A ring object in the category of sets is just a ring as we know it, and a ring object in the category of topological spaces is a topological ring, and so on.

    What about a ring object in a category of sheaves? Well, there, a ring object is the same thing as a sheaf of rings. This means that we can treat, say, the structure sheaf of a scheme as essentially the same thing as an exotic ring, and anything we can prove about rings will also be true of the sheaf, interpreted appropriately. With one catch: the proofs must be intuitionistically valid, because the internal logic of a topos is in general intuitionstic. (The internal logic of more general categories are even less pleasant.) In particular, double negation elimination is invalid, so any theorems proven by contradiction become suspect (but not necessarily untrue).