When to use the contrapositive to prove a statment

My question tries to address the intuition or situations when using the contrapositive to prove a mathematical statement is an adequate attempt.

Whenever we have a mathematical statement of the form $A \implies B$, we can always try to prove the contrapositive instead i.e. $\neg B \implies \neg A$. However, what I find interesting to think about is, when should this approach look promising? When is it a good idea when trying to prove something to use the contrapositive? What is the intuition that $A \implies B$ might be harder to do directly than if one tried to do the contrapositive?

I am looking more for a set of guidelines or intuitions or heuristics that might suggest that trying to use the contrapositive to prove the mathematical statement might be a good idea.

Some problems have structures that make it more "obvious" to try induction or contradiction. For example, in cases where a recursion is involved or something is iterating, sometimes induction is natural way of trying the problem. Or when some mathematical object has property X, then assuming it doesn't have property X can seem promising because assuming the opposite might lead to a contradiction. So I was wondering if there was something analogous to proving stuff using the contrapositive.

I was wondering if the community had a good set of heuristics for when they taught it could be a good idea to use the contrapositive in a proof.

Also, this question might benefit from some simple, but very insightful examples that show why the negation might be easier to prove. Also, I know that this intuition can be gained from experience, so providing good or solid examples could be a great way to contribute. Just don't forget to say what type of intuition you are trying to teach with your examples!

Note that I am probably not expecting an actual full proof super general magical algorithm because an algorithm like that could probably be used for automating prooving, which might imply something big like $P=NP$. (Obviously a proof of P vs NP is always interesting, but I think that asking the community to prove the P vs NP is not a realistic thing to ask...)


Solution 1:

Contraposition is often helpful when an implication has multiple hypotheses, or when the hypothesis specifies multiple objects (perhaps infinitely many).

As a simple (and arguably artificial) example, compare, for $x$ a real number:

1(a). If $x^4 - x^3 + x^2 \neq 1$, then $x \neq 1$. (Not easy to see without implicit contraposition?)

1(b). If $x = 1$, then $x^4 - x^3 + x^2 = 1$. (Immediately obvious.)

Here's a classic from elementary real analysis, with $x$ again standing for a real number:

2(a). If $|x| \leq \frac{1}{n}$ for every positive integer $n$, then $x = 0$.

This is true, but awkward to prove directly because there are effectively infinitely many hypotheses (one for each positive $n$), yet no finite number of these hypotheses implies the conclusion.

2(b). If $x \neq 0$, there exists a positive integer $n$ such that $\frac{1}{n} < |x|$.

The contrapositive, which follows immediately from the Archimedean property, requires only a strategy for showing some hypothesis is violated if the conclusion is false.

3(a). If $f$ is a continuous, real-valued function on $[0, 1]$ and if $$ \int_0^1 f(x) g(x)\, dx = 0\quad\text{for every continuous $g$,} $$ then $f \equiv 0$.

3(b). If $f$ is a continuous, real-valued function on $[0, 1]$ that is not indentically zero, then $$ \int_0^1 f(x) g(x)\, dx \neq 0\quad\text{for some continuous $g$.} $$

Here contraposition is not especially helpful because the specific choice $f = g$ gives either direction. That is, 3(a) has infinitely many hypotheses, but one of them is sufficient to deduce the conclusion.

Contrast with the superficially similar-looking:

4(a). If $f$ is a continuous, real-valued function on $[0, 1]$ and if $$ \int_0^1 f(x) g(x)\, dx = 0\quad\text{for every non-negative step function $g$,} $$ then $f \equiv 0$.

4(b). If $f$ is a continuous, real-valued function on $[0, 1]$ that is not indentically zero, then $$ \int_0^1 f(x) g(x)\, dx \neq 0\quad\text{for some non-negative step function $g$.} $$

(Sketch of 4(b): If $f$ is not identically $0$, there is an $x_0$ in $(0, 1)$ such that $f(x_0) \neq 0$. By continuity, there exists a $\delta > 0$ such that $[x_0 - \delta, x_0 + \delta] \subset (0, 1)$ and $|f(x)| > |f(x_0)|/2$ for all $x$ with $|x - x_0| < \delta$. Let $g$ be the characteristic function of $[x_0 - \delta, x_0 + \delta]$.)

As in 2(a), the hypothesis of 4(a) comprises infinitely many conditions, and no finite number suffice. Here, the contrapositive 4(b) arises naturally, because in trying to establish 4(a) directly one is all but forced to ask how the conclusion could fail.

Solution 2:

Here are some examples, hope they help. First an easy one.

Theorem. Let $n$ be an integer. If $n$ is even then $n^2$ is even.

Proof (outline). Let $n$ be even. Then $n=2k$ for some integer $k$, so $n^2=4k^2=2(2k^2)$, which is even.

Now try the converse by the same method.

Theorem. Let $n$ be an integer. If $n^2$ is even then $n$ is even.

Proof (attempted). Suppose that $n^2$ is even, say $n^2=2k$. Then $n=\sqrt{2k}$ and so...???? This seems hopeless, $\sqrt{2k}$ does not look like an integer at all, never mind proving that it's even!

Now try proving the converse by using its contrapositive.

Theorem. Let $n$ be an integer. If $n^2$ is even then $n$ is even.

Proof. We have to prove that if $n$ is odd, then $n^2$ is odd. So, let $n=2k+1$; then $n^2=4k^2+4k+1=2(2k^2+2k)+1$ which is odd. Done!

I think the point here is that for the attempt at a direct proof we start with $n^2=2k$. This implicitly gives us some information about $n$, but it's rather indirect and hard to get hold of. Using the contrapositive begins with $n=2k+1$, which gives us very clear and usable information about $n$.

Perhaps you could put a heuristic in the following form: "try both ways, just for a couple of steps, and see if either looks notably easier than the other".