Is this proof that $\sqrt 2$ is irrational correct?

Suppose $\sqrt 2$ were rational. Then we would have integers $a$ and $b$ with $\sqrt 2 = \frac ab$ and $a$ and $b$ relatively prime.

Since $\gcd(a,b)=1$, we have $\gcd(a^2, b^2)=1$, and the fraction $\frac{a^2}{b^2}$ is also in lowest terms.

Squaring both sides, $2 = \frac 21 = \frac{a^2}{b^2}$.

Lowest terms representations of rational numbers are unique, so we have $a^2 = 2$ and $b^2=1$.

But there is no such integer $a$, and therefore we have a contradiction and $\sqrt 2$ is irrational.

I am not interested in the pedagogical value of this purported proof; I am only interested in whether the logic is sound.


Your proof is correct. Quicker, from $\rm\:(a,b)=1,\ 2\,b^2 = a^2\:$ we deduce by Euclid's Lemma (EL) that $\rm\:b\:|\:a^2\:\Rightarrow\:b\:|\:a\,\ $ (i.e. $\rm\:b\:|\:ac\:\Rightarrow\:b\:|\:c\:$ for $\rm\:c=a).\:$ Therefore $\rm\:\sqrt{2} = a/b\in \mathbb Z.\:$

In your proof, the only "uniqueness" result one needs about reduced fractions is the following:

$\rm{\bf Lemma}\ \ (a,b)=1,\ \dfrac{a}b = \dfrac{A}B\:\Rightarrow\:b\:|\:B\ \ $ Proof $\rm\ \ Ab=aB\:\Rightarrow\:b\:|\:aB\:\Rightarrow\:b\:|\:B\:$ by EL $\ $ QED

$\rm Hence\quad\ \, (a,b)=1,\ \dfrac{a}b = \dfrac{2b}{a}\ \Rightarrow\:b\:|\:a\ \Rightarrow\ \sqrt{2}=\dfrac{a}b\in\mathbb Z.\ $ So you don't need $\rm\:(a^2,b^2)=1.$

The converse of the Lemma is also true, i.e. Euclid's Lemma is equivalent to this uniqueness property of reduced fractions. For more on this topic see my posts on unique fractionization.


Hhmmm ... Here is a short one.

We all know that $\sqrt{2}$ is a root of $x^2-2 = 0$

Apply Rational Root Theorem and you're done

It goes something like this : if $\sqrt{2} = \dfrac{p}{q}$, then $q=1$ and $p$ is a factor of $2$. Hence the only possible rational root of $x^2-2=0$ would be $-2,-1,1,-2$.

A quick check shows that this is not the case. Therefore $\sqrt{2}$ is irrational.

A bit further now : using the same Theorem, shiw that $\sqrt{n^2+1}$ is irrational.


Here's a proof that a rational algebraic integer is an integer, using matrices. An algebraic integer occurs as an eigenvalue of a matrix with integer entries. Let $A$ be an $n \times n$ matrix with integer entries. Suppose that $Av = \lambda v$ for some rational number $\lambda$ where $v$ is a non-zero rational column vector. Multiplying through by a suitable integer, we may suppose that $v$ has integer entries. Dividing through if necessary, we may suppose that the entries of $v$ have no common divisor greater than $1.$ Write $\lambda = \frac{r}{s},$ where $r$ and $s$ are relatively prime integers. Then $sAv = rv.$ However, all entries of the column vector on the left are integer multiples of $s$. On the other hand, since the entries of $v$ have gcd 1, the gcd of the entries of the column vector on the right is $r$ (more precisely, we may write $r$ as an integer combination of the entries of $rv$). Hence $s$ divides $r.$ But ${\rm gcd}(r,s) =1,$ so $s = \pm 1,$ and $\lambda$ is an integer. (Background: An algebraic integer, $\alpha$ say, is a root of a monic polynomial with integer coefficients. There is a unique such polynomial of minimal degree called the minimum polynomial of $\alpha.$ Then $\alpha$ is an eigenvalue of the companion matrix of its minimum polynomial).