Let $ a, b, c, d $ be natural numbers such that $ ab=cd $. Prove that $ a^2+b^2+c^2+d^2 $ is not a prime.

I am clueless on this one. I tried contradiction, but didn't get anywhere.

Can you help?

Edit: I understand natural numbers to be strictly positive, excluding $0 $.


Solution 1:

Since $d=\frac{ab}{c}$, we obtain

$$ a^2+b^2+c^2+d^2=\frac{(a^2+c^2)(b^2+c^2)}{c^2}=\left(\left(\frac{a}{a'}\right)^2+b'^2\right)\left(\left(\frac{b}{b'}\right)^2+a'^2\right). $$ where $c=a'b'$ such that $a' \mid a$ and $b' \mid b$.

Solution 2:

Suppose not. Note $p:=a^2+b^2+c^2+d^2=(a+b)^2+(c-d)^2=(a-b)^2+(c+d)^2$. That is, we expressed $p$ in two ways as a sum of two squares.

But since $p$ is prime, it can be expressed as the sum of two squares in at most one way, up to interchanging the numbers. This corresponds to the fact $p$ has a unique prime factorization in the ring of Gaussian integers $\mathbb{Z}[i]$: $p=(a+bi)(a-bi)=a^2+b^2$.

Therefore either $a+b=a-b$ and $c-d=c+d$, which is impossible, or $a+b=c+d$ and $c-d=a-b$ which implies $a=c$ and $b=d$ and therefore $2|p$. Contradiction because $p>2$.

Solution 3:

Let $N := a^2 + b^2 + c^2 + d^2$

Note that $bd(a^2 + c^2) = ac(b^2 + d^2)$, while $ac < a^2+c^2$, so that $$lcm(a^2 + c^2, b^2 + d^2) < (a^2 + c^2)(b^2 + d^2)$$ This tells us that $a^2 + c^2$ and $b^2 + d^2$ are not coprime, and so $$ 1 < gcd(a^2 + c^2, b^2 + d^2)< N $$

Finally this $gcd|N$, which completes the proof.