Prove that if the $\gcd(a,b)=1$, then $\gcd(a+b,ab)=1$ [duplicate]
Solution 1:
Below is a proof that is messier but more primitive than Yiorgos' proof. By Bezout's Identity, we have that there exists $x,y \in \mathbb{Z}$ such that
$$ax + by = 1.$$
Squaring both sides, we see that
$$a^2 x ^ 2 + 2abxy + b^2 y^2 = 1.$$
But notice that
$$a^2 x ^ 2 + 2abxy + b^2 y^2 = ab(2xy-x^2-y^2) + (a+b)(ax^2+by^2).$$
And hence, those same $x,y$ as above give
$$ ab(2xy-x^2-y^2) + (a+b)(ax^2+by^2) = 1.$$
So it must be that $\gcd{(a+b,ab)} = 1$.
Solution 2:
Assume this is not true.
Let gcd$(a+b,ab)=m>1$. Then there exists a prime number $p$ which divides $m$.
If $p\mid a+b$ and $p\mid ab$, then $p$ divides $a$ or $b$.
Assume that $p\mid a$. But then $p\mid a+b$ implies that $p\mid b$, and hence $p\mid\,$gcd$(a,b)$, which is a contradiction.
Note. We have used the fact that: If $p$ is a prime and $p$ divides $ab$ then $p$ divides $a$ or $b$.
Solution 3:
I like pushing the limits of the Euclidean algorithm. Let's do our best to elimiante $b$'s:
$$\begin{align} \gcd(a+b, ab) &= \gcd(a+b, ab - a (a+b)) \\&= \gcd(a+b, -aa) \\&= \gcd(a+b, a^2) \end{align}$$
Since we also have $\gcd(a+b, a) = 1$, we can infer that $\gcd(a+b, a^2) = 1$.
Solution 4:
We're given $\gcd(a,b) = 1$, and from Bézout's identity we have $\gcd(a,b) = 1 \iff \exists x,y: ax + by = 1$ for integer $x$ and $y$.
$1 = ax + by = ax + bx - bx + by = (a + b)x + b(y - x)$, so $\gcd(a+b,b) = 1$. Likewise, $\gcd(a+b,a)=1$ because $(a + b)y + a(x - y) = 1$
$$ \begin{align} 1 &= ((a + b)x + b(y - x))((a + b)y + a(x -y)) \\&= (a+b)(a+b)xy + (a+b)a(x-y) + (a+b)b(y-x)y + ab(y-x)(x-y) \\&= (a+b)[(a+b)xy + a(x-y) + b(y-x)] + ab[(y-x)(x-y)] \end{align}$$
So $\gcd(a+b,ab) = 1$.