If $\gcd(a,c)=1=\gcd(b,c)$, then $\gcd(ab,c)=1$ [duplicate]

Your proof is not correct. You have implicitly assumed that if $d\mid ab$ then either $d\mid a$ or $d\mid b$. But this is not true: for example, $10\mid4\times25$, but $10\not\mid4$ and $10\not\mid25$.

It's a little difficult to say what you could do for a proof, since the basic facts of number theory can be proved in various orders and I don't know what you have done in your course. But here is one possibility.

I assume you have done the following theorem.

Let $s,t$ be integers. Then $\gcd(s,t)=1$ if and only if there exist integers $x,y$ such that $sx+ty=1$.

If you apply this to $a,c$ and also to $b,c$, you have $$ax+cy=1\quad\hbox{and}\quad bu+cv=1$$ for some integers $x,y,u,v$. Can you see how to use these equations to show that $$ab(\cdots)+c(\cdots)=1\ ,$$ where both sets of dots are integers?

Good luck!


If $\gcd(ab,c)>1$, You can assume $d$ is a prime factor of $\gcd(ab,c)$.

And if $d$ is a prime number and $d\mid ab$, either $d\mid a$ or $d\mid b$ is true.

Then you will get $\gcd(a,c)>1$ or $\gcd(b,c)$>1, which is not true.

Therefore $\gcd(ab,c)=1$.