Proving that $\gcd\left(\frac a {\gcd(a,b)},\frac b {\gcd(a,b)}\right) =1$

I need to show that the $$\gcd\left(\frac a {\gcd(a,b)},\frac b {\gcd(a,b)}\right) =1$$

I'm not sure exactly how to approach this .. I tried using the fact that $\gcd(a,b)=d$ so $d=ma+nb$ but I didn't get too far

Could anyone suggest where to start?


Solution 1:

No need to break apart the abstraction: we can deduce this just by the algebraic properties of the $\gcd$ operator.

If we pull out the common divisor:

$$ \gcd\left( \frac{a}{\gcd(a,b)}, \frac{b}{\gcd(a,b)} \right) = \frac{1}{\gcd(a,b)} \gcd(a, b) = 1$$

I personally find algebraic calculation, when it can be used, much easier than reasoning about divisors or linear combinations.


In general, when the quotients $a/c$ and $b/c$ are integers, we have

$$\gcd\left( \frac{a}{c}, \frac{b}{c} \right) = \frac{1}{c} \gcd(a, b) $$

If that's not obvious that this rule should be true, rewrite it as

$$c \gcd\left( \frac{a}{c}, \frac{b}{c} \right) = \gcd(a, b) $$

and then apply the more familiar rule

$$ x \gcd(y, z) = \gcd(xy, xz) $$

(If signs are relevant to you, then that should be $ |x| \gcd(y,z) = \gcd(xy, xz) $, assuming you always take the positive value for the $\gcd$)

(In case of degeneracy, I believe the correct convention is $\gcd(0,0) = 0$, so the last identity holds even when $x=0$)


In a suitably generalized sense, the identity

$$\gcd\left( \frac{a}{c}, \frac{b}{c} \right) = \frac{1}{c} \gcd(a, b) $$

holds even when $a/c$ and $b/c$ aren't integers. e.g. we can consider divisibility for rational numbers saying that $x$ divides $y$ if $y/x$ is an integer. Then, for example, the greatest common divisor of $1/2$ and $1/3$ is $1/6$. And we can verify

$$ \gcd \left( \frac{1}{2}, \frac{1}{3} \right) = \gcd \left( \frac{3}{6}, \frac{2}{6} \right)= \frac{1}{6} \gcd(3, 2) = \frac{1}{6} $$

Solution 2:

Write $g = \gcd(a,b).$ Then write $\alpha = \frac{a}{g},$ and $\beta = \frac{b}{g}.$ So far we have $a = g \alpha, \; b = g \beta.$ Now, assume that the $\gcd(\alpha, \beta) > 1,$ so that some $t \mid \alpha$ and $t \mid \beta$ with $t > 1.$ Then $gt$ is a divisor of both $a,b$ that is larger than $g,$ which is a contradiction of the definition of $g.$