Prove that if $\gcd(a,b)=1$, then $\gcd(a^2,b^2)=1$

Solution 1:

Suppose $gcd(a,b)=1$ then you have $ax+by=1$, cubing this, we get $(ax+by)^3=1$

i.e., $a^3x^3+b^3y^3+3a^2x^2by+3axb^2y^2=1$

i.e., $a^2(ax^3+3x^2by)+b^2(by^3+3axy^2)=1$

does this imply $gcd(a^2,b^2)=1$???

Solution 2:

First show $\gcd(a, b^2) = \gcd(a^2, b) = 1$

$\gcd(a,b) = 1$ means that there are $x$, $y$ ($x$, $y$ are integers) such that $ax + by =1$

$ax + by(ax+by) =1$

$a(x+bxy) + b^2y =1$

That means $\gcd(a,b^2)=1$

$\gcd(a,b^2)= \gcd(a^2,b)=1$

$ar_1(ar_2+b^2s_2) +b^2s_1 = 1$

$a^2r_1^2+b^2(ar_1s_1+s_1)=1$

Therefore,

$\gcd(a^2, b^2) = 1$