If $a,b$ are positive integers such that $\gcd(a,b)=1$, then show that $\gcd(a+b, a-b)=1$ or $2$. and $\gcd(a^2+b^2, a^2-b^2)=1$ or $2 $

If $a,b$ are positive integers such that $\gcd(a,b)=1$, then show that $\gcd(a+b, a-b)=1$ or $2$ and $\gcd(a^2+b^2, a^2-b^2)=1$ or $2 $.

Progress

We have $\gcd(a,b)=1\implies \exists u,v\in \mathbb Z$ such that $au+bv=1\implies a^2u'+bv'=1, u',v'\in \mathbb Z$.

Let $\gcd(a+b, a-b)=d$. Then $\mid (a+b)x+(a-b)y, \forall x,y\in \mathbb Z$.

How to show $\gcd(a+b, a-b)=1$ or $2$ and $\gcd(a^2+b^2, a^2-b^2)=1$ or $2 $.


Solution 1:

Let $d = gcd(a+b,a-b)$. So $d |(a-b)$, and $d|(a+b)$. Thus:

$d |(a-b)+(a+b) = 2a$, and $d |(a+b) - (a-b) = 2b$. So $d | gcd(2a,2b) = 2gcd(a,b) = 2$.

Thus $d = 1 $ or $2$.

For the other one, observe that $2(a^2 + b^2) = (a+b)^2 + (a-b)^2$, and $a^2 - b^2 = (a-b)(a+b)$. Thus if $ d = gcd(a^2+b^2, a^2-b^2)$, then $d |(a^2+b^2)+(a^2-b^2) = 2a^2$, and $d|(a^2+b^2)-(a^2-b^2) = 2b^2$. Thus: $d |gcd(2a^2,2b^2) = 2gcd(a^2,b^2) = 2(gcd(a,b)^2) = 2\cdot 1 = 2$. Thus $d = 1$ or $2$.

Note: $gcd(a^2,b^2) = (gcd(a,b))^2$ is easy to prove.

Solution 2:

If $\gcd(a,b)=1$, then there are $x,y\in\mathbb{Z}$ so that $ax+by=1$.

Then, since $$ \begin{align} 2 &=\overbrace{[(a+b)+(a-b)]}^{\large2a}\,x+\overbrace{[(a+b)-(a-b)]}^{\large2b}\,y\\[4pt] &=(a+b)(x+y)+(a-b)(x-y) \end{align} $$ we have that $$ \gcd(a+b,a-b)\mid2 $$

Furthermore $$ \begin{align} 1 &=(ax+by)^3\\ &=a^2(ax^3+3bx^2y)+b^2(3axy^2+by^3) \end{align} $$ implies that $\gcd(a^2,b^2)=1$. Apply the result above to get $$ \gcd(a^2+b^2,a^2-b^2)\mid2 $$