Greatest common divisor in the Gaussian Integers

Let $a$ and $b$ be integers. Prove that their greatest common divisor in the ring of integers is the as their greatest common divisor in the ring of Gaussian Integers.

Ring of Gaussian Integers is:

$\mathbb{Z}[i]=\{a+ib: a,b\in \mathbb{Z}\} $

Attempt at proof:

Suppose that the GCD (greatest common divisor) of $a$ and $b$ is $d$. Then for any other common divisor of $a$ and $b$, say $e$, we must have that $e$ divides $d$. This is the definition of greatest common divisor.

Extend to Gaussian Integers. Suppose that $x+iy$ divides $a$ and $b$. That is:

$a=(x+iy)(n_0+in_1)$

$b=(x+iy)(m_0+im_1)$

Then I need to show that $x+iy$ divides $d$. From here I don't know where to go, I could write: $a=Nd=(x+iy)(n_0+in_1)$ for some integer $N$. But then I don't know that $N$ will divide $n_0+in_1$

Thanks for any help!


Solution 1:

Remember that $d$ is a greatest common divisor of $a$ and $b$ if and only if:

  1. $d|a$ and $d|b$; and
  2. If $c|a$ and $c|b$, then $c|d$.

Let $a$ and $b$ be integers, and let $d$ be their greatest common divisor as integers. Then we know that $d$ satisfies the two properties above, where "divides" means "divides in $\mathbb{Z}$; and $c$ in point 2 is an arbitrary integer.

The thing that makes this very easy is that in the integers, we can express a gcd as a linear combination: we know that there exist integers $\alpha$ and $\beta$ such that $d=\alpha a + \beta b$.

Thus, if $\mathbf{x}$ is a Gaussian integer that divides $a$ and divides $b$ (in the Gaussian integers), then it also divides $\alpha a$, it also divides $\beta b$, and hence also divides $\alpha a + \beta b$. Thus, if $\mathbf{x}$ divides $a$ and divides $b$ in the Gaussian integers, then it divides $d$ in the Gaussian integers.

Thus, $d$ is a greatest common divisor for $a$ and $b$ in the Gaussian integers, since it satisfies the two requirements to be a greatest common divisor:

  1. $d|a$ and $d|b$ (in the Gaussian integers); and
  2. If $c$ is a Gaussian integer that divides both $a$ and $b$, then it divides $d$ (in the Gaussian integers).

Thus, $d$ is a greatest common divisor for $a$ and $b$ in $\mathbb{Z}[i]$.

No need to muck about with norms or with products.

P.S. Note that in the integers we can talk about "the" greatest common divisor because, even though every pair of numbers (except for $0$ and $0$) has two greatest common divisors ($d$ and $-d$), there is a natural way to "choose" one of them. In the Gaussian integer each pair of numbers (again, except $0$ and $0$) has four greatest common divisors (if $d$ is one, then so are $-d$, $id$, and $=id$); there is no universally accepted way of deciding which one would be "the" greatest common divisor, so generally you should talk about a greatest common divisor rather than the greatest common divisor.

Solution 2:

Hint $\:$ gcds in $\mathbb Z$ persist in extension rings because the gcd may be specified by the solvability of (linear) equations over $\mathbb Z$, and such solutions persist in extension rings, i.e. roots in $\mathbb Z$ remain roots in rings $\supset \mathbb Z$. Namely, the Bezout identity yields the following equational specification for the gcd

$$\rm\ gcd(a,b) = c\ \iff\ a\: x = c,\ b\: y = c,\ a\: u + b\: v = c\ \ has\ roots\ \ x,y,u,v\in \mathbb Z$$

Indeed, in any ring $\rm R,\:$ $\rm\:a\: x = c,\ b\: y = c\:$ have roots $\rm\:x,y\in R$ $\iff$ $\rm c\ |\ a,b\:$ in $\rm R.$ Further if $\rm\:c = a\: u + b\: v\:$ has roots $\rm\:u,v\in R\:$ then $\rm\:d\ |\ a,b$ $\:\Rightarrow\:$ $\rm\:d\ |\ a\:u+b\:v = c\:$ in $\rm\: R.\:$ Hence we infer $\rm\:c = gcd(a,b)\:$ in $\rm\: R,\:$ being a common divisor divisible by every common divisor. Conversely, if $\rm\:c = gcd(a,b)\:$ in $\:\mathbb Z,\:$ then the Bezout identity implies the existence of such roots $\rm\:u,v\in \mathbb Z$.

Rings with such linearly representable gcds are known as Bezout rings. As above, gcds in such rings always persist in extension rings. In particular, coprime elements remain coprime in extension rings (with same $1$). This need not be true without such Bezout linear representations of the gcd. For example, $\rm\:\gcd(2,x) = 1\:$ in $\rm\:\mathbb Z[x]\:$ but the gcd is the nonunit $\:2\:$ in $\rm\:\mathbb Z[x/2]\subset \mathbb Q[x]$.