Understanding divison by monic polynomial in $R[x]$ where $R$ is an arbitrary ring

I read "Algebra: Chapter 0" by P.Aluffi. I encountered a topic where it says you can divide any polynomial in $R[x]$($R$ is any ring) by a monic polynomial(that is, a polynomial of the form $x^d + \sum\limits_{i=0}^{d-1} a_i x^{i}$).

It says if $g(x)$ is any polynomial and $f(x)$ is a monic polynomial. Then $\exists \ \ \ q(x), r(x) \in R[x],\: \deg r(x) < \deg f(x): g(x) = f(x)q(x) + r(x).$

I have two questions regarding this:

1) Why $f(x)$ needs to be monic?

2) How can we prove it? The book talk about induction and that if $\deg g(x) = n > d = deg \ f(x)$, then $\forall a \in R$ we have $ax^n = ax^{n-d}f(x) + h(x)$ for some $h(x): \deg \ h(x) < n$. It's true, but how do I take it from here?


The reason $f$ has to be monic is that you can always divide by $1$, but not for other elements. Actually, the leading coefficient of $f$ can be any unit.

The proof is by induction on the degree of $g$; if $g$ is the zero polynomial, there is nothing to prove. If $g$ has degree zero and $f(x)=1$, write $g(x)=f(x)g(x)+0$; if $f$ has degree $>0$, $g(x)=f(x)\cdot 0+g(x)$.

Suppose $g$ has degree $n$ and that the statement holds for polynomials of degree $k<n$. Let's assume $\deg f=m$.

If $n<m$, we are done: $g(x)=f(x)\cdot 0+g(x)$. Otherwise, let $a_n$ be the leading coefficient of $g(x)$ and consider $$ g_1(x)=g(x)-a_nx^{n-m}f(x) $$ By construction, $\deg g_1<n$, so $g_1(x)=f(x)q_1(x)+r(x)$, with $\deg r<m$, by the induction hypothesis. Therefore $$ g(x)=g_1(x)+a_nx^{n-m}f(x)=f(x)(a_nx^{n-m}+q_1(x))+r(x) $$ and we are finished.

If the leading coefficient of $f(x)$ is a unit $u$, then $u^{-1}f(x)$ is monic, so $$ g(x)=u^{-1}f(x)q_0(x)+r(x) $$ by the previous result and now $q(x)=u^{-1}q_0(x)$ solves the problem.


For polynomials over any commutative coefficient ring, the high-school polynomial long division algorithm works to divide with remainder by any monic polynomial, i.e any polynomial $\rm\:f\:$ whose leading coefficient $\rm\:c =1\:$ (or a unit), since $\rm\:f\:$ monic implies that the leading term of $\rm\:f\:$ divides all higher degree monomials $\rm\:x^k,\ k\ge n = deg\ f,\:$ so the high-school division algorithm works to kill all higher degree terms in the dividend, leaving a remainder of degree $\rm < n = deg\ f\,$ (see here for further detail).

The division algorithm generally fails if $\rm\:f\:$ is not monic, e.g. $\rm\: x = 2x\:q + r\:$ has no solution for $\rm\:r\in \mathbb Z,\ q\in \mathbb Z[x],\:$ since evaluating at $\rm\:x=0\:$ $\Rightarrow$ $\rm\:r=0,\:$ evaluating at $\rm\:x=1\:$ $\Rightarrow$ $\:2\:|\:1\:$ in $\mathbb Z,\,$ contradiction. Notice that the same proof works in any coefficient ring $\rm\:R\:$ in which $2$ is not a unit (invertible). Conversely, if $2$ is a unit in $\rm\:R,$ say $\rm\:2u = 1\:$ for $\rm\:u\in R,\:$ then division is possible: $\rm\: x = 2x\cdot u + 0.$

However, we can generalize the division algorithm to the non-monic case as follows.

Theorem (nonmonic Polynomial Division Algorithm) $\ $ Let $\,0\neq F,G\in A[x]\,$ be polynomials over a commutative ring $A,$ with $\,a\,$ = lead coef of $\,F,\,$ and $\, i \ge \max\{0,\,1+\deg G-\deg F\}.\,$ Then
$\qquad\qquad \phantom{1^{1^{1^{1^{1^{1}}}}}}a^{i} G\, =\, Q F + R\ \ {\rm for\ some}\ \ Q,R\in A[x],\ \deg R < \deg F$

Proof $\ $ Hint: use induction on $\,\deg G.\,$ See this answer for a full proof.