Why is commutativity optional in multiplication for rings?

More precisely, why is it that all rings are required by the axioms to have commutativity in addition, but are not held to the same axiom regarding multiplication? I know that we have commutative and non-commutative rings depending on whether or not they are commutative in multiplication, but I am wondering why it is that the axioms were defined that way, providing us with this option.

I am using this list of axioms, from David Sharpe’s Rings and factorization:

Definition 1.3.1. A ring is a non-empty set $R$ which satisfies the following axioms:

(1) $R$ has a binary operation denoted by $+$ defined on it;

(2) addition is associative, i.e. \begin{align} a + \left(b+c\right) = \left(a+b\right) + c \text{ for all } a, b, c \in R \end{align} (so that we can write $a+b+c$ without brackets);

(3) addition is commutative, i.e. \begin{align} a + b = b + a \text{ for all } a, b \in R; \end{align}

(4) there is an element denoted by $0$ in $R$ such that \begin{align} 0 + a = a \text{ for all } a \in R \end{align} (there is only one such element because, if $0_1$ and $0_2$ are two such, then $0_1 = 0_1 + 0_2 = 0_2$ and they are the same -- we call $0$ the zero element of $R$);

(5) for every $a \in R$, there exists an element $-a \in R$ such that \begin{align} \left(-a\right) + a = 0 \end{align} (there is only one such element for each $a$, because if $b + a = 0$ and $c + a = 0$, then \begin{align} b = 0 + b = \left(c + a\right) + b = c + \left(a + b\right) = c + 0 = c; \end{align} we call $-a$ the negative of $a$);

(6) $R$ has a binary operation denoted by multiplication defined on it;

(7) multiplication is associative, i.e. \begin{align} a\left(bc\right) = \left(ab\right)c \text{ for all } a, b, c \in R; \end{align}

(8) multiplication is left and right distributive over addition, i.e. \begin{align} a\left(b+c\right) = ab + ac,\ \left(a+b\right)c = ac + bc \text{ for all } a, b, c \in R; \end{align}

(9) there is an element denoted by $1$ in $R$ such that $1 \neq 0$ and \begin{align} 1 \cdot a = a \cdot 1 = a \text{ for all } a \in R \end{align} (as for the zero element, there is only one such element, and it is called the identity element of $R$).

Ring axioms


The first rings that were considered were generally commutative; polynomial rings, then from work of Dedekind in number fields. The properties were then abstracted by Fraenkel and Noether, who still dealt mostly with commutative rings.

However, it soon became apparent that there were too many instances where commutativity of multiplication did not hold. You had famously the quaternions, of course, but you also had matrices and, more generally, the endomorphism ring of an abelian group (where “multiplication” is composition of functions). So that we have two different, related, notions: commutative rings and noncommutative rings, just like we have noncommutative groups and commutative/abelian groups.

Now, why do this with multiplication and not with addition? Well, if you take your definition of ring above, which includes a unity, but you drop condition (3) (that is, you require everything except you do not require that addition be commutative), it turns out that the other eight axioms force commutativity of addition.

Indeed, suppose you have a structure $(R,+,\cdot,0,1)$ that satisfies axioms (1), (2), and (4)-(9) above. I claim that one can deduce (3). Indeed, let $a,b\in R$. Then using distributivity on the left first, and distributivity on the right second, we have $$\begin{align*} (1+1)(a+b) &= 1(a+b) + 1(a+b) = a+b+a+b\\ (1+1)(a+b) &= (1+1)a + (1+1)b = a+a+b+b. \end{align*}$$ From this we get that $a+b+a+b = a+a+b+b$. Now add the inverse of $a$ on the left and the inverse of $b$ on the right on both sides to get $$\begin{align*} (-a) + a + b + a + b + (-b) &= 0+b+a+0 = b+a\\ (-a) + a + a + b + b + (-b) &= 0+a+b+0 = a+b \end{align*}$$ Thus, we conclude that $a+b=b+a$. That is, commutativity of addition is a consequence of the other eight axioms.

The reason we include it is two-fold: one, is that it is much nicer to say that the first few axioms force $(R,+)$ to be a commutative/abelian group. The second is that it is also common to consider rings without unity, and if we do that, then it is no longer true that addition is forced to be commutative. To see this, note that if $(G,\cdot)$ is any group with identity element $e_G$, and we define a multiplication on $G$ by letting $a*b=e_G$ for all $a,b\in G$, then $(G,\cdot,*)$ satisfies axioms (1)-(8) given above. But if the original group is not commutative, then the “addition” in this ring is not commutative. So if we want to consider rings without unity, we do want to explicitly require addition to be commutative.


I don't know about the history, but I think the right way to motivate rings is via their linear action on some set. Even the semi-ring of natural numbers $\def\nn{\mathbb{N}}$$\nn$ should be motivated by the action of counting numbers on objects, where you want the following for any $a,b ∈ \nn$ and object collections $X,Y$:

$a·X+a·Y = a·(X+Y)$   [$a$ copies of $X$ plus $a$ copies of $Y$ is $a$ copies of ( $X$ plus $Y$ )]

$a·X+b·X = (a+b)·X$   [$a$ copies of $X$ plus $b$ copies of $X$ is $(a+b)$ copies of $X$]

$a·(b·X) = (a·b)·X$   [$a$ copies of $b$ copies of $X$ is $(a·b)$ copies of $X$]

$1·X = X$   [$1$ copy of $X$ is just $X$]

$0·X + Y = Y$   [$0$ copies of $X$ plus $Y$ is just $Y$]

$X + Y = Y + X$   [Combining collections is symmetric]

Here $\nn$ acts via $·$ on the commutative semi-group $C$ of collections of things under combining, and the point is that we can abstract out the counting numbers $\nn$ by simply dropping the semi-group $C$ that $\nn$ acts on.

Note that associativity and commutativity of $+$ for $\nn$ immediately follows from associativity and commutativity for $C$.

Now observe that for any $a,b,c ∈ \nn$ and object collection $X$ we have:

$(a·(b+c))·X = a·((b+c)·X)$ $= a·(b·X+c·X)$ $= a·(b·X)+a·(c·X)$ $= (a·b)·X+(a·c)·X$.

$((a+b)·c)·X = (a+b)·(c·X)$ $= a·(c·X)+b·(c·X)$ $= (a·c)·X+(b·c)·X$.

So we have obtained distributivity for $\nn$!

But what about commutativity of $·$ for $\nn$? That corresponds to:

$a·(b·X) = b·(a·X)$   [$a$ copies of $b$ copies of $X$ is $b$ copies of $a$ copies of $X$]

Is it obviously true? For "copies" in the real world, sure, and hence we get the familiar semi-ring properties of $\nn$. Similar motivation involving scalings gives us semi-ring properties of $\mathbb{R}_{≥0}$.

If we move to the more abstract notion of collections of assets owned/owed, we can easily get the ring $\mathbb{Z}$, and likewise once we consider inverse scalings we get the ring $\mathbb{R}$.

In general, if a ring $R$ acts on a group $G$, then $R$ will automatically acquire associativity, and also naturally acquire commutative addition if $G$ is commutative.


But commutative multiplication is different. For copying and scaling, indeed the action is commutative. But it should be obvious that in general actions are not commutative!

For example, the collection $T$ of rigid transformations acts on set $S$ of locations (vectors), and certainly $A·(B·X)$ may not be $B·(A·X)$ for general $A,B ∈ T$ and location $X$ (rotations and translations do not generally commute). So if $T$ is viewed as a ring, with addition being pointwise addition and multiplication being composition, then this ring has commutative addition (since vector addition is commutative) but has non-commutative multiplication. And of course $T$ is a subset of linear operators on the vector space of locations, which can be represented by matrices. After all, matrix multiplication is defined so that it is the same as composition.