Prove that in a ring with $x^3 = x$ we have $x+x+x+x+x+x=0$. [duplicate]

Solution 1:

Try $x+x = (x+x)^3$ and expand the latter.

Solution 2:

When examining the consequences of an identity in an algebra, the natural place to start is the ground, i.e. the ground terms generated by evaluating the identity at the constants of the algebra. For rings we have constants $0$ and $1$ so its natural to look first at what the identity implies in the subring generated by these constants. In our example, we have the identity $\rm\ f(x) = x^3\!-x = 0,\ $ so we deduce that $\rm\:f(0)=0,\ f(1)=0,\ \color{#C00}{f(2) = 6},\ f(3) = 24,\,\ldots$ are all zero. Thus $\rm\:\color{#C00}6 = 0\:\Rightarrow\:6x = 0.\:$

The proof generalizes to rings without $1$, simply evaluate $\rm\:f\:$ at $\rm\:\color{#C00}{2x} = x\!+\!x\:$ instead of at $\rm\:\color{#C00}2.\:$ In fact we can deduce further identities using polynomial arithmetic on the identities. For example, we easily infer $\rm\: 0 = f(x\!+\!1)-f(x) = 3\,(x^2\!+\!x).\:$ More generally we can compute gcds/resultants of $\rm\:f(x\!+\!n),\,f(x),\:$ and perform more complex eliminations using multivariate generalizations of the Euclidean algorithm (e.g. Grobner bases). These techniques come in handy when attacking more difficult problems, e.g. Jacobson's commutativity theorem: $\rm\: x^{n} = x\:\Rightarrow\:xy = yx$.