Motivation for the ring product rule $(a_1, a_2, a_3) \cdot (b_1, b_2, b_3) = (a_1 \cdot b_1, a_2 \cdot b_2, a_1 \cdot b_3 + a_3 \cdot b_2)$

In a lecture, our professor gave an example for a ring. He took it out of another source and mentioned that he does not know the motivation for the chosen operation.

Of course, it's likely that somebody just invented an arbitrary operation satisfying ring axioms. I'd still like to try my luck whether anyone here can decipher the operation and give any kind of motivation for that example.

On $\mathbb{R}^3$ define the operations $+$ and $\cdot$ by $$ \begin{aligned} (a_1, a_2, a_3) + (b_1,b_2,b_3) &= (a_1+b_1,a_2+b_2,a_3+b_3) \\ (a_1, a_2, a_3) \cdot (b_1, b_2, b_3) &= (a_1 \cdot b_1, a_2 \cdot b_2, a_1 \cdot b_3 + a_3 \cdot b_2). \end{aligned} $$ (The $+$ and $\cdot$ operations on the right side are the usual addition and multiplication from $\mathbb{R}$.) With those operations, one can confirm that $\left(\mathbb{R}^3, +, \cdot \right)$ is a ring.


Solution 1:

This is just matrix multiplication in disguise. Specifically, if you identify $(a_1,a_2,a_3)$ with the matrix $\begin{pmatrix}a_1 & a_3 \\ 0 & a_2\end{pmatrix}$, these operations are the usual matrix operations: $$\begin{pmatrix}a_1 & a_3 \\ 0 & a_2\end{pmatrix}+\begin{pmatrix}b_1 & b_3 \\ 0 & b_2\end{pmatrix}=\begin{pmatrix}a_1+b_1 & a_3+b_3 \\ 0 & a_2+b_2\end{pmatrix}$$ $$\begin{pmatrix}a_1 & a_3 \\ 0 & a_2\end{pmatrix}\begin{pmatrix}b_1 & b_3 \\ 0 & b_2\end{pmatrix}=\begin{pmatrix}a_1b_1 & a_1b_3+a_3b_2 \\ 0 & a_2b_2\end{pmatrix}$$

Solution 2:

It is isomorphic to the ring of matrices

$$ \left\{\begin{bmatrix}a_1&a_3\\0&a_2\end{bmatrix}\,\middle|\,a_1, a_2,a_3\in \mathbb R\right\} $$

It's a semiprimary ring whose Jacobson radical is the subset with $a_1=a_2=0$. The Jacobson radical is nilpotent, and $R/J(R)\cong\mathbb R\times\mathbb R$. Here is a list of more properties of such a ring.

This sort of ring is fairly famous, and has nice interpretations. One of them is that if you select a chain of subspaces $\{0\}<V<W<\mathbb R\times \mathbb R$ ($W$ of dimension $1$, $V$ of dimension $2$) then the linear transformations of $\mathbb R\times\mathbb R$ which stabilize this chain is isomorphic to this triangular matrix ring. That is, $\phi$ stabiliezes the chain if $\phi(V)\subseteq\phi(W)$.

Incidentally, you are always going to be able to extract some sort of matrix presentation for a multiplication like you are describing, because you can rely on it being a finite dimensional algebra. If it really is a valid ring multiplication, it's bilinear, and so you can work on figuring out what a logical 'basis' is and then deduce what it looks like with matrices.