Proving that $\mathbb R^3$ cannot be made into a real division algebra (and that extending complex multiplication would not work)

I am trying to solve the following exercise:

Prove that complex multiplication does not extend to a multiplication on $\mathbb R^3$ so as to make $\mathbb R^3$ into a real division algebra.

I am aware of Frobenius' theorem that there are only three finite dimensional real associative division algebras. But the exercise would be pointless if the theorem was assumed.

Let $x,y \in \mathbb R^3$. Since $x y$ has to extend the complex multiplication:

$$xy = (x_1y_1-x_2y_2, x_1y_2 + x_2 y_1, ?)$$

I have no idea how I can proceed from here. Could someone tell me how to do this? It should be easy as the other exercises I did so far were also easy.


Solution 1:

Assume that $D$ is a 3-dimensional division algebra over $\Bbb{R}$. Let $a\in D\setminus\Bbb{R}$. Consider the linear mapping $\rho_a:z\mapsto az, z\in D,$ from $D$ to itself. Let $M$ be the matrix representing $\rho_a$ (with respect to some basis). The eigenvalue polynomial $$ \chi_a(x)=\det(xI_3-M)\in\Bbb{R}[x] $$ is monic of degree three. Thus $\lim_{x\to\pm\infty}\chi_a(x)=\pm\infty$. Therefore, by continuity of $\chi_a(x)$, there exists a real number $r$ such that $\chi_a(r)=0$. This means that the mapping $$ L:D\to D, z\mapsto az-rz=(a-r)z $$ has a non-trivial kernel. Therefore the element $a-r$ cannot be invertible. Because $a\notin\Bbb{R}$ we have $a-r\neq0_D$. This contradicts the assumption that $D$ is a division algebra.


Edit: The above was a bit of overkill for the task at hand (=proving that complex multiplication cannot be extended to a 3D-space). A simpler argument follows.

If the multiplication of $D$ is an extension of the multiplication of $\Bbb{C}$, then $D$ has a subalgebra isomorphic to $\Bbb{C}$. Therefore $D$ has a structure of a (left) vector space over $\Bbb{C}$. Thus $D$ is a finite dimensional vector space over $\Bbb{C}$. But this implies that the dimension of $D$ as a vector space over $\Bbb{R}$ is an even number.

Solution 2:

Here's a basic (linear) algebraic argument:

Lemma: Let $A$ be an $\mathbb R$-algebra containing an element $i_A$ such that $i_A^2=-1$. Then $A$ cannot have dimension $3$.

Proof: Assume it does. Since $1, i_A$ are obviously $\mathbb R$-linearly independent, we can extend them to a basis $\{1, i_A, j\}$ of $A$. Now the simple but crucial question is: What is the product $i_A \cdot j$ in this three-dimensional space?

Write $i_A \cdot j= a + bi_A+cj$ with unique $a,b,c \in \mathbb R$. By associativity (and everything commuting with $\mathbb R$),

$$-j = i_A \cdot (i_A \cdot j) = i_A(a+bi_A+cj) = \\a i_A + b(i_A^2) + c(i_A j) = -b+ai_A+ c(a + bi_A+cj)\\= (ca-b) + (cb+a) i_A + c^2 j.$$

But since the coefficients of basis vectors are unique, this implies $c^2=-1$ which is impossible for $c\in \mathbb R$. QED.


Looking at this very easy lemma, maybe you think what I first thought: "Wait a second, where's the mistake, obviously $A := \mathbb C \times \mathbb R$ will do". No it doesn't! Because the unit $1$ of that $A$ is $(1_\mathbb C, 1_\mathbb R)$, and indeed there is no element in this algebra which squares to $-1$: e.g. the element $(i,0)$ squares to $(-1,0) \neq -(1,1)$. (More formally, $\mathbb C$ is not a subalgebra of $\mathbb C \times \mathbb R$, they do not share the same unit.)

But notice that with a little more algebra, one also has

Lemma: Let $A$ be a finite dimensional $\mathbb R$-algebra. Then for any $a \in A \setminus \mathbb R$, the subalgebra $\mathbb R[a] \subset A$ generated by $a$ either contains zero divisors, or is isomorphic to $\mathbb C$. In particular, if it contains no zero divisor, it contains an element $i_a$ such that $i_a^2 =-1$.

The proof needs just the Chinese Remainder Theorem, factorisation of real polynomials, and the fact that $\mathbb C$ is the only proper field extension of $\mathbb R$.

The two lemmata together imply that every $3$-dimensional $\mathbb R$-algebra contains zero divisors.


Note that when Hamilton invented the quaternions, he did not have linear algebra at his disposal, not even something like the first lemma above which is an easy exercise to first year students today. (He was delighted later when he read the works of Grassmann, which paved the way for modern linear algebra.) I imagine he knew that in what he was looking for, he needed to have two different square roots of $-1$, which he called $i$ and $j$, and then he fiddled for a long time with the problem what

$$ij$$

was supposed to be. The first lemma above shows us quite easily that whatever this element is, it just does not "fit" into the three dimensional space $\mathbb R + \mathbb R i +\mathbb R j$. Oh his enlightenment when he realized he needed to make it a fourth basis vector $k :=ij$, and now suddenly everything works (as long as $ij=-ji$)!