History of the matrix representation of complex numbers

Solution 1:

This set of lecture notes from Wedderburn explicitly says that a complex scalar $\alpha + i\beta$ can be written as \begin{equation} \left(\begin{array}{lr} \alpha & -\beta \\ \beta & \alpha \end{array}\right) \end{equation} on page 101 of the PDF (this is page 108 of the document when viewed in a PDF viewer). These notes are from 1934, which is obviously only slightly earlier than your example. However, the notes themselves are based on lectures given at Princeton starting in 1920, and it would seem that this notation goes back to 1907 because in that year Wedderburn (in his thesis) showed that associative hypercomplex systems can be represented by matrices. I've been unable to find his thesis online to check if this representation is explicitly written out, but I will update this post if I do.

Going back even further, in 1858 Arthur Cayley published "A Memoir on the Theory of Matrices" in which he mentions matrix representations of quaternions. Specifically, in item #45 on page 32 of the PDF (or on page 17 when viewed in a PDF viewer), he makes a passing mention of the fact that matrices $M$, $N$, and $L$ such that $L^2 = -1$, $M^2 = -1$, and $N = LM - -ML$ satisfy a system of equations that is the same as those that the quaternions satisfy. I didn't see anything in the above paper by Cayley about representing complex numbers with matrices, though I've seen a few passing references to Cayley coming up with the idea in 1858, so it may be the consensus of the mathematical community that the credit should go to Cayley.

Solution 2:

Today was the first day of class in my complex analysis course. I sometimes attempt new derivations in real time to keep it fresh. Today, we got to the point of asking what was the reciprocal of $z=x+iy$. We said, let $w=a+ib$ and seek solutions of $wz=1$. This gives: $$ wz = (a+ib)(x+iy) = ax-by+i(bx+ay) = 1+i(0).$$ Equating real and imaginary parts reveals: $$ ax-by = 1 \qquad \& \qquad bx+ay = 0 $$ which is a system of linear equations which has matrix form: $$ \left[ \begin{array}{cc} x & -y \\ y & x \end{array}\right]\left[ \begin{array}{c} a \\ b \end{array}\right] =\left[ \begin{array}{c} 1 \\ 0 \end{array}\right] $$ We solve for $[a,b]^T$ by multiplying by the inverse of the $2 \times 2$ matrix for which we have the handy-dandy formula $\displaystyle \left[ \begin{array}{cc} x & -y \\ y & x \end{array}\right]^{-1} = \frac{1}{x^2+y^2}\left[ \begin{array}{cc} x & y \\ -y & x \end{array}\right]$. Thus, $$ \left[ \begin{array}{c} a \\ b \end{array}\right] = \frac{1}{x^2+y^2}\left[ \begin{array}{cc} x & y \\ -y & x \end{array}\right]\left[ \begin{array}{c} 1 \\ 0 \end{array}\right] = \frac{1}{x^2+y^2}\left[ \begin{array}{c} x \\ -y \end{array}\right].$$ Therefore, $a = \frac{x}{x^2+y^2}$ and $b = \frac{-y}{x^2+y^2}$ so $$\frac{1}{z}= \frac{x-iy}{x^2+y^2}.$$ I just found this a nice illustration of JHance's comment. In this routine calculation we stumble upon the $2 \times 2$ representations of both $z=x+iy$ and $1/z$. So, perhaps the real question to ask is not when the matrix representation was first given. Rather, the real question is simply when was the algebra of small matrices first known. I gather from yoknapatawpha's post of the 1858 paper of Cayley it may be a few years before that work. Apparently, the term matrix is Latin for "womb" and is due to Sylvester in 1850 as you may read at history of matrices. This makes me think there may exist some improvement on the Cayley answer.