What does it mean to represent a number in term of a $2\times2$ matrix?

Solution 1:

Your friend meant that all complex numbers can be represented by such matrices.

$$a+bi = \begin{pmatrix} a & -b \\ b & a \end{pmatrix}$$

Adding complex numbers matches adding such matrices and multiplying complex numbers matches multiplying such matrices.

This means that the collection of matrices:

$$R = \left\{ \begin{pmatrix} a & -b \\ b & a \end{pmatrix} \;\Bigg|\; a,b \in \mathbb{R} \right\}$$

is "isomorphic" to the field of complex numbers.

Specifically,

$$i = \begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix}$$

Notice that for this matrix $i^2=-I_2=-1$. :)

How does this help?

It allows you to construct the complex numbers from matrices over the reals. This allows you to get at some properties of the complex numbers via linear algebra.

For example: The modulus of a complex number is $|a+bi|=a^2+b^2$. This is the same as the determinant of such a matrix. Now since the determinant of a product is the product of a determinant, you get that $|z_1z_2|=|z_1|\cdot |z_2|$ for any two complex numbers $z_1$ and $z_2$.

Another nice tie, transposing matches conjugation. :)

Edit: As per request, a little about Euler's formula.

The exponential function can be defined in a number of ways. One nice way is via its MacLaurin series: $e^x = 1+x+\frac{x^2}{2!}+\cdots$. If you start thinking of $x$ as some sort of indeterminant, you might start to ask, "What can I plug into this series?" It turns out that the series: $$e^A = I+A+\frac{A^2}{2!}+\frac{A^3}{3!}+\cdots$$ converges for any square matrix $A$ (you have to make sense out of "a convergent series of matrices").

Consider a "real" number, $x$, encoded as one of our matrices: $$x=\begin{pmatrix} x & 0 \\ 0 & x \end{pmatrix} \quad \mbox{then} \quad e^x = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} + \begin{pmatrix} x & 0 \\ 0 & x \end{pmatrix} + \begin{pmatrix} x^2/2 & 0 \\ 0 & x^2/2 \end{pmatrix} + \cdots$$ $$= \begin{pmatrix} 1+x+x^2/2+\cdots & 0 \\ 0 & 1+x+x^2/2+\cdots \end{pmatrix} = \begin{pmatrix} e^x & 0 \\ 0 & e^x \end{pmatrix} = e^x$$

So (no surprise) the matrix exponential and the good old real exponential do the same thing.

Now one can ask, "What does the exponential of a complex number get you?" It turns out that... $$\mbox{Given } a+bi = \begin{pmatrix} a & -b \\ b & a \end{pmatrix} \quad \mbox{then} \quad e^{a+bi} = \begin{pmatrix} e^a\cos(b) & -e^a\sin(b) \\ e^a\sin(b) & e^a\cos(b) \end{pmatrix}$$ ...this involves some (?intermediate?) linear algebra.

Anyway accepting that, we have found that $e^{a+bi} = e^a(\cos(b)+i\sin(b))$. In particular, $$e^{i\theta} = \begin{pmatrix} \cos(\theta) & -\sin(\theta) \\ \sin(\theta) & \cos(\theta) \end{pmatrix}$$ So that $$e^{i\pi} = \begin{pmatrix} -1 & 0 \\ 0 & -1 \end{pmatrix} = -1$$

We can see this way that complex exponentiation (with pure imaginary exponent) yields a rotation matrix. Thus leading us down a path to start identifying complex arithmetic with 2-dimensional geometric transformations.

Of course, there are many other ways to arrive at these various relationships. The matrix route is not the fastest/easiest route but it is an interesting one to contemplate.

I hope that helps a little bit. :)

Solution 2:

Let $$ I=\begin{bmatrix}1&0\\0&1\end{bmatrix}\tag{1} $$ and $$ J=\begin{bmatrix}0&-1\\1&0\end{bmatrix}\tag{2} $$

Since $I$ is the identity matrix, whether multiplying on the left or on the right, it leaves all matrices untouched. Thus, $I$ has the properties of $1$.

The key fact here is that $J^2=-I$. If $I$ represents $1$ then $J$ would represent $i$.

As alluded to earlier, $I$ commutes with all matrices; in particular $J$. That is, $IJ=J=JI$.

Scalar and matrix multiplication distribute over matrix addition. Therefore, $$ (xI+yJ)+(uI+vJ)=(x+u)I+(y+v)J\tag{3} $$ and $$ (xI+yJ)(uI+vJ)=(xu-yv)I+(xv+yu)J\tag{4} $$ With $I$ representing $1$ and $J$representing $i$, $(3)$ and $(4)$ correspond exactly with complex addition and multiplication.

Since analytic functions can be written as series involving addition and multiplication of complex numbers, those functions can be translated directly to corresponding functions involving $I$ and $J$.

For example $$ \begin{align} \exp(xI+yJ) &=\sum_{n=0}^\infty\frac{(xI+yJ)^n}{n!}\\ &=\sum_{n=0}^\infty\sum_{k=0}^n\frac1{n!}\binom{n}{k}(xI)^{n-k}(yJ)^k\\ &=\sum_{n=0}^\infty\sum_{k=0}^n\frac{(xI)^{n-k}}{(n-k)!}\frac{(yJ)^k}{k!}\\ &=\sum_{n=0}^\infty\frac{(xI)^n}{n!}\sum_{k=0}^\infty\frac{(yJ)^k}{k!}\\ &=\sum_{n=0}^\infty\frac{(xI)^n}{n!}\left(\sum_{k=0}^\infty\frac{(yJ)^{2k}}{(2k)!}+\sum_{k=0}^\infty\frac{(yJ)^{2k+1}}{(2k+1)!}\right)\\ &=\sum_{n=0}^\infty\frac{x^n}{n!}I\left(\sum_{k=0}^\infty(-1)^k\frac{y^{2k}}{(2k)!}I+\sum_{k=0}^\infty(-1)^k\frac{y^{2k+1}}{(2k+1)!}J\right)\\[9pt] &=e^xI(\cos(y)I+\sin(y)J)\\[15pt] &=e^x\cos(y)I+e^x\sin(y)J\tag{5} \end{align} $$ We can rewrite $(5)$ as $$ \exp\left(\begin{bmatrix}x&-y\\y&x\end{bmatrix}\right) =\begin{bmatrix}e^x\cos(y)&-e^x\sin(y)\\e^x\sin(y)&e^x\cos(y)\end{bmatrix}\tag{6} $$


In this fashion, we can reformulate almost any formula involving complex numbers in terms of matrices using the isomorphism $$ x+yi\leftrightarrow\overbrace{\begin{bmatrix}x&-y\\y&x\end{bmatrix}}^{xI+yJ}\tag{7} $$ For example, the complex conjugate is represented by the transpose $$ \overline{x+yi}=x-yi\leftrightarrow\begin{bmatrix}x&y\\-y&x\end{bmatrix}=\begin{bmatrix}x&-y\\y&x\end{bmatrix}^T\tag{8} $$ and the square of the absolute value is represented by the determinant times $I$ $$ \begin{align} |x+yi|^2 =(x+yi)\overbrace{(x-yi)\vphantom{\begin{bmatrix}x\\y\end{bmatrix}}}^{\text{conjugate}} &\leftrightarrow\begin{bmatrix}x&-y\\y&x\end{bmatrix} \overbrace{\begin{bmatrix}x&y\\-y&x\end{bmatrix}}^{\text{transpose}}\\ &=\begin{bmatrix}x^2+y^2&0\\0&x^2+y^2\end{bmatrix}\\ &=\det\begin{bmatrix}x&-y\\y&x\end{bmatrix}I\tag{9} \end{align} $$ Not surprisingly, the reciprocal is represented by the matrix inverse $$ \begin{array}{c} \dfrac1{x+yi}&=&\dfrac{x-yi}{|x+yi|^2}\\ \updownarrow&&\updownarrow\\ \begin{bmatrix}x&-y\\y&x\end{bmatrix}^{-1} &=&\begin{bmatrix}x&y\\-y&x\end{bmatrix}\left(\det\begin{bmatrix}x&-y\\y&x\end{bmatrix}I\right)^{-1}\tag{10} \end{array} $$