What allows us to use imaginary numbers?

We can do anything we want!

Specifically, we can define anything we want (as long as our definitions don't contradict each other). So if we want to allow ourselves to use imaginary numbers, all we have to do is write something like the following:

Define a complex number as an ordered pair of the form $(a, b)$, where $a$ and $b$ are real numbers.

Define $i$ as the complex number $(0, 1)$.

If $(a, b)$ and $(c, d)$ are complex numbers, define $(a, b) + (c, d)$ as $(a + c, b + d)$.

If $(a, b)$ and $(c, d)$ are complex numbers, define $(a, b) \cdot (c, d)$ as $(ac - bd, ad + bc)$.

And define subtraction and division in similar ways.

Is that it? Are we done? No, there's still more that we want to do. There are a lot of useful theorems about real numbers that also apply to the complex numbers, but we don't know that they apply to the complex numbers until we prove them. For example, one very useful theorem about real numbers is:

Theorem: If $a$ and $b$ are real numbers, then $a + b = b + a$.

The analogous theorem about the complex numbers is:

Theorem (not yet proven): If $a$ and $b$ are complex numbers, then $a + b = b + a$.

This theorem is, in fact, true, but we didn't know that it was true until somebody proved it.

Once we've proved all the theorems that we want to prove, then we can say that we're "done."

(Do we have to prove these theorems? No, we don't have to if we don't want to. But without these theorems, complex numbers aren't very useful.)


As for your corollary question:

Could I define a new number $z$ which is $1/0$ and simply begin using it? Seems ludicrous.

Yes, you absolutely can! All you have to do is write:

Assume that there is a value $z$. Define $1/0$ as $z$.

And that's perfectly valid; this definition doesn't contradict any other definitions. This is completely legal, acceptable and proper.

Is that it? Are we done? Probably not; there's more we'd like to do. For example, what do you suppose $z \cdot 0$ is? There are a couple of theorems here we might like to use, but we can't. Let's take a look at them:

Theorem: If $x$ is a real number, then $x \cdot 0 = 0$.

Theorem: If $x$ and $y$ are real numbers, and $y \ne 0$, then $(x / y) \cdot y = x$.

Do you see why we can't use these theorems?

Does the first theorem tell us that $z \cdot 0 = 0$? No, because we don't know that $z$ is a real number. So the first theorem doesn't apply.

How about the second theorem? We know that $z = 1/0$. Does the second theorem tell us that $(1 / 0) \cdot 0 = 1$ (and therefore $z \cdot 0 = 1$)? No, because the second theorem is only applicable when the denominator is not $0$, and here, the denominator is $0$. So the second theorem doesn't apply, either.

If we want, we can add more definitions and maybe make some of these theorems "work" for $z = 1/0$, just like we have a lot of theorems that "work" for the complex numbers. But when we do this, we encounter a lot of problems. Rather than dealing with these problems, most mathematical writers simply refuse to define $1/0$. (That's what the sentence "$1/0$ is undefined" means: the expression $1/0$ is an undefined expression, because we have refused to define it.)


What axiom or definition says that mathematical operations like +, -, /, and * operate on imaginary numbers?

It is set theory that allows us to give a rigorous foundation for complex numbers. In particular, as explained here the axiom of pairing plays a crucial role, It allows us to construct the product set $\,\Bbb R^2\,$ and then reduce complex arithmetic to arithmetic on pairs of reals - as Hamilton did when he gave the first rigorous construction of $\,\Bbb C,\,$ representing $\,a + b\,i $ by the pair $\,(a,b)\,$ with operations

$$\begin{align} (a\!+\!bi) + (c\!+\!di) &=\ \, a\!+\!c\!+\! (b\!+\!d)i\\[.2em] \rightsquigarrow\, (a,\ \ b)\ + (c,\ \ d)\ &= (a\!+\!c,\ \ \ b+d)\\[.4em] (a\!+\!bi)\times (c\!+\!di) &= \ ac\!-\!bd\!+\!(ad\!+\!bc)i\\[.2em] \rightsquigarrow\, (a,\ \ b) \ \times\ (c,\ d)\, \ &= (ac\!-\!bd,\ \ ad\!+\!bc) \end{align}\qquad\qquad$$

This reduces the consistency of $\,\Bbb C\,$ to the consistency of $\,\Bbb R\,$ i.e. any contradiction derived in $\,\Bbb C\,$ would yield a contradiction on such pairs of reals, so a contradiction in $\,\Bbb R.$

Further, a major accomplishment of the set-theoretical construction of $\,\Bbb C\,$ (and algebraic structures) is that it eliminates imprecise syntax and semantics in informal approaches. The imprecise term $\, a + b\, i\, $ is replaced it by its rigorous set-theoretic representation $\,(a,b)\,$ - which eliminates many ambiguities, e.g. doubts about the meaning of symbols $\,i\,$ and $\,+\,$ and $\,=\,$ in complex arithmetic. Such questions were rampant in the early development of complex numbers, and without set theory or any other rigorous foundation it was difficult to provide convincing precise answers. For example below is how Cauchy tried to explain them.

In analysis, we call a symbolic expression any combination of symbols or algebraic signs which means nothing by itself but which one attributes a value different from the one it should naturally be [...] Similarly, we call symbolic equations those that, taken literally and interpreted according to conventions generally established, are inaccurate or have no meaning, but from which can be deduced accurate results, by changing and altering, according to fixed rules, the equations or symbols within [...] Among the symbolic expressions and equations whose theory is of considerable importance in analysis, one distinguishes especially those that have been called imaginary. -- Cauchy, Cours d'analyse,1821, S.7.1

It's no surprise that Cauchy's peers were not persuaded by such handwaving, e.g. Hankel replied

If one were to give a critique of this reasoning, we can not actually see where to start. There must be something "which means nothing," or "which is assigned a different value than it should naturally be" something that has "no sense" or is "incorrect", coupled with another similar kind, producing something real. There must be "algebraic signs" - are these signs for quantities or what? as a sign must designate something - combined with each other in a way that has "a meaning." I do not think I'm exaggerating in calling this an unintelligible play on words, ill-becoming of mathematics, which is proud and rightly proud of the clarity and evidence of its concepts. $\quad$-- Hankel

Hamilton's elimination of such "meaningless" symbols - in favor of pairs of reals - served as a major step forward in placing complex numbers on a foundation more amenable to his contemporaries. Although there was not yet any theory of sets in which to rigorously axiomatize the notion of pairs, they were far easier to accept naively - esp. given the already known closely associated geometric interpretation of complex numbers.

See said answer for further discussion of this and related topics (above is excerpted from there).


At the risk of sounding like a postmodernist: all numbers are imaginary.

Long ago, someone abstracted: what is the thing that this collection of sheep has in common with the number of fingers on my left hand, and called that thing "five." No inconsistencies were introduced and there were great simplifications to be made.

Someone asked how to divide two pies among three people and the abstraction of fractions was born. Someone thought about debt and the abstraction of negative numbers was born. Someone realized that positive and negative fractions didn't describe the intuitive nature of a continuum and the abstraction of the reals was born.

And eventually someone abstracted solutions to $x^2 + 1 = 0$; they're not any more imaginary than any of the other abstractions, they're all products of human imagination. The name "imaginary numbers" is unfortunate.

You say: why can't I abstract a solution to $0*z = 1$, i.e. $1/0$? The problem is that your abstraction will be incompatible with your other abstractions, i.e. you will break arithmetic. But there are areas of geometry (e.g. Mobius transformations of the plane) where there is a consistent way to do a little arithmetic with an idea of $1/0 = \infty$ (although one has to be careful to remain consistent).