Is there an numeric arithmetic with a single operator?
[Added: @MJD's answer and mine are really answers to somewhat different questions, and @MJD's answer is the one that probably addresses the intended question. But I'll leave this answer in place as the issue it notes is interesting anyway!]
It depends what logical apparatus is available.
Dedekind/Peano arithmetic in its original form has just the successor $S$ function built in as the one primitive arithmetical operator, but is strong enough to express all the usual arithmetical functions.
How come? Well, other functions can be defined. Thus addition is the one and only function $f$ such that $f(x, 0) = x$ and $f(x, Sy) = Sf(x, y)$. Having defined addition, multiplication is the one and only function $f$ such that $f(x, 0) = 0$ and $f(x, Sy) = Sf(x, y) + x$. (Dedekind famously has a proof that such definitions do indeed uniquely pin down well-defined functions).
But note this sort definition of a function in effect quantifies over functions (NB that "one and only") -- hence it is only available in a second-order framework which allows quantifications over functions (as of course the original Dedekind/Peano axiomatisation did).
In a first-order framework, we can't define functions by quantification in the same way, but have to explicitly postulate at least some additional functions -- canonically, we take addition and multiplication as additional primitive functions (and then show all the other recursive functions can be expressed using these, by Gödel's trick).
This discussion follows the solutions of problems 1–2 given in A Problem Seminar by Donald J. Newman (Springer, 1982), pp. 45–46.
First, note that all four of $+, -, \times, \div$ can be built just from $-$ and the reciprocal operation $x\mapsto \frac1x$: Addition is easy: $$x+y = x-((x-x)-y).$$
By partial fraction decomposition we have $$ \frac1{x-x^2} = \frac1{1-x} + \frac1x$$
So $$x^2 = x-\left(\frac1{1-x} + \frac1x\right)^{-1}.$$
Now we can calculate $(x,y)\mapsto -2xy$ by using $-2xy = (x-y)^2 - x^2 - y^2$. We can get rid of the $-2$ by using $\frac u{-2} = \left(\left(0-\frac1u\right)-\frac1u\right)^{-1}$. And now that we have multiplication, obviously we get $x\div y = x\cdot \frac 1y$. So we have all of $+,-,\times,\div$ from $-$ and reciprocal.
Second, we need to find a single operation that gives us $-$ and reciprocal. Newman observes that nothing like $x\circ y = x(x+y)$ can work because it can never give us subtraction. So instead, we try something like $$x\circ y = \frac1{x-y}$$ that has subtraction and reciprocal in it already.
Reciprocal is easy: $$x\circ 0 = \frac1{x-0} = \frac1x.$$
And then $$(x\circ y)\circ 0 = \frac1{\frac1{x-y}-0} = x-y.$$
So all four of $+, -, \times, \div$ can be defined in terms of $\circ$.
Looking at this now, I notice that Newman has tacitly assumed that we are allowed to use certain constants: $0$ in the second part, and 1 in the first part. (0 in the first part can be synthesized from $x-x$.) I don't know if these can be avoided, but perhaps you don't care about that detail.
In MO there is a similar question, but only related to addition and multiplication, but the accepted answer, ensure that this is possible whenever we want to do this with a countable collection of binary operations, so we can do it with four.
https://mathoverflow.net/questions/57465/can-we-unify-addition-and-multiplication-into-one-binary-operation-to-what-exten