How is addition different than multiplication?
Solution 1:
The property that distinguishes addition from multiplication is the distributive law. This is part of the convention of calling the operations addition and multiplication. If the distributive law wasn't there, we wouldn't call the operations by those names.
The fact that multiplication distributes over addition admits $0\cdot a = 0 \;\forall a$ where $0$ is the additive identity. There is no such analog for addition. This is one property that falls out of imposing the distributive law.
Another necessary property is that if every element has an additive inverse, then addition must be commutative. Observe the commutativity of addition derived from the distributive law and the existence of additive inverses: $$\begin{align}(1+1)(a+b)\quad &= 1(a+b) + 1(a+b)\\ &= (1+1)a + (1+1)b\end{align}$$ $$a+b+a+b\quad =\quad a+a+b+b$$ $$b+a\quad=\quad a+b$$
Besides this, the addition and multiplication operations on a set $S$ are simply functions: $$\begin{align} +&\;:\;S\times S \rightarrow S\\ \cdot\:&\;:\;S\times S \rightarrow S \end{align}$$
They can be any mapping we choose so long as the distributive law is upheld.
The commutativity of multiplication can be relaxed, but as explained, commutativity of addition is required if every element has an additive inverse.
Solution 2:
Claude Shannon's master's thesis, a seminal contribution to Boolean algebra and electrical engineering, used the notation of addition and multiplication for the two operations that we now think of as AND (multiplication) and OR (addition), applied to the elements 0 and 1. In this case, not only does multiplication distribute over addition, $x(y+z)=xy+xz$, but addition also distributes over multiplication, $x+yz=(x+y)(x+z)$. (These are Shannon's equations 3a and 3b.)
Boolean algebra is completely symmetric in the two operations -- it doesn't matter which one you call addition and which one you call multiplication.
The usage of the terms "addition" and "multiplication" is like many issues of notation: Authors use whatever seems most natural to them, and as long as it's defined clearly, readers will deal with it.
Solution 3:
Sometimes multiplication isn't commutative (e.g. matrix multiplication), but it's a stretch to call something addition if it isn't commutative.
Edit: Below my comment that I don't think ordinal addition should be called addition seems to have been misinterpreted; my apologies for the confusion. I meant no offense, and in particular I didn't mean to imply that ordinal addition is not interesting or not worthy of study. I just don't think it should be called addition, in the same way that I don't think the free product should be called a product.
One basic intuitive model for where addition comes from is that it abstracts the properties of the coproduct in some category; for example, addition of natural numbers corresponds to the coproduct in $\text{FinSet}$ or even to the coproduct in $\text{FinVect}$. A distinguishing feature of the coproduct is that it treats its arguments symmetrically, and is in particular commutative, but that's just a way of stating a more fundamental property, which is that the coproduct, like any commutative and associative operation, takes as input a multiset of operands rather than an ordered list. Ordinal addition, of course, doesn't have this property; in particular it is not the coproduct in the category of ordinals, and it treats its inputs asymmetrically.
There's reason to believe that category theory has a special place in its heart for commutative and associative operations; see, for example, this blog post. I think it's valuable to use additive notation and terminology to refer to this cluster of ideas - e.g. when discussing additive categories and so forth - and that ordinal addition genuinely belongs to a different cluster of ideas which doesn't have a good name that I'm aware of.