How would multiplying money work?

Consider the notion of calculating the variance of an array of prices. The result would be some amount of square dollars. Even if this measure is simply an intermediate step toward determining a standard deviation, two currency values are being multiplied together before their square root is taken, so some definition of dollar multiplication must exist. The most natural notion would be as drowdemon suggested; simply multiply the fixed point values together like real/rational numbers.

The idea of a square dollar may not seem to make much sense, but the idea of a square second doesn't really make much sense either until put in the context of acceleration. Just as $m/s^2$ is really just meters per second per second, there is no reason we cannot consider the notion of how the rate of shares one gets per dollar changes per dollar invested: ie shares per dollar per dollar or shares per dollar squared.

More humorous examples abound. Try googling "pound pound dollar rate".


No, it would not make sense. And in any case, there is no standard multiplication of vectors anyhow.

The key is that currency has a unit of measurement, e.g. dollars. Just like multiplying one length with another length gives you length-squared, multiplying 2 dollars by 2 dollars would give you 4 dollars squared.

Now, we could treat currency as an algebraic object and define multiplication on it, and there's nothing wrong with that.

But just because we can do it mathematically in an abstract sense doesn't mean that it's useful in any way in the real world. Squared-dollars can't buy you lunch.


To explain a bit deeper why your definition does not quite work, let's think about unit conversions. Specifically, let's think about using unit conversions to try to get rich quick.

We can see that $\$2 + 2\text{ cents} = 202\text{ cents}$. So, we might think that $(\$2 + 2\text{ cents}) \times (\$2 + 2\text{ cents}) = (202\text{ cents}) \times (202\text{ cents})$. But, using your definition,

$$(\$2 + 2\text{ cents}) \times (\$2 + 2\text{ cents}) = \$4 + 4\text{ cents}$$ $$(202\text{ cents}) \times (202\text{ cents}) = 40804\text{ cents} = \$408 + 4\text{ cents}$$

This doesn't seem like how things should work! Just by converting to cents before we multiplied our money, we became over $100$ times richer than we would have been had we instead left our currency in bills. There are two problems: first, we should really be multiplying term by term, so

$$(\$2 + 2\text{ cents}) \times (\$2 + 2\text{ cents}) = (\$2)(\$2) + (\$2)(2\text{ cents}) + (2\text{ cents})(\$2) + (2\text{ cents})(2\text{ cents})$$ $$ = \$^24 + \$8\text{ cents} + 4\text{ cents}^2$$

and second, we should be multiplying the units together, so

$$(202\text{ cents}) \times (202\text{ cents}) = 40804\text{ cents}^2$$

Now, these two things look different, but you have to keep in mind the $\$1 = 100\text{ cents}$, so

$$\$^24 + \$8\text{ cents} + 4\text{ cents}^2 = (100\text{ cents})^2(4) + (100 \text{ cents})8\text{ cents} + 4\text{ cents}^2 = 40804\text{ cents}^2$$

exactly what we got by converting to cents first.