How is division symbol usage currently defined?

I've been in a lengthy discussion today about how to interpret the division symbol. There seem to be two views on what it means.

• Everything on the left hand symbol is divided by everything on the right hand of the symbol.
or
• The operator only deals with the term to its immediate left and right.

Which is correct? I tried to find a specific definition for the usage of the division operator but I couldn't find it.


The division slash should bind at the same level as multiplication and before addition and subtraction. Within multiplication and division it should be left to right. The computer languages I have used have respected this, so $a+b/cd$ would be parsed as $a+((b/c)d)$. I have used calculators that just apply operators left to right, so $a+b/cd$ would be $((a+b)/c)d,$ which is clearly wrong compared to the mathematical standard. Scientific calculators I have had respected the order of operations. We get many posts that do not do it properly, but you can usually guess what is intended. If you see $1/x^2+1$ it is most likely $1/(x^2+1)$ that is wanted.


Everything to the immediate left and right will be affected by the division symbol, everything else wont. This is how it is on most calculators and how order of operations PEMDAS defines the usage.


The standard is that multiplication and division have the same level of precedence, and precede addition and subtraction, which have the same precedence. Within a precedence level, they are left associative, meaning you perform operations from left to right.

This means that in $1 + 2 \div 3 \times 4 - 5$ gets computed as

$$ (1 + ((2 \div 3) \times 4)) - 5 $$

Whether $\div$ or $/$ is used for division doesn't matter. Similarly, both $\times$ and $\cdot$ have the same meaning for representing multiplication, as does leaving the operation implicit, such as the product in $2a$.

Handwritten formulas usually have visual cues that indicate how terms should be grouped, such as ${}^2\!/\!{}_3 + 4$.

Unfortunately, errors where people accidentally write something they don't mean, or even people who outright violate the standard convention and use something else entirely are common enough that you can't really rely on the intended meaning of a formula to be what is written.

Because of this, typeset formulas tend to be written in a way that can't be interpreted wrongly, such as $(1/2)x$ rather than $1/2x$ which risks being misinterpreted as $1/(2x)$. If you encounter a formula that could be interpreted either way, you're better off trying to infer the correct meaning from context rather than applying the standard rule.

Ultimately, (in my opinion) the driving force that cements this particular convention is programming languages, and the convention described above is nearly universal.

The only exception I have ever seen is that wolfram alpha interprets $1/xy$ as $1/(xy)$. And it isn't even consistent in this; wolfram alpha interprets $1/2x$ as $(1/2)x$ instead.