What are reasons why some symbols in mathematical logic are not standardized?
Solution 1:
I think there is more than one cause of it. My ideas:
- Symbolic logic is still a reasonably new field. (Different than you may think, symbolic logic didn't start with the old Greeks but with Frege's Begriffsschrift in 1879, not even 150 years ago, and don't even try to follow his notation.)
- Some philosophers thought that they knew everything about logic already and didn't even study it and thus were never confronted with the standard notation.
- Some logicians needed other kinds of implication (relevant, strict, material), negation (minimal, subminimal, constructive), or entailment (standard, fuzzy, quasi-, degree) for their own logic and created their own new symbols for it.
- Some logicians were comparing different logics and decided to use a different set of connectives to not get utterly confused.
- A couple started their own notation because they were not satisfied with the old one: Polish notation, dot notation, compressed dot notation, Lambda notation...
And maybe some wanted to confuse everybody :)
To add a bit:
Even with truth tables you see some publications where $0$ stands for true and others where $1$ stands for true. And that is just with two valued logic. If you are lucky you have a book that uses $T$ and $F$ or $\top$ and $\bot$. In either case the $T$ or $\top$ stands for true, and the $F$ or $\bot$ for false. But even so, be warned: always check the meaning first.
Solution 2:
Why is so hard to find a standardisation regarding symbolism ... in Mathematical Logic ?
Well, if you compare the situation with (say) fifty to sixty years ago -- the time when the books I was looking at while I was a student were written -- I would have said that there has been quite a considerable standardisation in symbolism, at least in mainstream mathematical logic books/articles. And one reason for that, surely, has been the universal adoption of LaTeX, which makes it so easy to type \land [for 'logical and'] and always get '$\land$' (and not '&', or a dot, etc.) and to type \forall x and get '$\forall x$' (and not e.g. '$(x)$' or '$\Pi x$'). So, let's be duly grateful for all the standardisation there in fact now is!
True there is the annoying business with $\to$ vs $\implies$. This too is partly to down to LaTeX I guess, as \implies yields the second. Now 'A implies B' gets used in informal talk both as variant on 'if A then B' and as a variant of 'A logically entails B', i.e. as both what me might regiment as $A \to B$ and as $A \vdash B$ [or $A \vDash B$]. And low and behold, we find $\implies$ being confusingly used both ways [in the object language, or in the metalanguage]. Conservatism in symbolism is a Good Thing, so I think the use of $\implies$ is to be deprecated: I'd say, use $\to$ for an object language conditional, and the appropriate turnstile in in metalanguage. The exception might be in a sequent calculus. [Indeed, it was only when I started regularly visiting math.se that I really registered that the double arrow was used in non-sequent-calculi so widely outside the very narrow logic community I was most familiar with -- though that might just show I wasn't really paying attention!]
Solution 3:
To paraphrase Bill Thurston, mathematics is just a way to organize human thought. The purpose of mathematical notation is to help us understand each other.
One reason that different notation is used is that the same concept may need to be compared or contrasted to different things in different situations. For instance, I recently discussed the various division symbols with someone:
The fraction notation $\frac{a}{b}$ is most useful when simplifying equations by hand.
The long division notation is most useful when calculating exact values.
The slash / is great for computers as it works with a regular keyboard.
The 'obelus' ÷ is used on calculators because the other notations are dissimilar from +,-, and x.
I feel that trying to standardize division would be counterproductive.
Now, mathematical logic is different from arithmetic, but the same truths hold. Russell and Whitehead used the notation most helpful for pure symbolic calculation, but this may not be the best notation for writing on the board or writing a computer program.
TL;DR Mathematical notation is designed to express thought as clearly as possible, and strict standardization makes this difficult.