Why do we (mostly) restrict ourselves to Latin and Greek symbols?

99% of variables, constants, etc. that I run into are named for either a Latin character (like $x$) or a Greek character (e.g. $\pi$). Sometimes I twitch a little when I have to keep two separate meanings for a symbol in my head at once.

Back during the Renaissance and prior, I can see why this would be the case (everybody who set today's conventions for notation spoke Latin or Greek). But it's 2016; thanks to globalization and technological advancements, when symbol overloading becomes an issue one could simply look at the Unicode standard (e.g. through a character map or virtual keyboard) and pick their favorite character.

So why do we (mostly) limit our choice of symbols to Latin and Greek characters?


There isn't a single answer, but:

  • Unicode isn't useful when you're writing notes by hand or at a board.
  • We want symbols to be distinguished from running text but still be familiar and play nicely with other symbols. Unicode symbols that are just Latin letters with a decoration or glyphs from languages very few mathematicians are familiar with are not particularly useful.
  • We do use other symbols: Cyrillic Ш for the Tate-Shafarevich group of an elliptic curve, Hebrew $\aleph$ for various cardinals, $\pitchfork$ for transverse intersection, whatever $\wp$ is for the Weierstrass function, $\sharp$ and $\flat$ for isomorphisms of the tangent and cotangent bundles in Riemannian geometry, etc.. But:
  • We don't use other symbols that often because using too many of them makes text hard to read. Math isn't computer programming, and math symbols aren't typed or have rigid syntax; ultimately, the goal for a math paper is to have other humans read it.
  • As such, overloading symbols isn't necessarily a problem. It's hardly confusing to use $e$ to denote both the number $2.7182\cdots$ and the identity element of a group. We could introduce a unique symbol for the latter (not $1$, presumably), but how would that be any easier?
  • In the opposite direction, there are a dozen kinds of integrals denoted by the $\int$ symbol, but they're all more or less the same kind of object, so it's not taxing to the reader. The symbol $+$ denotes addition on a vector space, in a module, in a ring, of matrices, in a field, etc.; but it's all ultimately the same kind of operation.

My high school teacher used something like $f(陳)=陳^{2}+2陳+1$ for illustration of dummy variables. It's a matter compatibility and liquidity of symbol usage. It all depends on the background of the readers. For instance, some people don't know what is $\text{cis } \theta$. Chinese texts don't follow the Western terminology of Pythagoras' theorem, Pascal triangles, Pythagorean triples and they use "勾股定理","楊輝三角形" and "勾股數" instead. Whereas Japanese also uses "三平方の定理" for Pythagoras' Theorem.

enter image description here

Japanese literature on tabulating binomial coefficients and Bernoulli numbers appears in "Hatsubi-Sampo" (発微算法) written by Seki Takakazu(関 孝和)


Math was written in plain text for most of the human history. Everything was spelled out in written statements, including addition and multiplication. It was clumsy, extremely hard to manipulate and probably one of the reasons some algorithms progressed more slowly. It was in renaissance when first symbols were introduced (equality sign, plus and minus,...), and in a couple of centuries, math was mostly writen symbolically (still, vector notation took some time, Maxwell still wrote his famous equations in incredibly convoluted form). In that time, scholars were largely writing in latin, and were under strong influence of ancient texts (mostly from greek mathematicians, although arabic math was also prominent). So greek and latin symbols were a natural choice.

I think nowadays we don't introduce new symbols from other alphabets because the world is pretty anglo-centric. For instance, even german conventions for physical quantities ($A$ for work, ...) which are taught in schools and used in most of continental Europe, are falling out of favor. It's hard even to get publishers and other authors to spell last names of authors right. Even when a non-ascii symbol would be a nice abbreviation for something in other languages, it's rarely used seriously (more like a "joke" in a classroom, especially for dummy variables, when you want to stress that it doesn't matter what you call it). The goal is uniformity - if we all use the same notation, then math transcends language and we can read anything any other scientist wrote (which is, beside the musical notation, one of the rare examples of universally understood "language").

New symbols are introduced, when a new theory/derivation/quantity is introduced, and it's up to the original author to set the notation and naming (which is quite a big deal, actually). So, if a russian scientist discovers something, he may as well use cyrillic letters, nothing wrong with that. Of course, americans will scoff, but that's nothing new. Still, most people nowadays don't even think of that, because education system makes you used to the latin/greek based notation, so there's a strong persistence from generation to generation about the preferred use of symbols.

One of the "problems" (even with greek letters) is still text encoding on computers. Even though unicode should have become a completely universal standards more than a decade ago, most people (again, USA comes to mind) don't bother changing settings from default western encoding, and most webforms still claim other letters are "illegal characters" (even online submission forms for scientific papers). This is one of the reasons we still see u instead of μ for micro- and capital latin letters instead of greek lowercase for triangle angles.

However, there are some very disturbing opposite examples. For instance, SI system of units and measures defines the symbols not as abbreviations but as pure symbols - they aren't subject to grammar rules and translation. But almost universally, you see cyrillic г instead of g for grams on food packaging in Serbia, in clear violation of the rules. Must be national pride or something, but would lead to very ridiculous ambiguities, if they ever applied this translation to milli- and micro- (making them both the same symbol).