How did the notation "ln" for "log base e" become so pervasive?
Wikipedia sez:
The natural logarithm of $x$ is often written "$\ln(x)$", instead of $\log_e(x)$ especially in disciplines where it isn't written "$\log(x)$". However, some mathematicians disapprove of this notation. In his 1985 autobiography, Paul Halmos criticized what he considered the "childish $\ln$ notation," which he said no mathematician had ever used. In fact, the notation was invented by a mathematician, Irving Stringham, professor of mathematics at University of California, Berkeley, in 1893.
Apparently the notation "$\ln$" first appears in Stringham's book Uniplanar algebra: being part I of a propædeutic to the higher mathematical analysis.
But this doesn't explain why "$\ln$" has become so pervasive. I'm pretty sure that most high schools in the US at least still use the notation "$\ln$" today, since all of the calculus students I come into contact with at Berkeley seem to universally use "$\ln$".
How did this happen?
As noted in the original question, Wikipedia claims that the ln notation was invented by Stringham in 1893. I have seen this claim in other places as well. However, I recently came across an earlier reference. In 1875, in his book Lehrbuch der Mathematik, Anton Steinhauser suggested denoting the natural logarithm of a number a by "log. nat. a (spoken: logarithmus naturalis a) or ln. a" (p. 277). This lends support to the theory that "ln" stands for "logarithmus naturalis."
I suggest that everybody from now on uses ln, lg, and lb respectively for the natural, decimal, and binary logarithmic functions, reserving log for when the base is displayed explicitly or is some fixed arbitrary number if the scale doesn't matter. I do (with explanation if necessary at the first instance). What could be simpler?
This convention is the ISO standard (since 1992); see Wikipedia.
Cajori, in his History of mathematical notations, Vol. II, as far as I can see, mentions the notation «$\operatorname{ln}$» once when he is talking about notations for logarithm. He refers to [Irving Stringham, Uniplanar Algebra, (San Francisco 1893), p. xiii] as someone who used that notation; the way this guy is mentioned makes me doubt he was the first or alone in this, though (I'd love to know what 'uniplanar algebra' is (was?)!)
The mention of «$\operatorname{ln}$» is quite minor, and I would guess that at the time Cajori was writing (the volume was completed in August 1925) the notation was essentially not used, for otherwise he would have been more interested in it.
PS: I think I got the link from someone here, or on MO... http://spikedmath.com/043.html is pretty relevant!
It is probably at least in part an instance of consumer lock-in, the phenomenon where the advantages of the majority of the market using the same or compatible products outweigh sufficiently minor differences of one version of the product versus another. (When I learned this term, the canonical example was VHS versus Beta. I am just about old enough to remember that when VCR terminology first came out, vendors would carry both. Already when the first video stores opened up there was more product available in VHS, so a lot of stores would have just one rack with all the Beta products. And eventually of course Beta died. However those who remember and care seem to largely agree that Beta was the superior technology. Apologies for giving such an old-fogey example. Can someone suggest something more current?)
In particular, when it comes to electronic calculators, having everyone agree what's going to happen when you press a certain button is a good thing. (In fact, another lock-in phenomenon is that when I was in high school in the early 90's, most students had Texas Instruments calculators of one kind or other. Among the real geeks it was known that the "cadillac of calculators" was actually the Hewlett-Packard, which used reverse polish notation. Serious CS people appreciate RPN, but the problem is that if you're a high school kid and pick up such a calculator for the first time, it's very hard to figure out what's going on. I haven't seen an HP calculator for many years.) The notation $\ln$ is simple and unambiguous: you don't have to like it (and I don't, especially), but you know what it means, and it's easier to fit on a small calculator key than $\log_e$. I think if you're first learning about logarithms, then base ten is probably the simplest (to have any clue what $e$ is other than "about $2.71828...$" requires some calculus, and is in my experience one of the more subtle concepts of first year calculus), so it's reasonable to have that be the standard base for the logarithm for a general audience.
Also, I'm sure everyone here knows this but I wish my calculus students had a better appreciation of it: exactly what base you take for the logarithm only matters up to a multiplicative constant anyway, since $\log_c x = \frac{\log_a x}{\log_a c}$. So it's no big deal either way.