What is the origin of "analogue" as a term meaning "non-digital?"

This question came up when having a pun-ridden discussion with some of my colleagues: When and why did we start using the word "analogue" to mean "not using numerical digits?"

Etymonline only has an origin for the sense of "having analogy to something else," but that's not the meaning I'm interested in. It does mention that the "computing sense" is recorded from 1946, but not what the origin of that sense is.

I can guess there may once have been a specific and well-known analogy to which this use of the word was a reference, and that that analogy has since faded from common knowledge... But an uneducated guess is just folk etymology that hasn't yet spread. As this use of the word is apparently fairly young, I'm hoping there's a more reliably accurate origin story than mine out there.


Analogue comes from computing.

"A Chronology of Analogue Computing" article in The Rutherford Journal

The word ‘analogue’ was first used as a technical term during the 1940s, and referred specifically to a class of computing technology. Today, the word enjoys much wider usage, typically conveying continuity. For example, engineers will discuss analogue and digital signals, and musicians decide whether to record their work on analogue (continuous) or digital (discrete) media.

Analogue computing emerged during the nineteenth century and became a mainstream computing technology during the early twentieth.

The word analogue has been used because the electric signal, for example, in analogue telephone line, is transmitted in a way that the voice vibrations correspond to electric signal fluctuation. In other word, the electric signal 'imitates' the voice.

In digital transmission, voice is coded into bytes, then is decoded with special protocol.

Another example is radio vs Morse code. Radio directly (by analogy) transmits the voice with electric signal variation. Morse code transmits only combinations of dots and dashes that are decoded by a trained person. So we can call Morse message digital because the concept is the same coding and decoding rather than an electric analogy of physical phenomena.

So the word analogue is used to reflect the concept when some physical phenomenon is converted into its electric signal analogue.

The word digital is used when a phenomenon properties are coded, then decoded.

Here are a few examples and articles to explain the difference between analogue and digital concept:

The basic difference between analog and digital technology on howstuffworks.com

Analog vs. Digital with explanation and comparison chart on diffen.com


I believe the usage of the word comes from analogue electronics.

Analogue electronics (or analog in American English) are electronic systems with a continuously variable signal, in contrast to digital electronics where signals usually take only two levels. The term "analogue" describes the proportional relationship between a signal and a voltage or current that represents the signal. The word analogue is derived from the Greek word ανάλογος (analogos) meaning "proportional". (Wikipedia)


The original electronic computers were "analog". The computations were done by adding/subtracting/integrating/differentiating electronic signals (voltages), so these signals were "analogs" of the real-life values being modeled.

(There were also various types of electromechanical computers, of course, from Babbage's "Difference Engine" to Turing's code-breaking device to several others in England and the US. The devices were incredibly slow and unreliable, though -- and noisy!)

"Digital" electronic computers (generally considered to start with the Eniac at University of Pennsylvania) were so-named to differentiate from analog ones.


Earlier radio and electronics references often classified signals and/or variables as "continuous" or "discrete" as on page 981 of the Fourth Edition (1956) of "Reference Data for Radio Engineers," published by ITT.