"Digital computer" in the 1940s
I was watching the DVD movie Imitation Game, starring Benedict Cumberbatch as Alan Turing, the British mathematician who helped crack the Nazi's enigma code in WWII. In one key scene, Turing uses the expression digital computer to explain to Joan Clarke what ‘Christopher’, the machine he is making, does. He describes it as
“... an electrical brain, a digital computer.”
I think the year depicted in the movie must have been 1940. The name of Turing's creation was never called ‘Christopher’ that is pure dramatic/artistic license, its real name was bombe and it was a monumental deciphering machine.
Basically, I have three questions:
Was Alan Turing aware of the term digital computer? Did he actually ever use it himself?
Could one describe the British Bombe as being a digital computer? Wikipedia calls it an “electromechanical device”
- When was the term digital computer actually coined?
YouTube clip: Alan Turing explains 'Christopher'
BOUNTY INFO
I should have left a message with the bounty, too late now.
Q1. I am asking whether Turing was familiar with the term “digital computer” in the 1940s but especially before the end of World War II. Biscuit Boy's answer refers to Turing's paper, Computing Machinery and Intelligence, which is dated 1950.
Q2. Because I know nothing about the history of computers nor how they work, I am hoping the community of Stack Exchange can provide a clear-cut answer.
Q3. According to this source, the term digital computer was devised by George Stibitz in 1942, which makes Cumberbatch's line a lexical anachronism.
Q4 Bounty bonus. The name of the first electronic digital computer.
Many cite ENIAC as being the holder of this title, but there are sources which claim the Universal Turing Machine “is the mathematical tool equivalent to a digital computer", and elsewhere: “The first fully functioning electronic digital computer was Colossus (1943)”.
I was going through some online articles and I'd like to thank @Josh61 for the right references. I found this detailed write-up on Word Origins from OED by Richard Holden. (I think I now know the reason why top EL&U users strictly stick to OED definitions)
The article: http://public.oed.com/aspects-of-english/word-stories/digital/
What distinguishes digital from many other terms associated with high technology is that it’s not a new word. In the newly revised OED entry, the earliest evidence—in the sense ‘designating a whole number less than ten’—dates from the fifteenth century. OED‘s original entry, published in 1897, does not record this sense. Instead, it covers senses corresponding to another sense of digit, such as: ‘of or pertaining to a finger, or to the fingers or digits’—evidence for which goes back to the seventeenth century. But for most of its history, digital was a relatively unimportant term: it wasn’t until the early to mid-twentieth century that the word became more significant and widespread.
In the late 1930s and early 1940s, the work of mathematicians and engineers led to the development of a new type of computing machine. As opposed to earlier analogue devices, which used a continuous quantity (such as voltage) to compute the desired quantity by analogy, these new machines operated upon data that was represented as a series of discrete digits. For example, in such a system the letter A might be represented as the binary sequence ‘01000001’ (as it is in the ASCII encoding scheme).
Being composed of such sequences of digits, such data (and so any machine making use of it) was hence said to be "digital". Digital computers were generally considered more adaptable and powerful than their analogue counterparts, and digital computing became dominant: the computer you are reading this article on will certainly be a digital one, as will probably any other computer you have ever used. The sense of digital relating to this was covered in OED2 (1989) by the definition, ‘of, pertaining to, or using digits; spec. applied to a computer which operates on data in the form of digital or similar discrete elements.’
Now, to answer your questions...
1. Was Alan Turing aware of the term digital computer? Did he actually ever use it himself?
Yes. Alan Turing developed the Turing test, and uses the term often in his seminal paper "Computer Machinery and Intelligence".
"Are there imaginable digital computers which would do well in the imitation game?
[Wikipedia]
2. Could one describe the British Bombe as being a digital computer? Wikipedia calls it an “electromechanical device”
I would say No. At least not a "digital computer". It's called electromechanical because it operated on electricity and had rotors, wires and plugboards that carried out the deciphering work. Importantly it did not operate on 0s and 1s, which is the definition of a "digital computer".
(noun). an electronic computer in which the input is discrete rather than continuous, consisting of combinations of numbers, letters, and other characters written in an appropriate programming language and represented internally in binary notation
[Collins Dictionary]
The ENIAC is widely believed to be the first ever "digital" computing device.
It was Turing-complete, digital, and could solve "a large class of numerical problems" through reprogramming
[Wikipedia]
You could probably say it was an "analog computer"!
(noun) a mechanical, electrical, or electronic computer that performs arithmetical operations by using some variable physical quantity, such as mechanical movement or voltage, to represent numbers
[Collins Dictionary]
3. When was the term digital computer actually coined?
The exact time and person who coined it remains a mystery. Etymonline suggests that both "digital" and "computer" originated in the middle of 16th century but the combined term, i.e., "a digital computer" might have its origins at the beginning of the 20th century, and a spike in usage by late 1940s, according to Ngrams.
In Computable Numbers (1936), Turing doesn't use the word "digital" at all. He refers to "computing machines" and in particular the "universal computing machine", which is what we now call a Turing machine. In modern terms, we would not usually refer to something as a computer unless it represented a concrete implementation of a Turing machine (except that strictly speaking, a Turing machine has infinite memory "tape"). The Turing machine is an abstract concept, and physical implementations of it don't necessarily have to be "digital" in any particular sense, but that's a complicated subject, and anyway all modern computers are digital.
Obviously, Turing didn't say much publicly during the war, so we don't know if he was using the term "digital computer" or not. In 1945 he moved to the National Physical Laboratory to develop the Automatic Computing Engine (ACE), which was his choice of name in homage to Babbage's machines[1]. In [Copeland BJ ed., The Essential Turing Oxford 2004], a 1947 lecture on ACE is reprinted without further bibliographic detail; Turing describes ACE as an "electronic digital computing machine" and goes on to emphasise the significance of the word "digital". He also refers to his earlier work on computability, saying "I was investigating what might now be described as... digital computing machines" (my emphasis). This suggests to me that he wouldn't have used the term in 1936, but he was by 1947. His discussion of the advantages of digital computing is essentially to do with the practical implementation of what was a purely theoretical thing in Computable Numbers, so a reasonable guess would be that he started using the word when he became involved in actually building machines; so, certainly by 1946 and probably in conversations related to the Mark 1 (Colossus) in 1943. In any case, I don't think the word would have been unfamiliar to him at any time during the war; there's a good chance he would have been specifically aware of Atanasoff's work, for example.
The thing that struck me more about that Imitation Game line was not the word "digital" but the word "brain". Turing thought a great deal about the essential nature of thinking, and I don't know whether he'd have casually conflated "brain" and "mind" in that way. It'd be an interesting comment, if he'd actually said it.
The first Turing-complete computer to be designed was Babbage's Analytical Engine in 1842, and the first to actually be built was ENIAC in 1946. The first machine with a von Neumann architecture-- very close to what we think of as a computer now-- was EDVAC in 1949. ACE wasn't completed until 1950.
The problem with counting "Colossus" as a computer is that it wasn't Turing-complete, or particularly general-purpose, so it's then hard to justify withholding the title from much older machines, especially the German Zuse Z3 from 1941. In fact the Z3 has been shown to be technically Turing-complete, but more as a theoretical stunt than in any practical sense. 19th-century mechanical calculators could do calculations, for example, and Jacquard looms were programmable.
The Polish bombes, which were later adopted and improved by Bletchley Park, were not computers by most definitions, either now or then. They weren't programmable, and didn't really perform any calculation except that they'd stop when they hit a certain combination of electrical settings; if we consider that computation, then many other machines, such as door locks, would qualify.