It Began with Babbage Read online

Page 14


  3. Kuhn, op cit., p. 12.

  4. Ibid., p. 15.

  5. There was an idea that Max Newman, the principle architect of the Colossus, was influenced by Turing’s work, but this claim was never substantiated. See B. Randell (1980), “The Colossus”. In N. Metropolis, J.S. Howlett, & G.-C. Rota (Eds.). A history of computing in the twentieth century. (pp. 47–92). New York: Academic Press.

  7

  A Tangled Web of Inventions

  I

  ON FEBRUARY 15, 1946, a giant of a machine called the ENIAC, an acronym for Electronic Numerical Integrator And Computer, was commissioned at a ceremony at the Moore School of Electrical Engineering at the University of Pennsylvania, Philadelphia.

  The name is noteworthy. We see that the word computer—to mean the machine and not the person—had cautiously entered the emerging vocabulary of computer culture. Bell Laboratories named one of its machines Complex Computer; another, Ballistic Computer (see Chapter 5, Section I). Still, the embryonic world of computing was hesitant; the terms “calculator”, “calculating machine”, “computing machine”, and “computing engine” still prevailed. The ENIAC’s full name (which, of course, would never be used after the acronym was established) seemed, at last, to flaunt the fact that this machine had a definite identity, that it was a computer.

  The tale of the ENIAC is a fascinating tale in its own right, but it is also a very important tale. Computer scientists and engineers of later times may be ignorant about the Bell Laboratories machines, they may be hazy about the Harvard Mark series, they may have only an inkling about Babbage’s dream machines, but they will more than likely have heard about the ENIAC. Why was this so? What was it about the ENIAC that admits its story into the larger story?

  It was not the first electronic computer; the Colossus preceded the ENIAC by 2 years. True, no one outside the Bletchley Park community knew about the Colossus, but from a historical perspective, for historians writing about the state of computing in the 1940s, the Colossus clearly took precedence over the ENIAC. In fact (as we will soon see), there was another electronic computer built in America that preceded the ENIAC. Nor was the ENIAC the first programmable computer. Zuse’s Z3 and Aiken’s Harvard Mark I, as well as the Colossus, well preceded the ENIAC in this realm.

  As for that other Holy Grail, general purposeness, this was, as we have noted, an elusive target (see Chapter 6, Section III). No one would claim that the Colossus was general purpose; it had been described as a “Boolean calculating machine” (see Chapter 6, Section XIII).1 But, the ENIAC provoked more uncertainty. For one person who was intimately involved in its design and construction, the ENIAC was “a general-purpose scientific computer”2—that is, a computer capable of solving, very fast, a wide variety of scientific, mathematical, and engineering problems.3 For another major participant in the ENIAC project, it was “a mathematical instrument.”4 A later writer somewhat extravagantly called it a “universal electronic calculator.”5 A more tempered assessment by a computer scientist and historian of computing spoke of the ENIAC as comparable with the Colossus; both were special-purpose machines—the former specialized for numeric computation; the latter, for Boolean calculations.6

  Perhaps, then, it seems reasonable to claim that the ENIAC was a general-purpose numeric computer, specialized for solving mathematical and scientific problems using methods of numeric analysis. It was an analytical engine as Babbage had dreamed of.

  However, the ENIAC’s historical significance, its originality, lay in other directions. There was, first, its sheer scale physically, technologically, and computationally. Physically, the machine was a mammoth, occupying three walls of a 30-foot-by-50-foot room and much of the central space. Technologically, it used 18,000 vacuum tubes of 16 different types.7 Added to that, it used 70,000 resistors, 10,000 capacitors, 1500 relays, and 6000 manual switches.8 This was an order of technological complexity far in excess of anything achieved in computing before. And, computationally, because of its electronic technology, it was vastly faster than any other previous computing machines—about 1000 times faster than its nearest competitor, the electromechanical Harvard Mark I.9

  The significance of using 18,000 vacuum tubes from the perspective of reliability is worth noting. The ENIAC was a synchronous machine, pulsed by a clock signal every 10 microseconds. If any one of these vacuum tubes malfunctioned, an error would occur every 10 microseconds. With this many tubes, the reliability of the components was of the essence. Even the failure of a single vacuum tube could cause a digit to be erroneous.10 By carefully selecting rigidly tested components that were then operated well below their “normal ratings,”11 the reliability of the computer was maintained at an acceptable level. Writing several months after its commission, Arthur Burks (1915–2008)—a mathematician who would later become known as much as a computer theorist and philosopher of science, as one of the ENIAC’s lead engineers—commented that, after the initial phase of testing, the failure rate was about two or three per week. These failures, however, could be identified and corrected quickly by operators thoroughly conversant with the ENIAC design so that, in effect, only a few hours were lost per week as a result of failures.12

  This, then, was one of the ENIAC’s major achievements: it demonstrated the viability of large-scale use of electronic components in digital computers. It heralded the viability and the advent of large-scale electronic digital computers.

  The other historically significant factor was that the ENIAC had consequences. Experience with its design, especially its organizational principles, and the resulting dissatisfaction showed the way for a new, crucial concept: the stored-program computer (discussed much more later). This concept was like the crucial missing piece of a jigsaw puzzle. It was instrumental in the formation of a style for the logical and functional organization of computers—for computer architecture (in present-centered language). So compelling was this style, so quickly was it accepted by the fledgling computer culture of its time, that it became the foundation of the first genuine paradigm in computing (in Thomas Kuhn’s sense; see Chapter 6). As we will see, discontent with the ENIAC was a catalyst that led to the birth of computer science.

  For these various reasons—its general-purposeness in the domain of numeric computation; its scale of physical size, technological complexity, and speed of operation; its consequence for the making of the paradigm for a science of computing—the ENIAC has a compelling place in our story. But there is more. The story of the ENIAC, both in terms of the genesis of its principles—that is, the past that fed into it—and the future it helped to shape form a tangled web of ideas, concepts, insights, and personalities. We learn much about the ontogeny of artifacts (to use a biological term)—its developmental history—from the story of the ENIAC.13

  II

  The ENIAC was, of course, a child of World War II, although it never saw wartime action. The ENIAC project began in June 1943 and the machine was commissioned in February 1946, exactly 6 months after the Japanese surrender and the end of the war. The project began in the Ballistics Research Laboratory (BRL) of the Aberdeen Proving Ground in Maryland. With the American entry into the war in December 1941, this laboratory established a scientific advisory committee of some of the country’s leading scientists.14 One of them was Hungarian-born John von Neumann (1903–1957), mathematician extraordinaire, a professor at the Institute of Advanced Study, Princeton (along with the likes of Einstein and Gödel), and an influential figure in the corridors of power in Washington, DC.

  Among the scientists—mathematicians, physicists, chemists, astronomers, and astrophysicists—assembled as BRL’s scientific staff for the war effort was the young Herman Goldstine (1913–2004), an assistant professor of mathematics at the University of Michigan until 1942, when he was called into wartime service. His charge at the BRL was ballistic computation.15

  Ballistic computation, solving differential equations to compute ballistic trajectories, demanded computing machines. The aim of these computati
ons was to prepare firing tables that, for a particular shell, meant that up to some 3000 trajectories had to be computed for a particular range of initial firing conditions, such as muzzle velocity and firing angle.16

  As it happened, the Aberdeen Proving Ground had acquired, in 1935 (well before the BRL was founded) a “copy” of a machine called the differential analyzer that was quite unlike the kind of computers developed in places like Bell Laboratories or Harvard. The differential analyzer was an analog computer and, as the war came to America, was considered the most powerful device for the solution of differential equations, which was what it was designed expressly to do.

  In 1931, an engineer named Vannevar Bush (1890–1974), a professor in the department of electrical engineering at MIT in Cambridge, Massachusetts, had invented the differential analyzer, a machine powered by an electric motor, but otherwise a purely mechanical device. It was an analog machine because, rather than transform an analytical expression (such as a differential equation) into a digital computational problem (as was the tradition ever since Babbage), the mathematical behavior of the “system” of interest (a ballistic trajectory, for example) would be modeled by another physical system with behavior that corresponded exactly (or as approximately as close as possible) to the system of interest. The model served as an analog to the problem system. By manipulating the model, the desired computation would be performed.

  Just as the numeric “digital style” of computing reached back to Victorian Britain so also did the “analog style.” The doyen of late Victorian British science, Scottish mathematical physicist William Thomson, Lord Kelvin (1824–1907), conceived, in 1876, the “harmonic analyzer” for solving equations that described tidal behavior.17 Kelvin’s harmonic analyzer was never actually built. In any case, Bush, apparently unaware of Kelvin’s ideas until much later, went his own way to build his differential analyzer.18

  The principal computing unit in the differential analyzer was comprised of a set of six “integrators” for performing integration, the means by which differential equations would be solved. In addition, systems of gears performed the four arithmetic operations.19 A differential equation to be solved for y as a function of x, say, would be prepared in the form of one or more integrals. An integrator would be supplied with values of x, the variable of integration, and y, the integrand, and would produce the value of ∫ydx. The input values would be translated into amounts of physical movements (linear displacements, angular rotations) of disks, wheels, gears, shafts, and styluses, although the output could also be printed out on a printer. The whole system was driven electrically by the main shaft rotated by an electric motor, with this shaft representing the independent variable x.20 Other than this, the machine’s components and their interconnections were entirely mechanical.

  In the words of Bush’s biographer, the differential analyzer was an “imposing contraption” yet “brutish.” 21 Its operation would cause metal to “clank” with metal.22 Setting up a problem for solution entailed disassembling the machine and reassembling it to set the linear and angular displacements of its mechanical components to represent the input values; various adjustments would be made to the components. More specifically, solving a particular set of differential equations involved the following steps:

  Problem preparation: The equation would be transformed into an integral form. Determination of the interconnections of the units: This was undertaken to solve the equation. Manual connection of the physical units: This entailed placing gear wheels, addition units, and shafts into positions and tightening them manually. Problem running: This necessitated the setup of the initial conditions corresponding to the initial values of the variables.

  Typically, a problem setup would take 2 or 3 days.23 To obviate some of these problems, in a second version of the MIT differential analyzer built by Bush and his student (and, later, colleague) Samuel Caldwell (1904–1960), and completed in 1942, transmission of the values of variables through shafts and gears was replaced by electrical methods. Very soon thereafter, circumstances would overtake the differential analyzer completely.

  III

  During the 1930s, Bush’s differential analyzer attracted a great deal of attention. It was a “smashing success” according to Bush’s biographer.24 As we have noted, the Aberdeen Proving Ground acquired its own copy of the machine in 1935. Early during the 1940s, the Moore School of Electrical Engineering of the University of Pennsylvania in Philadelphia took charge of this machine on behalf of the BRL and started a program to train people in ballistic computation—especially women who had science degrees.25 In September 1942, Goldstine was put in charge of this operation at the Moore School,26 and thus was established the nexus between the Moore School of Electrical Engineering and the BRL—more fatefully, between the Moore School and computers. Through the exigencies and happenstance of wartime decisions, the Moore School planted for itself a secure place in the history of computing.

  After America entered World War II, the school became a center for computing firing tables.27 The differential analyzer was fully occupied. But then, as Burks recalled, a certain John Mauchly (1903–1980) suggested to Goldstine that these firing tables could be compiled must faster with electronic devices. On Goldstine’s request, Mauchly and two colleagues, Presper Eckert (1919–1995) and John Brainerd (1904–1988), wrote a proposal for an “electronic differential analyzer” in April 1943, for which it was calculated that this machine would compute ballistic trajectories 10 times faster than the electromechanical differential analyzer. The proposal was submitted on behalf of the Moore School to the BRL.28

  Thus, two new protagonists enter this story, Mauchly and Eckert, whose names are forever twinned, as are other scientific “couples” in the annals of science, such as Cockcroft and Walton in atomic physics, Watson and Crick in molecular biology, Yang and Lee in theoretical physics, and Hardy and Littlewood in pure mathematics.

  Mauchly was a physicist who, at the time the war broke out, was on the physics faculty at Ursinus College, a liberal arts college just outside Philadelphia, and a person with a long-standing interest in automatic computing. Eckert (no relation to Wallace Eckert [see Chapter 5, Section V]) was a graduate student in electrical engineering, “undoubtedly the best electronic engineer in the Moore School.”29 The third author of the proposal, Brainerd, was a professor of electrical engineering at the Moore School, with a deep interest in using the differential analyzer. Goldstine persuaded the powers that be in the BRL to finance the project, which they did. The electronic differential analyzer later became the ENIAC.30

  So began a new chapter of this story, but a chapter embedded in a tangled web of means and ends, of people and personalities, of insights and ideas, of controversies.

  IV

  This part of the story must begin, in 1939/1940, in Ames, Iowa, with another pair of collaborators: John Vincent Atanasoff (1903–1995) and Clifford Berry (1918–1963). The former was on the faculty of the physics department at Iowa State College (later Iowa State University); the latter, a just-graduated electrical engineer and Atanasoff’s graduate student in physics. An unpublished memorandum authored by Atanasoff in 1940 spoke of the occurrence of systems of linear simultaneous algebraic equations in many fields of physics, technology, and statistics, and the necessity of solving such equations speedily and accurately.31

  The time to solve such equations of even moderate complexity manually—by a (human) computer32—was prohibitively large; equations containing a very large number of “unknowns” (that is, variables) were well nigh unapproachable. Like almost all his predecessors who have appeared in this story, Atanasoff was dissatisfied with the status quo. He sought help from automata. Some 7 years earlier, Atanasoff wrote in a 1940 memorandum that he had begun to investigate the possibility of automating the solution to such problems,33 although it was only in 1939 that he began an actual project to build a computer for solving linear simultaneous equations with Berry as assistant. This machine was completed in 1942. Later, it
would be called the Atanasoff–Berry Computer (ABC).34

  However, Atanasoff’s approach differed in some respects from his contemporaries. He considered, then discarded, punched-card tabulators because of their insufficient computational power. He dwelled on binary computation and concluded that it was superior to other number systems. He then considered the design of the actual mechanisms to use and decided, first, that he would use small electrical condensers (capacitors, in present-centered language) to represent the binary digits 0 and 1—a positive charge representing 1 and a negative charge, 0.35 As for the computing unit, it would be made of vacuum tubes.36

  The ABC became an operational electronic digital computer in 1942, capable of performing the task (up to some limit) it was designed to do. Later, it would be acknowledged as the world’s first operational electronic digital computer, preceding the Colossus. More important, for this story, it embodied a number of noteworthy innovations.

  First, as noted, vacuum tube circuits were used for the computational (arithmetic) units. By the mid 1930s, the use of vacuum tubes, resisters, and capacitors in radios was well established. However, these circuits operated in analog or continuous mode. Digital circuit elements such as flip-flops, counters, and simple switching circuits had also been implemented, but the digital use of vacuum tubes for building such circuits was quite rare.37

  Second, memory was implemented in the form of electrical condensers (capacitors) mounted on two rotating Bakelite drums 8 inches in diameter and 11 inches long. A positive charge on a condenser represented 1; a negative charge, 0. Each drum could store 30 50-bit numbers. As the drum rotated, these numbers could be read, processed, and replaced serially. The two drums were mounted on the same axle so that they could operate synchronously.