We argued, at the conclusion of the previous chapter, that the failure of Babbage to realise a functioning prototype of the Analytic Engine em arguably the first proposal for a programmable `computer' em was not due to any conceptual lack of insight but was, rather, an inevitable consequence of the inadequacies of contemporary engineering technology. Babbage had sought to construct a programmable calculating tool using the interaction of mechanical gear wheels meshed together as a basis. Undoubtedly the most significant technological insight in the genesis of modern digital computers was the realisation that binary switching components provided not only a sufficient basis for the implementation of automated calculators but, indeed, would also be a necessary foundation of any machine as ambitious in concept as Babbage's Analytic Engine, i.e. a machine which could:
Under a sufficiently large `stimulus' the switch is closed and (while the stimulus remains sufficiently `high') a signal can pass from the source to the output. If the stimulus and signal are both electrical and the concept of a `high enough stimulus' is that of carrying a large enough voltage then such a configuration gives a convention for representing 0 (the voltage on the output is too small to close a switch) and 1 (the voltage on the output is large enough to close a switch). We can thereby represent arbitrary number using a notational system called binary, e.g. the number 20 is represented by 5 switches set (in order) to 10100. We can, however, do much more than merely represent numbers: we can also manipulate such representations by using an appropriately powered (e.g. electrical) system of switches. Thus consider the configurations in Figure 16 below.
In (i) the output equals `1' if both A and equal 1; in (ii) the output equals `1' if A or B equal 1. The algebra developed by George Boole and introduced at the end of Chapter 4 provides a mechanism for reducing complex arithmetic operations to networks of switching components representing the operations logical AND, OR, and negation which manipulate binary representations of numbers. Thus, in principle, one can build systems to carry out arithmetic (and other operations) automatically. Using electrically driven switches we can
The first such `switching components' were electro-mechanical relay-contact switches. These comprised two metal contacts connected by a hinge; on receipt of a sufficiently high voltage the two metal contact closed together to all a current to pass through them.
Probably the first person to appreciate the importance of electrical components and Boolean algebra, as concepts that could be used to build a general-purpose computing tool, was the German engineer Konrad Zuse. Zuse had studied engineering in Berlin and in the course of his training had to learn about solutions to simultaneous linear equations. In such equations one is given a set of k identities involving k variables, e.g.
3x + 2y + 3z = 18 5x + 3y + 2z = 17 4x + 6y + 7z = 37
and one is required to find (usually, unique) values for the k variables which simultaneously satisfy the k identities. In the example, the (unique) solution is x=1, y=2, and z=3. Such systems commonly arise in engineering applications as a model for representing trade-offs between different criteria in building structures, e.g. weight, strength, rigidity etc of materials used. Such systems are not mathematically difficult to solve, however, considerable amounts of calculation are involved. The traditional approach (known as Gaussian elimination) becomes extremely laborious when more then four identities are to be dealt with. In engineering applications such as architecture systems involving several hundred equations often arose and constructing feasible solutions to such systems would require months of work by a number of teams. Zuse conceived the idea of carrying out the required calculations by a machine while he was a student in 1934. Zuse recognised that such a machine would need to meet three basic criteria.
1) This design was proposed without any knowledge of Babbage's work, which Zuse did not encounter until 1939.
2) Approximately 20,000 pounds at current prices.
One of the important innovations of the Z3 was its practice of treating data in fixed length words (of 22 bits long) so that all input numbers were translated into this form for processing. The practise of designing machines in terms of some fixed word size continues in machines built today were 32 bit and 64 bits are common sizes.
Zuse went on to built a larger machine, the Z4, which after being relocated a number of times during its construction to avoid bombing attacks, was completed in the late 1940s. It was subsequently placed in a technical institute in Zurich in 1950 were for a number of years it was the only available powerful calculating tool in mainland Europe. The Z3 no longer exists: it was destroyed by an air-raid on Berlin in April 1945.
The work of Zuse and Schreyer was unknown outside Germany at the time. Most of the innovations made by the two were independently developed in the United States (and Britain) in the 1940s.
The history of electronic digital computer development in the United States over this period is extremely complex and still regarded as a matter of some controversy. The results of this confusion were the patent disputes involving the company representing two of the significant pioneers in this work em J. Presper Eckert (1929-) and John Mauchly (1907-1980). The disputes started in 1952: Eckert and Mauchly finally lost their patent case (under which they claimed to have `invented' the first `modern electronic computer') in the Supreme Court on October 19th 1973.
In this section we only have space to quickly review the main contributors to the development of the important early machines in the U.S. and to outline the work of the individuals responsible for them. As well as Eckert and Mauchly two other figures are significant: John Atanasoff (1904-)and John von Neumann (1903-1957). That Eckert and Mauchly were not granted legal recognition of `their' invention may, in some part, be attributed to claims made on behalf of these two.
Independently of these four, however, Howard Aiken had in 1937 at Harvard University proposed some ideas concerning the structure of what he considered an ideal computer: one that could represent negative as well as positive numbers, perform all standard arithmetic operations, carry out sequences of operations. Aiken's idea combined elements of Babbage's work but with the additional concept of using the punched-card representation for data as delineated by Hollerith. Aiken was able to interest I.B.M. (which Hollerith's original company had become) into supporting his proposals. The machine, subsequently called the Mark 1, was built between 1939 and 1944. As with Zuse's Z machines it employed electromechanical relays, and comprised three-quarter of a million parts. Its speed at individual arithmetic functions was similar to that of the Z3, and when finished it was capable of completing six months manual calculation in less than a single day's working.
The Mark 1 was already out-of-date by the time it became operational in 1944. It had been superseded by the machine being developed at the Moore School of Pennsylvania State University. Before this project started an earlier machine had been designed by John Atanasoff and Clifford Berry. Their machine was to be based on valve devices, which although favoured by Schreyer had been ignored by Zuse who preferred to use relays. Atanasoff had succeeded in constructing an electronic calculator that could perform simple arithmetic. He and Berry then set about scaling this simple machine into a more complex computer. Their design provided for a novel storage medium on magnetic drums, 50 bit binary arithmetic, and input on punched cards. A number of components, including arithmetic units were built and later studies of the full design (made in the 1960s) concluded that the proposed machine would have worked as a special-purpose calculating machine. The machine, however, was never completed as both Berry and Atanasoff abandoned it to work for the U.S. military at the onset of the Second World War. The subsequent recognition of Atanasoff's work (over 20 years after it was done) has led to the Atanasoff-Berry Computer (or ABC) as justifying claims being made for Atanasoff having invented the first general-purpose electronic computer. These claims are defended by Iowa State University were the two worked and were accepted in the Supreme Court ruling invalidating the patent suit filed on behalf of Eckert and Mauchly. While it is certainly the case that the design of the ABC was technically sound, nevertheless the machine was never built, and on this ground Eckert and Mauchly's claim to be the first U.S. developers has some foundation.
Like Shreyer and Atanasoff, Mauchly was interested in employing valves as the basic switching component for constructing an electronic calculating machine. He was interested in the problem of weather forecasting and the possibility of doing this by a machine. Mauchly met Atanasoff after presenting a paper on aprroach to tackling this problem. Mauchly's mechanism for doing this, however, performed its analyses on analogue electronic signals. Atanasoff and Mauchly disagree about the outcome of their discussions. Nevertheless, Mauchly subsequently became concerned with working on a digital electronic machine, based on valves. Eckert and Mauchly came together while the former was studying at the Moore School. Eventually, in May 1943, they succeeded in persuading the U.S. Government to finance the construction of a electronic computer. This was to become known as ENIAC: Electronic Numerical Integrator and Calculator. The initial budget for the machine was 61,700 dollars. It was completed in May 1944 and has a strong claim to be the first ever general-purpose electronic machine. The total cost of the project by the time a demonstration was held in February 1945 was almost half a million dollars. The final machine had two noticeable features: it was extremely large (the dimensions were 100 times 10 times 3 feet, weight of 30 tons, over 100,000 components, and occupying an area of 1800 square feet); it was also far faster than anything that has been built previously, being able to multiply in under 3/1000 second. The publicity demonstration in 1945 involved calculations of tables of trigonometric functions and powering of large numbers; all of these calculations were performed in a matter of seconds.
The final figure of importance in our review of U.S. developments is John von Neumann. Although von Neumann made a number of technical improvements to the initial design of ENIAC and was involved in setting up the successor project: EDVAC which would be a programmable machine, he is principally of importance for his theoretical work on computer structures (or architecture) and his r^ole in convincing sceptical authorities of the significance of the potential provided. He has (mistakenly) been credited as the inventor of the digital computer as a result of circulating the first draft description of its operation. Among his many important contributions to Computer Science is his development of a structural description of how computers can be organised: this is now known as the von Neumann model and in effect describes the relationship between memory, program, and control units. In addition, von Neumann became involved with the design and construction of one machine em the IAS em which could be used to demonstrate his ideas, conceived in 1946 this machine was not completed until 1950. Nevertheless, von Neumann came to be regarded in the U.S. as the principal authority on computer design.
Zuse and Schreyer in trying to develop their ideas for a general-purpose computer met with apathy when seeking support from the German High Command. Some military historians have claimed that the failure of the German military intelligence to appreciate the potential uses of such a machine, and thereby divert resources from other research, was a major strategic error. Clearly, it is impossible to say for certain if this was indeed the case. Nevertheless if neglecting to pursue computer development was a technical error it was not one that was also made by British and U.S. research strategists. Much of the impetus behind developments in the U.S. arose from the need for the army to be able to calculate trajectories of projectiles quickly and accurately. In the U.K. the main motivating factor in the development of computers for military use came about through another application: that of deciphering coded intercepted messages.
In 1938 the British Intelligence service managed to obtain a complete working description of the German ENIGMA machine from Richard Lewinski, a Polish Jew who had been employed at the factory where these machines were built. Lewinski had been dismissed from his post on account of his religion. The ENIGMA machine was a coding device: by setting up a secret key, messages typed on it would appear in an encrypted form; the receiver of the message who knew the key could then decipher its actual content. ENIGMA was the mechanism by which all command orders and strategic decisions were communicated through the High Command to field officers. As such the encryption device (and, of course, the rota for setting keys) was carefully guarded. Lewinski had been able to pass on a precise description of the machine since, possessing a formidable memory, he had remembered exact details of its construction and operation while employed where it was built. In principle, once the British Intelligence service understood the workings of ENIGMA they would have complete knowledge of German military operations.
The machine, however, was not enough in itself. Having intercepted a message it was still necessary to know the keyword that had been used to encode it and codes were changed three times a day. Knowledge of the internal working of the machine greatly reduced the range of possible keys consistent with a particular intercepted text, but typically there would be an enormous number of possible keys left to test. The frequency of code changes and the labour involved in testing a possible key meant that the task of decoding messages clearly needed some degree of non-human intervention. It was to assist in this that early in the Second World War Project ULTRA was started at Bletchley Park. The aim of this project was to construct a machine that would quickly identify code keys so that messages could be decrypted. The most important figure involved with the research at Bletchley Park was the English mathematician Alan Mathieson Turing (1912-1954). Turing was to become a significant pioneer in computer development and popularisation. In 1936 he had addressed the question of whether it was possible to specify algorithms (i.e. programs) for every conceivable function and demonstrated that a particular problem - that of deciding whether an algorithm came to halt on a given input^3 could not, in general, be solved.
3) Now known as, The Halting Program.
Turing was also one of the first people to consider the question of whether machines were capable of `intelligent' behaviour and so a founder of the Computer Science discipline of Artificial Intelligence. He proposed a method, now called The Turing Test, for determining if a machine was acting in an `intelligent' way: namely, that a human observer monitoring the responses to questions from the machine and another human participant would be unable to distinguish which set of responses came from the machine and which from the human.
Researchers at Bletchley Park initially built a number of small relay-based machines to process potential keys. While these devices were of considerable help in decoding it was eventually decided to build a valve based machine to automate fully the decryption process. Construction of this machine, called COLOSSUS, started in January 1943 and was completed by December of the same year. There is a strong case for regarding COLOSSUS as the first working electronic computer^4 despite the fact that it was limited to carrying out a special-purpose task. COLOSSUS comprised 1800 valves and could read 5000 characters per second via a paper tape reader. By the end of 1944 a larger machine, called Mark II, had been built in addition.
4) The British work at Bletchley Park was not a factor considered in the Eckert-Mauchly patent dispute in the U.S. since the existence of COLOSSUS and documents relating to it were classified under the Official Secrets Act in the U.K. Details about this work were not publicly released until 1976.
British engineers were among the first to develop and enrich the ideas that had produced the ENIAC computer in the U.S. In July 1945, Douglas Hartree had visited and talked with a number of people who had worked on this machine (despite the fact that its operation was technically classified information owned by the U.S. Military). Hartree, on his return to Britain, tried to encourage British development of similar machines. As a consequence a number of institutions started work on such projects. At Cambridge University, a team coordinated by Maurice Wilkes built the EDSAC computer. This was the first stored-program computer and is the closest of machines considered so far to modern machines. The EDSAC led to two significant innovations: it was the basis of the world's first user Computing Service; and user programs were coded in an assembler language. Since machines operated on binary codes a description of the program instructions has to be supplied in this form. This, however, is difficult to do since it is very easy to enter part of an instruction wrongly. In order to overcome this problem, simple English language mnemonics are used to write the program and this is then translated into the equivalent binary patterns understood by the machine.
Other British computers followed from the work at Cambridge: Wilkes assisted the Lyons Company in designing a machine (LEO or Lyons Electronic Office) which handled their account and office transaction work. LEO was handling clerical tasks by the end of 1951. At Manchester University the first of a long series of machines, the Manchester Mark I, was built between 1947 and 1949 and followed by the Mark II in 1951. In February 1946, Alan Turing, had put forward a complete design of a stored program computer - the Automatic Computing Engine or ACE - which it was envisaged would provide a national computing facility. Unable to interest Wilkes at Cambridge or Williams at Manchester (both of whom disliked Turing), Turing developed his design at the National Physical Laboratory in Teddington. When Turing left the NPL in 1947 a new design based on Turing's but containing a number of fundamental new ideas was developed under the name of Pilot ACE. Although the Pilot ACE was working by the middle of 1950 it did not become fully operational until 1952. Turing died in 1954^5
5) Of cyanide poisoning. The inquest subsequently ruled that Turing had committed suicide, citing as motive Turing's impending prosecution for homosexual activity. Turing's relatives, however, have always maintained that his death was accidental: given Turing's eccentric habit of mixing lethal chemicals together to see what the outcome would be there is some degree of justification for their claim.
without seeing all of his ideas for the ACE realised.
The first half of the twentieth century was to see the development of computer systems in the form that they are common today. Dispensing with solely mechanical emulations of arithmetic processes and moving to electrically based switching components and the formalisms of binary and Boolean algebra, allowed fast and fairly reliable calculating machines to be built. All of the major technological improvements in computer construction since this innovation have come about as a result of the engineering of better (smaller, faster, and more reliable) binary switching devices. By the middle 1950s large machines of unprecedented size and speed were operational at many U.K. institutions and several U.S. sites. Despite this progress, however, some significant problems remained concerning the ease of using these machines. Although Wilkes' team had originated the idea of assembly languages, coding of programs was still a time-consuming and error prone activity. In the final chapter of these notes we shall examine what developments took place from the late 1950s onward which would render the task of programming computers a far simpler task.