C/IL 102 (Computing & Information Literacy)
Highlights of The Machine That Changed the World: Giant Brains (PBS) (with some annotations)

Miscellaneous Observations:

The period 1939-1949 saw the introduction of the world's first several electronic computers. In hindsight, it is amusing to consider that, at the time, some "experts" (including IBM's T.J. Watson) predicted that society's computational needs could be satisfied by about six computers!

The advances in computer technology (in terms of cost and speed of computation) have been astounding. One observer remarked that, had automobile technology advanced as rapidly, today we would have cars that travel at a speed of 1,000,000 miles per hour and travel 500,000 miles per gallon of gasoline! (Upon hearing a similar statement, an executive at one of the major automobile makers retorted, in an obvious poke at the Microsoft Windows operating system, that such cars would also frequently "crash" or break down for no apparent reason!)

Writing, which is a means for recording and communicating ideas, was invented about 5000 years ago. Its impact on the world was profound. (Indeed, it is the cornerstone of our intellectual and commercial lives.) It may be that the influence of computing on the world will, eventually, be seen to rival that of writing.

Inside an electronic computer, data —which represent thoughts, concepts, sounds, images, etc.— is stored in the form of bits (0's and 1's), which are themselves represented by patterns of (electrical) voltages. (A memory unit with high voltage corresponds to 1, say, and low voltage to 0.) Despite this apparent restriction, the applications of computers are far and wide. Computers have a great ability to seemingly be transformed from one thing to another, unlike any other kind of machine. Using a computer, an architect can draw, a musician can compose, a scientist can build and test models of complex phenomena, a scholar can perform a literature search, or a child can play a video game. This is in contrast other kinds of machines, which, in fairness, can be described as "special purpose" in that they are capable of doing one thing, and one thing only, perhaps with a little variation. (E.g., A washing machine can wash clothes, and an oven can get hot. These machines are "programmable" only to a very limited extent: in the former, adjustments can be made to wash- and rinse-cycle times or to agitation speed; in the latter, temperature and cooking time can be set.)

What makes computers more versatile is their programmability. They are not designed for one specific purpose; rather, they are designed to carry out instructions provided to them. Thus, it is the instructions that determine the application to which the machine is put! In effect, when a particular application program is executed, the computer becomes a virtual machine corresponding to whatever kind of application is expressed by that program. A computer running MS Word becomes a (virtual) word processing machine; running other software can make it into a (virtual) flight simulation machine or a (virtual) chess-playing machine.

Nowadays, tiny computers can be found everywhere—in automobiles, household appliances, etc. They help to control how the world works. Their ability to process large amounts of data has enabled new fields of science (e.g., radio astronomy) to develop in ways that would not have been possible otherwise.

Computers manipulate ideas (actually, representations of ideas). They can conjure up artifical universes and allow people to experience these universes. (This is called "virtual reality".) Computers manipulate representations of reality. In this sense, they are very similar to the human brain. One might say that a computer is a "mind-like" machine. (However, the analogy should not be taken too far!)

Given the tremendous versatility of computers, it is ironic that the original purpose for building the first computer in the U.S. focused on a single application: to carry out arithmetic calculations for the U.S. military in support of its efforts during World War II. Such calculations can be done electronically much faster (and with less likelihood of errors) than by humans. An illustrative, historical example: In the 1800's, William Shanks (an English schoolteacher) spent 28 years of his life calculating the decimal expansion of the irrational number Pi (the ratio of the circumference of a circle to its diameter, which turns out to be approximately 3.14159) out to 707 digits of accuracy. Unfortunately, he goofed in calculating the 528th digit, making the last few years of his work useless. (The calculation of each digit depends on the values of the previous ones). A computer of today, given an appropriate program, can compute Pi to 707 digits of accuracy in a small fraction of a second.

Prior to the 1940's, the term "computer" referred to a human whose job was to perform arithmetic calculations: the mental equivalent of "hard labor". Such calculations were necessary to provide various tables for scientists and engineers, including tables of logarithms, called "log tables". Owing to human imperfection, human computers often made errors in doing their calculations.


Significant Events and People in the History of Computing

Babbage's Analytical Engine
Charles Babbage (also see wikipedia entry) (1791-1871) was an English mathematician during the Victorian age. During this period of history (the Industrial Revolution), machines were beginning to be employed to perform physical labor previously carried out by humans. Babbage wondered whether a calculating machine (one "powered by steam", as he put it, alluding to steam-powered machines such as the locomotive) could be employed to relieve people from tedious mental work. Also, much calculation was in demand, due to the rise of fields such as civil and mechanical engineering, whose practitioners relied upon it.

With financial support from the British government, he designed and partially built a calculating device that he called the Difference Engine. (The name was due to the fact that the calculations carried out by the turning of its wheels and shafts was based on a mathematical procedure referred to as the "method of differences".) Construction was never completed, for at least two reasons. One was that Babbage was a poor manager. Perhaps more important, however, was that Babbage got sidetracked by the pursuit of a better idea: to build a much more versatile computing device, which he called the Analytical Engine.

The fundamental advancement in going from the Difference Engine to the Analytical Engine was that the latter was designed to be programmable. That is, rather than performing a fixed, predefined task, the Analytical Engine was designed to be capable of interpreting and executing instructions provided to it in the form of punched cards, similar to those used by the Frenchman Jacquard in his loom (a machine that makes cloth out of thread) in the early 1800's. (The idea is that each card encodes an instruction determined by the pattern of holes punched into it. Such cards were in common use until the 1980's.) The notion of what today we call software was born at the moment when Babbage conceived of separating the machine from its intended application. (That is, the machine itself was not to have any specific pre-defined purpose; rather, the purpose was determined by the instructions given to the machine to execute.)

Unfortunately, Babbage found it difficult to obtain monetary support for his Analytical Engine project (which was probably viewed as far-fetched or gradiose), and it was never completed.

It would be another century before Babbage's ideas would come to fruition. By then, trains, automobiles, and even airplanes had come into common use. Calculating machines, too, were commonplace. But they were merely what today would be called "adding machines", capable only of carrying out arithmetic operations under the direct contol of a human user. Little or no progress in the field of (practical) general-purpose computing occurred until the late 1930's. (See Konrad Zuse.) Interestingly, in the early-to-mid 1930's, several mathematicians (including Kurt Godel, Alonzo Church, and Alan Turing (see below)) had proved some deep results regarding the theoretical limitations of automatic computing.

Another short Babbage biography


Ada, Countess of Lovelace, who was a mathematician (and also the daughter of the poet Lord Byron), became an apprentice to Babbage, carrying on a long correspondence with him. She acted as something of a spokesperson for Babbage, translating his ideas into terms that others could understand. She wrote extensive notes describing how the Analytical Engine operated, as well as what today we would call computer programs that could have been executed by that machine, had it ever been built. (The contemporary programming language Ada is named after her.)


Konrad Zuse (also see wikipedia entry) (1910-1995) was a German engineer who, in the Nazi Germany of the late 1930's, began to build a general-purpose computer (in his parent's house!) because, in his words, he was "too lazy to calculate". (He also noted that "young people have better things to do than to study and calculate", a sentiment that is probably shared by most college students!)

Zuse used electro-magnetic relays (such as were used in the telephone switching system at the time) as the basis of data storage (i.e., memory). He realized that relays (which act as switches, i.e., devices that have two possible states, which are commonly referred to as on and off or as 0 and 1) serve this purpose well, assuming that the computer's calculations are made according to the binary (i.e., base 2) numeral system rather than the more familiar decimal (i.e., base 10) system. To this day, computers are designed to represent data using only 0's and 1's simply because it is easier and cheaper to do it that way.

Although Zuse's efforts were interrupted for a short time by World War II (WWII), by the end of 1941 he had, with some financial aid from the German government, built a programmable, general purpose computer. He had realized Babbage's dream, in effect. (Interestingly, he used discarded movie film (with holes punched into it to encode instructions) as a means for storing programs.) After getting some advice from a colleague, Zuse came to realize that, by using vacuum tubes rather than electro-magnetic relays, he could speed up his machine by a factor of 1000 or 2000 times. (Using relays, performing a multiplication took three to five seconds.) However, the German government refused to fund this project (because Hitler predicted that the war would be over before the project could produce anything useful), and Zuse never built this improved version of his computer. Zuse's great accomplishments did not become widely known until many years after WWII had ended.


The ENIAC
John Mauchly and J. Presper Eckert of the University of PA, headed the development and construction of ENIAC ("Electronic Numeric Integrator And Calculator" (or "Computer")) during the mid-1940's. During this period (i.e., WWII), the U.S. government employed thousands of "human computers" whose job was to compute "firing" tables for the military. (A firing table helped in determining the proper angle at which to set the barrel of a (large) gun, given its position relative to that of the intended target and taking into account weather conditions such as wind and temperature.) Doing all the calculations to produce one table required about four person-years of work. It was evident that faster calculation was needed; hence the Army's interest in ENIAC. The U.S. Army funded the project, despite the claims of some experts that, given the unreliability of vacuum tubes (of which the proposed machine was to contain 18,000, which was hundreds of times as many tubes as had ever been incorporated in a single system), the machine was likely to break down every five seconds or so. (They were wrong, however. As Mauchly says on the videorecording, the machine was designed to allow individual components to fail without causing the entire machine to break down. Today this approach is sometimes called fault tolerance.)

ENIAC filled a 50 ft. × 30 ft. room. It had tens of thousands of electrical parts and a half million soldered joints. It could perform 5000 additions per second, which is pitifully slow by today's standards, but was impressive at the time. Although it was programmable, to change its programming required technicians to (manually) set 6000 switches and replug hundreds of cables connecting different parts of the machine. This could take hours or days (during which the machine could do no useful work, of course)! This design weakness was corrected in later computers, which incorporated the "stored program" approach (see below).

The U.S. Army perceived (or at least lauded) ENIAC as a big success, despite the fact that it was not completed until after the war had ended.

Eckert and Mauchly, in a dispute with the University of PA over the patent rights to ENIAC, left the university to found the world's first computer company. It would be several years before they delivered their first commercial computer (called UNIVAC) to a customer. The company later became Sperry-Univac, and is now Unisys.

John von Neumann (1903-1957) was a prominent U.S. mathematician (originally from Hungary) during the 1940's. He joined the ENIAC project as it was nearing completion. He is best remembered (at least in computing circles) for co-authoring the so-called EDVAC Report (1948) that, in effect, became the basic blueprint for the design of nearly all subsequent electronic computers. Even present-day computers are based on the concepts set forth in that report, and are thus said to possess a von Neumann architecture. Although von Neumann is often credited with developing the stored program concept, others had independently arrived at the same idea. The stored program concept refers to using a computer's primary memory to store not only data, but also the instructions (i.e., the program/software) that are to be carried out in order to manipulate that data. This is in contrast to a machine such as ENIAC, whose "program" was determined by the details of how its (physical) switches and plugs were set/configured, meaning that, to be "re-programmed", it needed to be physically altered.


Freddie Williams, a British radar engineer, was the leader of the team that produced the first stored-program electronic computer, which first operated at Manchester University (Great Britain) in July 1948.


Alan Turing (also see wikipedia entry) (1912-1954) was a British mathematician who legitimately could be called the "father" of computer science. He was one of the first people to fully appreciate the general applicability of computers not just to numerical problems but, more generally, to symbol manipulation problems. He predicted that, by the year 2000, an "intelligent" computer would have been created (suggesting that his prognostication skills fell short of his math skills). Such a computer would be capable of passing the Turing Test, which basically means that it could carry on an intelligent conversation with a human being (so that the human could not determine whether (s)he was conversing with another human or with a machine).

Turing is also remembered for the prominent role he played in breaking the German encryption codes during WWII. (This happened at Bletchley Park, a secret location in Great Britain whose existence was not even revealed until 1970, twenty-five years after the war had ended.) In particular, he led the construction of an electro-mechanical device, called the Bombe (also see wikipedia entry), that was employed in the code-breaking effort. (The videorecording incorrectly states that Turing led the development of Colossus, a very different code-breaking machine that was also used at Bletchley Park. Turing was involved with Colossus, but only peripherally.)

Turing died under mysterious circumstances (most believe that he committed suicide) after being convicted of crimes relating to, and being forced to receive medical treatment for, homosexuality.

Summary of Main Points

Among the most important points made in the videotape relate to the original motivations for building computing devices and to the idea of programmability, which is at the foundation of why the computer is perhaps the most general-purpose kind of machine ever built.