Tools for Thought by Howard Rheingold

The idea that people could use computers to amplify thought and communication, as tools for intellectual work and social activity, was not an invention of the mainstream computer industry or orthodox computer science, nor even homebrew computerists; their work was rooted in older, equally eccentric, equally visionary, work. You can't really guess where mind-amplifying technology is going unless you understand where it came from.

- HLR

Chapter One: The Computer Revolution Hasn't Happened Yet
Chapter Two: The First Programmer Was a Lady
Chapter Three: The First Hacker and his Imaginary Machine
Chapter Four: Johnny Builds Bombs and Johnny Builds Brains
Chapter Five: Ex-Prodigies and Antiaircraft Guns
Chapter Six: Inside Information
Chapter Seven: Machines to Think With
Chapter Eight: Witness to History: The Mascot of Project Mac
Chapter Nine: The Loneliness of a Long-Distance Thinker
Chapter Ten: The New Old Boys from the ARPAnet
Chapter Eleven: The Birth of the Fantasy Amplifier
Chapter Twelve: Brenda and the Future Squad
Chapter Thirteen: Knowledge Engineers and Epistemological Entrepreneurs
Chapter Fourteen: Xanadu, Network Culture, and Beyond
Footnotes

Chapter Four:
Johnny Builds bombs and Johnny Builds Brains

If you asked then thousand people to name the most influential thinker of the twentieth century, it is likely that not one of them would nominate John von Neumann. Few would even recognize his name. Despite his obscurity outside the communities of mathematicians and computer theorists, his thoughts had an incalculable impact on human destiny. He died in 1957, but the fate of the human race still depends on how we and out descendants decide to use the technologies von Neumann's extraordinary mind made possible.

At the end of his life he was an American, and a power behind the scenes of American scientific policy and foreign policy. But that was only the last of several equally distinguished identities in different countries and fields of thought. Janos Neumann, known as "Jansci," was a prodigious young chemical engineer turned mathematician and logician in Hungary in the early 1920s. Johann von Neumann was one of the elite quantum physics revolutionaries in Gottingen, Germany, in the late twenties. And from 1933 until his death, he was John von Neumann of Princeton, New Jersey; Los Alamos, New Mexico; and Washington, D.C., known to professors and Presidents as "Johnny."

Ada and Babbage could only dream of the day their device could be put to work. Turing was a tragic victim of political events before he could get his hands on a computer worth the name. Johnny, however, not only managed to get his machines built and use them to create the first working principles of software--but he also ended up telling his government how to use the new technology. He was responsible for much more than the first boost in accelerating American effort to develop computer technology.

A combination of many different scientific and political developments led to the invention of ENIAC. Electronic tube technology, Boolean logic, Turing-type computation, Babbage-Lovelace programming, and feedback-control theories were brought together because of the War Department's insatiable hunger for raw calculating power. John von Neumann was the only man who not only knew enough about the scientific issues but moved comfortable enough in the societies of Princeton and Los Alamos and Washington to grasp the threads and weave them together in an elegant and powerful design.

Von Neumann was a very important, probably indispensable, member of the Manhattan Project scientific team. Oppenheimer, Fermi, Teller, Bohr, Lawrence, and the other members of the most gifted scientific gathering of minds in history were as awed by Johnny's intellect as anyone else who ever met him. More impressively, they were as reliant on his mathematical judgment as anyone else. In that galactic cluster of world-class physicists, chemists, mathematicians, and engineers, it was a rare tribute that von Neumann was put in charge of the mathematical calculations upon which all their theories--and the functioning of their "gadget"--would depend.

As if his significant contributions to the development of the first nuclear weapons and the first computers were not enough for one man, he was also one of the original logicians who had posed the questions that Turing and Kurt G–del answered in the 1930s. He was a cofounder of the modern science of game theory (picking up where Babbage left off), one of the founders of operational research (also, curiously, advancing a field first explored by Babbage), an active participant in the creation of quantum physics, one of the first people to suggest analogies and differences between computer circuits and brain processes, and one of the first scientists since Turing to examine the relationship between the mathematics of code-making and the mystery of biological reproduction.

Von Neumann ended up a key policy-maker in the fields of nuclear power, nuclear weapons, and intercontinental ballistic weaponry: he was the director of the Atomic Energy Commission and an influential member of the ICBM Committee. Generals and senators were lucky to get an appointment. Even when he was dying, the most powerful men in the world gathered around for a final consultation. According to Admiral Lewis Strauss, former chairman of the Atomic Energy commission: "On one dramatic occasion near the end, there was a meeting at Walter Reed Hospital where, gathered around his bedside and attentive to his last words of advice and wisdom, were the secretary of Defense and his Deputies, the Secretaries of the Army, Navy, and Air Force, and all the military Chiefs of Staff."

John von Neumann's political views, undoubtedly rooted in his upper-class Hungarian past, were unequivocal and extreme, according to the public record and his biographers. He not only used his scientific expertise to hasten and accelerate the development of nuclear weapons and computer-guided missiles, but counseled military and political leaders to think about using these new American inventions against the USSR in a "preventive war." (In an article in Life magazine, published shortly after he died, von Neumann was quoted as saying: "If you say why not bomb them tomorrow, I say, why not today. If you say at five o'clock, I say why not one o'clock.")

In contrast to Turing, whom he knew from Turing's prewar stay at Princeton and from their wartime work, von Neumann was a sophisticated, worldly, and gregarious fellow, famous for the weekly cocktail parties he and his wife hosted during his tenure at Princeton's Institute for Advanced Study and up on the Mesa at Los Alamos. He had a substantial private income and an additional $10,000 a year from the Institute. He was widely known to have a huge repertoire of jokes in several languages, a vast knowledge of risquÈ limericks, and a casual manner of driving so recklessly that he demolished automobiles at regular intervals, always managing to emerge miraculously unscathed.

Despite his apparently charmed existence, von Neumann, like Ada Lovelace and Alan Turing, died relatively young. Lovelace died of cancer at thirty-six, Turing of cyanide at forty-two, and von Neumann of cancer at fifty-three. Like many other Los Alamos veterans, he may have been a victim of exposure to radiation during the early nuclear bomb tests. His death came as a shock to all who knew him as a vital, lively, peripatetic, seemingly invulnerable individual. Stanislaw Ulam, von Neumann's mathematical colleague and lifelong friend, in a memorial to Johnny published in a mathematical journal shortly after von Neumann's death, described his physical presence in loving detail:

Johnny's friends remember him in his characteristic poses: standing before a blackboard or discussing problems at home. Somehow his gesture, smile, and the expression of the eyes always reflected the thought or the nature of the problem under discussion. He was of middle size, quite slim as a young man, then increasingly corpulent; moving in small steps with considerable random acceleration, but never with great speed. A smile flashed on his face whenever a problem exhibited features of a logical or mathematical paradox. Quite independently of his liking for abstract wit, he had a strong appreciation (one might almost say a hunger) for the more earthy type of comedy and humor.

Everyone who knew him remembers to point out two things about von Neumann--how charming and personable he was, no matter what language he was speaking, and how much more intelligent that other human beings he always seemed to be, even in a crowd of near-geniuses. Among his friends, the standard joke about Johnny was that he wasn't actually human but was as skilled at imitating human beings as he was at everything else.

Born into an upper-class Hungarian Jewish family, Jansci was fluent in five or six languages before the age of ten, and he once told his collaborator Herman Goldstine that at age six he and his father often joked with each other in classical Greek. It was well known that he never forgot anything once he read it, and his ability to perform lightning fast calculations was legendary.

One night in the middle of the summer of 1944, von Neumann encountered by happenstance a mathematician of past acquaintance in the Aberdeen, Maryland, train station. History might have been far different if one of their trains had been scheduled a few minutes earlier. That accidental meeting in Aberdeen presented von Neumann with a nearly completed approach to a problem the strategic significance of which he was uniquely equipped to understand, the details of which were complex and profound enough to attract his intellectual curiosity, the successful completion of which could be hastened by the use of his political clout.

Lieutenant Herman Goldstine, then associated with the U.S. Army Ordnance Ballistic Laboratory at Aberdeen, Maryland, didn't know anything about the other projects von Neumann was juggling at that time. But he knew that von Neumann's security clearance was miles above his and that he was a member of the Scientific Advisory Committee at the Ballistic Research Laboratory. So Goldstine happened to mention that an Army project at the Moore School of Engineering was soon to produce a device capable of performing mathematical calculations at phenomenal speeds.

Years later, Goldstine remembered that he was understandably nervous upon meeting the world-famous mathematician on the platform at the Aberdeen station. Goldstine recalled:

Fortunately for me, von Neumann was a warm friendly person who did his best to make people feel relaxed in his presence. The conversation soon turned to my work. When it became clear to von Neumann that I was concerned with the development of an electronic computer capable of 333 multiplications per second, the whole atmosphere of our conversation changed from one of relaxed good humor to one more like the oral examination for the doctor's degree in mathematics.

Because he had all-important reasons for wanting a fast automatic calculator, von Neumann asked for a demonstration. At the Moore School of Engineering, he met the gadget's inventors, Mauchly and Eckert, and the next years saw Johnny adding Aberdeen as a regular stop on his Princeton-D.C.-Los Alamos shuttle. Like everything else he turned his mind to, von Neumann immediately seemed to see more clearly than anyone else the future potential of what was then only a crude prototype. While the other principal creators of the first electronic computer were either mathematicians or electrical engineers, von Neumann was also a superb logician, which enabled him to understand what few others did--that these gadgets were in a class quite far beyond that of superfast calculating engines.

From those early meetings in 1944 to the eras of ENIAC, EDVAC, UNIVAC, MANIAC, and (yes) JOHNNIAC, the problem of assigning legal and historical credit to the inventors of the first electronic digital computers becomes a tangled affair in which easy explanations are impossible and many conflicts are still unresolved. Goldstine--the other man on the platform with von Neumann--had his own version of the key events in early computer history. Mauchly and Eckert had a distinctly different point of view. There was a tale of Stibitz at Bell Labs. IBM's Thomas Watson, Senior, had yet another story. And a man in Iowa named Atanasoff eventually had the unexpected last laugh in a courtroom in 1973.

Monumental court cases have been fought over the issue of assigning credit for the invention of the modern computer, and even the legal decisions have been somewhat murky. Certainly it was a field in which a few people all over the world, working independently, reached similar conclusions. In the case of the ENIAC team, it was a case of several determined minds working together.

It isn't hard to envision von Neumann coming onto the scene after others have worked for years on the considerable engineering problems involved in building ENIAC (Electronic Numerical Integrator and Calculator), then dominating the voice of the group when they articulated their discoveries, not out of self-aggrandizement, but because he undoubtedly had the most elegant way of stating the conclusions that the group had arrived at, working in concert. Because of von Neumann's prominence in other fields, and the way his charm worked on journalists as well as generals, he was often described by the mass media as the soleinventor of key concepts like the all-important "stored program"--a credit he never claimed himself.

Although the matter of assigning credit for the earliest computer hardware is a tricky business, there is no denying von Neumann's central role in the history of software. His contributions to the science of computation in the late forties and early fifties were preceded by even earlier theoretical work that led to the notion of computation. He was one of the principal participants in both of the lines of thought that converged into the construction of ENIAC--mathematical logic and ballistics.

John von Neumann's role in the invention of computation began nearly twenty years before the ENIAC project. In the late 1920s, between his major contributions to quantum physics, logic, and game theory, young Johann von Neumann of G–ttingen was one of the principal players in the international game of mathematical riddles that started with Boole seventy years prior and led to Turing's invention of the universal machine a decade later.

The impending collision of philosophy and mathematics that was becoming evident at the end of the nineteenth century made mathematicians extremely uncomfortable. Slippery metaphysical concepts associated with human thought might have appealed to minds like Boole's or Turing's. But to David Hilbert of G–ttingen and others of the early 1900s, such vagueness was a grave danger to the future of an enterprise that intended to reduce all scientific laws to mathematical equations.

The logical and metamathematical foundations of more "pure" forms of mathematics, Hilbert insisted, could only be stated clearly in terms of numerical problems and precisely defined symbols and rules and operations. This was the doctrine of formalism that later spurred Turing to make his astonishing discovery about the capabilities of machines. Johann von Neumann, a student of Hilbert's, was one of the stars of the formalists. In itself, von Neumann's metamathematical achievement was remarkable. His work in formalism, however, was only part of what von Neumann achieved in several disparate fields, all in the same dazzling year.

In 1927, at the age of twenty-four, von Neumann published five papers that were instant hits in the academic world, and which still stand as monuments in three separate fields of thought. It was one of the most remarkable interdisciplinary triple plays in history. Three of his 1927 masterpieces were critical to the field of quantum physics. Another paper established the new field of game theory. The paper most directly to the future of computation was about the relationship between formal logic systems and the limits of mathematics.

In his last 1927 paper, von Neumann demonstrated the necessity of proving that all mathematics was consistent, a critically important step toward establishing the theoretical bases for computation (although nobody yet knew that). This led, one year later, to a paper published by Hilbert that listed three unanswered questions about mathematics that he and von Neumann had determined to be the most important questions facing logicians and mathematics of the modern era.

The first of these questions asked whether or not mathematics was complete. Completeness, in the technical sense used by mathematicians, means that every true mathematical statement can be proven (i.e., is the last line of a valid proof).

The second question, the one that most concerned von Neumann, asked whether mathematics (or any other formal system) was consistent. Consistency in the technical sense means that there is no valid sequence of allowable steps (or "moves" or "states") that could prove an untrue statement to be true. If arithmetic was a consistent system, there would never be a way to prove that 1 + 1 = 3.

The third question, the one that opened the side door to computation, asked whether or not mathematics was decidable. Decidability means that there is some definite method that is guaranteed to correctly determine whether an assertion is provable.

It didn't take long for a shocking answer to emerge in response to the first Hilbert-von Neumann question. In 1930, yet another young mathematician, Kurt G–del, showed that arithmetic cannot be complete, because there will always be at least one true assertion that cannot be proved. In the course of demonstrating this, G–del crossed a crucial threshold between logic and mathematics when he showed that any formal system that is as rich as the number system (i.e., contains the mathematical operators + and =) can be expressed in terms of arithmetic. This means that no matter how complicated mathematics (or any other equally powerful formal system) becomes, it can always be expressed in terms of operations to be performed on numbers, and the parts of the system (whether or not they are inherently numerical) can be manipulated by rules of counting and comparing.

Von Neumann's and Hilbert's third question about the decidability of mathematics led Turing to his 1936 breakthrough. The "definite method" (of determining whether a mathematical assertion is provable) that was demanded by the decidability question was formulated by Alan Turing as a machine that could operate in definite steps on statements encoded as symbols on tape. G–del had shown how numbers could represent the operations of formal system, and Turing showed how the formal system could be described numerically to a machine equipped to decode such a description (e.g., translate the system's rules into the form "find a number n, such that . . . ", "n" being expressible as a string of ones and zeroes).

All of these questions were terribly important at the time they were formulated--to the few dozen people around the world who were equipped to understand their significance. But in 1930, the rest of the population had more important things to worry about that the hypothetical machines of the metamathematicians. Even those who understood that universal machines could in fact be built were in no position to begin such a task. Making a digital computer was an engineering project that would require the kind of support that only a national government could afford.

John von Neumann was at the Institute for Advanced Study at Princeton by the time young G–del and Turing came along. Although he was keenly aware of the latest developments in the "foundation crisis of mathematics" he had helped initiate in the late 1920s, von Neumann's restless intellect was attacking half a dozen new problems by the early 1930s. To Johnny, still in his twenties, the most important thing in life was to find "interesting problems."

In particular, he was interested in mathematical questions involving the phenomenon of turbulence, and the dynamics of explosions and implosions happened to be one area where such questions could be applied. He was also interested in new mathematical methods for modeling complex phenomena like global weather patterns or the passage of radiation through matter--methods that were powerful but required such enormous numbers of calculations that future progress in the field was severely limited by the human inability to calculate the results of the most interesting equations in a reasonable length of time.

Von Neumann seemed to have a kind of "Midas Touch." The problems he tackled, no matter how abstruse and apparently obscure they might have seemed at the time, had a way of becoming very important a decade or two later. For example, he wrote a paper in the 1920s on the mathematics underlying economic strategies. A quarter of a century later it turned out to be a perfect solution to the problem of how airplanes should search for submarines (as well as one of the first triumphs of "operational research," one of the fields pioneered by Babbage).

By the 1940s, von Neumann's expertise in the mathematics of hydrodynamic turbulence and the management of very large calculations took on unexpected importance because these two specialties were especially applicable to a new kind of explosion that was being cooked up by some of the old gang from G–ttingen, now gathered in New Mexico. The designers of the first fission bomb knew that hellish mathematical problems in both areas had to be solved before any of the elegant equations of quantum physics could be transformed into the fireball of a nuclear detonation. As von Neumann already suspected, the mathematical work involved in designing nuclear and thermonuclear weapons created an avalanche of calculations.

The calculating power needed in the quest for thermonuclear weaponry ended up being one of the highest-priority uses for ENIAC--top-secret calculations for Los Alamos were the subject of the first official programs run on the device when it became operational--although the reason the electronic calculator had been commissioned in the first place was to generate the mathematical tables needed for properly aiming conventional artillery.

The ENIAC project was started under the auspices of the Army Ballistic Research Laboratory. Herman Goldstine, a historian of computation as well as one of the key participants, took the trouble to point out that the word ballistics is derived from the Latin ballista, the name of a large device for hurling missiles. Ballistics in the modern sense is the mathematical science of predicting the path of a projectile between the time it is launched and the moment it hits the target. Complex equations concerning moving bodies are complicated further by the adjustments necessary for winds of different velocities and for the variations in air resistance encountered by projectiles fired from very large guns as they travel through the atmosphere. The results of all possible distance, altitude, and weather calculations for guns of each specific size and muzzle velocity are given in "firing tables" which artillerymen consult as they set up a shot.

The application of mass-production techniques to weapons meant that new types of guns and shells were coming along at an unprecedented pace, making the ongoing production of firing tables no easy task. During World War I, such calculations were done by humans who were called "computers." But even then it was clear that new methods of organizing these large-scale calculations, and new kinds of mechanical calculators to help the work of human computers, would be an increasingly important part of modern warfare.

In 1918 the Ballistics Branch of the Chief of Ordnance set up a special mathematical section at the Aberdeen Proving Ground in Maryland. One of the early recruits was the young Norbert Wiener, who was to feature prominently in another research tributary of the mainstream of ballistic technology--the automatic control of antiaircraft guns--and who was later to become one of the creators of the new computer-related discipline of cybernetics.

In the 1930s, both the Aberdeen laboratory and an associated group at the University of Pennsylvania's Moore School of Engineering obtained models of the automatic analog computer constructed by Vannevar Bush at MIT, a gigantic mechanical device known as the "differential analyzer." It was a marvelous aid to calculation, but it was far from being a digital computer, in either its design or its performance.

With the aid of these machines, the work of performing ballistic calculations was somewhat relieved. Before World War II, the machines were still second to the main resource--mathematics professors emeriti at the Moore School, who performed the calculations by hand, with the aid of hand-cranked mechanical calculators. Shades of Babbage's Cornish clergymen!

When war broke out, it was obvious that the institutions in charge of producing ballistic calculations for several armed services needed expert help. It was for this reason that a mobilized mathematician, Lieutenant Herman Goldstine, reported for duty at Aberdeen in August, 1942, and was assigned the task of streamlining ballistic computations. He soon found the Moore School facilities inadequate, and started to expand the staff of human "computers" by adding a large number of young women recruited from the Women's Army Corps to the small cadre of elderly ex-professors.

Goldstine's wife, Adele, herself a mathematician who was to play a prominent role in the programming of early computers (she and six other women were eventually assigned the task of programming the ENIAC), became involved with recruiting and teaching new staff members. Von Neumann's wife, Klara, performed a similar role at Los Alamos, both before and after electronic computing machines became available. The tradition of using women for such work was widespread--the equivalent roles in Britain's code-breaking efforts were played by hundreds of skilled calculators whom Turing and his colleagues called "girls" as well as "computers."

The expansion of the human computing staff at Aberdeen to nearly two hundred people, mostly WACs, was a stopgap measure. The calculation of firing tables was already out of hand. As soon as a new kind of gun, fuse, or shell became available for combat, a new table had to be calculated. The final product was either printed in a booklet that gunners kept in their pockets, or was mechanically encoded in special aiming apparatus called automata. (An entirely different mathematical research effort by Julian Bigelow, Warren Weaver, and Norbert Wiener was to concentrate on the characteristics of these automatic aiming machines.)

The answer to the firing table dilemma, as Goldstine was one of the first to recognize, was to commission the invention of an entirely new kind of mechanical calculating aid. The Vannevar Bush calculators were no longer the most efficient calculating devices. Faster machines, built on different principles, had been built by Dr. Howard Aiken and an IBM team at Harvard, and by a group led by a man named George Stibitz at Bell laboratories. But Goldstine knew that what they really needed at Aberdeen and the Moore School was an automatic calculator that was hundreds, even thousands of times faster than the fastest existing machines.

Such dreams would have been akin to an Air Force officer wishing for a ten-thousand-mile-per-hour airplane, except for the fact that another new technology, one that only a few people even thought of applying to mathematical problems, looked as if it might make such a machine possible in theory, if only questionably probable in execution. Research in the young field of electronics had been uncovering all sorts of marvelous properties of the vacuum tube. Over in Great Britain, the whiz kids at Bletchley Park were using such devices in Colossus, their not-quite-computational code-breaking machine.

Until the war, electronic vacuum tubes had been used almost exclusively as amplifiers. But they could also be used as very fast switches. Since the rapid execution of a large number of on/off impulses is the hallmark of digital computation, and vacuum tubes could switch on and off as fast as a million times a second, electronic switching (as opposed to the mechanical switching of Vannevar Bush's machine) was an unbelievable good candidate for the key component of an ultrafast computing machine.

By 1943, unknown to Goldstine and almost all of his superiors, another, much higher-ranking scientist was also searching for an ultrafast computing machine. Goldstine beat the other fellow to it. Goldstine found Mauchly and Eckert in 1942. John von Neumann, and chance, found Goldstine in 1944.

John W. Mauchly and J. Presper Eckert have been properly credited with the invention of ENIAC, but before they implemented the key ideas of electronic digital computing machines, a man named Atanasoff in Iowa, in the 1930s, built small, crude, but functioning prototypes of electronic calculating machines. His name has not been as widely known, and his fortunes turned out differently from those of other pioneers when computers grew from an exotic newborn technology to a powerful infant industry. But in 1973 a Unites Stated district court ruled that John Vincent Atanasoff invented the electronic digital computer.

It was a complicated decision, reached after years of litigation, and was not as clear-cut as it might have been if both did not have such strong cases. The core of the dispute centered around original work Atanasoff did in the 1930s, and the influence that his work later had on John Mauchly's design of ENIAC. Like the Hollerith-Billings story of the invention of punched-card data processing, simple explanations of where one man's ideas left off and another's began are difficult to reconstruct at best.

Atanasoff was the last of the lone inventors in the field of computation; after him, such projects were too complicated for anything less than a team effort. Like Boole, Atanasoff was the recipient of one of those sudden inspirations that provided the solution to a problem he had been grappling with for years. A theoretical physicist teaching at Iowa State in the early 1930s, he came up against the same obstacle faced by other mathematicians and physicists of his era. The approaches to the most interesting ideas were blocked by the problems of performing large numbers of complex calculations.

By 1935, Atanasoff was in hot pursuit of a scheme to mechanize calculation. He was aware of Babbage's ideas, but he was an electronic hobbyist as well as a physicist, and entire technologies that didn't exist in Babbage's time were now showing great promise. Atanasoff was gradually convinced that an electronic computing machine was a good bet to pursue, but he had no idea how to go about designing one, and he wasn't sure how to design a machine without working out a method of programming it. In the late 1970s, Atanasoff told writer Katherine Fishman:

I commenced to go into torture. For the next two years my life was hard. I thought and thought about this. Every evening I would go into my office in the physics building. One night in the winter of 1937 my whole body was in torment from trying to solve the problems of the machine. I got in my car and drove at high speeds for a long while so I could control my emotions. It was my habit to do this for a few miles: I could gain control of myself by concentrating on driving. But that night I was excessively tormented, and I kept on going until I had crossed the Mississippi River into Illinois and was 189 miles from where I started. I knew I had to quit; I saw a light, which turned out to be a roadhouse, and I went in. It was probably zero outside, and I remember hanging up my heavy coat; I started to drink and commenced to warm up and realized that I had control of myself.

Nearly forty years later, when he testified in the patent case concerning the invention of the electronic computer, Atanasoff recalled that he decided upon several design elements and principles that night in the roadhouse--including a binary system for encoding input and electronic tube technology for switching--that would transform his dream of an electronic calculator into a practical plan.

The state of each inventor's mind at the time of their discussions in 1940 and 1941 was the crux of the legal and historical conflict. There is no dispute that John Mauchly had also devoted years of thought of the idea of automated calculation. Thirty-three years old when he met Atanasoff, Mauchly had worked his way through Johns Hopkins as a research assistant, which gave him extensive experience with procedures that involve detailed measurement and calculation. In 1933, as head of the physics department at Ursinus College near Philadelphia, he began to perform research in atmospheric electricity.

Mauchly was particularly interested in the long-disputed theory about the effect of sunspots on the earth's weather. There was no obvious connection between these huge storms on the sun and terrestrial weather conditions, but that did not prove that such a connection did not exist. In 1936, Mauchly arranged to have many parts of the government's voluminous meteorological records shipped back to his office at Ursinus. He intended to apply modern statistical analysis to the weather data in an attempt to correlate them with records of sunspot activity, hoping that this probe would reveal the previously undetected pattern.

As other mathematical meteorologists like von Neumann were also quickly discovering, Mauchly found that any calculations involving data based on weather quickly grew so complicated that it would take a lifetime to calculate all the equations generated from even the shortest periods of observation. So he found himself doing the same thing that the ballistics experts did--hiring a lot of people with adding machines. A Depression-era agency, the National Youth Administration, helped Mauchly pay students fifty cents an hour to tabulate his weather data with hand calculators. Mauchly planned to obtain punched-card machines, once he got his crew to tackle the first part of the data. But when he watched a demonstration of the world's most advanced punched-card tabulator at the 1939 World's Fair, he realized that even scores of such machines in the hands of trained operators might take another decade to go through the weather data.

In 1939 and 1940, Mauchly read in scientific journals about a new measuring and counting system developed to assist cosmic-ray research. The part of the system that caught his eye was the fact that this new device, using electronic circuits, could count cosmic rays far faster than a dozen of the fasted punched-card tabulators. Cosmic rays can be detected at the rate of thousands per second, but all previous recorders failed to keep pace beyond 500 times a second. Mauchly tried making a few electronic circuits for himself, and he began to see a way that they could be used for computation.

Mauchly took note of one circuit in particular that was developed by the cosmic-ray researchers--the coincidence circuit, in which a switch would be closed only when several signals arrived at exactly the same time, thus, in effect, rendering a decision. Would a machine capable of making electronic logical operations be possible via some variation of this circuit? Experimenting with his own vacuum-tube circuits, Mauchly speculated that there might also exist circuits used in other kinds of instruments that would enable him to build a machine to add, subtract, multiply, and divide. At this point his speculations were more grandiose than his hand-wired prototypes, but the clues he had obtained from the cosmic-ray researchers were enough to put Mauchly's weather-predicting machines on a collision course with a certain device the U.S. Army had in mind, one that had nothing to do with sunspots or the weather.

Mauchly brought a small analog device to the AAAS meeting where he met Atanasoff, and in June, 1941, he hitched a ride to visit Atanasoff in Ames, Iowa. Atanasoff demonstrated the ABC, Mauchly stayed for five days, and thirty-two years later a court decided that Mauchly's later invention of the ENIAC relied upon key ideas of Atanasoff's that were transferred from mind those five days in June.

The 1973 legal decision (Honeywell versus Sperry Rand, U.S. District Court, District of Minnesota, Fourth Division) did not state that Mauchly stole anything, but did restore partial credit for the invention of the electronic computer to a man whose name had been nearly forgotten in all the publicity and honors heaped upon Mauchly and Eckert. After the ruling, Mauchly said: "I feel I got nothing out of that visit to Atanasoff except the royal shaft later." On Mauchly's behalf, it must be noted that nobody has disputed the fact that the sheer scale and engineering audacity of ENIAC was far beyond the ABC, and that Mauchly was indeed on the right track at least as early as Atanasoff.

Part of the reason for ENIAC's success and ABC's obscurity must be attributed to the accidents of history. Legal issues aside, the historical momentum shifted to Mauchly later in the summer of 1941, when he signed up for an Army-sponsored electronics course at the Moore School of Engineering. His instructor, J. Presper Eckert, was an exceptionally bright Philadelphia blueblood twelve years younger than Mauchly. When Eckert, the electronics wizard, learned of Mauchly's plan to automate large-scale numerical calculations, a critical mass of idea-power was reached. They were in exactly the right place at the right time to cook up such an ambitious project.

Not long after thirty-four-year-old John Mauchly and twenty-two-year-old Pres Eckert started to sketch out a plan for an electronic computer, they became acquainted with Lieutenant Herman Goldstine, both as a mathematician and as a liaison officer between the Moore School and the Ballistic Research Laboratory. By the time he met them, Goldstine was sufficiently frustrated by the lack of ballistic computing power that he was receptive to even a science-fiction story like the one presented to him by these two whiz kids.

As wild as it sounded as an engineering feat, Goldstine knew that an electronic device such as the one Mauchly and Eckert described to him had the potential to perform ballistic calculations over 1000 times faster than the best existing machine, the Aiken-IBM-Harvard-Navy device called the Mark I. But it would cost a lot of money to find out if they were right. Atanasoff and Berry built their prototype for a total of $6500. These boys would need hundreds of thousands of dollars to lash together something so complicated and delicate that most electrical engineers of the time would swear it could never work.

Goldstine later explained the risks associated with attempting the proposed electronic calculator project:

. . . we should realize that the proposed machine turned out to contain over 17,000 tubes of 16 different types operating at a fundamental clock rate of 100,000 pulses per second. . . . once every 10 microseconds an error would occur if a single one of the 17,000 tubes operated incorrectly; this means that in a single second there were 1.7 billion . . . chances of a failure occurring . . . Man has never made an instrument capable of operating with this degree of fidelity or reliability, and this is why the undertaking was so risky a one and the accomplishment so great.

The two young would-be computer inventors at the Moore School, the mathematician-turned-lieutenant who found them, and their audacious plan for cutting through the calculation problem by creating the world's most complicated machine were the subject of a high-level meeting on April 9, 1943. Attending was one of the original founders of the military's mathematical research effort and President of the Institute for Advanced Study at Princeton, Oswald Veblen, as well as Colonel Leslie Simon, director of the Ballistic Research Laboratory, and Goldstine.

The moment when the United States War Department entered the age-old quest for a computing machine, and thus made the outcome inevitable, was recalled by Goldstine when he wrote, nearly thirty years later, that Veblen, "after listening for a short while to my presentation and teetering on the back legs of his chair brought the chair down with a crash, arose, and said, 'Simon, give Goldstine the money.'" They got their money--eventually as much as $400,000--and started building their machine.

ENIAC was monstrous--100 feet long, 10 feet high, 3 feet deep, weighing 30 tons--and hot enough to keep the room temperature up toward 120 degrees F while it shunted multivariable differential equations through its more than 17,000 tubes, 70,000 resistors, 10,000 capacitors, and 6,000 hand-set switches. It used an enormous amount of power--the apocryphal story is that the lights of Philadelphia dimmed when it was plugged in.

When it was finally completed, ENIAC was too late to use in the war, but it certainly delivered what its inventors had promised: a ballistic calculation that would have taken twenty hours for a skilled human calculator could be accomplished by the machine in less than thirty seconds. For the first time, the trajectory of a shell could be calculated in less time than it took an actual shell to travel to its target. But the firing tables were no longer the biggest boom on the block by the time ENIAC was completed. The first problem run on the machine, late in the winter of 1945, was a trial calculation for the hydrogen bomb then being designed.

After his first accidental meeting with Goldstine at Aberdeen, and the demonstration of a prototype ENIAC soon afterward, von Neumann joined the Moore School project as a special consultant. Johnny's genius for formal, systematic, logical thinking was applied to the logical properties of this huge maze of electronic circuits. The engineering problems were still formidable, but it was becoming clear that the nonphysical component, the subtleties of setting up the machine's operations--the coding, as they began to call it--was equally difficult and important.

Until the transistor came along a few years later, ENIAC would represent the physical upper limit of what could be done with a large number of high-speed switches. In 1945, the most promising approach to greater computing power was in improving the logical structure of the machine. And von Neumann was probably the one man west of Bletchley Park equipped to understand the logical attributes of the first digital computer.

Part of the reason ENIAC was able to operate so fast was that the routes followed by the electronic impulses were wired into the machine. This electronic routing was the materialization of the machine's instructions for transforming the input data into the solution. Many different kinds of equations could be solved, and the performance of a calculation could be altered by the outcome of subproblems, but ENIAC was nowhere near as flexible as Babbage's Analytical Engine, which could be reprogrammed to solve a different set of equations, not by altering the machine itself, but by altering the sequence of input cards.

What Mauchly and Eckert gained in calculating power and speed, they paid for in overall flexibility. The gargantuan electronic machine had to be set up for solving each separate problem by changing the configuration of a huge telephone-like switchboard, a procedure that could take days. The origins of the device as a ballistics project were partially responsible for this inflexibility. It was not the intention of the Moore School engineers to build a universal machine. Their contract quite clearly specified that they create an altogether new kind of trajectory calculator.

Especially after von Neumann joined the team, they realized that what they were constructing would not only become the ultimate mathematical calculator, but the first, necessarily imperfect prototype of a whole new category of machine. Before ENIAC was completed, its designers were already planning a successor. Von Neumann, especially, began to realize that what they were talking about was a general-purpose machine, one that was by its nature particularly well suited to function as an extension of the human mind.

If one thing was sacred to Johnny, it was the power of human thought to penetrate the mysteries of the universe, and the will of human beings to apply that knowledge to practical ends. He had other things on his own mind at the time--from the secrets of H-bomb design to the structure of logic machines--but he appeared to be most keen on the idea that these devices might evolve into some kind of intellectual extension. How much more might a thinker like himself accomplish with the aid of such a machine? One biographer put it this way:

Von Neumann's enthusiasm in 1944 and 1945 had first been generated by the challenge of improving the general-purpose computer. He had been a proponent of using the latest in computing machines in the atomic bomb project, but he realized that for the impending hydrogen bomb project still better and faster machines were needed. In the theoretical level he was intrigued by the fact that there appeared to be organizational parallels between the brain and computers and that these parallels might lead to formal-logic theories encompassing both computers and brains; moreover, the logical theories would constitute interesting abstract logics in their own right. He was cautious in assuming similarity between a computer and the awesome functioning of the human brain, especially as in 1944 he had little preparation in physiology. Rather he regarded the computer as a technical device functioning as an extension of its user; it would lead to an aggrandizement of the human brain, and von Neumann wanted to push this aggrandizement as far and as fast as possible.

There is no dispute that Mauchly, Eckert, Goldstine, and Von Neumann worked together as a team during this crucial gestation period of computer technology. The team split up in 1946, however, so the matter of accrediting specific ideas has become a sticky one. Memoranda were written, as they are on any project, without the least expectation that years later they would be regarded as historical or legal documents. Technology was moving too fast for the traditional process of peer review and publication: the two most important documents from these early days were titled "First Draft . . ." and "Preliminary Report . . ."

By the time they got around to sketching the design for the next electronic computer, the four main ENIAC designers had agreed that the goal was to design a machine that would use the same hardware technology in a more efficient way. The next step, the invention of stored programming, is where the accreditation controversy comes in. At the end of June, 1945, the ENIAC team prepared a proposal in the form of a "First Draft of a Report on the Electronic Discrete Variable Calculator" (EDVAC). It was signed by von Neumann, but reflected the conclusions of the group. Goldstine later said of this: "It has been said by some that von Neumann did not give credits in his First Draft to others. The reason for this was that the document was document was intended by von Neumann as a working paper for use in clarifying and coordinating the thinking of the group and was not intended for publication." (Mauchly and Eckert, however, took a less benign view of von Neumann's intentions.) The most significant innovations articulated in this paper involved the logical aspects of coding, as well as dealing with the engineering of the physical device that was to follow the coded instructions.

Creating the coded instructions for a new computation on ENIAC was nowhere near as time consuming as carrying out the calculation by hand. Once the code for the instructions needed to carry out the calculation had been drawn up, all that had to be done to perform the computation on any set of input data was to properly configure the machine to perform the instructions. The calculation, which formerly took up the most time, had become trivial, but a new bottleneck was created with the resetting of switches, a process that took an unreasonable amount of time compared with the length of time it would take to run the calculation.

Resetting the switches was the most worrisome bottleneck, but not the only one. The amount of time it took for the instructions to make use of the data, although greatly reduced from the era of manual calculation, was also significant--in ballistics, the ultimate goal of automating calculation was to be able to predict the path of a missile before it landed, not days or hours or even just minutes later. If only there was a more direct way for the different sets of instructions--the inflexible, slow-to-change- component of the computing system--to interact with the data stored in the electronic memory, the more quickly accessible component of computation. The solution, as von Neumann and colleagues formulated it, was an innovation based upon a logical breakthrough.

The now-famous "First Draft" described the logical properties of a true general-purpose electronic digital computer. In one key passage, the EDVAC draft pointed out something that Babbage, if not Turing, had overlooked: "The device requires a considerable memory. While it appeared that various parts of this memory have to perform functions which differ somewhat in their nature and considerably in their purpose, it is nevertheless tempting to treat the entire memory as one organ." In other words, a general-purpose computer should be able to store instructions in its internal memory, along with data.

What used to be a complex configuration of switchboard settings could be symbolized by the programmer in the form of a number and read by the computer as the location of an instruction stored in memory, an instruction that would automatically be applied to specified data that was also stored in memory. This meant that the program could call up other programs, and even modify other programs, without intervention by the human operator. Suddenly, with this simple change, true information processing became possible.

This is the kernel of the concept of stored programming, and although the ENIAC were officially the first to describe an electronic computing device in such terms, it should be noted that the abstract version of exactly the same idea was proposed in Alan Turing's 1936 paper in the form of the single tape of the universal Turing machine. And at the same time the Pennsylvania group was putting together the EDVAC report, Turing was thinking again about the concept of stored programs:

So the spring of 1945 saw the ENIAC team on one hand, and Alan Turing on the other, arrive naturally at the idea of constructing a universal machine with a single "tape." . . .

But when Alan Turing spoke of "building a brain," he was working and thinking alone in his spare time, pottering around in a British back garden shed with a few pieces of equipment grudgingly conceded by the secret service. He was not being asked to provide the solution to numerical problems such as those von Neumann was engaged upon; he had been thinking for himself. He had simply put together things that no one had put together before: his one tape universal Turing machine, the knowledge that large scale pulse technology could work, and the experience of turning cryptanalytic thought into "definite methods" and "mechanical processes." Since 1939 he had been concerned with little but symbols, states, and instruction tables--and with the problem of embodying these as effectively as possible in concrete forms.

With the EDVAC design, ballistics calculators took the first step toward general-purpose computers, and it became clear to a few people that such devices would surely evolve into something far more powerful. The kind of uses the inventors envisioned for the future of their technology was a cause for one of several major theoretical disagreements that were to surface soon thereafter among the four ENIAC principals. Von Neumann and Goldstine saw the opportunity to build an incredibly powerful research tool for scientists and mathematicians. Mauchly and Eckert were already thinking of business and government applications outside military or research institutions.

The first calculation run on ENIAC in December, 1945, six months after the "First Draft," was a problem posed by scientists from Los Alamos Laboratories. ENIAC was formally dedicated in February, 1946. By then, the patriotic solidarity enforced upon the research team by wartime conditions had faded away. Von Neumann was enthusiastic about the military and scientific future of the computer-building enterprise, but the two young men who had dreamed up the computer project before the big brass stepped in were getting other ideas about how their brain-child ought to mature. The tensions between institutions, people, and ideas mounted until Mauchly and Eckert left the Moore School on March 31, 1946, over a dispute with the university concerning patent rights top ENIAC. They founded their own group shortly thereafter, eventually naming it The Eckert-Mauchly Computer Corporation.

When Mauchly and Eckert later suggested that they were, in fact, the sole originators of the EDVAC report, they were, in Goldstine's phrase, "strenuously opposed" by Goldstine and von Neumann. The split turned out to be a lifelong feud. Goldstine, writing in 1972 from his admittedly partial perspective, was unequivocal in pointing out von Neumann's contributions:

First, his entire summary as a unit constitutes a major contribution and had a profound impact not only on the EDVAC but also served as a model for virtually all future studies of logical design. Second, in that report he introduced a logical notion adapted from one of McCulloch and Pitts, who used it in a study of the nervous system. This notation became widely used, and is still, in modified form, an important and indeed essential way for describing pictorially how computer circuits behave from a logical point of view.

Third, in the famous report he proposed a repertoire of instructions for the EDVAC, and in a subsequent letter he worked out a detailed programming for a sort and merge routine. This represents a milestone, since it is the first elucidation of the now famous stored program concept together with a completely worked-out illustration.

Fourth, he set forth clearly the serial mode of operation of the modern computer, i.e., one instruction at a time is inspected and then executed. This is in sharp distinction to the parallel operation of the ENIAC in which many things are simultaneously performed.

While Mauchly and Eckert set forth to establish the commercial applications of computer technology, Goldstine, von Neumann, and another mathematician by the name of Arthur Burks put together a proposal and presented it to the Institute for Advanced Study at Princeton, the Radio Corporation of America, and the Army Ordnance Department, requesting one million dollars to build an advanced electronic digital computer. Once again, some of the thinking in this project was an extension of the group creations of the ENIAC project. But this "Preliminary Discussion," unquestionably dominated by von Neumann, also went boldly beyond the EDVAC conception as it was stated in the "First Draft."

Although the latest proposal was aimed at the construction of a machine that would be more sophisticated than EDVAC, the authors went much farther than describing a particular machine. They very strongly suggested that their specification should be of the general plan for the logical structure and fundamental method of operation for all future computers. They were right: it took almost forty years, until the 1980s until anyone made a serious attempt to build "non-von Neumann machines."

"Preliminary Discussion of the Logical Design of an Electronic Computing Instrument," which has since been recognized as the founding document of the modern science of electronic computer design, was submitted on June 28, 1946, but was available only in the form of mimeographed copies of the original report to the Ordnance Department until 1962, when a condensed version was published in Datamation magazine. The primary contributions of this document were related to the logical use of the memory mechanism and the overall plan of what has been come to be known as the "logical architecture." One aspect of this architecture was the ingenious way data and instructions were made to be changeable during the course of a computation without requiring direct intervention by the human operator.

This changeability was accomplished by treating numerical data as "values" that could be assigned to specific locations in memory. The basic memory component of an EDVAC-type computer used collections of memory elements known as "registers" to store numerical values in the form of a series of on/off impulses. Each of these numbers was assigned an "address" in the memory, and any address could contain either data or an instruction. In this way, specific data and instructions could be located when needed by the control unit. One result of this was that a particular piece of data could be a variable--like the x in algebra--that could be changed independently by having the results of an operation stored at the appropriate address, or by telling the computer to perform an operation on whatever was found at that location.

One of the characteristics of any series of computation instructions is a reference to data: when the instructions tell the machine how to perform a calculation, they have to specify what data to plug into the calculation. By making the reference to data a reference to the contents of a specific memory location, instead of a reference to a specific number, it became possible for the data to change during the course of a computation, according to the results of earlier steps. It is in this way that the numbers stored in the memory can become symbolic of quantities other than just numerical value, in the same way that algebra enables one to manipulate symbols like x and y without specifying the values.

It is easier to visualize the logic of this schema if you think of the memory addresses as something akin to numbered cubbyholes or post-office boxes--each address is nothing but a place to find a message. The addresses serve as easily located containers for the (changeable) values (the "messages") to be found inside them. Box #1, for example, might contain a number; box #2 might contain another number; box #3 might contain instructions for an arithmetic operation to be performed on the numbers found in boxes #1 and #2; box #4 might contain the operation specified in box #3. The numbers in the first two boxes might be fixed numbers, or they might be variables, the values of which might depend on the result of other operations.

By putting both the instructions and the raw data inside the same memory, it became possible to perform computations much faster than with ENIAC, but it also became necessary to devise a way to clearly indicate to the machine that some specific addresses contain instructions and other addresses contain numbers for those instructions to operate on.

In the "First Draft," von Neumann specified that each instruction should be designated in the coding of a program by a number that begins with the digit 1, and each of the numbers (data) should begin with the digit 0. The "Preliminary Report" expanded the means of distinguishing instructions from data by stating that computers would keep these two categories of information separate by operating during two different time cycles, as well.

All the instructions are executed according to a timing scheme based on the ticking of a built-in clock. The "instruction" cycles and "execution" cycles alternate: On "tick," the machine's control unit interprets numbers brought to it as instructions, and prepares to execute the operations specified by the instructions on "tock," when the "execution" cycle begins and the control unit interprets input as data to operate upon.

The plan for this new category of general-purpose computer not only specified a timing scheme but set down what has become known as the "architecture" of the computer--the division of logical functions among physical components. The scheme had similarities to both Babbage's and Turing's models. All such machines, the authors of the "Preliminary Report" declared, must have a unit where arithmetic and logical operations can be performed (the processing unit where actual calculation takes place, equivalent to Babbage's "mill"), a unit where instructions and data for the current problem can be stored (like Babbage's "store," a kind of temporary memory device), a unit that executes the instructions according to the specified sequential order (like the "read/write head" of Turing's theoretical machine), and a unit where the human operator can enter raw information or see the computed output (what we now call "input-output devices").

Any machine that adheres to these principles--no matter what physical technology is used to implement these logical functions--is an example of what has become known as "the von Neumann architecture." It doesn't matter whether you build such a machine out of gears and springs, vacuum tubes, or transistors, as long as its operations follow this logical sequence. This theoretical template was first implemented in the Unites States at the Institute for Advanced Study. Modified copies of the IAS machine were made for the Rand Corporation, an Air Force spinoff "think tank" that was responsible for keeping track of targets for the nation's new but fast-growing nuclear armory, and for the Los Alamos Laboratory. Against von Neumann's mild objections, the Rand machine was dubbed JOHNNIAC. The Los Alamos machine assigned to nuclear weapons-related calculations was given the strangely uneuphemistic name of MANIAC.

(Neither EDVAC, the IAS machine, the Los Alamos, not the Rand machine was the first operational example of a fully functioning stored-program computer. British computer builders, who had been pursuing parallel research and who were aware of Von Neumann's ideas, beat the Americans when it came to constructing a machine based on the logical principles enunciated by von Neumann. The first machine that was binary, serial, and used stored-program memory was EDSAC--the Electronic Delay Storage Automatic Calculator, built at the University Mathematical Laboratory, University of Cambridge, England.)

In a von Neumann machine, the arithmetic and logic unit is where the basic operations of the system are wired in. All the other instructions are constructed out of these fundamentals. It is possible, in principle, to build a device of this type with very few, extremely simple, built-in operations. Addition, for example, could be performed over and over again whenever a multiplication operation is requested by a program. In fact, the only two operations that are absolutely necessary are "not" and "and." The problem with using a few very simple hardwired operations and proportionally complex software structures built from them is that it slows down the operation of the computer: Because instructions are executed one at a time ("serially") as the internal clock ticks, the number of basic instructions in a program dictates how long it takes a computer to run that program.

The control unit specified by the "Preliminary Report"--the component that supervises the execution of instructions--was the materialization of the formal logic device created by Emil L. Post and Turing, who had proved that it was possible to devise codes in terms of numbers that could cause a machine to solve any problem that was clearly statable. This is where the symbol meets the signal, where sequences of on and off impulses in the circuits, the Xs and Os on the cells of the endless tape, the strings of numbers in the programmer's code, marry the human-created computation to the machine that computes.

The input-output devices were the parts of the system that were to advance the most slowly while the switch-based memory, arithmetic, and control components ascended through orders of magnitude. For over a decade after ENIAC, punched cards were the main input devices, and for over two decades, teletype machines were the most common output devices.

The possibility of future breakthroughs in this area and their implications were not overlooked. In a memorandum written in November, 1945, concerning on of the early proposals for the IAS machine, von Neumann anticipated the possibility of creating a more visually oriented output device:

In many cases the output really desired is not digital (presumably printed) but pictorial (graphed). In such situations the machine should graph it directly, especially because graphing can be done electronically and hence more quickly than printing. The natural output in such a case is an oscilloscope, i.e., a picture on its fluorescent screen. In some cases these pictures are wanted for permanent storage . . . in others only visual inspection is desired. Both alternatives should be provided for.

But a personal interactive computer, helpful as such a device might be to a mind such as von Neumann's, was not an interesting enough problem. After solving interesting problems about the processes that take place in the heart of stars, a scientific-technological tour de force that also became a historical point of no return when the scientists' employers demonstrated their creation at Hiroshima, and then solving another set of problems concerned with the creation of computing machinery, all the while pontificating about the most potent aspects of foreign policy to the leaders of the most powerful nation in history, John von Neumann was aiming for nothing less than the biggest secret of all. In the late 1940s and early 1950s, the most interesting scientific question of the day was "what is life?"

To someone who had been at Alamogordo and the Moore School, it would not have been too farfetched to believe that the next intellectual conquest might bring the secret of physical immorality within reach. Certainly he would never know whether he could truly resolve the most awesome of nature's mysteries until he set his mind to decoding the secret of life. And that he did. Characteristically, von Neumann focused on the aspect of the mystery of life that appealed to his dearest instincts and most powerful capacities--the pure, logical, mathematical underpinnings of nature's code. He was particularly interested in the logical properties of the theoretical devices known as automata, of which Turing's machine was an example.

Von Neumann was especially drawn to the idea of self-reporducing automata--mathematical patterns in space and time that had the property of being able to reproduce themselves. He was able to draw on his knowledge of computers, his growing understanding of neurophysiology and biology, and make particularly good use of his deep understanding of logic, because he saw self-replicating automata as essentially logical beasts. The way the task was accomplished by living organism of the type found on earth was only one way it could be done. In principle, the task could be done by a machine that could follow a plan, because the plan, and not the mechanism that carried it out, was a part of the system with the special, heretofore mysterious property that distinguished life from nonliving matter.

Von Neumann approached "cellular automata" on an abstract level, just as Turing did with his first machines. As early as 1948, he showed that any self-replicating system must have raw materials, a program that provides instructions, an automaton that follows the instructions and arranges the symbols in the cells of a Turing-type machine, a system for duplicating instructions, and a supervisory unit--which turned out to be an excellent description of the DNA direction of protein synthesis in living cells.

Another thing that interested Johnny was the gamelike aspect of the world. Accordingly, he thought about the way his self-reproducing automaton was like a game:>

Making use of the work done by his colleague Stanislav Ulam, von Neumann was able to refine his calculations and make them more generally applicable. Von Neumann's mental experiment, which we can easily present in the form of a game, makes use of a homogeneous space subdivided by cells. We can think of these cells as squares on a playing board. A finite number of states--e.g., empty, occupied, or occupied by a specific color--is assigned to a square. At the same time, a neighborhood is defined for each cell. This neighborhood can consist of either the for orthogonally bordering cells or the eight orthogonally and diagonally bordering cells. In the space divided up this way, transition rules are applied simultaneously to each cell. The transition any particular cell undergoes will depend on its state and on the states of its neighbors. Von Neumann was able to prove that a configuration of about 200,000 cells, each with 29 different possible states and each placed in a neighborhood of 4 orthogonally adjacent squares, could meet all the requirements of a self-reproducing automaton. The large number of elements was necessary because von Neumann's model was also designed to simulate a Turing machine. Von Neumann's machine can, theoretically, perform any mathematical operation.

In 1950, when it was evident to all that the engineering phase of computer technology was accomplishing impressive tasks, von Neumann postulated one such system in terms of a factory that contains within it the machinery and the detailed blueprints for making identical factories (and identical blueprints) from raw materials provided to it. Take that a step up in complexity, and the details can include a specification for subsystems that find raw materials for the factory from the environment, with no human intervention.

If one fantasizes one step farther on the complexity spectrum, the instructions and capabilities could specify factories capable of building spaceships to send more spaceships to other planets, where the raw materials found would be shaped into more factory-spaceship-launchpad systems, and if you could build factories that could build two or more such complexes, you could have a counterforce to the generally disorderly trend of the cosmos, in the form of a (mindless?) horde of factory-building factories, munching outward through the galaxies like an anti-entropic swarm of logical locusts.

While it definitely sounds like a science-fiction story, and many would add that it could be interpreted to be an idea of such inhuman coldness as to be termed "fiendish" such scenarios are legitimate topics in the field of automata, and are still known as "von Neumann machines" (as distinguished from "the von Neumann machine," the logical architecture he created for digital computers).

Von Neumann died in 1957, before he could achieve a breakthrough in the field of automata. Like Ada, he died of cancer, and like Ada, he was said to have suffered terribly, as much from the loss of his intellectual facilities as from pain. But the world he left behind him was powerfully rearranged by what he had accomplished before he failed to solve his last, perhaps most interesting problem.

index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14
read on to
Chapter Five:
Ex-Prodigies and Antiaircraft Guns

howard rheingold's brainstorms

©1985 howard rheingold, all rights reserved worldwide.