That John von Neumann was one of the supreme intellects humanity has produced should be a statement beyond dispute. Both the lightning fast speed of his mind and the astonishing range of fields he made seminal contributions to made him a legend in his own lifetime. When he died in 1957 at the young age of 56 it was a huge loss; the loss of a great mathematician, a great polymath and to many, a great patriotic American who had done much to improve his country's advantage in cutting-edge weaponry.

Starting with pure mathematics - set and measure theory, rings of operators, foundations of mathematics in the 1920s and early 30s - von Neumann moved to other mathematical topics like ergodic theory, Hilbert spaces and the foundations of quantum mechanics that were closer to physics. He then moved into economics, writing "The Theory of Games and Economic Behavior" with Oskar Morgenstern which laid the foundations of game theory (a first edition in good condition now sells for $12,500). During and after the war von Neumann became an almost completely applied mathematician and physicist. Perhaps the major reason for this transformation was his introduction to computing during a consulting stint in England during the war in 1943. Even as nuclear weapons promised to completely change politics, science and international relations, he was writing in a letter to a friend at the end of the war, "I am thinking about something much more important than bombs; I am thinking about computers." In another puckish letter that indicated his move away from his traditional domain of pure mathematics, he said he was coming back from England a "better and impurer man".

To the lay public and to engineers, von Neumann might be best known as one of the founders of modern computing, his name made ubiquitous through the von Neumann architecture of computers that is taught to undergraduates in computer science and engineering. Interestingly, it is this distinction that is somewhat controversial and also much more interesting than it seems from a naive analysis. On both sides one sometimes sees extreme opinions tossed about, so it's worth laying some of them to rest right away. Von Neumann did not "invent" the computer or computer science; the history of computing goes back much farther all the way to medieval times. He also did not "invent" the stored program computer concept, neither did he invent most of the major computing concepts that we now take for granted, like RAM and flow control. He did not invent any important technical bit of hardware. But as William Aspray surmises in his excellent and detailed, albeit a bit staid and dry book, von Neumann's true influence was far more subtle and in fact ironically goes even further than what his defenders imply. I am not sure even Aspray does a convincing job emphasizing how far it went. Therefore, while I will not embark on a detailed chapter-by-chapter analysis of the book here, what I want to do is drive home the two most important concepts that emerge when we analyze von Neumann's role in the history of modern computing -

*the value of generalists and the power of abstraction.*

**An accidental introduction to computing**

Von Neumann became introduced to computers in large part by accident. An important part of the introduction came from the "computers" - usually women calculators in an assembly line kind of system performing repetitive calculations - who were used to do bomb calculations at Los Alamos. The bomb calculations particularly drove home to him the importance of non-linear phenomena involved in the complex radiation flow and hydrodynamics of a nuclear explosion, phenomena that were very hard to model by hand. Another introduction came from meeting scientists in England like Alan Turing and the engineers who were building some of the first computers in Manchester and other places. Von Neumann had also seen the value of computing tables in his work on ballistics at the Aberdeen Proving Ground in Maryland. All these experiences drove home to him the importance of computational science in general.

But perhaps the most important event that introduced von Neumann to computing was a chance encounter at a railway station in the summer of 1944 with Herman Goldstine, an engineer who had been working on the ENIAC computer at the University of Pennsylvania. Until then von Neumann did not know about this pioneering work that was the first important computer project in the country. The ENIAC was not a true stored program computer, so the cables and connections had to be laboriously rewired to solve every new problem, but by the standards of the times it was quite advanced and is now considered the first general-purpose computer, able to tackle a variety of problems. Unlike past analog computers which used electromechanical relays, the ENIAC used vacuum tubes which importantly made it a digital computer and a forerunner of modern computers. The ENIAC had been a labor of love and had been built by engineers whose names are sadly not as appreciated as von Neumann's but should be; along with Goldstine, Julian Bigelow, J. Presper Eckart and John Mauchly played foundational roles in its design and construction.

**The importance of von Neumann's influence**

At this point it's sensible to say a word about the state of what was then computer science. As a field it was generally looked down upon by mathematicians and physicists and regarded as being the domain of drudge work. This is where the first of von Neumann's contributions came into play: his sheer influence whose role cannot be underestimated. By the 1940s he was already considered one of the world's greatest mathematicians and polymaths, and his work in mathematics, physics and economics all commanded the highest respect. In addition, the sheer speed of his thinking that left even Nobel Laureates feeling stumped contributed to a kind of godlike perception of his abilities; later Enrico Fermi once said that von Neumann made him feel like he knew no mathematics at all, and Hans Bethe once mused whether von Neumann's mind indicated a higher species of human being. Von Neumann was also becoming a very valuable asset to the US government. All this meant that when von Neumann said something, you listened. People who question his importance to modern computing sometimes don't appreciate that "importance" in a field is a combination of originality and influence. In terms of influence there was none who surpassed von Neumann, so whatever he said about computing was often taken seriously simply because he had said it.

**Von Neumann the generalist**

The reason von Neumann immediately became so influential in the ENIAC project attested to one of his signal qualities - his remarkable ability to quickly grasp a new field of inquiry and then to leapfrog over even the field's founders to divine new results and insights. It was also a source of annoyance to some since it meant that von Neumann could take their ideas and immediately run farther with them than they themselves could. More than anyone else von Neumann could take the complete measure of a field, a thirty thousand foot view if you will. This is where an even more important quality of his came into play - the polymath's ability to be a generalist. Most people who worked in computing then came from narrowly defined fields: the mathematicians didn't know much about engineering, and the engineers who specialized in vacuum tubes and electronics had little idea of the mathematical theory behind computing. Von Neumann was unique in having total command of all of mathematics and a good part of physics, and his work at Aberdeen and Los Alamos had also introduced him to key ideas in engineering. The missing link was the engineering work on the ENIAC, and when he understood this work, his generalist's mind quickly connected all the dots.

**Von Neumann and the power of abstraction**

Two other very important facts contributed to making von Neumann unique, and both of them shed light not just on his mind but on the power of abstraction. One was a reading of Alan Turing's famous 1936 paper on Turing machines that led the foundations of theoretical computer science. This was again a paper which would not have been read by engineers. When Turing visited Princeton during the war von Neumann tried to recruit him as his assistant but Turing instead chose to go back and become a key part of the government's cryptographic effort in breaking the German codes. But Turing's paper proved very influential and in fact von Neumann asked all the engineers working on the ENIAC and later on the Institute for Advanced Study computer to read it.

The second paper that was a major influence on von Neumann was a 1943 paper by Walter Pitts and Warren McCullough that was the first computational model of a neuron and the forerunner of today's neural networks. Von Neumann immediately grasped the similarity between the Pitts-McCullough paper and the basis of computing. Again, this would not be work familiar to engineers or even other mathematicians interested in computing, and it was only von Neumann's unique interests and abilities as a polymath that led him to read and appreciate it, and to especially appreciate the value of treating neurons and computational elements in general as generalized black boxes.

Both the Turing and the Pitts-McCullough paper led von Neumann to achieve something that was actually unique and can be stamped with his name on it. This something is a signature quality of mathematics and to some extent computer science, and it's what really makes those two fields the powerful fields they have become. The signature quality is the power of abstraction. The beauty and strength of mathematics is that it can generalize from specific instances (or instantiations, as computer scientists like to say) to universal abstract frameworks. Physics also shares this power to a considerable extent - for instance, the equation F=ma is independent of its specific instances and can equally describe an apple falling to the earth, a planet revolving around the sun and two black holes colliding. But the language the equation is expressed in is mathematics, and it is mathematics that allows us to generalize in the first place.

Von Neumann's big achievement was in being able to move away from vacuum tubes, wires, punch cards and magnetic core memory to a high-level view of computing that also led him to see parallels with the human brain. Basically this view told him that any computational framework - biological or electronic - must have five basic components: an input, an output, an arithmetic unit, a processing unit that manipulates data and a memory that stores data. Crucially, it also told him that both the instructions for doing something and the thing that is done can be stored in the same place and in the same form. In the words of the historian George Dyson, von Neumann's insights "erased the distinction between numbers that mean something and numbers that do something." The stored program was not invented by von Neumann, but this abstract view of the stored program did come from him, again thanks to his powers as a pure mathematician and generalist. These two signal insights are the basis of today's von Neumann architecture, but the key idea enabling them was an abstracted view that led von Neumann to visualize the structure of the computer in a most general form, something that his specialized contemporaries could not do.

A slight digression on this idea of the power of abstraction since it's relevant to my own job. I am involved with a company which is trying to enable scientists to run experiments in biology and chemistry remotely in a "cloud lab" from the luxury of their homes and laptops. A key idea in doing this is to abstract away the gory details of all the hardware and equipment through a software platform that only exposes high-level functionality to scientists who aren't experts in engineering. But an even more desirable goal is to generalize workflows across biology and chemistry so that instead of thinking of protocols specific to biology or chemistry, scientists will only think of generic protocols and generic sequences of steps like "move liquid", "stir" and "heat/cool". This is possible because at an abstract level, a container holding cells and a container holding a chemical compound for instance are both the same from the point of view of software - they are objects on which you need to perform some operation. At an even more abstract level, they are binary bits of code which change into other binary bits of code; at this level, the words "biology" and "chemistry" become irrelevant.

The ultimate goal is thus to do away with the distinction between specific instantiations of operations in specific fields and abstract them away into generalized operations. I would like to think Johnny would have appreciated this.

**First Draft of a Report on the EDVAC (1945)**

The result of this generalist knowledge was a seminal report called First Draft of a Report on the EDVAC that von Neumann wrote and circulated in 1945 and 1946. The EDVAC was supposed to be the ENIAC's successor and a true stored program computer. The report laid out in detail what we know as the von Neumann architecture and also explained key concepts like flow control, sub-routines and memory implementation. Von Neumann was especially big on subroutines since they went a long way in enabling instantaneous access of specific instructions that would enable stored program computing. He also emphasized the importance of random access memory; the first random access memory hardware was the Williams tube, invented in 1946.

The EDVAC report has become controversial because of two reasons. Firstly, while it came out of many discussions that von Neumann had with the ENIAC engineers, especially Eckert and Mauchly, it had only von Neumann's name on it. Secondly, the report led to a bitter patent dispute. Eckert and Mauchly wanted to start their own company designing computers based on patenting the work on the ENIAC. But after von Neumann circulated the report in public the knowledge was in the public domain and therefore the patent issue became moot. Eckert and Mauchly were understandably bitter about this, but we have to credit von Neumann for being an early proponent of open-source software; he wanted concepts from computing to be available to everyone. Appropriately enough, the EDVAC report became widely known to engineers and scientists across the United States and Europe and influenced the design of computers in many countries. It cemented von Neumann's reputation as one of the founders of modern computing, but it should always be remembered that while the generalist insights in that report came from von Neumann, they were based on a lot of specific engineering and design work done by others.

**Two early applications: Non-linear equations and meteorology**

After working on the ENIAC and the EDVAC, von Neumann decided to apply all the knowledge and insights he had gained to building a computer at the Institute for Advanced Study (IAS) in Princeton where he had been a member since 1933. This fascinating story has been told extremely well by George Dyson in his marvelous book "Turing's Cathedral" so it's not worth repeating here. But it is worth noting what von Neumann considered the two most important applications he envisaged for the first computers. The first was the solution of non-linear equations. Von Neumann had become quite familiar with non-linear equations in the analysis of the complex hydrodynamics and radiation flow associated with nuclear explosions. He knew that non-linear equations are very hard to solve using traditional methods - while analytical solutions are often impossible, even numerical ones might be challenging - and realized that the iterative and fast techniques computers used would greatly aid the solution of these methods. Many of the early papers authored by von Neumann, Goldstine and Bigelow describe mathematical problems like the diagonalization of large matrices and the solution of non-linear partial differential equations. This early work drove home the great advantage and power of computing in a wide variety of fields where non-linear equations are important.

Von Neumann also realized that the atmosphere with its complex movements of air and water is a perfect example of non-linear phenomena. Events during the war like the Normandy landings had emphasized the importance of understanding the weather; von Neumann now thought that the computer would be the ideal tool for weather simulation. Most of the work in this area was done by scientists like Jule Charney and Carl-Gustaf Rossby, but von Neumann played a very influential role by co-authoring papers with them, organizing conferences, securing funding and generally spreading the word. His stature and eminence again went far in convincing the scientific community to work on applying computers to meteorology. Von Neumann also thought that controlling the weather would be easy, but this has proved to be a harder goal, partly became of the complexity of the phenomena involved (including chaos) and partly because of political reasons.

**Von Neumann's role as a founder of modern computer science**

The Institute for Advanced Study computer had a memory of 5 kilobytes, less than what it takes to display a single pixel today. And yet it achieved remarkable feats, simulating the workings of a hydrogen bomb (secretly, at night), simulating the weather and modeling the genetic growth of populations. It embodied all of von Neumann's salient concepts and was widely emulated around the country. The navy built a computer based on the IAS machine, and so did IBM and the RAND corporation whose machine was playfully named the JOHNNIAC. From these machines the gospel spread wide and hard.

In his last few years von Neumann became even more interested in the parallels between the brain and the computer. His last major contribution was to come up with a detailed theory of self-reproducing automata which presaged important later developments in molecular biology and nanotechnology; a 1948 set of lectures at Caltech by him lays out components of self-reproducing organisms with error correction that are remarkably similar to the DNA, RNA ribosomes , proof-reading enzymes and other genetic components that were later discovered. Once again, what made von Neumann's insights in this area possible was that he thought about these components in the most general, most abstract manner, without waiting for the biologists to catch up. In the 1950s he planned to move away from the IAS to either UCLA or MIT where his interests in computing would find a better home and would be encouraged and funded. The history of science and technology could have been very different had this come to pass. Unfortunately this did not come to pass. In 1956 von Neumann was diagnosed with cancer, and he passed away after a cruel and protracted illness in February 1957. Notes for a set of lectures later published as a book lay on his deathbed.

So was von Neumann one of the founders of modern computer science? As complicated, subtle and important as the details are, the overall answer has to be yes. This answer has little to do with his technical contributions and all to do with his sheer influence and his power of generalization and abstraction. Von Neumann communicated the power of computers at a time when they were regarded as little more than turn-the-crank calculators. Because of his enormously wide-ranging interests he demonstrated their potential applications to a vast number of fields in pure and applied mathematics, meteorology, physics and biology. Most importantly, he came up with general ideas that serve as the foundation of so much computing that we take for granted today. In other words, von Neumann more than anyone else made computing respectable, widely known and the basis of modern life that everyone critically relies on today. He is not

*the*founder of computer science, but certainly one of the principal founders. And he achieved this status largely because of the advantage enjoyed by generalists over specialists and the power of abstraction, both good lessons for an age when specialization seems to be the norm.

Great article! But where did you get the number 5kB of memory required to store a single pixel from? According to my calculations that would mean displaying a standard laptop screen's worth of pixels (1920x1080 resolution) would require about 10GB of memory.

ReplyDelete