Field of Science

Von Neumann In 1955 And 2020: Musings Of A Cheerful Pessimist On Technological Survival




Johnny von Neumann enjoying some of the lighter aspects of technology. The cap lights up when its wearer blows into the tube.

“All experience shows that even smaller technological changes than those now in the cards profoundly transform political and social relationships. Experience also shows that these transformations are not a priori predictable and that most contemporary “first guesses” concerning them are wrong.” – John von Neumann
Is the coronavirus crisis political or technological? All present analysis would seem to say that this pandemic was a result of gross political incompetence, lack of preparedness and impulsive responses by world leaders and government. But this view would be narrow because it would privilege the proximate cause over the ultimate one. The true, deep cause underlying the pandemic is technological. The coronavirus arose as a result of a hyperconnected world that made human reaction times much slower than global communication and the transport of physical goods and people across international borders. For all our skill in creating these technologies, we did not equip ourselves to manage the network effects and sudden failures in social, economic and political systems created by them. An even older technology, the transfer of genetic information between disparate species, was what enabled the whole crisis in the first place.
This privileging of political forces over technological ones is typical of the mistakes that we often make in seeking the root cause of problems. Political causes, greatly amplified by the twenty-four hour news cycle and social media, are illusory and may even be important in the short-term, but there is little doubt that the slow but sure grind of technological change that penetrates deeper and deeper into social and individual choices will be responsible for most of the important transformations we face during our lifetimes and beyond. On scales of a hundred to five hundred years, there is little doubt that science and technology rather than any political or social event cause the biggest changes in the fortunes of nations and individuals: as Richard Feynman once put it, a hundred years from now, the American Civil War would pale into provincial insignificance compared to that other development from the 1860s – the crafting of the basic equations of electromagnetism by James Clerk Maxwell. The former led to a new social contract for the United States; the latter underpins all of modern civilization – including politics, war and peace.
The question, therefore, is not whether we can survive this or that political party or president. The question is, can we survive technology? In 1955, John von Neumann wrote a very thought-provoking article titled “Can We Survive Technology?” in Fortune magazine that put this question in the context of the technology of the times. The essay was influenced by historical context – a great, terrible world war had ended just ten years earlier- and by von Neumann’s own background and interests. But the essay also presents original and very general observations that are most interesting to analyze in the context of our present times. By then Johnny, as friends and even casual acquaintances called him, was already regarded as the fastest and most wide-ranging thinker alive and had already carved his name in history as a mathematician, polymath, physicist and military advisor of the very highest rank. Sadly, he was only two years away from the cancer that would kill him at the young age of 54. He was also blessed – or cursed – with a remarkably prescient but still cheerful and ironic pessimism that enabled him to boldly look ahead into future world events; already in the 1930s, he had predicted the major determinants of a potential world war and the winners and losers. Along with his seminal contributions to game theory, pure and applied mathematics, nuclear weapons design and quantum mechanics, his work on computing and automata had already placed him in the front ranks of soothsayers. And like all good soothsayers, he was sometimes wrong.

Copy of the June, 1955 issue of Fortune magazine (from the author’s library)

Perhaps it’s pertinent to quote a paragraph from the last part of Johnny’s article because it lays bare the central thesis of his philosophy in stark terms.
“All experience shows that even smaller technological changes than those now in the cards profoundly transform political and social relationships. Experience also shows that these transformations are not a priori predictable and that most contemporary “first guesses” concerning them are wrong. For all these reasons, one should take neither present difficulties nor presently proposed reforms too seriously.”
Von Neumann starts by pointing to what he saw as the major challenge to the growing technological revolution of the past half century, a technological revolution that saw the rise of radio, television, aviation, submarines, antibiotics, radar and nuclear weapons among other things. He had already seen what military technology could do to millions of people, incinerating them in a heartbeat and reducing their cities and fields to rubble, so one needs to understand his musings in this context.
“In the first half of this century the accelerating industrial revolution encountered an absolute limitation—not on technological progress as such but on an essential safety factor. This safety factor, which had permitted the industrial revolution to roll on from the mid-eighteenth to the early twentieth century, was essentially a matter of geographical and political Lebensraum: an ever broader geographical scope for technological activities, combined with an ever-broader political integration of the world. Within this expanding framework it was possible to accommodate the major tensions created by technological progress. Now this safety mechanism is being sharply inhibited; literally and figuratively, we are running out of room. At long last, we begin to feel the effects of the finite, actual size of the earth in a critical way.”
Let’s contrast this scenario with the last fifty years which were also a period of extraordinary technological development, mainly in communications technologies and the nature of work and knowledge engendered by the Internet. As Johnny notes, just like in 1955 we are “running out of room” and feeling the effects of the “finite, actual size of the earth in a critical way”, albeit through our own novel incarnations. The Internet has suddenly brought people together and made the sphere of interaction crowded. We were naïve in thinking that this intimacy would engender understanding and empathy; but as we realized quite quickly, it tore us apart instead by cloistering us into our own echo chambers that we hermetically sealed from others through social disapproval and technological means. But as Johnny rightly notes, this crisis is scarcely a result of the specific technology involved; rather, “it is inherent in technology’s relation to geography on the one hand and to political organization on the other.”

Von Neumann and Oppenheimer in front of the Institute for Advanced Study computer in Princeton

The three major technological developments of Johnny’s time were computing, energy production and the weather. That last topic might seem like an odd addition, but it was foremost on Johnny’s mind as a major topic of application for computing. Climate was of special interest to him because it was characteristic of complex systems including non-linear differential equations and multifactorial events that are very hard for human beings to solve using pencil and paper. Scientists during World War 2 had also become finely attuned to the need for understanding the weather; this need had become apparent during major events like the invasion of Normandy where the lives of hundreds of thousands of soldiers and civilians depended on day-to-day weather forecasts. It was precisely for understanding complex systems like the weather that Johnny and his associates had made such major contributions to building some of the first general-purpose computers employing the stored program concept, first at the University of Pennsylvania and then at the Institute for Advanced Study in Princeton.
Johnny had a major interest in predicting the weather and then controlling it. He was also one of the first scientists to see that increased production of carbon dioxide would have major effects on the climate. He was well aware of the nature of feedback systems and analyzed, among other things, the impact of solar radiation and ice changes on the earth’s surface. He understood that both these factors are subject to delicate balances, and that human production of carbon dioxide might well upset or override these balances. But Johnny’s main interest was not simply in understanding the weather but in predicting it. In his essay he talks about cloud seeding and rain making and about modulating the reflectivity of ice to increase or decrease temperatures. He clearly understands the monumental impact, exceeding the effects of even nuclear war, that weather prediction and control might have on human civilization:
“There is no need to detail what such things would mean to agriculture or, indeed, to all phases of human, animal, and plant ecology. What power over our environment, over all nature, is implied! Such actions would be more directly and truly worldwide than recent or, presumably, future wars, or than the economy at any time. Extensive human intervention would deeply affect the atmosphere’s general circulation, which depends on the earth’s rotation and intensive solar heating of the tropics. Measures in the arctic may control the weather in temperate regions, or measures in one temperate region critically affect another, one quarter around the globe. All this will merge each nation’s affairs with those of every other, more thoroughly than the threat of a nuclear or any other war may already have done.”
Of all the topics that Johnny discusses, this is the only one which at first sight does not seem to have come to pass in terms of major developments. The reasons are twofold. Firstly, Johnny did not know about chaos in dynamical systems which would make the accurate prediction of climate very difficult. Of course, you don’t always need to understand a system well in order to manipulate it by trial and error. This is where the second reason involving political and social will comes into play. Johnny’s prediction that carbon dioxide will have a major impact on the climate has been well-validated, although the precise effects remain murky. World opinion in general has shied away from climate control experiments, but given the potentially catastrophic effects that CO2 might have on the food supply, immigration, tree cover and biodiversity in general, it is likely that the governments of the world would be pressed into action by their citizens to at least try to mitigate the impact of climate change using technology. Although this prediction by Johnny now seems quaint and outdated, my feeling is that his analysis was actually so far ahead of its time that we will soon see it discussed, debated and put into action, perhaps even during my lifetime. In saying this I remember President Kennedy’s words: “Our problems are man-made; therefore, they can be solved by man.”
Like many scientists of his time, Johnny was optimistic about nuclear power, seeing limitless possibilities for it, perhaps even making it “too cheap to meter”. His prediction seems to have failed along with similar predictions by others, but the failure has less to do with the intrinsic nature of nuclear power and more with the social and political structures that hampered its development by imposing onerous regulatory burdens on nuclear plant construction, spreading unrealistic fears about radiation and not allowing entrepreneurs to experiment with reactor designs through trial and error, the way they did with biotechnology and computing. Just like with weather prediction, I believe that Johnny’s vision for the future of nuclear power will become reality once world governments and their citizenry realize that nuclear power would provide one of the best ways to escape the dual trap of low-energy alternative fuels and high-energy but politically and environmentally destructive fossil fuels. Already we are seeing a resurgence of new-generation nuclear reactors.
One of the fears that Johnny had about nuclear power was that our reaction times would be inadequate compared to even minor developments in the field. He says,
“Today there is every reason to fear that even minor inventions and feints in the field of nuclear weapons can be decisive in less time than would be required to devise specific countermeasures. Soon existing nations will be as unstable in war as a nation the size of Manhattan Island would have been in a contest fought with the weapons of 1900.”
I already mentioned at the beginning how the rapid advances in communications and transport systems made us woefully prepared for the coronavirus. But there is another very important sphere of human activity perhaps unanticipated by Johnny that has also left us impoverished in terms of countermeasures against even minor “improvements”. This sphere is the field of electronic commerce and financial trading, where differences of nanoseconds in the transmission of price signals can make or break the fortunes of companies. More importantly, they can make or break the fortunes of millions of ancillary economic units and individuals who are associated with these institutions through a complex web of models and dependencies whose fault lines we barely understand, leading to a gulf of ignorance with direct causal connections to the global financial crisis of 2008. Sadly, there is no evidence that we understand these dependencies any better now or are better prepared for employing countermeasures against odd and sundry developments in the layering and modeling of financial instruments impacting millions.
Cybersecurity is another field where even minor improvements in being able to control, even momentarily, the complex computer network of an enemy country can have network effects that surpass the initial perturbation and lead to large-scale population impact. Ironically, the very dependence of developed countries on state-of-the-art computer networks which govern the daily lives of their citizens has made them vulnerable to attacks; the creation of these techno-bureaucratic systems itself has not kept pace with the capacity of the systems to efficiently and globally ward off foreign and domestic attacks. Presumably, defense and high-value corporate systems in countries like the United States are resilient enough to not be crippled by such attacks, but as the 2016 election showed, there is a low level of confidence that this is actually the case. Moreover, these systems need to be not just resilient but antifragile so that they can counteract the vastly amplified effects of small initial jolts with maximum efficiency. As critical medical, transport and financial infrastructure increasingly ties its fate to such technology, the ability to respond with countermeasures in equal or less time compared to the threat becomes key.
Automation is another field in which Johnny made major contributions through computing. While working on the atomic bomb at Los Alamos, he had observed human “computers” performing repetitive calculations related to the complex hydrodynamics, radiation flow and materials behavior in a nuclear weapon as it blew apart in a millionth of a second. It was apparent to him that not only would computers revolutionize this process of repetitive calculation, but that they would have to employ stored programs if they were not to be crippled in these calculations by the bottleneck of being reconfigured for every task.
“Thanks to simplified forms of automatic or semi-automatic control, the efficiency of some important branches of industry has increased considerably during recent decades. It is therefore to be expected that the considerably elaborated newer forms, now becoming increasingly available, will effect much more along these lines. Fundamentally, improvements in control are really improvements in communicating information within an organization or mechanism. The sum total of improvements in this field is explosive.”
The explosive nature of the improvements in automation again comes from great gains in economies of scale combined with the non-linear effects of chunking together automated protocols that lead to a critical mass in terms of suddenly freeing up large parts of engineering and commercial processes from human intervention. Strangely, Johnny did not see the seismic effects automation would have on displacing human labor and causing significant political shifts both within and across nations. In looking for insights into this problem, perhaps we should look to a book written by Johnny’s friend and contemporary, mathematician Norbert Wiener of MIT. In 1950 Wiener had written a book titled “The Human Use of Human Beings” in which he extolled automation but warned against machines breaking free from the dictates of their human masters and controlling us instead.

“Progress imposes not only new possibilities for the future but new restrictions.” – Norbert Wiener

Wiener’s prediction has already come true, but likely not in the way he meant or foresaw. Self-replicating pieces of code now travel through cyberspace looking for patterns in human behavior which they reinforce through modifying and spreading themselves through the cyber-human interface. There is no better example of this influence than in the ubiquity of social media and the virtual addiction that most of us display for these sources. In this particular case, the self-replicating pieces of code first observe and then hijack the stimulus-response networks in our brains by looking for dopamine rush-inducing reactions and then mutating and fine-tuning themselves to maximize such reactions (the colloquial phrase “maximizing clicks”, while pithy, does not begin to capture such multilayered phenomena).
How do we ward off such behavior-hijacking technology, and more generally technology with destructive effects? Here Johnny is pessimistic, for several reasons. The primary reason is because as history shows, separating “good” from “bad” technology is often a fool’s errand at best. Johnny gives the example of classified military technology which is often impossible to separate from open civilian technology because of its dual use nature. “Technology – like science – is neutral all through, providing only means of control applicable to any purpose, indifferent to all…A separation into useful and harmful subjects in any technological sphere would probably diffuse into nothing in a decade.” Any number of examples ranging from chemistry developed for both fertilizer and explosives to atomic fission developed for both weapons and reactors should underscore the unvarnished and total truth of this statement.
Technology and more fundamentally science are indeed indifferent, mainly because, in Robert Oppenheimer’s words, “The deep things in science are not discovered because they are useful; they are discovered because it was possible to discover them.” Once prehistoric man found a flint rock, rubbing it together to create fire and using it to smash open the skull of a competitor were both inevitable actions, completely inseparable from each other. It was only our unnatural state of civilization, developed during an eye blink of time as far as geological and biological evolution are concerned, that taught man to try to use the rock for the former purpose instead of the latter. These teachings came from social and political structures that men and women built to ensure harmony, but there was exactly zero information in the basic technology of the rock itself that would have allowed us to make the distinction. As Johnny notes, achieving a strict separation of this distinction could only come from obliteration of the technology in the first place, providing a neat example of having to kill something in order to save it.
However, the bigger and deeper problem that Johnny identified is that technology has an inexorable, Faustian attraction that creates an unholy meld between its utility and volatility. This is because:
“Whatever one feels inclined to do, one decisive trait must be considered: the very techniques that create the dangers and the instabilities are in themselves useful, or closely related to the useful. In fact, the more useful they could be, the more unstabilizing their effects can also be. It is not a particular perverse destructiveness of one particular invention that creates danger. Technological power, technological efficiency as such, is an ambivalent achievement. Its danger is intrinsic… The crisis will not be resolved by inhibiting this or that apparently particularly obnoxious form of technology”
“The more useful they could be, the more unstabilizing their effects can also be.” This statement perfectly captures the Gordian knot technologies like social media have bound us with today. Their usefulness is intrinsically linked to the instability they cause, whether that instability involves an addictive hollowing out of our personal time or the political echo chambers and biases that evolve with these platforms. As such technology is indeed ambivalent, and perhaps the people who would thrive best in an exceedingly technological world are ones who can comfortably ride the wave of this ambivalence while at least marginally pushing it in a productive direction. Neither can people harbor the seemingly fond hope that, even from a strictly political and social viewpoint, the demonstration of a technology such as a social media platform as toxic and divisive would lead to its decline. When even war which clearly demonstrated the ability of technology to obliterate millions could do little to stem further technological development in weaponry, it is scarcely possible to believe that the peacetime problems created by Facebook or Twitter would do anything to starve off what fundamentally makes them tick. And yet, similar to what happened with weaponry, there might be a path forward where we make these destructive technologies more humane and more conditional, with a curious mix of centralized and citizen-enabled control that curb their worst excesses.
Quite apart from the emotional and technical aspects of it, separating useful effects of technology from destructive ones and trying to isolate one from the other might also be a moral mistake. This becomes apparent when one realizes that almost all of technology with its roots in science comes from the basic human urge to seek, discover, build, find and share; the word technology itself comes from the Greek ‘techne’, meaning the skill or manner in which something is gained, and ‘logos’, meaning the words through which such knowledge is expressed. Opposing this urge would be opposing a very basic human facility.
“I believe, most importantly, prohibition of technology (invention and development, which are hardly separable from underlying scientific inquiry), is contrary to the whole ethos of the industrial age. It is irreconcilable with a major mode of intellectuality as our age understands it. It is hard to imagine such a restraint successfully imposed in our civilization.”
What safeguards remain then against the rapid progression and unpredictable nature of technologies described above? As mundane as it sounds, course-correction through small, incremental, opportunistic steps might be the only productive path. Just like the infinitesimal steps of thermodynamic work in an idealized Carnot engine, one hopes that small course-corrective steps will allow us to gradually turn the system back to an equilibrium state. As Johnny put it, “Under present conditions, it is unreasonable to expect a novel cure-all.” 

The cotton gin

I think back again to Johnny’s central thesis stated at the beginning of this essay – “All experience shows that even smaller technological changes than those now in the cards profoundly transform political and social relationships” – and I think of Eli Whitney’s cotton gin. By the end of the 18th century it was thought by many that slavery was a dying institution; the efficiency of slaves picking cotton was so low that one could scarcely imagine slavery serving as the foundation of the American economy. Whitney’s cotton gin, invented in 1794, changed all that: while previously it took a single slave about ten hours to separate and clean a single pound of cotton, two or three slaves using the machine could turn out fifty pounds of cleaned cotton in a day. Whitney’s invention was classic dual use: it would lead to transformative gains in the production of a staple crop. But it was other human beings, not the machine, that decided that these gains would be built on the backs of indentured human beings often treated worse than animals. The cotton gin consigned America to be an economic powerhouse and a fair share of America’s population to not being treated even as citizens. Clearly the reaction time built into the social institutions of the time could not keep pace with the creation of a seemingly mundane brush-like component that would separate cotton fibers from each other.
What can we do in the face of such inevitable, unpredictable technological progression that catches us off guard? If the answer were really simple, we would have discovered it with the metronomic timing of new invented technology. But Johnny’s musings end with hope, hope provided by the same history that tells us that stopping technology is tantamount to trying to stop the air from flowing.
“Can we produce the required adjustments with the necessary speed? The most hopeful answer is that the human species has been subjected to similar tests before and seems to have a congenital ability to come through, after varying amounts of trouble. To ask in advance for a complete recipe would be unreasonable. We can specify only the human qualities required: patience, flexibility, intelligence.”
From limiting the spread of nuclear weapons to reducing human discrimination and trafficking to curbing the worst of greenhouse gas emissions and deforestation, while technology has shown nothing but a ceaseless march into the future, shared morality has been a powerful if sporadic driving force for resurrecting the better angels of our nature against our worst instincts. The social institutions supporting slavery did not reform until a cruel and widespread war forced their hand. But I wonder about counterfactual history. I wonder if, as gains in agricultural production kept on increasing, first with other mechanical harvesters and beasts of burden and then finally with powerful electric implements, whether the reliance on humans as a source of indentured labor would have been weakened and finally done away with by the moral zeitgeist. The great irony here would have been that the injustice one machine (the cotton gin) created might have met its end at the hands of another (the cotton mill created by mass electrification). This counterfactual imagining of history would nonetheless be consistent with the relentless progress of technology that has indeed made life easier and brought dignity to billions whose existence previously was mired in poverty and bondage. Sometimes the existence of something is more important than all the reasons you can think of for justifying its existence. We can continue to hope that the human race will continue to progress as it has before; with patience, flexibility, intelligence.

Book Review: "American Rebels", by Nina Sankovitch

I greatly enjoyed this excellent book on the intertwined lives and fortunes of three families from the little town of Braintree - the Adamses, the Hancocks and the Quincys. Nina Sankovitch has woven a wonderful tale of how these three families and the famous names they gave rise to were both spectators and participants in some of the most famous events in American history. The account is often quite engaging and it kept me glued to the pages. I was aware of the general facts and characters, of course, but the book accomplished the valuable goal of introducing me to Josiah Quincy in particular, a name I had only heard of but did not know much about.
Sankovitch''s account begins in the 1740s when she shows us how John Adams, John Hancock and Josiah Quincy grew up together in Braintree, along with other kids like Abigail Quincy. She leads us through well known events ranging from about 1765 to 1775 - the years of turmoil, and ones during which all these men and women found the role that history had created for them - with flair, and also sheds light on underappreciated events like the dysentery and smallpox epidemics that swept through Boston. The book portrays Braintree as a small town and quintessential example of American egalitarianism, one where everyone was equal - from the distinguished Quincys and wealthy Hancocks to the Adamses who came from yeoman farming stock. Today Braintree is simply the south end of the "T" for most Bostonians.
All the boys and girls played together in the same town square and attended the same church where their fathers were ministers. Abigail Adams came from the prominent Quincy family. Everyone had been drilled right from childhood in both the values of self-reliance and that of community service. The colony had already been enlightened about the evils of slavery, and many colonists did not own slaves unlike their Southern brethren. After John Hancock's father died, his wealthy uncle Thomas and aunt Lydia took him under their wing and spirited him away to Boston. There on Beacon Hill, in a wealthy mansion, Hancock grew up and took charge of the family's prosperous trading business. He soon became perhaps the most prominent citizen of Boston, certainly the wealthiest but also the most charitable. All spit and polish, he would throw dinner parties, give to the poor and somehow still avoid "entangling alliances" with the British, especially the much-hated Governor Thomas Hutchinson.
The real star of the story, however, is Josiah Quincy. A brilliant student at Harvard who raided the library while the others were drinking and playing cards (he knew almost all of Shakespeare by heart), he became a prominent lawyer who started publishing letters promoting the liberty and property rights of the colonists in the "Boston Gazette" and the "Massachusetts Spy". His brilliance, eloquence and dedication to the cause of liberty and property rights all exceeded those of his compatriots, the two Johns. John Adams really became prominent only after his defense of the British soldiers accused of orchestrating the Boston Massacre of 1770, and before that the limelight seemed to belong to Hancock, Quincy and his brother Sam Adams who headed the incendiary group the Sons of Liberty which was responsible for the Boston Tea Party. Racked with consumption almost all his life, Josiah could be laid low for days and nights and it was remarkable that he undertook the work that he did with such enthusiasm and industry. His friend Dr. Joseph Warren regularly visited him and nursed him back to health every time - Warren later martyred himself on Bunker Hill. Josiah had a fraught relationship with his brother Samuel Quincy who was appointed solicitor general of Boston by Hutchinson; even while the other children who he grew up with were turning into patriots, Samuel remained a loyalist. Later he fled to England, leaving a young wife and three children behind, never to return. In some sense his story is a tragic one because he was never completely won over to the Loyalist cause, but at the very least he should be faulted for not realizing what direction the winds were blowing and especially for abandoning his family.
Josiah took it upon himself to spread the cause of Boston and rally the other colonies. In 1773 he traveled by himself to the South to wake up the Southern colonies and press home the oppression that was then being visited by the British on Boston by the Tea Act and then the blockade of the port of Boston. His brother Ned had died during a sea voyage and Josiah feared the same, but this did not come to pass. In 1774 he undertook an ever more serious mission, traveling to England to try to quell any misunderstandings between the parent and the child, trying to convince the prime minister, Lord North, and other high officials that Boston wanted to live in peace with England in spite of its rebellious spirit. But back at home, his incendiary pamphlets and letters indicated that he was completely won over to the cause of rebellion, if not independence. When he found out that the king and Parliament were deciding to tighten the screws even more on the colony (machinations and misunderstandings in England are brilliantly described in Nick Bunker's "An Empire on the Edge"), he decided to go back home in the spring of 1775 to alert his countrymen. Sadly, he fell prey to consumption on the voyage back. Sankovitch's account has convinced me that if Josiah had lived and been in good health, he would likely have surpassed both John Adams and John Hancock in his success, perhaps rising to the stature of Jefferson and certainly occupying high office. Sadly this was not to be. His wife Abigail bore him two children, but the girl died when she was a baby. The son later became a prominent political leader and a governor of Massachusetts.
John Hancock, meanwhile, was treading a delicate balancing act. As perhaps the wealthiest and most prominent citizen of Boston, he had to associate with the governor and royal officials and was given a commission as a colonel. But he still had to stand firm on the principles that his friends were fighting for. Admirably enough, both he and John Adams turned down many very tempting offers from the crown to occupy high office. When the colony's leaders signed a non-importation clause to punish British trade, Hancock who had made his fortune based on trade with Britain joined in. It was Hancock and the firebrand Adams brother, Sam Adams, who later became the most prominent targets of the crown, Hancock commanding the Massachusetts militia and the minutemen who were soon to became famous. By 1775, when the first shots at Lexington and Concord had been fired, there was a price on Hancock and Sam Adams's heads and they had to abandon Boston.
The last part of the book deals with the momentous summer of 1775 when the Declaration of Independence was signed. Abigail Adams had stood guard over the house in Braintree to protect it and her four children from both marauding British soldiers and the horrors of the plague, even as John was away for months during the first, second and third continental congresses in Philadelphia, overseeing logistics and communicating with George Washington who had immediately made his way to Cambridge as the new commander of the Continental Army. Sankovitch tells us how Abigail made a remarkable and brave effort to convince John to include the cause of women, poor people and black people during the signing ("Remember the ladies", she said); when John flippantly dismissed her admonitions as female ignorance, she wouldn't back down. Later of course, Abigail became known as "Mrs. President" because of her strong and intelligent opinions as President Adams's wife.
Sadly as is well known (and superbly documented by Danielle Allen in her book "Our Declaration"), a paragraph condemning slavery and King George's slave trade had been included even by Jefferson in the original draft of the Declaration but had to be taken out to gain the Southern states' fealty. Both John Hancock and John Adams along with their wives were utterly opposed to the institution, and it was Josiah Quincy who had first called it a "peculiar curse" (forerunner of the more famous phrase "a peculiar institution"). John Hancock had his beloved aunt free all their slaves in her will. The summer of 1775 presented a signal opportunity to right the wrongs in both the country's past and its future, but it would not come to pass and the peculiar institution would only be eradicated in a horrifying and destructive war a hundred years later even as its informal effects persevered for another hundred. But they tried, these residents of small Braintree where all were as equal as was possible during those times, and where the ministers and residents alike preached the message that you cannot succeed in your own estimation and that of God's if you don't succeed in the estimation of your community.

Book review: Quantum mechanics and quantum mechanics. David Kaiser's "Quantum Legacies: Dispatches from an Uncertain World"

David Kaiser is a remarkable man. He has two PhDs from Harvard, one in physics and one in the history of science, and is a professor in the Science, Technology and Society department at MIT. He has written excellent books on the history of particle physics and the quirky personalities inhabiting this world. On top of it all he is a genuinely nice guy - he once wrote me a long email out of the blue, complimenting me on a review of his book "How the Hippies Saved Physics". And while his primary focus is the history and philosophy of physics, Kaiser still seems to find time for doing research in quantum entanglement.

What makes Kaiser unique is the attention he gives to what we can call the sociological aspects of physics, things like the physics job market, portrayals of physicists in the humanities, parallel threads of science and history, and perhaps most uniquely, the publications of physics - both the bread-and-butter textbooks that students use and the popular physics books written for laymen. It's this careful analysis of physics's sociological aspects that makes "Quantum Legacies" a delightful read, tread as it does on some of the under-explored aspects of physics. There are chapters on quantum indeterminacy and entanglement and the lives of Schrödinger, Einstein and Dirac, a nice chapter on computing and von Neumann's computer and interesting essays on the Large Hadron Collider and the tragic Superconducting Supercollider which was shelved in 1993 and the Higgs boson. All these are worth reading. But the real gem in the book as far as I am concerned is a collection of three chapters on physics publishing; this is the kind of material that you won't find in other books on the history and philosophy of physics.

The first chapter is about a book that fascinated me to no end while I was growing up - Fritjof Capra's "The Tao of Physics" which explored parallels between quantum physics and Eastern mysticism. This book along with the downright dubious "aliens-visited-earth" literature by the Danish writer Erich von Daniken dotted my bedroom for a while until I grew up and discovered in particular that Daniken was peddling nonsense. But Capra isn't that easy to dismiss, especially as Kaiser tells us, his book hit the market at a perfect time in 1975 when physicists had become disillusioned by the Vietnam War, the public had become disillusioned by physicists, and both groups of people had become smitten with the countercultural movement, Woodstock and Ravi Shankar. There could be no better time for a book exploring the ins and outs of both the bizarre world of quantum mechanics and the mystical world of Buddhism and the "Dance of Shiva" to become popular. Kaiser describes how Capra's book set the tone for many similar ones, and while most of the parallels described in it are fanciful, it did get the public interested in both quantum physics and Eastern philosophy - no small feat. Capra's own personal story, one in which he comes to the United States from Vienna, has a hard time making ends meet and goes back and then decides to first write a textbook and then a more unique popular book based on his experiences in California and advice from famed physicist Victor Weisskopf, is also quite interesting.

The second interesting chapter is about a textbook, albeit a highly idiosyncratic one, that is a household name to students of general relativity - a 1200 page doorstop of a tome by Kip Thorne, Charles Misner and John Wheeler, all legendary physicists. "MTW" as the textbook became known was a kind of landmark event in physics publishing. The textbook was the first major book to introduce advanced undergraduate and graduate students to fascinating concepts like time dilation, spacetime curvature and black holes. The joke about its size was that not only was the book *about* gravity but that it also *generated* gravity. But everything about the book was highly unconventional and quirky, including the typeface, the non-linear narrative and most importantly, serious and advanced mathematical calculations interspersed with boxes containing cartoons, physicist biographies and outrageous speculations about wormholes and time travel. Most people didn't know what to make of it, and perhaps the best review came from the Indian-American astrophysicist Subrahmanyan Chandrasekhar who said, "The book espouses almost a missionary zeal in preaching its message. I (probably for historical reasons) am allergic to missionaries." Nonetheless, "MTW" occupies a pride of place in the history of physics textbooks, and a comparable one on sagging student shelves where it's probably more seen than read.

The last chapter and perhaps the one I found most interesting is about the content of traditional quantum mechanics textbook, which is really a history of the quantum mechanics textbook in general. The first quantum mechanics textbooks in the United States came out in the 1940s and 50s. Many of them came out of the first modern school of theoretical physics in the country founded by J. Robert Oppenheimer at the University of California, Berkeley. Two of Oppenheimer's students, David Bohm and Leonard Schiff, set the opposing tones for two different kinds of textbooks (I remember working through a bit of Schiff's book as an undergraduate). After the war Schiff taught at Stanford, Bohm at Princeton.

Bohm was old school and believed in teaching quantum mechanics as a subject fraught with fascinating paradoxes and philosophical speculations. His approach was very close in spirit to the raging debates of the original scientist-philosophers who had founded the revolutionary paradigm - Niels Bohr, Albert Einstein, Erwin Schrödinger and Werner Heisenberg in particular. Bohm of course had a very eventful life in which he was accused on being a Communist and hounded out of the country, after which he settled in England and became known for carrying out and publishing a set of philosophical dialogues with Indian philosopher J. D. Krishnamurthy. His textbook is still in print and is worth reading, but it's worth noting that the Schrödinger equation is not even introduced until several chapters into the volume.

Schiff's book was different and was a practical textbook that taught students how to solve problems, mirroring a philosophy called "shut up and calculate" that was then taking root in American higher physics education. The Schrödinger equation was introduced on page 6. What Kaiser fascinatingly demonstrates, often through analysis of the original lecture notes from Bohm and Schiff's classes, is that this attitude reflected both a mushrooming of physics students as well as a higher demand for physicists engendered by the Cold War and the military-industrial complex. Not surprisingly, when you had to turn out large numbers of competent physicists with jobs waiting for them in the nation's laboratories and universities, you had little time or patience to teach them the philosophical intricacies of the field. Shut up, calculate, and get out there and beat the Soviets became the mantra of the American physics establishment.

Fascinatingly, Kaiser finds out that the philosophical trends and the practical ones in physics textbook publishing wax and wane with the times; when the job market was good and enrollment was high, the practical school prevailed and textbooks accordingly reflected its preferences, and when the pickings were slim, the job market was tight and enrollment drastically dropped, philosophical questions started making a comeback on tests and in textbooks. Especially after 1970 when the job market tanked, the Vietnam War disillusioned many aspiring physicists and the countercultural movement took off, philosophical speculations took off as well and combined with Fritjof Capra's "The Tao of Physics". Perhaps the ultimate rejection of philosophy among physicists might be said to have come during the second job slump in the early 90s, when many physicists left the world of particles and fields for the world of parties and heels on Wall Street.

Physics publishing, the physics market, the lives of physicists and physics theories have a strange and unpredictable entanglement of their own, one which even Einstein and Bohr might not have anticipated. Kaiser's book explores these well and brings a unique perspective to some of the most interesting aspects of a science that has governed men's lives, their education and their wallets.

What John von Neumann really did for modern computing


That John von Neumann was one of the supreme intellects humanity has produced should be a statement beyond dispute. Both the lightning fast speed of his mind and the astonishing range of fields he made seminal contributions to made him a legend in his own lifetime. When he died in 1957 at the young age of 56 it was a huge loss; the loss of a great mathematician, a great polymath and to many, a great patriotic American who had done much to improve his country's advantage in cutting-edge weaponry.

Starting with pure mathematics - set and measure theory, rings of operators, foundations of mathematics in the 1920s and early 30s - von Neumann moved to other mathematical topics like ergodic theory, Hilbert spaces and the foundations of quantum mechanics that were closer to physics. He then moved into economics, writing "The Theory of Games and Economic Behavior" with Oskar Morgenstern which laid the foundations of game theory (a first edition in good condition now sells for $12,500). During and after the war von Neumann became an almost completely applied mathematician and physicist. Perhaps the major reason for this transformation was his introduction to computing during a consulting stint in England during the war in 1943. Even as nuclear weapons promised to completely change politics, science and international relations, he was writing in a letter to a friend at the end of the war, "I am thinking about something much more important than bombs; I am thinking about computers." In another puckish letter that indicated his move away from his traditional domain of pure mathematics, he said he was coming back from England a "better and impurer man".

During the war, Von Neumann played a key role in developing the idea of implosion used in the plutonium bomb developed in the Manhattan Project. He visited Los Alamos as a consultant from 1943 onwards until the end of the war and specifically lent his expertise to the "lenses" in the plutonium bomb that focused a converging shock wave triggered by explosives that set off the fission reaction. These contributions were based on the valuable experience he had gained consulting on ballistics, shaped charges and shock waves at the Aberdeen Proving Ground in Maryland. During and after the war he turned his powerful mind to all kinds of defense-related research and became a major voice in the development of the hydrogen bomb and ICBMs; at one point he advised every US defense agency except the Coast Guard.

To the lay public and to engineers, von Neumann might be best known as one of the founders of modern computing, his name made ubiquitous through the von Neumann architecture of computers that is taught to undergraduates in computer science and engineering. Interestingly, it is this distinction that is somewhat controversial and also much more interesting than it seems from a naive analysis. On both sides one sometimes sees extreme opinions tossed about, so it's worth laying some of them to rest right away. Von Neumann did not "invent" the computer or computer science; the history of computing goes back much farther all the way to medieval times. He also did not "invent" the stored program computer concept, neither did he invent most of the major computing concepts that we now take for granted, like RAM and flow control. He did not invent any important technical bit of hardware. But as William Aspray surmises in his excellent and detailed, albeit a bit staid and dry book, von Neumann's true influence was far more subtle and in fact ironically goes even further than what his defenders imply. I am not sure even Aspray does a convincing job emphasizing how far it went. Therefore, while I will not embark on a detailed chapter-by-chapter analysis of the book here, what I want to do is drive home the two most important concepts that emerge when we analyze von Neumann's role in the history of modern computing - the value of generalists and the power of abstraction.

An accidental introduction to computing

Von Neumann became introduced to computers in large part by accident. An important part of the introduction came from the "computers" - usually women calculators in an assembly line kind of system performing repetitive calculations - who were used to do bomb calculations at Los Alamos. The bomb calculations particularly drove home to him the importance of non-linear phenomena involved in the complex radiation flow and hydrodynamics of a nuclear explosion, phenomena that were very hard to model by hand. Another introduction came from meeting scientists in England like Alan Turing and the engineers who were building some of the first computers in Manchester and other places. Von Neumann had also seen the value of computing tables in his work on ballistics at the Aberdeen Proving Ground in Maryland. All these experiences drove home to him the importance of computational science in general. 

But perhaps the most important event that introduced von Neumann to computing was a chance encounter at a railway station in the summer of 1944 with Herman Goldstine, an engineer who had been working on the ENIAC computer at the University of Pennsylvania. Until then von Neumann did not know about this pioneering work that was the first important computer project in the country. The ENIAC was not a true stored program computer, so the cables and connections had to be laboriously rewired to solve every new problem, but by the standards of the times it was quite advanced and is now considered the first general-purpose computer, able to tackle a variety of problems. Unlike past analog computers which used electromechanical relays, the ENIAC used vacuum tubes which importantly made it a digital computer and a forerunner of modern computers. The ENIAC had been a labor of love and had been built by engineers whose names are sadly not as appreciated as von Neumann's but should be; along with Goldstine, Julian Bigelow, J. Presper Eckart and John Mauchly played foundational roles in its design and construction.

The importance of von Neumann's influence

At this point it's sensible to say a word about the state of what was then computer science. As a field it was generally looked down upon by mathematicians and physicists and regarded as being the domain of drudge work. This is where the first of von Neumann's contributions came into play: his sheer influence whose role cannot be underestimated. By the 1940s he was already considered one of the world's greatest mathematicians and polymaths, and his work in mathematics, physics and economics all commanded the highest respect. In addition, the sheer speed of his thinking that left even Nobel Laureates feeling stumped contributed to a kind of godlike perception of his abilities; later Enrico Fermi once said that von Neumann made him feel like he knew no mathematics at all, and Hans Bethe once mused whether von Neumann's mind indicated a higher species of human being. Von Neumann was also becoming a very valuable asset to the US government. All this meant that when von Neumann said something, you listened. People who question his importance to modern computing sometimes don't appreciate that "importance" in a field is a combination of originality and influence. In terms of influence there was none who surpassed von Neumann, so whatever he said about computing was often taken seriously simply because he had said it.

Von Neumann the generalist

The reason von Neumann immediately became so influential in the ENIAC project attested to one of his signal qualities - his remarkable ability to quickly grasp a new field of inquiry and then to leapfrog over even the field's founders to divine new results and insights. It was also a source of annoyance to some since it meant that von Neumann could take their ideas and immediately run farther with them than they themselves could. More than anyone else von Neumann could take the complete measure of a field, a thirty thousand foot view if you will. This is where an even more important quality of his came into play - the polymath's ability to be a generalist. Most people who worked in computing then came from narrowly defined fields: the mathematicians didn't know much about engineering, and the engineers who specialized in vacuum tubes and electronics had little idea of the mathematical theory behind computing. Von Neumann was unique in having total command of all of mathematics and a good part of physics, and his work at Aberdeen and Los Alamos had also introduced him to key ideas in engineering. The missing link was the engineering work on the ENIAC, and when he understood this work, his generalist's mind quickly connected all the dots.

Von Neumann and the power of abstraction

Two other very important facts contributed to making von Neumann unique, and both of them shed light not just on his mind but on the power of abstraction. One was a reading of Alan Turing's famous 1936 paper on Turing machines that led the foundations of theoretical computer science. This was again a paper which would not have been read by engineers. When Turing visited Princeton during the war von Neumann tried to recruit him as his assistant but Turing instead chose to go back and become a key part of the government's cryptographic effort in breaking the German codes. But Turing's paper proved very influential and in fact von Neumann asked all the engineers working on the ENIAC and later on the Institute for Advanced Study computer to read it.

The second paper that was a major influence on von Neumann was a 1943 paper by Walter Pitts and Warren McCullough that was the first computational model of a neuron and the forerunner of today's neural networks. Von Neumann immediately grasped the similarity between the Pitts-McCullough paper and the basis of computing. Again, this would not be work familiar to engineers or even other mathematicians interested in computing, and it was only von Neumann's unique interests and abilities as a polymath that led him to read and appreciate it, and to especially appreciate the value of treating neurons and computational elements in general as generalized black boxes.

Both the Turing and the Pitts-McCullough paper led von Neumann to achieve something that was actually unique and can be stamped with his name on it. This something is a signature quality of mathematics and to some extent computer science, and it's what really makes those two fields the powerful fields they have become. The signature quality is the power of abstraction. The beauty and strength of mathematics is that it can generalize from specific instances (or instantiations, as computer scientists like to say) to universal abstract frameworks. Physics also shares this power to a considerable extent - for instance, the equation F=ma is independent of its specific instances and can equally describe an apple falling to the earth, a planet revolving around the sun and two black holes colliding. But the language the equation is expressed in is mathematics, and it is mathematics that allows us to generalize in the first place.

Von Neumann's big achievement was in being able to move away from vacuum tubes, wires, punch cards and magnetic core memory to a high-level view of computing that also led him to see parallels with the human brain. Basically this view told him that any computational framework - biological or electronic - must have five basic components: an input, an output, an arithmetic unit, a processing unit that manipulates data and a memory that stores data. Crucially, it also told him that both the instructions for doing something and the thing that is done can be stored in the same place and in the same form. In the words of the historian George Dyson, von Neumann's insights "erased the distinction between numbers that mean something and numbers that do something." The stored program was not invented by von Neumann, but this abstract view of the stored program did come from him, again thanks to his powers as a pure mathematician and generalist. These two signal insights are the basis of today's von Neumann architecture, but the key idea enabling them was an abstracted view that led von Neumann to visualize the structure of the computer in a most general form, something that his specialized contemporaries could not do.

A slight digression on this idea of the power of abstraction since it's relevant to my own job. I am involved with a company which is trying to enable scientists to run experiments in biology and chemistry remotely in a "cloud lab" from the luxury of their homes and laptops. A key idea in doing this is to abstract away the gory details of all the hardware and equipment through a software platform that only exposes high-level functionality to scientists who aren't experts in engineering. But an even more desirable goal is to generalize workflows across biology and chemistry so that instead of thinking of protocols specific to biology or chemistry, scientists will only think of generic protocols and generic sequences of steps like "move liquid", "stir" and "heat/cool". This is possible because at an abstract level, a container holding cells and a container holding a chemical compound for instance are both the same from the point of view of software - they are objects on which you need to perform some operation. At an even more abstract level, they are binary bits of code which change into other binary bits of code; at this level, the words "biology" and "chemistry" become irrelevant.

The ultimate goal is thus to do away with the distinction between specific instantiations of operations in specific fields and abstract them away into generalized operations. I would like to think Johnny would have appreciated this.

First Draft of a Report on the EDVAC (1945)

The result of this generalist knowledge was a seminal report called First Draft of a Report on the EDVAC that von Neumann wrote and circulated in 1945 and 1946. The EDVAC was supposed to be the ENIAC's successor and a true stored program computer. The report laid out in detail what we know as the von Neumann architecture and also explained key concepts like flow control, sub-routines and memory implementation. Von Neumann was especially big on subroutines since they went a long way in enabling instantaneous access of specific instructions that would enable stored program computing. He also emphasized the importance of random access memory; the first random access memory hardware was the Williams tube, invented in 1946. 

The EDVAC report has become controversial because of two reasons. Firstly, while it came out of many discussions that von Neumann had with the ENIAC engineers, especially Eckert and Mauchly, it had only von Neumann's name on it. Secondly, the report led to a bitter patent dispute. Eckert and Mauchly wanted to start their own company designing computers based on patenting the work on the ENIAC. But after von Neumann circulated the report in public the knowledge was in the public domain and therefore the patent issue became moot. Eckert and Mauchly were understandably bitter about this, but we have to credit von Neumann for being an early proponent of open-source software; he wanted concepts from computing to be available to everyone. Appropriately enough, the EDVAC report became widely known to engineers and scientists across the United States and Europe and influenced the design of computers in many countries. It cemented von Neumann's reputation as one of the founders of modern computing, but it should always be remembered that while the generalist insights in that report came from von Neumann, they were based on a lot of specific engineering and design work done by others.

Two early applications: Non-linear equations and meteorology

After working on the ENIAC and the EDVAC, von Neumann decided to apply all the knowledge and insights he had gained to building a computer at the Institute for Advanced Study (IAS) in Princeton where he had been a member since 1933. This fascinating story has been told extremely well by George Dyson in his marvelous book "Turing's Cathedral" so it's not worth repeating here. But it is worth noting what von Neumann considered the two most important applications he envisaged for the first computers. The first was the solution of non-linear equations. Von Neumann had become quite familiar with non-linear equations in the analysis of the complex hydrodynamics and radiation flow associated with nuclear explosions. He knew that non-linear equations are very hard to solve using traditional methods - while analytical solutions are often impossible, even numerical ones might be challenging - and realized that the iterative and fast techniques computers used would greatly aid the solution of these methods. Many of the early papers authored by von Neumann, Goldstine and Bigelow describe mathematical problems like the diagonalization of large matrices and the solution of non-linear partial differential equations. This early work drove home the great advantage and power of computing in a wide variety of fields where non-linear equations are important.

Von Neumann also realized that the atmosphere with its complex movements of air and water is a perfect example of non-linear phenomena. Events during the war like the Normandy landings had emphasized the importance of understanding the weather; von Neumann now thought that the computer would be the ideal tool for weather simulation. Most of the work in this area was done by scientists like Jule Charney and Carl-Gustaf Rossby, but von Neumann played a very influential role by co-authoring papers with them, organizing conferences, securing funding and generally spreading the word. His stature and eminence again went far in convincing the scientific community to work on applying computers to meteorology. Von Neumann also thought that controlling the weather would be easy, but this has proved to be a harder goal, partly became of the complexity of the phenomena involved (including chaos) and partly because of political reasons.

Von Neumann's role as a founder of modern computer science

The Institute for Advanced Study computer had a memory of 5 kilobytes, less than what it takes to display a single pixel today. And yet it achieved remarkable feats, simulating the workings of a hydrogen bomb (secretly, at night), simulating the weather and modeling the genetic growth of populations. It embodied all of von Neumann's salient concepts and was widely emulated around the country. The navy built a computer based on the IAS machine, and so did IBM and the RAND corporation whose machine was playfully named the JOHNNIAC. From these machines the gospel spread wide and hard. 

In his last few years von Neumann became even more interested in the parallels between the brain and the computer. His last major contribution was to come up with a detailed theory of self-reproducing automata which presaged important later developments in molecular biology and nanotechnology; a 1948 set of lectures at Caltech by him lays out components of self-reproducing organisms with error correction that are remarkably similar to the DNA, RNA ribosomes , proof-reading enzymes and other genetic components that were later discovered. Once again, what made von Neumann's insights in this area possible was that he thought about these components in the most general, most abstract manner, without waiting for the biologists to catch up. In the 1950s he planned to move away from the IAS to either UCLA or MIT where his interests in computing would find a better home and would be encouraged and funded. The history of science and technology could have been very different had this come to pass. Unfortunately this did not come to pass. In 1956 von Neumann was diagnosed with cancer, and he passed away after a cruel and protracted illness in February 1957. Notes for a set of lectures later published as a book lay on his deathbed.

So was von Neumann one of the founders of modern computer science? As complicated, subtle and important as the details are, the overall answer has to be yes. This answer has little to do with his technical contributions and all to do with his sheer influence and his power of generalization and abstraction. Von Neumann communicated the power of computers at a time when they were regarded as little more than turn-the-crank calculators. Because of his enormously wide-ranging interests he demonstrated their potential applications to a vast number of fields in pure and applied mathematics, meteorology, physics and biology. Most importantly, he came up with general ideas that serve as the foundation of so much computing that we take for granted today. In other words, von Neumann more than anyone else made computing respectable, widely known and the basis of modern life that everyone critically relies on today. He is not the founder of computer science or the "inventor of the computer", but certainly one of the principal founders. And he achieved this status largely because of the advantage enjoyed by generalists over specialists and the power of abstraction, both good lessons for an age when specialization seems to be the norm.