Field of Science

Book Review: "American Rebels", by Nina Sankovitch

I greatly enjoyed this excellent book on the intertwined lives and fortunes of three families from the little town of Braintree - the Adamses, the Hancocks and the Quincys. Nina Sankovitch has woven a wonderful tale of how these three families and the famous names they gave rise to were both spectators and participants in some of the most famous events in American history. The account is often quite engaging and it kept me glued to the pages. I was aware of the general facts and characters, of course, but the book accomplished the valuable goal of introducing me to Josiah Quincy in particular, a name I had only heard of but did not know much about.
Sankovitch''s account begins in the 1740s when she shows us how John Adams, John Hancock and Josiah Quincy grew up together in Braintree, along with other kids like Abigail Quincy. She leads us through well known events ranging from about 1765 to 1775 - the years of turmoil, and ones during which all these men and women found the role that history had created for them - with flair, and also sheds light on underappreciated events like the dysentery and smallpox epidemics that swept through Boston. The book portrays Braintree as a small town and quintessential example of American egalitarianism, one where everyone was equal - from the distinguished Quincys and wealthy Hancocks to the Adamses who came from yeoman farming stock. Today Braintree is simply the south end of the "T" for most Bostonians.
All the boys and girls played together in the same town square and attended the same church where their fathers were ministers. Abigail Adams came from the prominent Quincy family. Everyone had been drilled right from childhood in both the values of self-reliance and that of community service. The colony had already been enlightened about the evils of slavery, and many colonists did not own slaves unlike their Southern brethren. After John Hancock's father died, his wealthy uncle Thomas and aunt Lydia took him under their wing and spirited him away to Boston. There on Beacon Hill, in a wealthy mansion, Hancock grew up and took charge of the family's prosperous trading business. He soon became perhaps the most prominent citizen of Boston, certainly the wealthiest but also the most charitable. All spit and polish, he would throw dinner parties, give to the poor and somehow still avoid "entangling alliances" with the British, especially the much-hated Governor Thomas Hutchinson.
The real star of the story, however, is Josiah Quincy. A brilliant student at Harvard who raided the library while the others were drinking and playing cards (he knew almost all of Shakespeare by heart), he became a prominent lawyer who started publishing letters promoting the liberty and property rights of the colonists in the "Boston Gazette" and the "Massachusetts Spy". His brilliance, eloquence and dedication to the cause of liberty and property rights all exceeded those of his compatriots, the two Johns. John Adams really became prominent only after his defense of the British soldiers accused of orchestrating the Boston Massacre of 1770, and before that the limelight seemed to belong to Hancock, Quincy and his brother Sam Adams who headed the incendiary group the Sons of Liberty which was responsible for the Boston Tea Party. Racked with consumption almost all his life, Josiah could be laid low for days and nights and it was remarkable that he undertook the work that he did with such enthusiasm and industry. His friend Dr. Joseph Warren regularly visited him and nursed him back to health every time - Warren later martyred himself on Bunker Hill. Josiah had a fraught relationship with his brother Samuel Quincy who was appointed solicitor general of Boston by Hutchinson; even while the other children who he grew up with were turning into patriots, Samuel remained a loyalist. Later he fled to England, leaving a young wife and three children behind, never to return. In some sense his story is a tragic one because he was never completely won over to the Loyalist cause, but at the very least he should be faulted for not realizing what direction the winds were blowing and especially for abandoning his family.
Josiah took it upon himself to spread the cause of Boston and rally the other colonies. In 1773 he traveled by himself to the South to wake up the Southern colonies and press home the oppression that was then being visited by the British on Boston by the Tea Act and then the blockade of the port of Boston. His brother Ned had died during a sea voyage and Josiah feared the same, but this did not come to pass. In 1774 he undertook an ever more serious mission, traveling to England to try to quell any misunderstandings between the parent and the child, trying to convince the prime minister, Lord North, and other high officials that Boston wanted to live in peace with England in spite of its rebellious spirit. But back at home, his incendiary pamphlets and letters indicated that he was completely won over to the cause of rebellion, if not independence. When he found out that the king and Parliament were deciding to tighten the screws even more on the colony (machinations and misunderstandings in England are brilliantly described in Nick Bunker's "An Empire on the Edge"), he decided to go back home in the spring of 1775 to alert his countrymen. Sadly, he fell prey to consumption on the voyage back. Sankovitch's account has convinced me that if Josiah had lived and been in good health, he would likely have surpassed both John Adams and John Hancock in his success, perhaps rising to the stature of Jefferson and certainly occupying high office. Sadly this was not to be. His wife Abigail bore him two children, but the girl died when she was a baby. The son later became a prominent political leader and a governor of Massachusetts.
John Hancock, meanwhile, was treading a delicate balancing act. As perhaps the wealthiest and most prominent citizen of Boston, he had to associate with the governor and royal officials and was given a commission as a colonel. But he still had to stand firm on the principles that his friends were fighting for. Admirably enough, both he and John Adams turned down many very tempting offers from the crown to occupy high office. When the colony's leaders signed a non-importation clause to punish British trade, Hancock who had made his fortune based on trade with Britain joined in. It was Hancock and the firebrand Adams brother, Sam Adams, who later became the most prominent targets of the crown, Hancock commanding the Massachusetts militia and the minutemen who were soon to became famous. By 1775, when the first shots at Lexington and Concord had been fired, there was a price on Hancock and Sam Adams's heads and they had to abandon Boston.
The last part of the book deals with the momentous summer of 1775 when the Declaration of Independence was signed. Abigail Adams had stood guard over the house in Braintree to protect it and her four children from both marauding British soldiers and the horrors of the plague, even as John was away for months during the first, second and third continental congresses in Philadelphia, overseeing logistics and communicating with George Washington who had immediately made his way to Cambridge as the new commander of the Continental Army. Sankovitch tells us how Abigail made a remarkable and brave effort to convince John to include the cause of women, poor people and black people during the signing ("Remember the ladies", she said); when John flippantly dismissed her admonitions as female ignorance, she wouldn't back down. Later of course, Abigail became known as "Mrs. President" because of her strong and intelligent opinions as President Adams's wife.
Sadly as is well known (and superbly documented by Danielle Allen in her book "Our Declaration"), a paragraph condemning slavery and King George's slave trade had been included even by Jefferson in the original draft of the Declaration but had to be taken out to gain the Southern states' fealty. Both John Hancock and John Adams along with their wives were utterly opposed to the institution, and it was Josiah Quincy who had first called it a "peculiar curse" (forerunner of the more famous phrase "a peculiar institution"). John Hancock had his beloved aunt free all their slaves in her will. The summer of 1775 presented a signal opportunity to right the wrongs in both the country's past and its future, but it would not come to pass and the peculiar institution would only be eradicated in a horrifying and destructive war a hundred years later even as its informal effects persevered for another hundred. But they tried, these residents of small Braintree where all were as equal as was possible during those times, and where the ministers and residents alike preached the message that you cannot succeed in your own estimation and that of God's if you don't succeed in the estimation of your community.

Book review: Quantum mechanics and quantum mechanics. David Kaiser's "Quantum Legacies: Dispatches from an Uncertain World"

David Kaiser is a remarkable man. He has two PhDs from Harvard, one in physics and one in the history of science, and is a professor in the Science, Technology and Society department at MIT. He has written excellent books on the history of particle physics and the quirky personalities inhabiting this world. On top of it all he is a genuinely nice guy - he once wrote me a long email out of the blue, complimenting me on a review of his book "How the Hippies Saved Physics". And while his primary focus is the history and philosophy of physics, Kaiser still seems to find time for doing research in quantum entanglement.

What makes Kaiser unique is the attention he gives to what we can call the sociological aspects of physics, things like the physics job market, portrayals of physicists in the humanities, parallel threads of science and history, and perhaps most uniquely, the publications of physics - both the bread-and-butter textbooks that students use and the popular physics books written for laymen. It's this careful analysis of physics's sociological aspects that makes "Quantum Legacies" a delightful read, tread as it does on some of the under-explored aspects of physics. There are chapters on quantum indeterminacy and entanglement and the lives of Schrödinger, Einstein and Dirac, a nice chapter on computing and von Neumann's computer and interesting essays on the Large Hadron Collider and the tragic Superconducting Supercollider which was shelved in 1993 and the Higgs boson. All these are worth reading. But the real gem in the book as far as I am concerned is a collection of three chapters on physics publishing; this is the kind of material that you won't find in other books on the history and philosophy of physics.

The first chapter is about a book that fascinated me to no end while I was growing up - Fritjof Capra's "The Tao of Physics" which explored parallels between quantum physics and Eastern mysticism. This book along with the downright dubious "aliens-visited-earth" literature by the Danish writer Erich von Daniken dotted my bedroom for a while until I grew up and discovered in particular that Daniken was peddling nonsense. But Capra isn't that easy to dismiss, especially as Kaiser tells us, his book hit the market at a perfect time in 1975 when physicists had become disillusioned by the Vietnam War, the public had become disillusioned by physicists, and both groups of people had become smitten with the countercultural movement, Woodstock and Ravi Shankar. There could be no better time for a book exploring the ins and outs of both the bizarre world of quantum mechanics and the mystical world of Buddhism and the "Dance of Shiva" to become popular. Kaiser describes how Capra's book set the tone for many similar ones, and while most of the parallels described in it are fanciful, it did get the public interested in both quantum physics and Eastern philosophy - no small feat. Capra's own personal story, one in which he comes to the United States from Vienna, has a hard time making ends meet and goes back and then decides to first write a textbook and then a more unique popular book based on his experiences in California and advice from famed physicist Victor Weisskopf, is also quite interesting.

The second interesting chapter is about a textbook, albeit a highly idiosyncratic one, that is a household name to students of general relativity - a 1200 page doorstop of a tome by Kip Thorne, Charles Misner and John Wheeler, all legendary physicists. "MTW" as the textbook became known was a kind of landmark event in physics publishing. The textbook was the first major book to introduce advanced undergraduate and graduate students to fascinating concepts like time dilation, spacetime curvature and black holes. The joke about its size was that not only was the book *about* gravity but that it also *generated* gravity. But everything about the book was highly unconventional and quirky, including the typeface, the non-linear narrative and most importantly, serious and advanced mathematical calculations interspersed with boxes containing cartoons, physicist biographies and outrageous speculations about wormholes and time travel. Most people didn't know what to make of it, and perhaps the best review came from the Indian-American astrophysicist Subrahmanyan Chandrasekhar who said, "The book espouses almost a missionary zeal in preaching its message. I (probably for historical reasons) am allergic to missionaries." Nonetheless, "MTW" occupies a pride of place in the history of physics textbooks, and a comparable one on sagging student shelves where it's probably more seen than read.

The last chapter and perhaps the one I found most interesting is about the content of traditional quantum mechanics textbook, which is really a history of the quantum mechanics textbook in general. The first quantum mechanics textbooks in the United States came out in the 1940s and 50s. Many of them came out of the first modern school of theoretical physics in the country founded by J. Robert Oppenheimer at the University of California, Berkeley. Two of Oppenheimer's students, David Bohm and Leonard Schiff, set the opposing tones for two different kinds of textbooks (I remember working through a bit of Schiff's book as an undergraduate). After the war Schiff taught at Stanford, Bohm at Princeton.

Bohm was old school and believed in teaching quantum mechanics as a subject fraught with fascinating paradoxes and philosophical speculations. His approach was very close in spirit to the raging debates of the original scientist-philosophers who had founded the revolutionary paradigm - Niels Bohr, Albert Einstein, Erwin Schrödinger and Werner Heisenberg in particular. Bohm of course had a very eventful life in which he was accused on being a Communist and hounded out of the country, after which he settled in England and became known for carrying out and publishing a set of philosophical dialogues with Indian philosopher J. D. Krishnamurthy. His textbook is still in print and is worth reading, but it's worth noting that the Schrödinger equation is not even introduced until several chapters into the volume.

Schiff's book was different and was a practical textbook that taught students how to solve problems, mirroring a philosophy called "shut up and calculate" that was then taking root in American higher physics education. The Schrödinger equation was introduced on page 6. What Kaiser fascinatingly demonstrates, often through analysis of the original lecture notes from Bohm and Schiff's classes, is that this attitude reflected both a mushrooming of physics students as well as a higher demand for physicists engendered by the Cold War and the military-industrial complex. Not surprisingly, when you had to turn out large numbers of competent physicists with jobs waiting for them in the nation's laboratories and universities, you had little time or patience to teach them the philosophical intricacies of the field. Shut up, calculate, and get out there and beat the Soviets became the mantra of the American physics establishment.

Fascinatingly, Kaiser finds out that the philosophical trends and the practical ones in physics textbook publishing wax and wane with the times; when the job market was good and enrollment was high, the practical school prevailed and textbooks accordingly reflected its preferences, and when the pickings were slim, the job market was tight and enrollment drastically dropped, philosophical questions started making a comeback on tests and in textbooks. Especially after 1970 when the job market tanked, the Vietnam War disillusioned many aspiring physicists and the countercultural movement took off, philosophical speculations took off as well and combined with Fritjof Capra's "The Tao of Physics". Perhaps the ultimate rejection of philosophy among physicists might be said to have come during the second job slump in the early 90s, when many physicists left the world of particles and fields for the world of parties and heels on Wall Street.

Physics publishing, the physics market, the lives of physicists and physics theories have a strange and unpredictable entanglement of their own, one which even Einstein and Bohr might not have anticipated. Kaiser's book explores these well and brings a unique perspective to some of the most interesting aspects of a science that has governed men's lives, their education and their wallets.

What John von Neumann really did for modern computing


That John von Neumann was one of the supreme intellects humanity has produced should be a statement beyond dispute. Both the lightning fast speed of his mind and the astonishing range of fields he made seminal contributions to made him a legend in his own lifetime. When he died in 1957 at the young age of 56 it was a huge loss; the loss of a great mathematician, a great polymath and to many, a great patriotic American who had done much to improve his country's advantage in cutting-edge weaponry.

Starting with pure mathematics - set and measure theory, rings of operators, foundations of mathematics in the 1920s and early 30s - von Neumann moved to other mathematical topics like ergodic theory, Hilbert spaces and the foundations of quantum mechanics that were closer to physics. He then moved into economics, writing "The Theory of Games and Economic Behavior" with Oskar Morgenstern which laid the foundations of game theory (a first edition in good condition now sells for $12,500). Von Neumann contributed to many other fields in major and minor ways. During and after the war he turned his powerful mind to defense-related research and became a major voice in the development of the hydrogen bomb and ICBMs; at one point he advised every US defense agency except the Coast Guard. Von Neumann played a key role in developing the idea of implosion used in the plutonium bomb during the Manhattan Project and made valuable contributions to consulting on ballistics and shock waves. After the war von Neumann turned completely to applied mathematics. Perhaps the major reason for this transformation was his introduction to computing during a consulting stint in England during the war in 1943. Even as nuclear weapons promised to completely change politics, science and international relations, he was writing in a letter to a friend at the end of the war, "I am thinking about something much more important than bombs; I am thinking about computers." In another letter that indicated his move away from his traditional domain of pure mathematics, he said he was coming back from England a "better and impurer man".

To the lay public and to engineers, von Neumann might be best known as one of the founders of modern computing, his name made ubiquitous through the von Neumann architecture of computers that is taught to undergraduates in computer science and engineering. Interestingly, it is this distinction that is somewhat controversial and also much more interesting than it seems from a naive analysis. On both sides one sometimes sees extreme opinions tossed about, so it's worth laying some of them to rest right away. Von Neumann did not "invent" the computer or computer science; the history of computing goes back much farther all the way to medieval times. He also did not "invent" the stored program computer concept, neither did he invent most of the major computing concepts that we now take for granted, like RAM and flow control. He did not invent any important technical bit of hardware. But as William Aspray surmises in his excellent and detailed, albeit a bit staid and dry book, von Neumann's true influence was far more subtle and in fact ironically goes even further than what his defenders imply. I am not sure even Aspray does a convincing job emphasizing how far it went. Therefore, while I will not embark on a detailed chapter-by-chapter analysis of the book here, what I want to do is drive home the two most important concepts that emerge when we analyze von Neumann's role in the history of modern computing - the value of generalists and the power of abstraction.

An accidental introduction to computing

Von Neumann became introduced to computers in large part by accident. An important part of the introduction came from the "computers" - usually women calculators in an assembly line kind of system who did repetitive calculations - who were used to do bomb calculations at Los Alamos. Another introduction came from meeting scientists in England like Alan Turing and the engineers who were building some of the first computers in Manchester and other places. Von Neumann had also seen the value of computing tables in his work on ballistics at the Aberdeen Proving Ground in Maryland. All these experiences drove home to him the importance of computational science in general. 

But perhaps the most important event that introduced von Neumann to computing was a chance encounter at a railway station in the summer of 1944 with Herman Goldstine, an engineer who had been working on the ENIAC computer at the University of Pennsylvania. Until then von Neumann did not know about this pioneering work that was the first important computer project in the country. The ENIAC was not a stored program computer, so the cables and connections had to be laboriously rewired to solve every new problem, but by the standards of the times it was quite advanced and is now considered the first general-purpose computer, able to tackle a variety of problems. Unlike past analog computers which used electromechanical relays, the ENIAC used vacuum tubes which importantly made it a digital computer and a forerunner of modern computers. The ENIAC had been a labor of love and had been built by engineers whose names are sadly not as appreciated as von Neumann's but should be; along with Goldstine, Julian Bigelow, J. Presper Eckart and John Mauchly played foundational roles in its design and construction.

The importance of von Neumann's influence

At this point it's sensible to say a word about the state of what was then computer science. As a field it was generally looked down upon by mathematicians and physicists and regarded as being the domain of drudge work. This is where the first of von Neumann's contributions came into play: his sheer influence whose role cannot be underestimated. By the 1940s he was already considered one of the world's greatest mathematicians and polymaths, and his work in mathematics, physics and economics all commanded the highest respect. In addition, the sheer speed of his thinking that left even Nobel Laureates feeling stumped contributed to a kind of godlike perception of his abilities; later Enrico Fermi once said that von Neumann made him feel like he knew no mathematics at all, and Hans Bethe once mused whether von Neumann's mind indicated a higher species of human being. Von Neumann was also becoming a very valuable asset to the US government. All this meant that when von Neumann said something, you listened. People who question his importance to modern computing sometimes don't appreciate that "importance" in a field is a combination of originality and influence. In terms of influence there was none who surpassed von Neumann, so whatever he said about computing was often taken seriously simply because he had said it.

Von Neumann the generalist

The reason von Neumann immediately became so influential in the ENIAC project attested to one of his signal qualities - his remarkable ability to quickly grasp a new field of inquiry and then to leapfrog over even the field's founders to divine new results and insights. It was also a source of annoyance to some since it meant that von Neumann could take their ideas and immediately run farther with them than they themselves could. More than anyone else von Neumann could take the complete measure of a field, a thirty thousand foot view if you will. This is where an even more important quality of his came into play - the polymath's ability to be a generalist. Most people who worked in computing then came from narrowly defined fields: the mathematicians didn't know much about engineering, and the engineers who specialized in vacuum tubes and electronics had little idea of the mathematical theory behind computing. Von Neumann was unique in having total command of all of mathematics and a good part of physics, and his work at Aberdeen and Los Alamos had also introduced him to key ideas in engineering. The missing link was the engineering work on the ENIAC, and when he understood this work, his generalist's mind quickly connected all the dots.

Von Neumann and the power of abstraction

Two other very important facts contributed to making von Neumann unique, and both of them shed light not just on his mind but on the power of abstraction. One was a reading of Alan Turing's famous 1936 paper on Turing machines that led the foundations of theoretical computer science. This was again a paper which would not have been read by engineers. When Turing visited Princeton during the war von Neumann tried to recruit him as his assistant but Turing instead chose to go back and become a key part of the government's cryptographic effort in breaking the German codes. But Turing's paper proved very influential and in fact von Neumann asked all the engineers working on the ENIAC and later on the Institute for Advanced Study computer to read it.

The second paper that was a major influence on von Neumann was a 1943 paper by Walter Pitts and Warren McCullough that was the first computational model of a neuron and the forerunner of today's neural networks. Von Neumann immediately grasped the similarity between the Pitts-McCullough paper and the basis of computing. Again, this would not be work familiar to engineers or even other mathematicians interested in computing, and it was only von Neumann's unique interests and abilities as a polymath that led him to read and appreciate it, and to especially appreciate the value of treating neurons and computational elements in general as generalized black boxes.

Both the Turing and the Pitts-McCullough paper led von Neumann to achieve something that was actually unique and can be stamped with his name on it. This something is a signature quality of mathematics and to some extent computer science, and it's what really makes those two fields the powerful fields they have become. The signature quality is the power of abstraction. The beauty and strength of mathematics is that it can generalize from specific instances (or instantiations, as computer scientists like to say) to universal abstract frameworks. Physics also shares this power to a considerable extent - for instance, the equation F=ma is independent of its specific instances and can equally describe an apple falling to the earth, a planet revolving around the sun and two black holes colliding. But the language the equation is expressed in is mathematics, and it is mathematics that allows us to generalize in the first place.

Von Neumann's big achievement was in being able to move away from vacuum tubes, wires, punch cards and magnetic core memory to a high-level view of computing that also led him to see parallels with the human brain. Basically this view told him that any computational framework - biological or electronic - must have five basic components: an input, an output, an arithmetic unit, a processing unit that manipulates data and a memory that stores data. Crucially, it also told him that both the instructions for doing something and the thing that is done can be stored in the same place and in the same form. In the words of the historian George Dyson, von Neumann's insights "erased the distinction between numbers that mean something and numbers that do something." The stored program was not invented by von Neumann, but this abstract view of the stored program did come from him, again thanks to his powers as a pure mathematician and generalist. These two signal insights are the basis of today's von Neumann architecture, but the key idea enabling them was an abstracted view that led von Neumann to visualize the structure of the computer in a most general form, something that his specialized contemporaries could not do.

A slight digression on this idea of the power of abstraction since it's relevant to my own job. I am involved with a company which is trying to enable scientists to run experiments in biology and chemistry remotely in a "cloud lab" from the luxury of their homes and laptops. A key idea in doing this is to abstract away the gory details of all the hardware and equipment through a software platform that only exposes high-level functionality to scientists who aren't experts in engineering. But an even more desirable goal is to generalize workflows across biology and chemistry so that instead of thinking of protocols specific to biology or chemistry, scientists will only think of generic protocols and generic sequences of steps like "move liquid", "stir" and "heat/cool". This is possible because at an abstract level, a container holding cells and a container holding a chemical compound for instance are both the same from the point of view of software - they are objects on which you need to perform some operation. At an even more abstract level, they are binary bits of code which change into other binary bits of code; at this level, the words "biology" and "chemistry" become irrelevant.

The ultimate goal is thus to do away with the distinction between specific instantiations of operations in specific fields and abstract them away into generalized operations. I would like to think Johnny would have appreciated this.

First Draft of a Report on the EDVAC (1945)

The result of this generalist knowledge was a seminal report called First Draft of a Report on the EDVAC that von Neumann wrote and circulated in 1945 and 1946. The EDVAC was supposed to be the ENIAC's successor and a true stored program computer. The report laid out in detail what we know as the von Neumann architecture and also explained key concepts like flow control, sub-routines and memory implementation. Von Neumann was especially big on subroutines since they went a long way in enabling instantaneous access of specific instructions that would enable stored program computing. He also emphasized the importance of random access memory; the first random access memory hardware was the Williams tube, invented in 1946. 

The EDVAC report has become controversial because of two reasons. Firstly, while it came out of many discussions that von Neumann had with the ENIAC engineers, especially Eckert and Mauchly, it had only von Neumann's name on it. Secondly, the report led to a bitter patent dispute. Eckert and Mauchly wanted to start their own company designing computers based on patenting the work on the ENIAC. But after von Neumann circulated the report in public the knowledge was in the public domain and therefore the patent issue became moot. Eckert and Mauchly were understandably bitter about this, but we have to credit von Neumann for being an early proponent of open-source software; he wanted concepts from computing to be available to everyone. Appropriately enough, the EDVAC report became widely known to engineers and scientists across the United States and Europe and influenced the design of computers in many countries. It cemented von Neumann's reputation as one of the founders of modern computing, but it should always be remembered that while the generalist insights in that report came from von Neumann, they were based on a lot of specific engineering and design work done by others.

Two early applications: Non-linear equations and meteorology

After working on the ENIAC and the EDVAC, von Neumann decided to apply all the knowledge and insights he had gained to building a computer at the Institute for Advanced Study (IAS) in Princeton where he had been a member since 1933. This fascinating story has been told extremely well by George Dyson in his book "Turing's Cathedral" so it's not worth repeating here. But it is worth noting what von Neumann considered the two most important applications he envisaged for the first computers. The first was the solution of non-linear equations. Von Neumann had become quite familiar with non-linear equations in the analysis of the complex hydrodynamics and radiation flow associated with nuclear explosions. He knew that non-linear equations are very hard to solve using traditional methods and realized that the iterative and fast techniques computers used would greatly aid the solution of these methods. Many of the early papers authored by von Neumann, Goldstine and Bigelow describe mathematical problems like the diagonalization of large matrices and the solution of non-linear partial differential equations. This early work drove home the great advantage and power of computing in a wide variety of fields where non-linear equations are important.

Von Neumann also realized that the atmosphere with its complex movements of air and water is a perfect example of non-linear phenomena. Events during the war like the Normandy landings had emphasized the importance of understanding the weather; von Neumann now thought that the computer would be the ideal tool for weather simulation. Most of the work in this area was done by scientists like Jule Charney and Carl-Gustaf Rossby, but von Neumann played a very influential role by co-authoring papers with them, organizing conferences, securing funding and generally spreading the word. His stature and eminence again went far in convincing the scientific community to work on applying computers to meteorology. Von Neumann also thought that controlling the weather would be easy, but this has proved to be a harder goal, partly became of the complexity of the phenomena involved (including chaos) and partly because of political reasons.

Von Neumann's role as a founder of modern computer science

The Institute for Advanced Study computer had a memory of 5 kilobytes, less than what it takes to display a single pixel today. And yet it achieved remarkable feats, simulating the workings of a hydrogen bomb (secretly, at night), simulating the weather and modeling the genetic growth of populations. It embodied all of von Neumann's salient concepts and was widely emulated around the country. The navy built a computer based on the IAS machine, and so did IBM and the RAND corporation whose machine was playfully named the JOHNNIAC. From these machines the gospel spread wide and hard. 

In his last few years von Neumann became even more interested in the parallels between the brain and the computer. His last major contribution was to come up with a detailed theory of self-reproducing automata which presaged important later developments in molecular biology and nanotechnology; a 1948 set of lectures at Caltech by him lays out components of self-reproducing organisms with error correction that are remarkably similar to the DNA, RNA ribosomes , proof-reading enzymes and other genetic components that were later discovered. Once again, what made von Neumann's insights in this area possible was that he thought about these components in the most general, most abstract manner, without waiting for the biologists to catch up. In the 1950s he planned to move away from the IAS to either UCLA or MIT where his interests in computing would find a better home and would be encouraged and funded. The history of science and technology could have been very different had this come to pass. Unfortunately in 1956 von Neumann was diagnosed with cancer, and he passed away after a cruel and protracted illness in February 1957. Notes for a set of lectures later published as a book lay on his deathbed.

So was von Neumann one of the founders of modern computer science? As complicated, subtle and important as the details are, the overall answer has to be yes. This answer has little to do with his technical contributions and all to do with his sheer influence and his power of generalization and abstraction. Von Neumann communicated the power of computers at a time when they were regarded as little more than turn-the-crank calculators. Because of his enormously wide-ranging interests he demonstrated their potential applications to a vast number of fields in pure and applied mathematics, meteorology, physics and biology. Most importantly, he came up with general ideas that serve as the foundation of so much computing that we take for granted today. In other words, von Neumann more than anyone else made computing respectable, widely known and the basis of modern life that everyone critically relies on today. He is not the founder of computer science, but certainly one of the principal founders. And he achieved this status largely because of the advantage enjoyed by generalists over specialists and the power of abstraction, both good lessons for an age when specialization seems to be the norm.

The last great contrarian?


Freeman Dyson, photographed in 2013 in his office by the author
On February 28th this year, the world lost a remarkable scientist, thinker, writer and humanist, and many of us also lost a beloved, generous mentor and friend. Freeman Dyson was one of the last greats from the age of Einstein and Dirac who shaped our understanding of the physical universe in the language of mathematics. But what truly made him unique was his ability to bridge C. P. Snow’s two cultures with aplomb, with one foot firmly planted in the world of hard science and the other in the world of history, poetry and letters. Men like him come along very rarely indeed, and we are unfathomably poorer for his absence.
The world at large, however, knew Dyson not only as a leading scientist but as a “contrarian”. He didn’t like the word himself; he preferred to think of himself as a rebel. One of his best essays is called “The Scientist as Rebel”. In it he wrote, “Science is an alliance of free spirits in all cultures rebelling against the local tyranny that each culture imposes on its children.” The essay describes pioneers like Kurt Gödel, Albert Einstein, Robert Oppenheimer and Francis Crick who cast aside the chains of conventional wisdom, challenging beliefs and systems that were sometimes age-old, beliefs both scientific and social. Dyson could count himself as a member of this pantheon.
Although Dyson did not like to think of himself as particularly controversial, he was quite certainly a very unconventional thinker and someone who liked to go against the grain. His friend and fellow physicist Steven Weinberg said that when consensus was forming like ice on a surface, Dyson would start chipping away at it. In a roomful of nodding heads, he would be the one who would have his hand raised, asking counterfactual questions and pointing out where the logic was weak, where the evidence was lacking. And he did this without a trace of one-upmanship or wanting to put anyone down, with genuine curiosity, playfulness and warmth. His favorite motto was the founding motto of the Royal Society: “Nullius in verba”, or “Nobody’s word is final”.
Many chapters from Dyson’s own life illustrate this spirit of rebellion; some of it was by choice, some by necessity. First was his rebellion from pure mathematics. Dyson had learnt mathematics from G. H. Hardy, a man for whom the purity of mathematics was so sacred that, in a short and beautifully written book, he said he was actually proud of working on things that had absolutely no practical use. Dyson learnt mathematics from Hardy during the cold, dark days of the Second World War, when most of the students at Cambridge University had left to fight and Hardy taught a handful of students including Dyson in his rooms. Decades later Dyson could remember the tired, shrunken figure of Hardy lecturing from his chair, with the 17-year-old occasionally feeling like giving the old man a hug. Hardy imparted Dyson with a genuine love for the beauty of mathematics, and while Dyson worked on an astonishing variety of pure and applied problems during the rest of his life, his first love never left him and he kept coming back to it; when I attended his 90th birthday celebrations six years ago, about a third of the talks were about his occasional escapes into mathematics and the flowers they sprouted in other people’s gardens.
As war raged on the continent, Dyson was once taking a walk with a friend of his, the Indian mathematician Harish-Chandra who later became his colleague at the Institute for Advanced Study in Princeton. Harish-Chandra had started as a physicist, studying under Paul Dirac. Dyson had already started getting a taste of applied mathematics, working for Bomber Command where he was using statistics to figure out the most effective way to bomb Germany into surrender. “I am planning to leave physics for mathematics”, said Harish-Chandra. “I find physics to be unrigorous, messy, elusive”. “Interesting”, replied Dyson. “I am planning to leave mathematics for physics for exactly the same reasons.” Dyson was farseeing, and while mathematics would continue to be a rich source of discoveries, he could see that it was physics that was becoming the most promising discipline then. The atomic bomb, which as far as most people were concerned had won the war, sealed the decision in his mind to become a physicist and told him that physics and America were where the action was.
But Dyson was not even the most unconventional figure in this regard. Right after the war he ran into Francis Crick who had spent a depressing time during the war working on magnetic mines for the British admiralty. He was just then planning to switch to biology and join the famous Cavendish Laboratories where researchers under the physicist Lawrence Bragg were working on deciphering the structures of proteins. Bragg himself was one of the pioneers of x-ray crystallography and had decided to change the Cavendish’s direction, from its early pioneering work in physics under J. J. Thomson and Ernest Rutherford to new research in biology. When Dyson met Crick he explained his decision to switch to biology. Dyson told him biology was interesting but it was too early to be able to make major contributions to it. As Dyson recalled, he was happy that Crick disregarded his advice and went on to become perhaps the most important biologist of the 20th century, not only cracking the structure of DNA but deciphering the genetic code and influencing an entire generation of molecular biologists. I tell these stories to make the point that while Dyson was certainly an unconventional mind himself, it helped quite a bit that he was surrounded by equally unconventional and odd minds during his formative years as a student, people who switched fields with impunity and worked on completely new things. One might in fact argue that Cambridge was then a breeding ground for contrarians; in the next decade or two it would house minds like Thomas Gold, Fred Hoyle, Roger Penrose and Stephen Hawking, all known for the audacity and unconventional nature of their ideas.
This spirit of rebellion stayed with Dyson as he moved to the United States and started his apprenticeship under the tutelage of Hans Bethe, the legendary physicist who won a Nobel Prize for working out how the stars shine and led the theoretical physics division of the Manhattan Project. At Cornell where Bethe held court, Dyson met an even freer spirit, Richard Feynman. Dyson’s great mathematical skills were ideally suited to understand the new realm of quantum electrodynamics which Feynman, wunderkind Julian Schwinger and others were trying to understand. On a cross-country drive which became the stuff of folklore, Dyson had the unique opportunity to hear first Feynman’s version of his theory and then Schwinger’s of his, two versions which seemed irreconcilable. Dyson would show that they were two ways of looking at the same physical reality.
Years later Dyson would write a memorable essay called “Birds and Frogs”: some mathematicians are frogs who like to play in the mud and solve problems, he would say. Others are birds who soar and survey the landscape from a great height. Although he put himself squarely in the camp of frogs, the crucial work he did bridging Feynman and Schwinger’s theories and providing a unified view of quantum electrodynamics made him a bird. Throughout his life he was able to expertly navigate between the two domains.
Dyson was unconventional not just in his science but in his view of the social aspects of science, a view that he applied to himself with abandon. He famously never received a PhD; by that time it had become clear that he did not need one. Without a PhD the prodigy became a professor at Cornell and a fellow of the Royal Society, both before he turned 30. Over the years, dozens of PhD theses would be written based on Dyson’s work, but he would become a sharp critic of the entire system, a view that seems increasingly valid in an age when graduate students and postdoctoral researchers are often regarded as a source of cheap labor to grind out results and papers. Dyson thought that the PhD had become a kind of union card, forcing students and especially women who might want to start a family to spend most of their twenties working on single problems so that they would be deemed worthy enough to join a hallowed guild. It is also unfair to inventors who almost never have a PhD. I have worked with both PhDs and non-PhDs during my career and on balance can say that the non-PhDs have been more approachable, more hardworking and more creative. Unfortunately our research system, both in academic and industry, puts a premier on people having PhDs.
With his lack of a PhD and his disdain for a system that forces both students and professors to spend years working on a problem, Dyson the frog clearly realized that he wasn’t suited to a conventional academic career. But it also made him reassess what his particular strengths were. His great accomplishment had given him an enviable lifetime appointment at the Institute for Advanced Study in Princeton; Robert Oppenheimer who was the director had himself sung Dyson’s praises. For the next seven decades Dyson would become one of the most famous residents of the institute, but even his home institution was not spared from his contrarian take. He always thought the institute was too ivory tower and too much like an alien transplant in America– in contrast Ithaca and Cornell where he had lived before were the real American deal. He also disdained the tribalism among the institute’s mathematicians and their successful efforts to shut down John von Neumann’s computer project, a project which if it had been supported would have put the IAS on the map in the annals of modern computing.
During the early years Dyson kept on working on particle physics which had been his first proving ground, but a high water mark in Dyson’s career as a particle physicist came when he had a fortuitous meeting with Enrico Fermi in Chicago in 1953. Dyson and his students at Cornell were working on some results on particle scattering which seemed to give very good agreement with experiments done by Fermi. Fermi demolished the agreement in one fell swoop, saying that Dyson had neither a consistent mathematical theory nor a clear physical picture to explain the results. As far as the good curve fitting between theory and experiment was concerned, he quoted his friend Johnny von Neumann; with four parameters one can fit an elephant to a curve, with five one can make him wiggle his trunk. The conversation with Fermi, barely lasting fifteen minutes, dashed Dyson’s hopes to be a conventional physicist of high caliber. But his frustration was actually an astute observation and an epiphany: Fermi was the epitome of what a great physicist was supposed to be, combining great facility at experimentation and visualization of the physical picture with an equally good facility at calculation. Dyson’s forte was calculation.
It was partly the meeting with Fermi that convinced Dyson that his great strength was to apply mathematics to diverse problems rather than make great advances in what was then most fashionable in physics. This cleared the way for a remarkable career. His contributions in quantum electrodynamics were important enough, but after that Dyson switched to other fields, in particular condensed matter physics. In this realm his most important contribution concerned a fundamental question that can be asked simply: if most of the atom is empty space, what makes ordinary matter stable? Using a laborious proof that only one who was skilled in the highest realms of mathematical manipulation could muster, Dyson proved that the Pauli exclusion principle which prevents two electrons from occupying the same quantum state essentially keeps electrons apart and makes matter stable. The proof involved pages of hairy mathematics and was improved later by other researchers, but it did demonstrate Dyson’s great strengths as a professional calculator without peer. It also cemented his reputation as someone who would regularly toss gems aside that others would pick up and build into great edifices.
Until then, whatever some of the contrarian turns he took, Dyson was still known primarily as a theoretical physicist working in diverse parts of physics. But it was his work in the 1950s on engineering problems that truly demonstrated not only the scope of his intellect but the wondrous ambition of his ideas. The first engineering project he worked on was to design a nuclear reactor with physicist Edward Teller and others which would be intrinsically safe and “idiot-proof”; the safety in the reactor would have to come not from decisions by the operator but from the laws of physics. This reactor would be designed by collaboration between physicists, chemists and engineers. Dyson had to essentially learn an entire field of engineering, but at that time there were no true experts so the scientists who had gathered in a former red schoolhouse in La Jolla taught each other. The result of their deliberations was the TRIGA, the only nuclear reactor which has made a profit for its company. During TRIGA’s inauguration Dyson got a chance to take a solitary walk on the beach with the great Niels Bohr himself, but Bohr’s famous mumble and soft voice kept Dyson from receiving any enlightenment.
If TRIGA was earth, Project Orion was heaven. Buoyed by the enthusiasm of space travel, nuclear energy and rocketry in the 1950s, a small group of dreamers decided to build a spaceship powered by nuclear bombs. The idea was to sequentially explode bombs at a distance under a pusher plate on a spaceship and let the momentum carry the rocket away at great speed. “Saturn by 1970” was their motto, and it would also be a nice way to get rid of all those dangerous nuclear weapons lying around. Project Orion tested Dyson’s faculties and allowed his imagination to soar like nothing else. Once again we see the scientist as rebel, applying his skills to an audacious idea which had never before been explored and which could lead to a brave new world. The engineers on Project Orion remember being astonished by Dyson’s versatility as they saw a pure theoretical physicist calculate friction coefficients and load ratios, visualizing giant bombs going off under giant spaceships; while the project initially used small, tactical nuclear weapons, Dyson later imagined bigger, fusion-based weapons propelling the craft. A colleague named Brian Dunne captured Dyson’s unique style:
Freeman doesn’t have the Handbuchderphysik, the last-word sort of German precision. He doesn’t have the French quality of slap-dash, a point here well taken but the rest of it wrong. He doesn’t have the stiff British restraint. It’s a style he has developed that’s all his own.
Project Orion came to an end when radioactive fallout became a public concern and the test ban treaty of 1963 banned nuclear explosions in the atmosphere, but it had been a wild ride for this unconventional thinker. It was also an act of rebellion in another regard: ever since he hired him at the institute in Princeton, Robert Oppenheimer had thought that Dyson would continue his work on the purest of fundamental physics that Oppenheimer thought was the only thing worth doing. While forays into other areas of physics and nuclear engineering utilized Dyson’s abilities handsomely, Oppenheimer made it clear that he would not welcome Dyson back if he kept working on these areas for too long. Fortunately Dyson the contrarian found other ways to channel his unique abilities.
One of those ways was to think about communicating with aliens, another mainstay of 1950s and 60s interests. In 1960 Dyson wrote a serious paper in Science describing how an advanced extraterrestrial civilization might be able to disassemble a Jupiter-sized planet and surround itself with its pieces, trapping its parent star’s energy inside and utilizing it. Became these star systems would be cloaked by the shell of the planet’s fragments they would not be visible, but because they trapped energy it would escape as infrared radiation which could be detected. Unusual infrared radiation signatures could therefore be a way to find extraterrestrials. The original idea had not been Dyson’s and had been described by science fiction writer Olaf Stapledon in his story “Star Maker”, but its serious extension became emblematic of Dyson’s originality; take an audacious idea, even one that belongs to the realm of science fiction, and turn it into a serious scientific paper filled with mathematical calculations. Since the 1960 paper was published, Dyson Spheres have become a staple of modern science fiction, even making it into ‘Star Trek’.
The search for life in space was one of Dyson’s enduring interests and it gave voice to his creativity like little else. He disdained ideas like the Drake Equation that relied on highly uncertain armchair speculation without suggesting experiments. Along with Dyson Spheres, he also came up with an idea to find life on Europa by searching for freeze-dried fish in its orbit instead of water under its surface (the logic being that the former, while it sounds outlandish, is much easier to look for than the latter which involves very expensive drilling) and a way to grow giant plants in the reduced gravity of comets. But even assuming that humanity could escape the bonds that have always bound itself to planet earth, how long can life keep this up, this constant hopping around between inhospitable environments?
In Dyson’s view – literally until the end of time. In 1979, he wrote “Time Without End: Physics and Biology in an Open Universe”, perhaps his most audacious serious paper, covering thirteen pages in the distinguished journal Reviews of Modern Physics. In it he argued that living creatures could keep on feeding off dwindling sources of energy in an expanding universe and even keep communicating with each other, although admittedly not in a form that would be easily recognizable to modern day human beings. With this paper, he hoped to “hasten the arrival of the day when eschatology, the study of the end of the universe, will be a respectable scientific discipline and not merely a branch of theology.” Like all contrarians who deal with speculative ideas, Dyson was prepared to accept that this one might be incorrect. In the 1990s a new era in our understanding of the universe dawned when it was found that the expansion of the universe is accelerating. In this universe Dyson’s creatures would be doomed since they would be competing against energy sources rapidly going to zero and distances between galaxies going to infinity. Dyson graciously accepted as much in a new edition of his set of Gifford Lectures, “Infinite in all Directions”, but he still found solace: “If it turns out that we live in a constantly accelerating universe, we may complain that God designed it badly as a home for intelligent creatures, but we can be thankful that he gave us at least a few trillion years to enjoy it before the final darkness falls.”
1979 was also the year when Dyson started a new career, taking a step that is truly contrary for most working scientists. He wrote his autobiography, “Disturbing the Universe”; “Life begins at 55”, he said, because that was when he wrote his first book. Among top scientists there are very few who can write genuinely well, partly because there are very few who are genuinely steeped in both their scientific specialization as well as the broader traditions of literature. Fortunately Dyson’s upbringing had given him a unique facility with both science and the humanities – his mother was a lawyer who fought for women’s rights and his father was Sir George Dyson, one of England’s most well known composers of the time; his parents were highly cultured and socially responsible people, and they endowed Freeman with an unusual sensitivity to human affairs. He himself acknowledged his two strengths as “calculation and English prose”. Transitioning from largely doing hard science to largely doing writing and giving lectures served both Dyson and the world exceedingly well, and by doing this he was following a maxim G. H. Hardy once told him: “Young men should prove theorems. Old men should write books.”
“Disturbing the Universe” remains a remarkable book, filled not only with sharp observations on great scientists like Oppenheimer and Feynman but sprinklings from T. S. Eliot and Blake, full not just of discussions of physics and genetic engineering but reflections on world peace and human nature. It’s perhaps the finest memoir I have read of the scientist as citizen. All of Dyson’s ensuing books have mirrored these themes, combining highly original scientific ideas with meditations on war, peace and human affairs. He always saw people’s flaws astutely, but also saw their greatness and saw how both of these qualities combined to make a complete person. And he was as comfortable discussing minor but enlightening issues of family and friends, as he was the fate of the world. Most of all he cared about his family and was proud of his six children who now have sixteen children in all and are productive citizens.
Even when he was mainly writing books Dyson kept on making strikingly original contributions to science. In a 1999 interview he was asked what he thought would be his most important contribution to science. It would be hard to know until he had been dead for a hundred years, he sensibly said, and added that his son thought his contributions to the origins of life might be most important. Dyson was referring to a slim volume he had just published in which he had made the striking argument that life might have arose twice, once as pure metabolism and once as pure replication. Dyson’s starting point was Erwin Schrödinger’s equally slim book “What is Life?” and a set of lectures on self-reproducing automata that Johnny von Neumann had given in the 40s. Just like Dyson, Schrödinger and von Neumann were physicists and mathematicians who had made a side foray into biology. Just like Schrödinger’s book, Dyson’s book might stand the test of time and turn out to be an unusually important contribution. In fact recent findings of life arising by pure metabolism in hot vents under the sea could well support Dyson’s theories. This was Freeman Dyson at his best, starting multiple lines of research even during a minor trip off the beaten track into a field that he previously knew nothing about.
More original ideas followed including two from 2012 (when he was 89): one arguing that it might be impossible to detect individual gravitons and dissolve the wall between quantum theory and general relativity, and a paper on game theory that upended conventional wisdom in the field. He became somewhat controversial for arguing that extrasensory perception might be real but it may be just outside the bounds of our standard measuring equipment and experiments; I have similar feelings about traditional Chinese and Indian medicine, much of which deals with small but significant holistic effects and differs from person to person and therefore may escape the design of standard double blind clinical trials.
But in the last decade or so of his life, Dyson became most famous in the public eye because of his views on global warming. I regard this entire affair as unfortunate and blown out of proportion, more emblematic of how we have grown increasingly intolerant in recent times than of anything Dyson said or wrote. During my several meetings and email exchanges with him I discussed global warming, and he never denied the basic facts, only the consequences. I always thought that most of his views were valid and were steeped in sound skepticism and humility. He asked if we know for certain how much the good effects of climate change will outweigh the bad, and in particular whether increased CO2 might not be better for the growth of certain trees and for certain cold regions of the planet. He asked if in our zeal to criticize we are not paying attention to technological solutions that might mitigate the problem. Some of his ideas such as genetically engineering trees that might consume more CO2 were speculative, but one could argue that an extraordinary problem like climate change warrants the exploration of extraordinary ideas.
He realized that the climate is a very complex system about which it’s difficult to know everything, especially when computer modeling plays such an important role. There are some components of the climate like wind patterns that are better understood than others like the soil. One reason I very much empathized with him is because I have spent most of my career doing computer modeling on chemical systems much simpler than the climate. These systems often involve only two molecules interacting with each other, and yet we have found out how complicated modeling the action of different components in these systems are; as he and I often discussed, water especially seems to be a diabolical culprit to understand, both in cloud formation and chemical systems. I also found during our conversations that climate change is a minor interest of his, and most of the controversy seemed to be drummed up by others rather than by Dyson himself. Most of all he lamented the intense politicization of climate change that had made reasoned debate very difficult, and I could not agree more. If certain groups of individuals and groups took his views out of context and used them to shore up their own political agenda, it would be unfair to blame him for it.
When I heard about Dyson’s passing I felt devastated. Devastated not only because I had lost someone who I considered to be the biggest intellectual influence on my thinking after my father’s passing a few years ago, but because I wonder if the world is now willing to tolerate the tradition of skepticism and originality that he and fellow scientists he admired exemplified. When I hear people say that he was a contrarian, I think it should be clear that he was always a contrarian and we are all the better off for it. Being a contrarian enriched his life, took him in unexpected directions and uniquely contributed to his dialogue with the world. And if the word “contrarian” means someone who comes up with highly original ideas that challenge the status quo and bucks the trend, then Dyson was a contrarian in the best scientific tradition. He had displayed these qualities all his life, right from his transition to physics as a student to his forays in very diverse branches of science and engineering to his career as a writer of eloquent prose and his commentary on social issues. But if Dyson was the last contrarian it makes him even more unique and we are in trouble.
The reason I worry is that we increasingly seem to live in an age in which contrarian ideas of the kind Dyson exemplified are not just criticized but criticized through a moral lens. Unfortunately social media has wildly exacerbated this trend. Today when you express a contrarian view on social media, not only do people disagree with you and regard you as mistaken but they are also likely to regard you as immoral and even evil. This is a problem, and it’s a problem especially when a minority of people who are actually immoral get their opinions mixed up with those who are arguing in good faith. Moral criticism makes it much harder for individuals to speak their mind compared to criticism of other kinds. Now of course, throughout history moral judgement has been used as a tool for social ostracism, but as long as it did not expose you to the entire world it kept the criticism contained and within bounds. You may have felt a little dissuaded but you could still preach your gospel. Today when anything that you say to a small group of people can reach thousands of people on social media, when the ensuing din hounds you out of the chambers of debate, the scope of moral criticism gets tremendously magnified, magnified to such an extent that it becomes a whirlwind and silences everyone except those with unusually stout hearts.
We must return to an era when disagreements are just that, disagreements in good faith. There is too much at stake in our world today and the problems we are trying to solve are too complex to allow only one set of voices to be heard and opposing ones silenced. And because the problems are so complex, some of us will inevitably be wrong. But that’s how it should be. The best contrarians always realize that they can be wrong – Dyson once said that he would rather be wrong than be vague, and always admitted the former possibility – and it’s precisely that freedom to be wrong, the freedom to be listened to and disagreed with without moral ostracism and outrage that illuminates the path to the truth.
An old voice from Dyson’s past, his mentor Robert Oppenheimer, once said,
“There must be no barriers to freedom of inquiry. There is no place for dogma in science. The scientist is free, and must be free to ask any question, to doubt any assertion, to seek for any evidence, to correct any errors…and we know that as long as men are free to ask what they must, free to say what they think, free to think what they will, freedom can never be lost, and science can never regress.”
I hope that if Freeman Dyson was the great contrarian, he certainly won’t be the last one.