Field of Science

Computer simulations and the Universe

There is a sense in certain quarters that both experimental and theoretical fundamental physics are at an impasse. Other branches of physics like condensed matter physics and fluid dynamics are thriving, but since the composition and existence of the fundamental basis of matter, the origins of the universe and the unity of quantum mechanics with general relativity have long since been held to be foundational matters in physics, this lack of progress rightly bothers its practitioners.
Each of these two aspects of physics faces its own problems. Experimental physics is in trouble because it now relies on energies that cannot be reached even by the biggest particle accelerators around, and building new accelerators will require billions of dollars at a minimum. Even before it was difficult to get this kind of money; in the 1990s the Superconducting Supercollider, an accelerator which would have cost about $2 billion and reached energies greater than those reached by the Large Hadron Collider, was shelved because of a lack of consensus among physicists, political foot dragging and budget concerns. The next particle accelerator which is projected to cost $10 billion is seen as a bad investment by some, especially since previous expensive experiments in physics have confirmed prior theoretical foundations rather than discovered new phenomena or particles.
Fundamental theoretical physics is in trouble because it has become unfalsifiable, divorced from experiment and entangled in mathematical complexities. String theory which was thought to be the most promising approach to unifying quantum mechanics and general relativity has come under particular scrutiny, and its lack of falsifiable predictive power has become so visible that some philosophers have suggested that traditional criteria for a theory’s success like falsification should no longer be applied to string theory. Not surprisingly, many scientists as well as philosophers have frowned on this proposed novel, postmodern model of scientific validation.
Quite aside from specific examples in theory and experiment, perhaps the most serious roadblock that fundamental physics seems to be facing is that it might have reached the end of “Why”. That is to say, the causal framework for explaining phenomena that has been a mainstay of physics since its very beginnings might have ominously hit a wall. For instance, the Large Hadron Collider found the Higgs Boson, but this particle had already been predicted thirty years before. Similarly, the gravitational waves predicted by LIGO were a logical prediction of Einstein’s theory of relativity proposed almost a hundred years before. Both these experiments were technical tour de forces, but they did not make startling, unexpected new discoveries. Other “big physics” experiments before the LHC had validated the predictions of the Standard Model which is our best theoretical framework for the fundamental constituents of matter.
The problem is that the basic fundamental constants in the Standard Model like the masses of elementary particles and their numbers are ad hoc quantities. Nobody knows why they have the values they do. This dilemma has led some physicists to propose the idea that while our universe happens to be the one in which the fundamental constants have certain specific values, there might be other universes in which they have different values. This need for explanation of the values of the fundamental constants is part of the reason why theories of the multiverse are popular. Even if true, this scenario does not bode well for the state of physics. In his collection of essays “The Accidental Universe”, physicist and writer Alan Lightman says:
Dramatic developments in cosmological findings and thought have led some of the world’s premier physicists to propose that our universe is only one of an enormous number of universes, with wildly varying properties, and that some of the most basic features of our particular universe are mere accidents – random throws of the cosmic dice. In which case, there is no hope of ever explaining these features in terms of fundamental causes and principles.
Lightman also quotes the reigning doyen of theoretical physicists, Steven Weinberg, who recognizes this watershed in the history of his discipline:
We now find ourselves at a historic fork in the road we travel to understand the laws of nature. If the multiverse idea is correct, the style of fundamental physics will be radically changed.
Although Weinberg does not say this, what’s depressing about the multiverse is that its existence might always remain postulated and never proven since there is no easy way to experimentally test it. This is a particularly bad scenario because the only thing that a scientist hates even more than an unpleasant answer to a question is no answer at all.
Do the roadblocks that experimental and theoretical physics have hit combined with the lack of explanation of fundamental constants mean that fundamental physics is stuck forever? Perhaps not. Here one must remember Einstein when he said that “Our problems cannot be solved with the same thinking that created them”. Physicists may have to think in wholly different ways, to change the fundamental style that Weinberg refers to, in order to overcome the impasse.
Fortunately there is one tool in addition to theory and experiment which has not been prominently used by physicists but which has been used by biologists and chemists and which could help physicists do new experiments. That tool is computation. Computation is usually regarded separately from experiment, but computational experiments can be performed the same way that lab experiments can as long as the parameters and models underlying the computation are well defined and valid.  In the last few decades, computation has become as legitimate a tool in science as theory and experiment.
Interestingly, this problem of trying to explain fundamental phenomena without being able to resort to deeper explanations is familiar to biologists: it is the old problem of contingency and chance in evolution. Just like physicists want to explain why the proton has a certain mass, biologists want to explain why marsupials have pouches that carry their young or why Blue Morpho butterflies are a beautiful blue. While proximal explanations for such phenomena are available, the ultimate explanations hinge on chance. Biological evolution could have followed an infinite number of pathways, and the ones that it did simply arose from natural selection acting on random mutations. Similarly one can postulate that while the fundamental constants could have had different values, the ones that they do have in our universe came about simply because of random perturbations, each one of which rendered a different universe. Physics turns into biology.
Is there a way to test this kind of thinking in the absence of concrete experiments? One way would be to think of different universes as different local minima in a multidimensional landscape. This scenario would be familiar  to biochemists who are used to thinking of different folded structures for a protein as lying in different local energy minima. A few years back a biophysicist named Collin Stultz in fact made this comparison as a helpful way to think about the multiverse. Computational biophysicists test this protein landscape by running computer simulations in which they allow an unfolded protein to explore all these different local minima until it finds a global minimum which corresponds to its true folded state. In the last few years, thanks to growing computing power, thousands of such proteins have been simulated.
Similarly, I postulate that computational physicists could perform simulations in which they simulate universes with different values for the fundamental constants and evaluate which ones resemble our real universe. Because the values of the fundamental constants dictate chemistry and biology, one could well imagine completely fantastic physics, biology and chemistry arising in universes with different values for Planck’s constant or for the fine structure constant. A 0.001% difference in some values might lead to a lifeless universe with total silence, one with only black holes or spectacularly exploding supernovae, or one which bounced back between infinitesimal and infinite length scales in a split second. Smaller variations on the constants could result in a universe with silicon-based life, or one with liquid ammonia rather than water as life’s essential solvent, or one with a few million earth-like planets in every galaxy. With a slight tweaking of the cosmic calculator, one could even have universes where Blue Morpho butterflies are the dominant intelligent species or where humans have the capacity to photosynthesize.
All these alternative universes could be simulated and explored by computational physicists without the need to conduct billion dollar experiments and deal with politicians for funding. I believe that both the technology and the knowledge base required to simulate entire universes on a computer could be well within our means in the next fifty years, and certainly within the next hundred years. In some sense the technology is already within reach; already we can perform climate and protein structure simulations on mere desktop computers, so simulating whole universes should be possible on supercomputers or distributed cloud computing systems. Crowdsourcing of the kind done for the search for extraterrestrial intelligence or protein folding would be readily feasible. Another alternative would be to do computation using DNA or quantum computers: Because of DNA’s high storage and permutation capacity, computation using DNA can multiply required computational resources manyfold. One can also imagine taking advantage of natural phenomena like electrical discharges in interstellar space or in the clouds of Venus or Jupiter to perform large-scale computation; in fact an intelligence based on communication using electrical discharges was the basis of Fred Hoyle’s science fiction story “The Black Cloud”.
On the theoretical side, the trick is to have enough knowledge about fundamental phenomena and to be able to abstract away the details so that the simulation can be run at the right emergent level. For instance, physicists can already simulate the behavior of entire galaxies and supernovae without worrying about the behavior of every single subatomic particle in the system. Similarly, biologists can simulate the large-scale behavior of ecosystems without worrying about the behavior of every single organism in them. In fact physicists are already quite familiar with such an approach in the field of statistical mechanics where they can simulate quantities like temperature and pressure in a system without simulating every individual atom or molecule in it. And they have measured the values of the fundamental constants to many decimal places to use them confidently in the simulations.
In our hypothetical simulated universe, all the simulator would have to do would be to input slightly different values of the fundamental constants and then hard-code some fundamental emergent laws like evolution by natural selection and the laws of chemical bonding. In fact, a particularly entertaining enterprise would be to run the simulation and see if these laws emerge by themselves. The whole simulation would in one sense largely be a matter of adjusting initial values, setting the boundary value conditions and then sitting back and watching the ensuing fireworks. It would simply be an extension of what scientists already do using computers albeit on a much larger scale. Once the simulations are validated, they could be turned into user-friendly tools or toys that can be used by children. The children could try to simulate their own universes and can have contests to see which one creates the most interesting physics, chemistry and biology. Adults as well as children could thus participate in extending the boundaries of our knowledge of fundamental physics.
Large-scale simulation of multiple universes can help break the impasse that both experimentation and theory in fundamental physics are facing. Computation cannot completely replace experiment if the underlying parameters and assumptions are not well-validated, but there is no reason why this cannot happen as our knowledge of the world based on small-scale experiments grows. In fields like theoretical chemistry, weather prediction and drug development, computational predictions are becoming as important as experimental tests. At the very least, the results from these computational studies will constrain the number of potential experimental tests and provide more confidence in asking governments to allocate billions of dollars for the next generation of particle accelerators and gravitational wave detectors.
I believe that the ability to simulate entire universes is imminent, will be part of the future of physics and will undoubtedly lead to many exciting results. But the most exciting ones will be those that even our best science fiction writers cannot imagine. That is something we can truly look forward to.

First published on 3 Quarks Daily.

Book review: Roger Williams and the Creation of the American Soul

Roger Williams and the Creation of the American Soul: Church, State, and the Birth of LibertyRoger Williams and the Creation of the American Soul: Church, State, and the Birth of Liberty by John M. Barry

If anyone wants to know what makes the United States unique, part of the answer can be found in this book. At its center is a wholly remarkable, extraordinary, awe-inspiring individual who was light years ahead of his time. Roger Williams founded Rhode Island (then Providence Plantation) in 1636 and it became the world's first model of both full religious tolerance as well as individual rights, and it established government by the consent of the governed as a foundational principle. At that time nothing like it existed anywhere, and certainly not in Europe where Catholics and Protestants were killing each other over absurdly trivial matters like the age for baptism and Calvinist predestination. Williams's teachings and writings set the stage for fundamental debates about the role of religion and the state in individuals' lives with which we are still grappling.

Williams had fled from England when Charles I intensified his father James I's campaign to persecute Protestant Puritans who wanted a purer, more rigorous form of worship. Growing up in London, Williams had been enormously influenced by Edward Coke and Francis Bacon, two men who ironically were sworn enemies. Coke was the most eminent jurist in English history and had challenged the divine right of kings and emphasized rights to property and due process. Bacon was one of the fathers of the scientific method and put a premium on evidence and observation. From both these men Williams imbibed a deep set of ethics about free, secular thinking.

He arrived in Massachusetts a decade after the Mayflower docked at Plymouth and a few years after the Massachusetts Bay Colony was established by John Winthrop, the "city on a hill". A talented lawyer, minister and linguist who was steeped in Bacon's scientific method, he became friends with the Indians, learnt their customs and and became fluent in their language. He was received with great respect and offered the post of minister in the newly-established Boston's first church. His time in history came when he made a historic break by opposing two basic tenets of the Puritans and Christians in general: that the state should have no authority to enforce the first four commandments dealing with God, and that Indians had property rights too and the Puritans did not have the authority to simply seize them and needed to buy their lands. This went not just against the fundamentalist religious beliefs of the colony but was something wholly new that directly contradicted both the meld between church and state and in fact all the political and religious philosophy that existed at the time.

For his novel views Williams was duly banished from Massachusetts under threat of execution, but he kept on privately preaching his creed in the more tolerant Salem. When Massachusetts sent out a squad of soldiers to haul him onto a ship bound for England for imprisonment, they found that (goaded by a tip from Winthrop), Williams had already escaped into the bitter, snowy winter wilderness. The only reason he remained alive was because he found refuge and friendship among the Narragansett and other Indians who lived in the area, and the fact that his friends and colleagues had denounced him while strangers had saved his life fundamentally changed his views of race, of religion, of Native Americans, of freedom and individual rights, of how much control men should have over other men. His colony became a refuge for the rejected, the denounced, the banished of Massachusetts, Plymouth and Connecticut; the three major colonies of the time.

He decided to codify his beliefs in a formal document. Massachusetts kept on being threatened by its freethinking neighbor to the South and kept on trying to usurp its territories, so Williams went back again to the same England from which he had fled about fifteen years ago. At this point England itself had become roiled up in what was going to lead to the English Civil War and the execution of Charles I. He befriended Oliver Cromwell and managed to get a charter for Rhode Island written and later endorsed by Charles II (who seems to have forgiven his friendship with Cromwell). In an age when almost every piece of paper including the founding charter of Puritan Massachusetts was infused throughout with the names of God and Christ, the Rhode Island charter is an extraordinary document, not mentioning God even once. It established almost complete freedom of religion and made it clear that no one should be persecuted simply for their beliefs; a groundbreaking assertion at a time when even minor differences in religious beliefs between Catholics and Protestants, let alone ones between Protestants and Jews or Quakers, were enough to ignite religious wars that killed thousands. Finally, with his charter safely establishing the legality of Rhode Island, Williams returned back to his colony and lived to be an old man, still preaching the gospel of tolerance.

Williams's writing serve as the foundation for the novelty of the American experiment. He was a devout Christian who conceived a separation of church from state, private from public activity. He might have been the first bonafide libertarian. There is a straight line between his teachings, John Locke, the Declaration of Independence and all the worldwide events that the American Revolution inspired. No wonder that when the tide of history met the shores of fate, not only did Rhode Island become the first to protest against unlawful behavior by the English even before the Boston Tea Party, but it became the first state in the colonies to declare independence from Great Britain in 1776.

Book review: Fur, Fortune and Empire

Fur, Fortune, and Empire: The Epic History of the Fur Trade in AmericaFur, Fortune, and Empire: The Epic History of the Fur Trade in America by Eric Jay Dolin

A marvelous and highly revealing history of the fur trade in America, right from the first permanent European settlements in the 17th century to the end of the 19th century. A story of inspiring doggedness against an incredibly unforgiving environment and of the tragic clash of civilizations.

Dolin's basic thesis is that fur was to the 17th and 19th centuries what oil was to the 20th, and it was the possibility of buying beaver furs in unprecedented quantities for fashion-hungry Europe from Indians that largely drew first the Dutch and French and later the English to North America, so the settling and expansion of North America especially to the West tracks very closely with the fur trade. Having access to the Mississippi and the Hudson rivers, the former were much better placed to buy fur in exchange for European goods, at first trinkets like utensils and clothing but later deadlier commodities like guns and alcohol. The Dutch started trading for beaver pelts in their New Amsterdam colony, while the French swept in from Canada and controlled the Mississippi. This led to an inevitable clash between the British and the French for control of the Great Lakes region. After the French and Indian War, clashes arose between the British and the colonies regarding jurisdiction over the newly-opened vast Ohio territory and its lucrative fur possibilities, and this was at least one of the factors leading to the American Revolution. Americans continued to duke it out with the British even as both expanded into the Northwest, this time killing sea otters in unprecedented numbers for trade with China with brutal techniques and gleeful avarice. The Lewis and Clark expedition was at least in part a quest to map lucrative locations for the fur trade.

One of the highlights of the book is the light it sheds on early European-Indian relations which were much more benign compared to later years. In almost every case the Indians welcomed the Europeans at first contact and were in awe of their guns and other modern technology. Partly out of necessity - the Europeans were completely dependent on the natives at first for fetching furs from the deep interior - and partly out of genuine respect and curiosity, Europeans established trading relationships with the Indians through trading posts, and the Indians were often canny enough to play competing French and British trappers and companies against each other to get the best price. The relationship started changing when the Europeans became more land-hungry and when they started taking advantage of the Indians by plying them with alcohol; the independent forays of European trappers also started reducing their dependence on native fur acquisition. But there were violent clashes on both sides, sometimes instigated by Indians but more often invoked by European greed.

The book has memorable portraits of key fur trappers, sailors and soldiers who braved unbelievable rigors of starvation, predation and hostile engagements with Indians to get the furs, living for months in inhospitable, sub-zero temperatures in the Midwest and the Great Plains. One of these "mountain men" was Hugo Glass who was mauled by a grizzly bear and left for dead before he endured an astonishing foot journey to reach civilization; Glass was the inspiration for the movie "The Revenant". The mountain men are fascinating; mostly originating from Kentucky, Tennessee and other border states, they were the most free-lancing among the free-lancing trappers, traveling with aplomb whenever and wherever they wanted, yet 80% of them were married and a third took Indian wives. What is truly interesting is that these uneducated, hardy men were often as well read as an East Coast businessman and practiced a kind of equality among themselves and their wives, often living in communal camps, that might have been unique on the continent for the times. Other memorable characters include John Jacob Astor, one of America's first millionaires who thrived on and greatly expanded the fur trade, Captain James Cook who was the first to discover the Northwest before he was killed in Hawaii and frontiersmen like Kit Carson, Daniel Boone and Manuel Lisa.

The last part of the book deals with the tragic effects the fur trade had on America's fauna as well as on the Indians. By the 1850s or so Europeans and Indians had both hunted the beaver nearly to extinction before they discovered a new source of fur: the American buffalo or bison. With that discovery began probably the greatest episode of manmade carnage in history. At the beginning tens of millions of buffalo roamed the Great Plains and the Southwest; by the end of 1890 there were a few hundred. The building of the transcontinental railroad sealed the fate of both the buffalo and the Indians in whose life the buffalo was so intimately integrated that they would use and consume every single part of it, including the scrotum and the tail, the heart and the blood. Meanwhile, Europeans started killing the animal for sport, sometimes lazily shooting it from train compartments and leaving the carcasses rotting. The long-range rifle made it possible for a single hunter to kill dozens in a day and waste most of their meat. Soon the plains were literally dotted with rotting carcasses and skulls for as far as the eye could see. The westward expansion also split the Indian population into small groups which were at the mercy of settlers and the U.S. Army, leading to their complete subjugation. This was truly a sad chapter in the history of the United States, and one that frankly brought tears to my eyes.

Not just the buffalo but the beaver and the sea otter were killed in the tens of millions and hunted to near extinction, so it's perhaps a miracle that they are still around. While the history of the fur trade tells the story of expansion, greed, killings and conquest along with one of resilience, doggedness and adventure, its aftermath tells a story of hope even as Teddy Roosevelt, John Muir, Thoreau and others reminded Americans of humans' deep connection to nature, made a strong push for conservation and assigned large areas of the country to conservation where bison, otters and other animals killed during the fur trade started thriving again. A few years ago a beaver was spotted on the Bronx River in New York for the first time in two hundred years. Perhaps there is a kernel of compassion and hope in the gnarly undergrowth of man's cruelty after all.

Book review: Miracle at Philadelphia

Miracle at Philadelphia: The Story of the Constitutional Convention, May to September 1787Miracle at Philadelphia: The Story of the Constitutional Convention, May to September 1787 by Catherine Drinker Bowen

A superb, must-read day-by-day account of the Constitutional Convention which took place in Philadelphia between May and September 1787. The writing and description of not just the deliberations and the personalities but the stuffy, hot, Philadelphia weather, the shops, the clothes and the impressions of European visitors of a society that snubs its nose at class are so vivid that you get the feeling you are there. I have read a few other accounts of this all-important episode, but none so revealing as to the spirit of the times.

Present here are the great men of American history in all their glory and flaws: Washington, Hamilton, Madison, Franklin, Gouverneur Morris (from whose pen came “We the people” in the preamble to the Constitution), and even a lobbyist for land companies, Manasseh Cutler, who helped draft the Northwest Ordinance that created the vast Northwest Territory and sealed the fate of millions of Indians. Exerting their influence subtly from Europe were Jefferson and Adams. There were fiery speakers both for and against a central government - George Mason and Edmund Randolph from Virginia, Luther Martin from Maryland, Hamilton from New York, Elbridge Gerry from Massachusetts (from whom comes one of my favorite quotes: “The evils we have stem from the excess of democracy. The people do not want virtue, but are the dupes of pretended patriots”) - who made no secret of their feelings. They formed the Federalists and Antifederalists who were to have such bitter debates later.

Discussed were issues both trivial and momentous: the exact terms for Senators and Congressmen, whether the President should be appointed for life, the regulation of trade with other countries, the requirements for voting and citizenship, the provision for a national army. But the three most important issues were taxation, representation in both houses, and Western expansion. In many ways these issues encapsulated the central issue: states’ rights vs a strong national government. The small states were afraid that proportional representation would diminish their influence to nothing; the large ones were afraid that incomplete representation would harm their economy, their manufacturing and their landed gentry; sparsely populated ones worried that it would harm Westward expansion and slavery. Many people spoke openly against slavery, but it was out of concerns for the Southern states’ objections that the Constitution adopted the infamous three-fifths clause relating to “other persons” (there was consolation in the fact that the convention at least set a 1808 date for the ending of the slave trade). To soothe concerns on both sides, Roger Sherman of Connecticut offered the Sherman Compromise which proposed that the House would have proportionate representation while the Senate’s composition would be fixed to two from each state.

Women, white men without property, Africans and Indians famously got fleeced. As Jill Lepore wrote in her history “These Truths”, while Africans were degraded as slaves and considered as three-fifths of men, women fared almost as badly and were completely left out of the Constitution: in 1776, Abigail Adams memorably wrote to her husband, "Do not put such unlimited power into the hands of the husbands. Remember, all men would be tyrants if they could. If particular care and attention is not paid to the ladies, we are determined to foment a rebellion, and will not hold ourselves bound by any laws in which we have no voice or representation”, but her words were far from anyone’s mind in 1787. Women’s rights as we know them were non-existent then. But the Constitution was at least a triumph of religious freedom when, in the face of objections by some prominent Americans, it did away with any religious test for becoming a citizen and for holding office. This was a revolutionary move for the times.

Bowen’s book also does a fantastic job of letting us see the world through the eyes of these men and women. It’s very difficult for us in the age of the Internet to realize how slow communication was during those times and how disconnected people felt from each other in the unimaginably vast expanse of the country and the frontier to the West. The states were so loosely bound to each other by the previous Articles of Confederation and had such disparate geographies and cultures that in some cases they were threatening to fracture (for instance Maine wanted to separate from Massachusetts, and Virginia was planning to form a navy to defend herself against other states) So many of the concerns arose from legitimate worries that a Senator or President from Washington would never understand the concerns of a farmer from South Carolina, or that a farmer from South Carolina would never understand the concerns of a New England artisan. The fear that a central government would run roughshod over individual states was a very real one, although seventy years later it manifested itself in an ugly incarnation. There was also deep skepticism about “the people” (as Hamilton had put it, “If men were angels, governments would be unnecessary.”), and many vociferously asked that the preamble should say “We the states”.

Another revealing aspect of the book is to communicate how many measures were either defeated when they were first proposed or passed by a slim majority; sometimes the delegates even changed their votes. This was democracy in action; giving everyone a chance to voice their concerns while still obeying the wishes of the majority. Fun fact, especially in light of the present times: the presidential veto was struck down ten-to-one when first proposed. And, in what today seems like the most incomprehensible move, a Bill of Rights was also struck down ten-to-one when first proposed. The main argument was: if Americans are already free, why do they need a separate Bill of Rights? And if you are already laying down rules for what the government can do, why is it necessary to explicitly state what it cannot do? It was only after the Constitution was sent to the states for ratification that Massachusetts proposed adding a bill of rights; in fact some of the amendments in the Bill or Rights mirror Massachusetts’ own proposals for a state bill of rights. Once the powerful states like Massachusetts, Virginia and Pennsylvania ratified, the other states quickly fell in line.

It is wonderful to see Antifederalists who had opposed the Constitution immediately concede to the wishes of the people, often in generous terms, when it is ratified by individual states. In fact that is perhaps the single-most important fact that comes across in Bowen’s account; that men with widely differing views reached a compromise and forged a document which, although it contained important flaws, became a trailblazing, unique, enduring piece of work asking for a “more perfect Union” that led to a clarion call for individual rights and liberty not just in the United States but throughout the world.

View all my reviews

Chemistry is not harder than other sciences...just different.

A well known physicist turned venture capitalist asked on Twitter the other day why people seem to have a harder time understanding chemistry rather than physics or biology. Chemistry is by no means harder to understand than physics or biology, but it occupies a tricky middle ground between rigor and intuition, between deduction and creation, between creativity and understanding. Understanding it can bring great dividends: Robert Oppenheimer once said that “If you want to get someone interested in science teach them a course on elementary chemistry…unlike physics it gets very quickly to the heart of things.”
Chemistry’s path was partly driven by an impulse to understand the physical world, much like the path of physics and astronomy, but somewhat differently from physics and astronomy, to consciously improve the material conditions of life. What passed for medicine, art, architecture, agriculture and commerce in the ancient world was suffused with chemistry. Whether it was indigo dye for royal textiles, mercury or arsenic for medicine, lime for protecting crops or plaster for holding together stones of medieval stone buildings, the world looked to chemistry, whether consciously or not, to feed, transport, clothe and sustain itself. But this foundational practical role that chemistry played also obscured its philosophy.
The philosophy of chemistry developed in the 18th and 19th centuries through the work of Dalton, Lavoisier, Liebig, Kekule, Mendeleev and other thinkers. Much like biologists had spent their time collecting specimens and systematizing their science before someone like Darwin could make a great theoretical leap, chemists had systematized the vast body of observations that natural philosophers had documented and assimilated over the years. But key questions still remained: Why did water freeze at 0 degrees celsius and expand as it cooled? Why were gallium and mercury liquids? Why was lithium relatively stable while its cousin sodium a fiery, unstable beast? Even Mendeleev’s famed periodic table, after answering the how and what, did not answer the why.
It was only with the advent of atomic physics and quantum theory in the 20th century that these questions started to be answered. Niels Bohr’s atomic model led to the idea of the atom as an entity with a central dense nucleus surrounded by fuzzy probabilistic shells of electrons. Concomitant developments by 19th century chemists that had led to the precise measurements of atomic weights and the elucidation of rules that predicted how elements combine with each other intersected with the basic Bohr atom and the science of spectroscopy to illuminate how different elements were built up with different numbers of electrons and protons (the neutron whose discovery explained isotopes came only in 1932).
It was only after Walter Heitler, Fritz London, Gilbert Newton Lewis and especially Linus Pauling explained how the chemical bond was formed that chemistry truly exploded as a self-contained discipline. By showing how different atoms shared electrons in different ways so that they were held together by a variety of forces – weak dispersion forces and strong electrostatic forces for instance – modern chemistry finally started answering those questions about liquid ice and mercury that had been asked for centuries.
But how was the philosophy of chemistry faring compared to the philosophy of science during this period? Not very well. Firstly, philosophers were more naturally drawn first to physics and then to biology as deductive disciplines for laying out their conception of how science was done. Quantum mechanics especially, with its paradoxes and mysteries, became a fertile ground for philosophers to erect their edifice. Biology with evolution and heredity seemed to go to the heart of human existence and also attracted philosophical theorizing. Somehow chemistry slipped through the fingers of the prominent philosophers, partly because it seemed too practical like engineering (although engineering has its own philosophy) and partly because they simply didn’t get it.
Why? Because chemistry largely defies the traditional philosophy of science as laid down not only in physics and biology but in science in general in the centuries since the competing visions of Baconian and Cartesian science molded the way both scientists and philosophers view the natural world. Francis Bacon said, “All depends on keeping the eye fixed on the facts of nature.” Descartes said, “I think, therefore I am.” Science developed along both these lines and it led to the familiar set of ideas about hypothesis testing, observation, experiment and theorizing, and later in the 20th century, to conjectures and refutations, falsification and paradigm shifts. Most people were comfortable dealing with sciences that seem to at least broadly fit these notions from the philosophy of science.
Chemistry does not neatly fit into these categories every time because it’s more akin to the creative arts of architecture and painting. The Nobel Prize winning chemist, writer and poet Roald Hoffmann asks what hypothesis we are exactly generating or falsifying when we are synthesizing a molecule like quinine or indigo, or for that matter what hypothesis we are exactly trying to generate or falsify when we are composing a poem like “J. Alfred Prufrock”. Synthesis of novel substances is really at the heart of chemistry and it has had an incalculable impact of our way of life. There is great science as well as great art in synthesizing a complex molecule through the precise, creative assembly of simple atomic components; there is great beauty as well, of the kind found in constructing the finest cathedrals. 

There is really nothing that a chemist is trying to falsify when she makes a new compound, except to prove that it can actually be made. In addition, chemistry much more than physics is a tool-driven science, and instrumental revolutions like x-ray crystallography and NMR spectroscopy are counter to the more traditional idea-driven revolutions framework by Thomas Kuhn that is popular among science philosophers. Chemistry is thus a slippery eel, easily escaping the grasp of the flowing waters of philosophy. It’s this inability of traditional boxes of philosophy to hold chemistry that often makes it hard for people to appreciate it.
The second aspect of chemistry makes it easier for biologists to appreciate it than physicists. Hoffmann provocatively hits on this aspect when he says, “When I talk about chemistry I have three audiences in mind; fellow academics in the humanities and arts, the man on the street and physicists. Among these three I find it hardest to explain chemistry to physicists, because they think they understand, but they don’t”. The problem here is that chemistry did depend on physics, especially atomic physics and quantum mechanics, to provide some of its key foundations. There is little doubt that explaining the Bohr atom allowed theoretical chemists to then explain the chemical bond. But this success also lulled physicists – and I would say a good number of laymen – into an illusory sense of total explanatory power.
This illusion was reflected in the words of Paul Dirac, as great a theoretical physicist as one can find, when, after setting into place the full laws of quantum mechanics in the late 1920s, he said that “The underlying physical laws necessary for the mathematical theory of a large part of physics and the whole of chemistry are thus completely known, and the difficulty is only that the exact application of these laws leads to equations much too complicated to be soluble. It therefore becomes desirable that approximate practical methods of applying quantum mechanics should be developed, which can lead to an explanation of the main features of complex atomic systems without too much computation.”
Dirac was both presciently, profoundly right in saying this as well as profoundly wrong. Profoundly right because it is indeed true that many simplifying approximations and massive computations have to be brought to bear when quantum mechanics is applied to real chemical systems. Profoundly wrong because while this fact is true in theory, it’s almost irrelevant for real chemical systems. Even if you could hypothetically solve the Schrödinger equation for every single molecule of DNA in the body, that solution would still not tell you why DNA is a double helix, why it replicates semi-conservatively, why it mutates, how these mutations are passed down from parents to children or how the information it encodes is passed from DNA to RNA to protein.
All these are examples of emergent phenomena, unique to chemistry that cannot be completely reduced to physics. One can write down the Schrödinger equation for DNA, but the exact functions of DNA are the consequences of its unique structure combined with evolutionary contingency that selected the replication and transmission of hereditary characteristics as one among many functions of DNA. Contingency and emergence confer a special status on DNA the chemical as opposed to DNA the collection of atoms described by quantum theory. The same theme permeates other parts of chemistry. A good example is the hydrogen bond, a bonding interaction that’s strong enough to hold the molecules of life together but weak enough to allow them to shape-shift between structures performing a variety of functions essential to life. The hydrogen bond is a minimalist feature of chemical and biological systems that’s composed of just three atoms, oxygen or nitrogen and hydrogen being exchanged between them like a tennis ball. One can write a Schrödinger equation for a hydrogen bond and it’s useful in deriving fairly accurate energies for it, but the solution by itself doesn’t inform us how useful hydrogen bonds are, how they differ on different length and time scales and how their distribution of energies leads us to a more refined understanding of biological systems.
There are concepts in chemistry like hydrogen bonding, electronegativity, aromaticity and polarizability that get “frayed at their edges”, in Hoffmann’s words, when one tries to scrutinize them too finely using the scalpel of physics; in that sense they are like the mythical electron that physicists talk about, best-behaved when not observed and left alone. It’s not that physics is useless for understanding these ideas, it’s that they are best understood at the level of chemistry itself as semi-qualitative concepts.
It’s this emergent nature of chemical concepts which still keep one foot rooted in physics, this imprecise and yet immensely useful blend of rigor and qualitative understanding, this inability of traditional philosophy of science to keep chemistry encased within its boxes, that makes chemistry a unique science. It’s not hard to understand. It’s just complicated.
First published on 3 Quarks Daily.

Open Borders

The traveler comes to a divide. In front of him lies a forest. Behind him lies a deep ravine. He is sure about what he has seen but he isn’t sure what lies ahead. The mostly barren shreds of expectations or the glorious trappings of lands unknown, both are up for grabs in the great casino of life.
First came the numbers, then the symbols encoding the symbols, then symbols encoding the symbols. A festive smattering of metamaniacal creations from the thicket of conjectures populating the hive mind of creative consciousness. Even Kurt Gödel could not grasp the final import of the generations of ideas his self-consuming monster creation would spawn in the future. It would plough a deep, indestructible furrow through biology and computation. Before and after that it would lay men’s ambitions of conquering knowledge to final rest, like a giant thorn that splits open dreams along their wide central artery.
Code. Growing mountains of self-replicating code. Scattered like gems in the weird and wonderful passage of spacetime, stupefying itself with its endless bifurcations. Engrossed in their celebratory outbursts of draconian superiority, humans hardly noticed it. Bits and bytes wending and winding their way through increasingly Byzantine corridors of power, promise and pleasure. Riding on the backs of great expectations, bellowing their heart out without pondering the implications. What do they expect when they are confronted, finally, with the picture-perfect contours of their creations, when the stagehands have finally taken care of the props and the game is finally on? Shantih, shantih, shantih, I say.
Once the convoluted waves of inflated rational expectations subside, the reality kicks in in ways that only celluloid delivered in the past. Machines learning, loving, loving the learning that other machines love to do was only a great charade. The answer arrives in a hurry, whispered and then proudly proclaimed by the stewards of possibility. We can never succeed because we don’t know what success means. How doth the crocodile eat the tasty bits if he can never know where red flesh begins and the sweet lilies end? Who will tell the bards what to sing if the songs of Eden are indistinguishable from the lasts gasps of death? We must brook no certainty here, for the fruit of the tree can sow the seeds of murderous doubt.
Just so often, although not as often as our eager minds would like, science uncovers connections between seemingly unrelated phenomena that point to wholly new ways forward. Last week, a group of mathematicians and computer scientists uncovered a startling connection between logic, set theory and machine learning. Logic and set theory are the purest of mathematics. Machine learning is the most applied of mathematics and statistics. The scientists found a connection between two very different entities in these very different fields – the continuum hypothesis in set theory and the theory of learnability in machine learning.
The continuum hypothesis is related to two different kinds of infinities found in mathematics. When I first heard the fact that infinities can actually be compared, it was as if someone had cracked my mind open by planting a firecracker inside it. There is the first kind of infinity, the “countable infinity”, which is defined as an infinite set that maps one-on-one with the set of natural numbers. Then there’s the second kind of infinity, the “uncountable infinity”, a gnarled forest of limitless complexity, defined as an infinity that cannot be so mapped. Real numbers are an example of such an uncountable infinity. One of the staggering results of mathematics is that the infinite set of real numbers is somehow “larger” than the infinite set of natural numbers. The German mathematician Georg Cantor supplied the proof of the uncountable nature of the real numbers, sometimes called the “diagonal proof”. It is like a beautiful gem that has suddenly fallen from the sky into our lap; reading it gives one intense pleasure.
The continuum hypothesis asks whether there is an infinity whose size is between the countable infinity of the natural numbers and the uncountable infinity of the real numbers. The mathematicians Kurt Gödel and – more notably – Paul Cohen were unable to prove whether the hypothesis is correct or not, but they were able to prove something equally or even more interesting; that the continuum hypothesis cannot be decided one way or another within the axiomatic system of number theory. Thus, there is a world of mathematics in which the hypothesis is true, and there is one in which it is false. And our current understanding of mathematics is consistent with both these worlds.
Fifty years later, the computational mathematicians have found a startling and unexpected connection between the truth or lack thereof of the continuum hypothesis and the idea of learnability in machine learning. Machine learning seeks to learn the details of a small set of data and make correlative predictions for larger datasets based on these details. Learnability means that an algorithm can learn parameters from a small subset of data and accurately make extrapolations to the larger dataset based on these parameters. The recent study found that whether learnability is possible or not for arbitrary, general datasets depends on whether the continuum hypothesis is true. If it is true, then one will always find a subset of data that is representative of the larger, true dataset. If the hypothesis is false, then one will never be able to pick such a dataset. In fact in that case, only the true dataset represents the true dataset, much as only an accused man can best represent himself.
This new result extends both set theory and machine learning into urgent and tantalizing territory. If the continuum hypothesis is false, it means that we will never be able to guarantee being able to train our models on small data and extrapolate to large data. Specific models will still be able to be built, but the general problem will remain unsolvable. This result can have significant implications for the field of artificial intelligence. We are entering an age where it’s possible to seriously contemplate machines controlling others machines, with human oversight not just impossible in practice but also in principle. As code flows through the superhighway of other code and groups and regroups to control other pieces of code, machine learning algorithms will be in charge of building models based on existing data as well as generating new data for new models. Results like the current result might make it impossible for such self-propagating intelligent algorithms to ensure being able to solve all our problems, or solve their own problems to imprison us. The robot apocalypse might be harder than we think.
As Jacob Bronowski memorably put it in his “The Ascent of Man”, one of the major goals of science in the 20th century was to establish the certainty of scientific knowledge. One of the major achievements of science in the 20th century was to prove that this goal is unattainable. In physics, Heisenberg’s uncertainty principle put a fundamental limit on measurement in the world of elementary particles. Einstein’s theory of relativity put a fundamental limit on the speed of light. But most significantly, it was Gödel’s famous incompleteness theorem that put a fundamental limit on what we could prove and know even in the seemingly impregnable world of pure, logical mathematics. Even in logic, that bastion of pure thought, where conjectures and refutations don’t depend on any quantity in the real world, we found that there are certain statements whose truth might forever remain undecidable.
Now the same Gödel has thrown another wrench in the machine, asking us whether we can indeed hold inevitability and eternity in the palm of our hands. As long as the continuum hypothesis remains undecidable, so will the ability of machine learning to transform our world and seize power from human beings. And if we cannot accomplish that feat of extending our knowledge into the infinite unknown, instead of despair we should be filled with the ecstatic joy of living in an open world, a world where all the answers can never be known, a world forever open to exploration and adventure by our children and grandchildren. The traveler comes to a divide, and in front of him lies an unyielding horizon.

Modular complexity, and reverse engineering the brain

The Forbes columnist Matthew Herper has a profile of Microsoft co-founder Paul Allen who has placed his bets on a brain institute whose goal is to to map the brain...or at least the visual cortex. His institute is engaged in charting the sum total of neurons and other working parts of the visual cortex and then mapping their connections. Allen is not alone in doing this; there's projects like the Connectome at MIT which are trying to do the same thing (and the project's leader Sebastian Seung has written an excellent book about it) .

Well, we have heard echoes of reverse engineered brains from more eccentric sources before, but fortunately Allen is one of those who does not believe that the singularity is near. He also seems to have entrusted his vision to sane minds. His institute's chief science officer is Christof Koch, former professor at Caltech and longtime collaborator of the late Francis Crick who started at the institute this year. Just last month Koch penned a perspective in Science which points out the staggering challenge of understanding the connections between all the components of the brain; the "neural interactome" if you will. The article is worth reading if you want to get an idea of how simple numerical arguments illuminate the sheer magnitude of mapping out the neurons, cells and proteins that make up the wonder that's the human brain.

Koch starts by pointing out that calculating the interactions between all the components in the brain is not the same as computing the interactions between all atoms of an ideal gas since the interactions are between different kinds of entities and are therefore not identical. Instead, he proposes, we have to use something called Bell's number B(n) which reminds me of the partitions that I learnt when I was sleepwalking through set theory in college. Briefly for n objects, B(n) refers to the number of combinations (doubles, triples, quadruples etc.) that can be formed. Thus, when n=3 B(n) is 5. Not surprisingly, Bn scales exponentially with n and Koch points out that B(10) is already 115,975. If we think of a typical presynaptic terminal with its 1000 proteins or so, B(n) already starts giving us heartburn. For something like the visual cortex where n= 2 million B(n) would be prohibitive. And as the graph demonstrates, for more than 10^5 components or so the amount of time spirals out of hand at warp speed. Koch then uses a simple calculation based on Moore's law in trying to estimate the time needed for "sequencing" these interactions. For n = 2 million the time would be of the order of 10 million years.

And this considers only the 2 million neurons in the visual cortex; it doesn't even consider the proteins and cells which might interact with the neurons on an individual basis. Looks like we can rapidly see the outlines of what Allen himself has called the "complexity brake". And this one seems poised to make an asteroid-sized impact.

So are we doomed in trying to understand the brain, consciousness and the whole works? Not necessarily, argues Koch. He gives the example of electronic circuits where individual components are grouped separately into modules. If you bunch a number of interacting entities together and form a separate module, then the complexity of the problem reduces since you now have to only calculate interactions between modules. The key question then is, is the brain modular? Commonsense would have us think it is, but it is far from clear how we can exactly define the modules. We would also need a sense of the minimal number of modules to calculate interactions between them. This work is going to need a long time (hopefully not as long as that for B(2 million) and I don't think we are going to have an exhaustive list of the minimal number of modules in the brain any time soon, especially since these are going to be composed of different kinds of components and not just one kind.

Any attempt to define these modules are going to run into problems of emergent complexity that I have occasionally written about. Two neurons plus one protein might be different from two neurons plus two proteins in unanticipated ways. Nevertheless this goal seems far more attainable in principle than calculating every individual interaction, and that's probably the reason Koch left Caltech to join the Allen Institute in spite of the pessimistic calculation above. If we can ever get a sense of the modular structure of the brain, we may have at least a fighting chance to map out the whole neural interactome. I am not holding my breath too hard, but my ears will be wide open.

Image source: Science magazine

This year's 100 book odyssey

I finally achieved my goal of reading a hundred books this year (105 to be exact, although I will probably take a break for the rest of the year). It's an important item off my bucket list. The experience has provided an immense sense of satisfaction since it may be very hard to do this for a while because of time constraints. I am particularly happy that some of these volumes were real tomes that I was lugging around everywhere.

As usual, the list was heavily biased toward non-fiction, with most of the books covering history, science and philosophy. I do want to increase my share of fiction next year, and on the non-fiction (or "verity", as Richard Rhodes calls it) front want to read more about AI, technology, biology and economics.

Is it hard to read a hundred books in one year, essentially two books a week? Not particularly if you try to grab every spare minute (after work and family, that is) for doing it; and I am not even a fast reader. Apart from the usual times (early morning, after work, bedtime, in the bathroom), I used to read in Ubers, in trains, in lines in coffee shops, in restaurants when dining alone, while waiting for friends to show up, in stores while the wife was shopping, at the DMV when I had to wait 2 hrs for my driver's license, during the occasional walk or hike, and during lunch break at work (but don’t tell anyone!). I used to read paper books when possible, but read on my Kindle and on my phone when nothing else was available. I could have read even more had I listened to Audiobooks, but I find it hard to concentrate on the spoken as opposed to the written word. Basically you try to cram in a few words every time you can. Sometimes this does lead to fragmented reading, but you gradually get used to it. And at the end of it you feel uniquely well-read, so it’s absolutely worth it, if for no other reason as a personal challenge.

 Here’s the list in case someone wants holiday book suggestions, starting with the most recent volumes (starred volumes indicate favorites that were reread). Happy Reading! 
 1. Jill Lepore – These Truths: A History of the United States. Perhaps the best, most even-handed single volume on American history that I have read. 2. Don Norman – The Design of Everyday Things 3. Edward Gibbon – Decline and Fall of the Roman Empire, Vol. 1. A monumental work that really needs to be savored like fine wine. Because of its archaic style and long sentences it’s not easy reading, but enlightenment comes to those who are patient. I don’t know if I will ever get through all six volumes, but one can try. 4. Isaac Asimov – Asimov’s Mysteries. * 5. Charles Darwin – The Origin of Species.* 6. Candice Millard – Destiny of the Republic. A great thriller about the sadly short-lived presidency of the brilliant James Garfield and his assassination. The book is as much about medical ignorance as about anything else; even though Joseph Lister had demonstrated the value of antibiotics, the doctors of the day resorted to crude measures like sticking their fingers into a wound to find the bullet. It was infection that did Garfield in and not the bullet. 7. Leslie Berlin - Troublemakers: Silicon Valley’s Coming of Age. A fantastic book that brings to life some of the underappreciated characters who, in just eight years, pioneered six transformational industries: video games, personal computing, biotechnology, venture capital, semiconductors and communications. 8. Valley of Genius – Another great and very unique book about Silicon Valley that patches together sound bytes from interviews scores of Silicon Valley pioneers over thirty years conducted during different times. Makes for a very unique experience where one person begins where another trails off, and you get multiple perspectives on the same people and events. 9. Michael Hiltzik – Dealers of Lightning: Xerox-PARC and the Dawn of the Computer Age. An excellent account of what was, for a brief period, the most innovative computer science lab in the world, giving us everyday inventions like the mouse, the GUI and the windows desktop. 10. John Carreyou - Bad Blood. Reads like a soap opera. No wonder it’s being turned into a Hollywood movie with Jennifer Lawrence. 11. Abraham Pais – Niels Bohr’s Times 12. Abraham Pais – Subtle is the Lord* 13. Niels Bohr – Atomic Physics and Human Knowledge. There are few wiser men in human history than Niels Bohr. 14. Niels Bohr – Atomic Theory and the Description of Nature. 15. Doris Kearns Goodwin – Leadership in Turbulent Times. You think these times are politically fraught? Just ask Lincoln or FDR. 16. William Aspray – John von Neumann and the Origins of Modern Computing. 17. Oxtoby and Pettis – John von Neumann. 18. Ray Monk – How to Read Wittgenstein. 19. Michael Lewis – The Fifth Risk. A great book on how crucial government functions are in keeping Americans alive and thriving every single day. 20. John Wesley Powell – The Exploration of the Colorado River and its Canyons. I read this first-hand account during a trip to the Grand Canyon. Powell was really the first American to explore what was then a land inhabited only by Natives, and the courage and resilience of his team were amazing (he lost several men). 21. Michael Beschloss – Presidents of War 22. Charles Krauthammer – Things that Matter 23. Charles Krauthammer – The Point of It All. One of the last conservatives who was a well-read and eloquent intellectual. 24. Venki Ramakrishnan – Gene Machine: The Race to Decipher the Secrets of the Ribosome. 25. Charles Darwin – The Autobiography of Charles Darwin. Darwin may have seemed conservative in his demeanor, but this book really brings out both his radical thinking and his sense of humor, especially about religion. 26. Steven Weinberg – Third Thoughts. 27. Richard Powers – The Overstory. In one word – spellbinding. Powers is the most creative writer alive in my opinion. It made me fall in love with redwood trees. 28. Craig Childs – House of Rain: Tracking a Vanished Civilization Across the American Southwest. 29. Simon Winchester – The Perfectionists. A superb history of precision engineering. 30. Alan Lightman – Searching for Stars on an Island in Maine. A meditation on science, time, existence and other topics from one of the most literary and poetic science writers of his generation. 31. Sabine Hossenfelder – Lost in Math: How the Search for Beauty Leads Physics Astray. 32. Chaos – James Gleick. * (This may be the fourth of fifth time I have read this landmark book). 33. Brian VanDeMark – Road to Disaster. A new approach to the Vietnam War that sees it through the lens of theories of organizational behavior and cognitive biases of the kind explored by Amos Tversky and Daniel Kahneman. 34. Brian Keating – Losing the Nobel Prize. A unique first-hand account of a false alarm (cosmic inflation) that led the author very close to a Nobel Prize. 35. William Perry – My Journey to the Nuclear Brink. 36. Jeremy Bernstein – A Bouquet of Dyson. 37. Loren Eiseley – The Unexpected Universe. 38. Loren Eiseley – The Immense Universe. 39. Loren Eiseley – The Firmament of Time. If you think Carl Sagan is eloquent and poetic about how vast the cosmos and how insignificant and yet profound life are, read Loren Eiseley. 40. Ann Finkbeiner – The Jasons: The Secret History of America’s Postwar Elite.* 41. Tom Holland – Persian Fire. A gripping account of the Greco-Persian Wars. Marathon, Salamis, Thermopylae, all come alive on these pages. 42. AI Superpowers – Kai-Fu Lee. 43. What is Real – Adam Becker. A case for David Bohm’s interpretation of quantum theory. 44. Carlo Rovelli – The Order of Time. 45. Oliver Sacks – The River of Consciousness. A moving posthumous collection of Sacks’s eclectic writing; on Darwin, on consciousness, on plant biology and neurology and on time. 46. Oliver Sacks – Uncle Tungsten: Memories of a Chemical Boyhood. 47. Sy Montgomery – The Soul of an Octopus. The description of octopus intelligence in this book impressed me so much that I actually stopped eating it. 48. Kevin Kelly – What Technology Wants. 49. Jon Gertner – The Idea Factory.* 50. Arieh Ben-Naim – Myths and Verities in Protein Folding. 51. A. P. French – Einstein: A Centenary Volume. Fond recollections of a great physicist and human being by friends, colleagues and students. 52. Intercom on Product Management. 53. John McPhee – Annals of the Former World. A towering history of the geology of the United States, written by one of the best non-fiction writers in America. No one else has the eye for observational detail for both people and places that McPhee does. 54. Oliver Sacks – On the Move.* 55. William Prescott - History of the Conquest of Mexico: Vols 1 and 2. Vivid, engaging, monumental; hard to believe Prescott wasn’t there. 56. Joel Shurkin – True Genius. A biography of physicist and engineer Richard Garwin, one of the very few people to whom the label genius can be applied. 57. Lillian Hoddeson and Vicki Daitch – True Genius*. Another true genius: John Bardeen, the only person to win two Nobel Prizes for physics. The definitive treatment of his life and contributions to world-changing inventions like the transistor and superconductivity. 58. Cormac McCarthy – Blood Meridian. The greatest novel I have read. Breathtaking in its raw beauty and Biblical violence. 59. John Archibald Wheeler – At Home in the Universe. A collection of essays from a physicist who was also a great philosopher and poet. 60. John Wesley Powell – The Exploration of the Colorado River and Its Canyons. The first exploration of the area around the Grand Canyon by a white man, Powell’s journey became known for its harrowing loss of life and adventures. 61. Franck McCourt –Angela’s Ashes. Achingly beautiful, heartbreaking account of Irish poverty. 62. Adam Becker – What is Real? Fascinating account of quantum physics and reality, and one that challenges the traditional Copenhagen Interpretation and gives voice to David Bohm, John Wheeler and others. 63. Steven Weinberg – Facing Up: Science and its Cultural Adversaries 64. Benjamin Hett – The Death of Democracy. The best account I have read of the details of how Hitler came to power. Read and learn. 65. George Trigg – Landmark Experiments in Twentieth Century Physics. The nuts and bolts of some of the most important physics experiments of the last hundred years, from the oil drop experiment to the Lamb Shift. 66. Albert Camus – The Stranger 67. Norman McCrea – John von Neumann 68. Werner Heisenberg – Physics and Philosophy* 69. David Schwartz – The Last Man Who Knew Everything. A fine biography of Enrico Fermi, the consummate scientist. 70. A. Douglas Stone – Einstein and the Quantum. Einstein’s opposition to quantum theory is well known; his monumental contributions to the theory are not as well appreciated. Stone fixes this gap. 71. John Cheever – Cheever: The Collected Short Stories. Melancholy, beautiful prose describing the quiet despair of New England upper class suburbia. 72. Arnold Toynbee – A Study of History 73. Aldous Huxley – Brave New World 74. Charles Darwin – Insectivorous Plants 75. William Faulkner – As I Lay Dying 76. Lillian Hoddeson – Critical Assembly* 77. Herbert York – The Advisors: Oppenheimer, Teller and the Superbomb 78. Joseph Ellis – Founding Brothers*. Worth reading for the wisdom, insights and follies that the founding fathers displayed in erecting a great nation. 79. Noam Chomsky – Requiem for the American Dream. Everyone’s perpetual wet blanket, with his incomparable combination of resoundingly true diagnoses and befuddling philosophy. 80. Colin Wilson – Beyond the Occult. This book had mesmerized me as a child. Now I am more critical, but some of the case studies are fascinating. 81. John Tolland – Adolf Hitler. Still stands as the most readable Hitler biography in my opinion. 82. Ron Chernow – Grant. Fantastic. Brings to life the towering, plainspoken, determined man who won the Civil War and became president. Chernow is evenhanded in his treatment of Grant’s drinking problems and corruption-riddled presidency, but he clearly loves his subject. And who wouldn’t? That kind of simplicity and grassroots activism seems to be from another planet these days. 83. Priscilla McMillan – The Ruin of J. Robert Oppenheimer* 84. Paul Horgan – Great River: The Rio Grande in North American History. An epic history of Indians, Spaniards and Anglo-Americans who settled the great states around the Rio Grande. 85. Toby Huff – The Rise of Early Modern Science: Islam, China and the West*. A superb examination of why modern science developed in Europe and not other parts of the world. Huff’s main explanation centers around the European legal and scientific system derived from Roman law and Greek philosophy, both of which encouraged scientific inquiry. Both these elements were crucially missing from Islamic countries, China and India. 86. A. P. French – Niels Bohr: A Centenary Volume. A glowing set of tribute to a great physicist and human being. 87. Stephen Kotkin – Stalin, Volume 1: Paradoxes of Power. Monumental biography of the tyrant, although not easy going because of the dense detail and slightly academic writing. Kotkin’s is likely to be the last word, though. 88. Joseph Heilbronner and Jack Dunitz – Reflections on Symmetry. A beautiful exploration of symmetry in chemistry, physics, biology, architecture and other scientific and human endeavors. 89. Chuck Hansen – The Swords of Armageddon, Vols 2 and 3. The definitive history of US nuclear weapons. Everything you can possibly read about their details without having the feds show up at your doorstep (as they did show up at Hansen’s door many times without being able to ever prove that he had access to non-public information). 90. David Kaiser – Drawing Theories Apart: The Dispersion of Feynman Diagrams in Postwar Physics* A fascinating socio-scientific exploration of how a key scientific idea makes it ways by fits, starts and eventual acceptance into the scientific community. 91. Cormac McCarthy – Child of God. Highly disturbing story of a man on the fringes, filled with dark humor. Not McCarthy’s best in my opinion, and I won’t recommend it for weak stomachs. 92. Franz Kafka – The Metamorphosis 93. Lawrence Badash – Reminiscences of Los Alamos 94. Herodotus – The Histories. The account of the Greco-Persian Wars is especially rousing, and Herodotus of course made a seminal contribution to history by treating it as contemporary account rather than divine, untouchable past. 95. Freeman Dyson – Maker of Patterns: An Autobiography Through Letters 96. Iris Chang – The Chinese in America. An amazing account of how Chinese immigrants came to the United States and laid down roots here in the face of poverty, cultural challenges, political upheavals and discrimination. 97. Robert Divine – Blowing on the Wind 98. Richard Rhodes – Energy: A Human History. A history of energy transitions, focusing on the human beings, some well known and others obscure, who engineered it. As usual, Richard is highly adept at both digging up fascinating individual stories as well as deciphering the big picture. 99. Norman Cohn – Warrant for Genocide: The Myth of the Jewish World Conspiracy and the Protocols of the Elders of Zion. 100. Jim Holt - When Einstein Walked with Gödel. A series of essays on math, philosophy, genius and the nature of reality. 101. Anil Ananthaswamy – Through Two Doors at the Same Time. Captivating account of the essential nature and many ramifications of one of the most simply stated and yet perplexing, deep and mind-bending experiments of all time. 102. Marcus Chown – The Magic Furnace: The Search for the Origin of Atoms. 103. E. O. Wilson – The Meaning of Human Existence. 104. David Quammen – The Tangled Tree. 105. V. S. Naipaul – A House for Mr. Biswas.