Field of Science

Book Review: Jacob Bronowski's "The Origins of Knowledge and Imagination"

The late Jacob Bronowski was one of a handful of people in the 20th century who were true Renaissance Man with a grasp of all intellectual endeavors, from the science of Newton to the art of Blake. But more importantly, Bronowski was also a great humanist who understood well the dangers of dogma and the importance of ethics in securing the contract between science and society. His TV series and book, “The Ascent of Man”, is an eloquent testament and essential resource for anyone who wants to understand the history of science and its relationship to politics and society. His plea to all of us – delivered on a rainy, gloomy day in Auschwitz - to regard no knowledge as final, to regard everything as open to doubt, is one of the great statements on science and ethics of our times. Bronowski had an unusual command over the English language; perhaps because English was not his first language, his style acquired a simplicity and a direct, hard-hitting eloquence that would escape native English speakers. In this sense, I find Bronowski to be the Joseph Conrad of his discipline.

In this book Bronowski takes on a different topic – an inquiry into the meaning of knowledge and our means of its acquisition. The book is based on the Silliman Lectures at Yale and is more academic than “The Ascent of Man”, but it is no less wide-ranging. Bronowski tells us how all objective knowledge is essentially derived, a point illustrated by a description of how the eye and brain conspire together to build a fine-grained picture of coarse-grained reality. He drives home the all important lesson that every experiment we do on nature is only made possible by making a "cut" that isolates the system under investigation from the rest of the universe. Thus, we are forced to sacrifice the connectivity of the universe and the knowledge we leave out when we make this cut when we do science. This fact by itself shows us that scientific knowledge is only going to be an approximation at best.

If Bronowski lived in modern times, I feel certain that this discussion would digress into one about models and especially computer models. All knowledge is essentially a model, and understanding the strengths and limitations of models helps us understand the limitations of objective knowledge. Now knowledge has little value if it cannot be understood and communicated through language, and a good part of the book is devoted to the differences between human and animal language that make human language special; one difference that I hadn’t quite appreciated was that humans have the ability to break sentences down into words that can be rearranged while animals essentially speak in sentences, and even then these sentences communicate instruction rather than information.

The most important part of the book in my opinion is the second half. Here Bronowski solidifies the theme of understanding the world through limited models and really drives home the open and uncertain nature of all of knowledge. There are a few essential springboards here: Bertrand Russell’s difficulties with paradoxes in mathematics, Alan Turing’s negative solution to the halting problem (the problem of whether there is an algorithm that could tell us for certain whether an arbitrary program on a Turing machine will halt) and finally, the twenty-four-year-old Kurt Gödel’s stunning incompleteness theorem. Bronowski ties together the themes explored in his lectures by making the centerpiece of his arguments the ability of linguistic and mathematical systems to engage in self-reference. Creating self-referential system was one of the powerful tools Gödel used in his seminal work. 

In Bronowski’s opinion, we found the limits of our knowledge when, in our attempts to turn mathematics and logic into closed systems, we found ourselves running into fundamental difficulties created by the fact that every such system will create paradoxes through the process of self-reference. Self-reference dooms attempts by human beings to gain complete knowledge of any system. This ensures that our investigations will always remain open, and this openness is one which we must recognize as an indelible feature, not a bug. Our political life too is predicated on openness, and not recognizing the open nature of systems of government can lead to real as opposed to merely intellectual pain and grief. The history of the twentieth century underscores this point adequately.

There is an anecdote at the end of the book which I thought illustrates Bronowski’s appeal to openness quite well, although perhaps not in the way he intended. It’s also particularly relevant to our present trying times. Bronowski tells a story about emigrating to the United States from England during a political fraught time in the 1950s when everyone who dissented from orthodoxy was suspected of being a subversive or traitor. When he arrived at the port of New York City, an Irish-American policeman insisted on examining the books he was carrying. Bronowski had written a well-regarded book on Blake. The policeman took the book, flipped a few pages and asked, “You write this bud?”. “Yes”. He said, “Psshh, this ain’t never going to be no bestseller!”. And here’s Bronowski’s take on the policeman with which he ends the book: “So long as there are Irish policemen who are more addicted to literary criticism than to legalisms, the intellect will not perish”. Unfortunately, as has become apparent in today’s political environment, the kind of dissent and rugged individualism that has been the hallmark of the American experiment could be said to have swung too far. Perhaps the Irish policeman was exercising his right to individual dissent, but perhaps it also meant that he was simply too ignorant to have understood the book, irrespective of its commercial status. Perhaps, ironically enough, the Irish policeman might have benefited from a dose of the spirit of open inquiry that Bronowski extols so well in these lectures.

Is Big Data shackling mankind's sense of creative wonder?

This is my latest monthly column for the site 3 Quarks Daily. 

Primitive science began when mankind looked upward at the sky and downward at the earth and asked why. Modern science began when Galileo and Kepler and Newton answered these questions using the language of mathematics and started codifying them into general scientific laws. Since then scientific discovery has been constantly driven by curiosity, and many of the most important answers have come from questions of the kind asked by a child: Why is the sky blue? Why is grass green? Why do monkeys look similar to us? How does a hummingbird flap its wings? With the powerful tool of curiosity came the even more powerful fulcrum of creativity around which all of science hinged. Einstein’s imagining himself on a light beam was a thoroughly creative act; so were Ada Lovelace’s thoughts about a calculating machine as doing something beyond mere calculation, James Watson and Francis Crick’s DNA model-building exercise, Enrico Fermi’s sudden decision to put a block of paraffin wax in the path of neutrons.

What is common to all these flights of fancy is that they were spontaneous, often spur-of-the-moment, informed at best by meager data and mostly by intuition. If Einstein, Lovelace and Fermi had paused to reconsider their thoughts because of the absence of hard evidence or statistical data, they might at the very least been discouraged from exploring these creative ideas further. And yet that is what I think the future Einsteins and Lovelaces of our day are in danger of doing. They are in danger of doing this because they are increasingly living in a world where statistics and data-driven decisions are becoming the beginning and end of everything, where young minds are constantly cautioned to not speculate before they have enough data.

We live in an age where Big Data, More Data and Still More Data seem to be all consuming, looming over decisions both big and mundane; from driving to ordering pet food to getting a mammogram. We are being told that we should not make any decision pending its substantiation through statistics and large-scale data analysis. Now, I will be the first one to advocate making decisions based on data and statistics, especially in an era where sloppy thinking and speculation based on incomplete or non-existent data seems to have turned into the very air which the media and large segments of the population breathe. Statistics has especially been found to be both paramount and sorely lacking in making decisions, and books like Daniel Kahneman’s “Thinking Fast and Slow” and Nate Silver’s “The Signal and the Noise” have stressed how humans are intrinsically bad at probabilistic and statistical thinking and how this disadvantage leads to them consistently making wrong decisions. It seems that a restructuring of our collective thinking process that is grounded in data would be a good thing for everyone.

But there are inherent problems with implementing this principle, quite apart from the severe limitations on creative speculation that an excess of data-based thinking imposes. Firstly, except in rare cases, we simply don’t have all the data that is necessary for making a good decision. Data itself is not insight, it’s simply raw material for insight. This problem is seen in the nature of the scientific process itself; in the words of the scientist and humanist Jacob Bronowski, in every scientific investigation we decide where to make a “cut” in nature, a cut that isolates the system of interest from the rest of the universe. Even late into the process, we can never truly know whether the part of the universe we have left out is relevant. Our knowledge of what we have left out is thus not just a “known unknown” but often an “unknown unknown”. Secondly and equally importantly, the quality of the data often takes second stage to its quantity; too many companies and research organizations seem to think that more data is always good, even when more data can mean more bad data. Thirdly, even with a vast amount of data, human beings are incapable of digesting this surfeit and making sure that their decisions include all of it. And fourthly and most importantly, making decisions based on data is often a self-fulfilling prophecy; the hypothesis we form and the conclusions we reach are inherently constrained by the data. We get obsessed with the data that we have and develop tunnel vision, and we ignore the importance of the data that we don’t have. This means that all our results are only going to be as good as the existing data.

Consider a seminal basic scientific discovery like the detection of the Higgs Boson, forty years after the prediction was made. There is little doubt that this was a supreme achievement, a technical tour de force that came about only because of the collective intelligence and collaboration of hundreds of scientists, engineers, technicians, bureaucrats and governments. The finding was of course a textbook example of how everyday science works: a theory makes a prediction and a well-designed experiment confirms or refutes the prediction. But how much more novelty the LHC would have found had the parameters been significantly tweaked, if the imagination of the collider and its operator been set loose? Maybe it would not have found the Higgs then, but it would have discovered something wholly different and unexpected. There would certainly have been more noise, but there would also have been more signal that would have led to discoveries which nobody predicted and which might have charted new vistas in physics. One of the major complaints about modern fundamental physics, especially in areas like string theory, is that it is experiment-poor and theory-rich. But experiments can only find something new when they don’t stay too close to the theoretical framework. You cannot always let prevailing theory dictate what experiments should do.

The success of the LHC in finding the Higgs and nothing but the Higgs points to the self-fulfilling prophecy of data that I mentioned: the experiment was set up to find or disprove the Higgs and the data contained within it the existence or absence of the Higgs. True creative science comes from generating hypotheses beyond the domain of the initial hypotheses and the resulting data. These hypotheses have to be confined within the boundaries of the known laws of nature, but there still has to be enough wiggle room to at least push against these boundaries, if not try to break free of them. My contention is that we are gradually becoming so enamored of data that it is clipping and tying down our wings, not allowing us to roam free in the air and explore daring new intellectual landscapes. It’s very much a case of the drunk under the lamppost, looking for his keys there because that’s where the light is.

A related problem with the religion of “dataism” is the tendency to dismiss anything that constitutes anecdotal evidence, even if it can lead to creative exploration. “Yes, but that’s an n of 1” is a refrain that you must have heard from many a data-entranced statistics geek. It’s important to not regard anecdotal evidence as sacrosanct, but it’s equally wrong in my opinion to simply dismiss it and move on. Isaac Asimov reminded us that great discoveries in science are made when an odd observation or fact makes someone go, “Hmm, that’s interesting”. But if instead, the reaction is going to be “Interesting, but that’s just an n of 1, so I am going to move on”, you are potentially giving up on hidden gems of discovery.

With anecdotal data also comes storytelling which has always been an integral part not just of science but of the human experience. Both arouse our sense of wonder and curiosity; we are left fascinated and free to imagine and explore precisely because of the paucity of data and the lone voice from the deep. Very few scientists and thinkers drove home the importance of taking anecdotal storytelling seriously as well as the late Oliver Sacks. If one reads Sacks’s books, every one of them is populated with fascinating stories of individual men and women with neurological deficits or abilities that shed valuable light on the workings of the brain. If Sacks had dismissed these anecdotes as insufficiently data-rich, he would have missed discovering the essence of important neurological disorders. Sacks also extolled the value of looking at historical data, another source of wisdom that would very easily be dismissed by hard scientists who think all historical data suspect because of its absence of large-scale statistical validation. Sacks regarded historical reports as especially neglected and refreshingly valuable sources of novel insights; in his early days, his insistence that his hospital’s weekly journal club discuss the papers of their nineteenth century forebears was met largely with indifference. But this exploration off the beaten track paid dividends. For instance, he once realized that he had rediscovered a key hallucinogenic aspect of severe migraines when he came across a paper on similar self-reported symptoms by the English astronomer John Herschel, written more than a hundred years ago. A data scientist would surely dismiss Herschel’s report as nothing more than a fluke.

The dismissal of historical data is especially visible in our modern system of medicine which ignores many medical reports of the kind that people like Sacks found valuable. It does an even better job ignoring the vast amount of information contained in the medical repositories of ancient systems of medicines, such as the Chinese and Indian pharmacopeias. Now, admittedly there are a lot of inconsistencies in these reports so they cannot all be taken literally, but neither is the process of ignoring them fruitful. Like all uncertain but potentially useful data, they need to be dug up, investigated and validated so that we can keep the gold and throw out the dross. The great potential value of ancient systems of medicine was made apparent when two years ago, the Nobel Prize for medicine was awarded to Chinese medicinal chemist Tu Youyou for her lifesaving discovery of the antimalarial drug artemisinin. Youyou was inspired to make the discovery when she found a process for low-temperature chemical extraction of the drug in a 1600-year-old Chinese text titled “Emergency Prescriptions Kept Up One’s Sleeve”. This obscure and low-visibility data point would have been certainly dismissed by statistics-enamored medicinal chemists in the West, even if they had known where to find it. Part of recognizing the importance of Eastern systems of medicine consists in recognizing their very different philosophy; while Western medicine seeks to attack the disease and is highly reductionist, Eastern medicine takes a much more holistic approach in which it seeks to modify the physiology of the individual itself. This kind of philosophy is harder to study in the traditional double-blinded, placebo-controlled clinical trial that has been the mainstay of successful Western medicine, but the difficulty of implementing a particular scientific paradigm should not be an argument against its serious study or adoption. As Sacks’s and Youyou’s examples demonstrate, gems of discovery still lie hidden in anecdotal and historical reports, especially in medicine where even today we understand so little about entities like the human brain.

Whether it’s the LHC or medical research, the practice of gathering data and relying only on that data is making us stay close to the ground when we could have been soaring high in the air without these constraints. Data is critical for substantiating a scientific idea, but I would argue that it actually makes it harder to explore wild, creative scientific ideas in the first place, ideas that often come from anecdotal evidence, storytelling and speculation. A bigger place for data leaves increasingly smaller room for authentic and spontaneous creativity. Sadly, today’s publishing culture also rooms little room for pure speculation-driven hypothesizing. As just one example of how different things have become in the last forty years, in 1960 the physicist Freeman Dyson wrote a paper in Science speculating on possible ways to detect alien civilizations based on their capture of heat energy from their parent star. Dyson’s paper contained enough calculations to make it at least a mildly serious piece of work, but I feel confident that in 2017 his paper would probably get rejected from major journals like Science and Nature which have lost their taste for interesting speculation and have become obsessed with data-driven research.

Speculation and curiosity have been mainstays of human thinking since our origins. When our ancestors sat around fires and told stories of gods, demons and spirit animals to their grandchildren, it made the wide-eyed children wonder and want to know more about these mysterious entities that their elders were describing. This feeling of wonder led the children to ask questions. Many of these questions led down wrong alleys, but the ones that survived later scrutiny launched important ideas. Today we would dismiss these undisciplined mental meanderings as superstition, but there is little doubt that they involve the same kind of basic curiosity that drives a scientist. There is perhaps no better example of a civilization that went down this path than ancient Greece. Greece was a civilization full of animated spirits and Gods that controlled men’s destinies and the forces of nature. The Greeks certainly found memorable ways to enshrine these beliefs in their plays and literature, but the same cauldron that imagined Zeus and Athena also created Aristotle and Plato. Aristotle and Plato’s universe was a universe of causes and humors, of earth and water, of abstract geometrical entities divorced from real world substantiation. Both men speculated with fierce abandon. And yet both made seminal contributions to Western science and philosophy even as their ideas were accepted, circulated, refined and refuted for the next two thousand years. Now imagine if Aristotle and Plato had refused to speculate on causes and human anatomy and physiology because they had insufficient data, if they had turned away from imagining because the evidence wasn’t there.

We need to remember that much of science arose as poetic speculations on the cosmos. Data kills the poetic urge in science, an urge that the humanities have recognized for a long time and which science has had in plenty. Richard Feynman once wrote,

“Poets say that science takes away the beauty of the stars and turns them into mere globs of gas atoms. But nothing is ‘mere’. I too can see the stars on a desert night, but do I see less or more? The vastness of the heavens stretches my imagination; stuck on this carousel my little eye can catch one-million-year-old light…What men are poets who can speak of Jupiter as if he were a man, but if he is an immense spinning sphere of methane and ammonia must be silent?”

Feynman was speaking to the sense of wonder that science should evoke in all of us. Carl Sagan realized this too when he said that not only is science compatible with spirituality, but it’s a profound source of spirituality. To realize that the world is a multilayered, many-splendored thing, to realize that everything around us is connected through particles and forces, to realize that every time we take a breath or fly on a plane we are being held alive and aloft by the wonderful and weird principles of mechanics and electromagnetism and atomic physics, and to realize that these phenomena are actually real as opposed to the fictional revelations of religion, should be as much a spiritual experience as anything else in one’s life. In this sense, knowing about quantum mechanics or molecular biology is no different from listening to the Goldberg Variations or gazing up at the Sistine Chapel. But this spiritual experience can come only when we let our imaginations run free, constraining them in the straitjacket of skepticism only after they have furiously streaked across the sky of wonder. The first woman, when she asked what the stars were made of, did not ask for a p value.

Staying after the party is over: Nobel Prizes and second acts

Hans Bethe kept on making important
contributions to physics for more than
thirty years after winning the Nobel Prize
Since the frenzy of Nobel season is over, it's worth dwelling a bit on a topic that's not much discussed: scientists who keep on doing good work even after winning the Nobel prize. It's easy to rest on your laurels once you win the prize. Add to this the exponentially higher number of speaking engagements, magazine articles and interviews in which you are supposed to hold forth on the state of the world in all your oracular erudition, and most scientists can be forgiven for simply not having the time to do sustained, major pieces of research after their prizewinning streak. This makes the few examples of post-Nobel scientific dedication even more noteworthy. I decided to look at these second acts in the context of physics Nobel Prizes, starting from 1900, and found interesting examples and trends.

Let's start with the two physicists who are considered the most important ones of the twentieth century in terms of their scientific accomplishments and philosophical influence - Albert Einstein and Niels Bohr. Einstein got a Nobel Prize in 1921 after he had already done work for which he would go down in history; this included the five groundbreaking papers published in the "annus mirabilis" of 1905, his collaboration on Bose-Einstein statistics with Satyendranath Bose and his work on the foundations of the laser. After 1921 Einstein did not accomplish anything of similar stature - in fact one can argue that he did not accomplish anything of enduring importance to physics after the 1920s - but he did became famous for one controversy, his battle with Niels Bohr about the interpretation of quantum theory that started at the Solvay conference in 1927 and continued until the end of his life. This led to the historic paper on the EPR paradox in 1935 that set the stage for all further discussions of the weird phenomenon known as quantum entanglement. In this argument as in most arguments on quantum theory, Einstein was mistaken, but his constant poking of the oracles of quantum mechanics let to spirited efforts to rebut his arguments. In general this was good for the field and culminated in John Bell's famous inequality and Alain Aspect and others' experiments to confirm quantum entanglement and prove Einstein's faith in "hidden variables" misguided.

Bohr himself was on the cusp of greatness when he received his prize in 1922. He was already famous for his atomic model of 1913, but he was not yet known as the great teacher of physics - perhaps the greatest of the century - who was to guide not just the philosophical development of quantum theory but the careers of some of the century's foremost theoretical physicists, including Heisenberg, Gamow, Pauli and Wheeler. Apart from the rejoinders to Einstein's objections to quantum mechanics that Bohr published in the 30s, he contributed one other idea of overwhelming importance, both for physics and for world affairs. In 1939, while tramping across the snow from Princeton University to the Institute for Advanced Study, Bohr realized that it was uranium-235 which was responsible for nuclear fission. This paved the path toward the separation of U-235 from its heavier brother U-238 and led directly to the atomic bomb. Along the same lines, Bohr collaborated with his young protege John Wheeler to formulate the so-called liquid drop model of fission that likened the nucleus to a drop of water; shoot an appropriately energetic neutron into this assembly and it wobbles and finally breaks apart. Otto Hahn who was the chief discoverer of nuclear fission later won the Nobel Prize and it seems to me that along with Fritz Strassman, Lisa Meitner and Otto Frisch, Bohr also deserved a share of this award.
Since we are talking about Nobel Prizes, what better second act than one that results in another Nobel Prize. As everyone knows, this singular achievement belongs to John Bardeen who remains the only person to win two physics Nobels, one for the invention of the transistor and another for the theory of superconductivity. And like his chemistry counterpart Fred Sanger who also won two prizes in the same discipline, Bardeen may be the most unassuming physicist of the twentieth century. Also along similar lines, Marie Curie won another prize in chemistry after her pathbreaking work on radioactivity with Pierre Curie.
Let's consider other noteworthy second acts. When Hans Bethe won the prize for his explanation of the fusion reactions that fuel the sun, the Nobel committee told him that they had trouble deciding which one of his accomplishments they should reward. Perhaps no other physicist of the twentieth century contributed to physics so persistently over such a long time. The sheer magnitude of Bethe's body of work is staggering and he kept on working productively well into his nineties. After making several important contributions to nuclear, quantum and solid-state physics in the 1930s and serving as the head of the theoretical division at Los Alamos during the war, Bethe opened the door to the crowning jewel of quantum electrodynamics by making the first decisive calculation of the so-called Lamb shift that was challenging the minds of the best physicists. This work culminated in the Nobel Prize being awarded to Feynman, Schwinger and Tomonaga in 1965. Later at an age when most physicists are just lucky to be alive, Bethe provided an important solution to the solar neutrino puzzle in which neutrinos change from one type to another as they travel to the earth from the sun. There's no doubt that Bethe was a supreme example of a second act. Richard Feynman also continued to do serious work in physics; among other contributions, he came up with a novel theory of superfluidity and a model of partons.

Another outstanding example is Enrico Fermi, perhaps the most versatile physicist of the twentieth century, equally accomplished in both theory and experiment. After winning a prize in 1938 for his research on neutron-induced reactions, Fermi was the key force behind the construction of the world's first nuclear reactor. That the same man who designed the first nuclear reactor also formulated Fermi-Dirac statistics and the theory of beta decay is a fact that continues to astonish me. The sheer number of concepts, laws and theories (not to mention schools, buildings and labs) named after him is a testament to his mind. And he achieved all this before his life was cut short at the young age of 53.

Speaking of diversity in physics research, no discussion of second acts can ignore Philip Anderson. Anderson spent his entire career at Bell Labs before moving to Princeton, making seminal contributions to condensed matter physics. The extent of Anderson's influence on physics becomes clear when we realize that most people today talk about his non-Nobel Prize winning ideas. These include one of the first descriptions of the Higgs mechanism (Anderson was regarded by some as a possible contender for a Higgs Nobel) and his firing of the first salvo into the "reductionism wars"; this came in the form of a 1972 Science article called "More is Different" which has since turned into a classic critique of reductionism. Now in his nineties, Anderson continues to write papers and has written a book that nicely showcases his wide-ranging interests and his incisive, acerbic and humorous style.
There's other interesting candidates who show up in the list. Luis Alvarez was an outstanding experimental physicist who made important contributions to particle and nuclear physics. But after his Nobel Prize in 1968 he re-invented himself and contributed to a very different set of fields; planetary science and evolutionary biology. In 1980, along with his son Walter, Alvarez wrote a seminal paper proposing a giant asteroid as the cause for the extinction of the dinosaurs. This discovery about the "K-Pg boundary" really changed our understanding of the earth's history and is also one of the most exemplary examples of a father-son collaboration.
There's a few more scientists to consider including Murray Gell-Mann, Steven Weinberg, Werner Heisenberg, Charles Townes and Patrick Blackett who continued to make important contributions. It's worth noting that this list focuses on specific achievements after winning the prize; a "lifetime achievement" list would include many more scientists like Lev Landau (who among other deep contributions co-authored a definitive textbook on physics), Subrahmanyan Chandrasekhar and Max Born. 
It's also important to focus on non-research activities that are still science-related; too often we ignore these other important activities and focus only on technical research. A list of these achievements would include teaching (Feynman, Fermi, Bohr, Born), writing (P. M. S. Blackett, Feynman, Percy Bridgman, Steven Weinberg), science and government policy (Bethe, Arthur Compton, Robert Millikan, Isidor Rabi) and administration (Lawrence Bragg, J. J. Thomson, Pierre de Gennes, Carlo Rubia). Bonafide research is not the only thing at which great scientists excel.

A book burning in Palo Alto

This is my latest monthly column for 3 Quarks Daily. As a book lover it wasn't easy for me to write this, but it also made me realize how bad it can get when it gets really bad, and how much we should fight tooth and nail to prevent this kind of scenario - or even partial elements of it - from becoming reality. One point I want to get across is how everyone can be complicit in these developments, including people who detest them. The biggest danger is not in government or corporations taking control of our lives; it's in us willingly ceding control to these entities through self-censorship and self-denial, all done under the name of some innocent ideology.

The flames crackled high and mighty, scalping the leaves from the oak trees, embracing bark and beetles in their maw of carbonized glimmer. The remains of what had been lingered at the bottom, burnt to the sticky nothingness of coagulated black blood. The walls of the stores and restaurants shone brightly, reflecting back the etherized memory of letters and words flung at them. Seen from the branches of the trees, filtered through incandescent fire, the people below were mere dots, ants borne of earthly damnation. A paroxysm of a new beginning silently echoed through the cold air. Palo Alto stood tall and brightly lit tonight.

Bell’s Books, a mainstay of the town for a hundred years, projected its ghostly, flickering shell across the square, its walls stripped of everything that ever dwelt on them, now pale shadows of a dimming past. A few months back they had come to the store, crew cuts and stiff ties, smiles of feigned concerns cutting across the room like benevolent razors. As a seller of used and antiquarian books Bell’s posed a particular problem, riddled through and through as it was with undesirables. The owner, an old woman who looked like she had been there since the beginning of time, was told quietly and with no small degree of sympathy how they did not want to do this but how they needed to cart out most of her inventory, especially because of its historical nature.

“We’re sorry, ma’am, but ever since they passed the addendum our directives have grown more urgent. And please don’t take this personally since yours is not the only collection to be cataloged: over the last few weeks we have repeated this exercise at most of the area’s stores and libraries. To be fair, they are offering healthy compensation for your efforts, and you should be hearing back from the grievances office very soon.”

With that, three Ryder trucks filled with most of the books from Bell’s had disappeared into the waning evening, the old woman standing in the door, the wisps of sadness on her face looking like they wanted to waft into the air and latch on to the gleaming skin of the vehicles. What happened to her since then, where she went and what she did was anybody’s guess. But the space where Bell’s stood had already been sold to an exciting new health food store.

Addendum XIV to the First Amendment had passed three months ago with almost unanimous approval from both parties. In an age of fractured and tribal political loyalties, it had been a refreshingly successful bipartisan effort to reach across the aisle. In some sense it was almost a boring development, since large parts of the First dealing with the right to peaceably assemble had been left unaltered. The few new changes added some exceptions to the hallowed Constitutional touchstone; these included an exception for public decency, another one for offending group sensibilities, and a third one for protection of citizens from provocative or offensive material. That last modification had been solidly backed by data from a battery of distinguished psychologists and sociologists from multiple academic centers, hospitals and government agencies who had demonstrated in double blind studies how any number of literary devices, allusions and historical references produced symptoms especially among the young that were indistinguishable from those of generalized anxiety disorder. Once the Surgeon General had certified the problem as a public health emergency, the road to determined political action had been smoothed over.

Most importantly, Addendum XIV had been a triumph of the people’s will. Painless and quick, it was being held up as an exemplar of representative democracy. The change had been catalyzed by massive public demonstrations of a magnitude that had not been seen since the last war. These demonstrations had begun in the universities as a response against blatant attacks on the dignity of their students, marshaled through the weaponization of words. The fire had then spread far and wide, raging across cities and plains and finally setting the hearts and minds of senators and congressmen ablaze; whether through fear or through common sense was at this point irrelevant. In what was a model example of the social contract between elected public officials and the people, much of the final language in Addendum XIV had been left almost unchanged from drafts that emerged from spirited and productive town hall meetings. It was grassroots government at its best. After years of being seen as almost a pariah, the country could again expect the world to look at it with renewed admiration as a nation of laws and decent people.

The police had put a perimeter around the fire, cordoning it off and trying their best to prevent spectators from approaching too close. But they were having a hard time of it since the whole point of the event was as a community-building exercise where the locals contributed and taught each other. An old cherry picker had been recruited to drop its cargo into the fire from top, but the real action belonged to the people. Children and old alike were cautiously approaching the bright burning flames and tossing in their quota and the younger crowd was flinging everything in quite enthusiastically. Parents who were trying to carefully keep their gleeful children from getting too close were simultaneously balancing the delicate act of teaching their kids how to do their part as civic-minded citizens. A mother was gently helping her four-year-old pick a slim volume and toss it into the gradually growing conflagration while the father stood nearby, smiling and returning the child’s eager glances. It was hard to contain the crowd’s enthusiasm as they obeyed the overt guidelines of the government and the silent dictates of their conscience. The police knew that the people were doing the right thing, so they finally became resigned to occasionally helping out the crowd rather than trying to prevent them from being singed by the heat. An officer took out his pocketknife and knelt to help a man cut the recalcitrant piece of twine that was tying his sheath of tomes together.

Based on the official state and federal guidelines, everyone had filled up their boxes and crates and SUVs and driven here. Driven here from Fremont and Berkeley — hallowed ground of the movement’s sacred origins — and some from as far as Livermore and Fresno, even braving the snaking line of cars on the Dumbarton Bridge to the East. They cursed under their breath for not being allowed to organize similar local events in their own cities, but the government wanted to build community spirit and did not want to dilute the wave of enthusiasm that had swept the nation. Rather than have several small events, they wanted to have a few big ones with memorably big attendance. Palo Alto afforded a somewhat central meeting point as well as a particularly convenient one because of its large repository of used bookstores and university libraries. The Ryder Company had helpfully offered generous discounts for use of their trucks. Stanford and Berkeley had been particularly cooperative and had contributed a large chunk of the evening’s raw material; as torchbearers of the movement, they had had no trouble gathering up enough recruits. Berkeley especially had the White House’s blessing and federal funding had once again started to flow generously to the once cash-strapped institution. Now University Avenue was backed up with Ryder trucks stretching back all the way to Campus Drive, mute messengers of information overflow relived to be offloading their tainted cargo.

As with most events like this, the restaurants were working overtime, offering happy hour deals and competing with each other for the attention of the diverse crowd. The $12.99 double slider special at Sliderbar had been sold out, and Blue Bottle Café in HanaHaus was going crazy trying to cater to their hyper-caffeinated consumers who especially relished the buzz from the establishment’s famous Death Valley Cold Brew. Groups of students could be seen working in relay teams; as one group helped unload the trucks and consign the contents to the flames, the other went back and brought back coffee and donuts for renewed energy. A family stood outside Palo Alto Creamery, the children squealing with delight as their ice cream melted quicker between their fingers in the glare of the heat. The parents watched with familiar exasperation, especially since there were three more bags to take care of. The extra generators at the creamery were having a hard time keeping up, but the huge size of the crowd seemed to please the crews even though they had been working since 3 AM.

To facilitate the transition, the government had mandated paid vacation for one day so that they could deploy agents who would visit homes and take stock of the inventory. Just like they did for jury duty, they sent out letters to everyone confirming the date and time. I had to postpone once since I had still not finished counting up my collection. I wanted to postpone again, but the second letter made the urgency of the matter a bit more clear. Two boyish-looking agents had stopped by and efficiently noted down everything as they gently took volumes out of my shelves and kept them back. Once they were sure about the total they had handed me a piece of paper confirming the number, along with information on the date of the event in Palo Alto. “We appreciate your help in this, Sir; you have no idea how some people have offered resistance to even such a simple call to community service. It’s especially absurd since it was their own friends and family members who had gone out of their way to come to all those town hall meetings and demand this! In any case, we’ll see you on the 27th. You have a nice day now.” I nodded wearily.

I had been reluctant to commit myself to that first milestone. There was another day late in November when those who couldn’t make it for some reason the first time around could go. I decided to go to the October event all the same; I had had nothing much to do in the evenings ever since all the bookstores had been either closed or reduced to selling meager, uninteresting fare.

They were offering discount parking in the lot on Waverley Street, so I parked there and took a right on University Avenue. As I turned a blast of hot air hit me, as if trying to wash away memories of an unwanted past. At the end of the street, flanked by shadows of the moving crowd, was the conflagration. The crowds around me were moving to and fro between the end of the street and the businesses along the sides, although the overall movement seemed to be toward the amorphous, flaring yellow shape shifter in the distance. I suddenly saw a familiar face at the side. It was Sam from HanaHaus; the establishment had opened an extra counter on the sidewalk to quell the crowds inside. “Hey, how’s it going? Some crowd huh?” waved Sam. I waved back but Sam’s hand quickly dissolved in the flurry of hands grabbing coffee cups and placing orders. I kept on walking and quickly reached the police perimeter. “Hi, do you have anything to donate?”, asked an officer. I told him that I was going to take advantage of the extended deadline. That’s ok, he said; based on the conversations he had had, people had such large collections that many of them were going to be forced to come back anyway. As a family with three young kids approached with their bags, he requested me to stand back so he could help them.

As I stepped back I took in the scene. The fire was gleefully lofting paper and pages up in a whirlwind of nihilistic ecstasy, the frayed, burning edges of pages proclaiming their glowing jaggedness, their silent letter-by-letter obliteration. Nearby, one group of children was dancing in a circle with others, enjoying the momentary dizziness induced by the motion. Their parents were keeping a close watch on them as they went on with their routine. Occasionally a child would quickly run to his or her parents’ side, pick up a volume and toss it laughing and screaming, even as the other children yelled about the interruption. They would then join the circle again and continue the dance, their own movements alternating with the movements of the soot as it went round and round the pyramid of burning paper.

It was then that I saw some of the names; it was odd that I shouldn’t have recognized them before, but it might simply have been because they were so ubiquitous that they had been rendered invisible. There were Lee and Kafka, Baldwin and Joyce, Ovid and Atwood, Plato and Melville, Rushdie and Russell, Twain and Conrad, Rhodes and Faulkner, Pynchon and Sagan, Woolf and Dostoyevsky, McCarthy and Stein. They were there because they were too colonial, too non-colonial, too postcolonial, too offensive, too profane, too sensitive, too traumatic, too objective, too white, too black, too egalitarian, too totalitarian, too maverick, too orthodox, too self-reflective, too existential, too modern, too postmodern, too violent, too bucolic, too crude, too literary, too cis, too trans, too religious, too secular, too nihilistic, too meaningful, too anarchist, too conformist, too feminist, too masculine, too languid, too unsettling, too horrific, too boring, too much ahead of their times, too much relics of the past, too much, simply. They were there because sensibilities had been offended, because words had been weaponized.

Most of them were lined up in bag after bag next to the fire, gagged and bound, silently screaming against the passions of men. The ones that had already made it into the void were gone, ideas becoming null, breath turned into air, but some had stumbled back from the high pile with various parts charred and curled up in half- dead configurations, painfully trying to remain part of this world. Some of the names were partially gone, formless echoes being slowly stuffed back into the grave. The ones which had photos of their authors had these photos metamorphosed into things begging to be obliterated: a woman with only her smile burnt off, looking like a gargoyle without a mouth; a man with his eyes masterfully taken out by well-placed glowing embers; another one where the heat had half-heartedly engraved dimpled plastic bubbles on the face of a female novelist known to have a pleasing countenance, now looking like a smallpox victim with a jaw left hanging.

It was then that I noticed another breed of spectator rapidly moving through the crowd. Photographers hired by both government and private agencies were canvassing the scene like bounty hunters looking for trinkets of a fractured reality which they could take back to their studios and immortalize in its isolated desolation. One of them was the noted photographer Brandon Trammel, from the California Inquirer. I could see him now on the other side of the fire, his body and the shimmering flames appearing to coalesce into one seamless disintegration. At a certain temperature human beings and paper become indistinguishable, guilt-ridden souls shredded apart into their constituent atoms, sons and daughters of the whims of men consumed by fire and fury. Trammel was taking photos of the men and women and children around the fire, etching their cries of glee and solemn duty into permanent oblivion.

Moving around the fire like a possessed man furiously scribbling down the habits of an alien civilization, he came over to my side. I caught sight of a half-burnt title on the ground. It was a familiar volume from another era, an era now looking like the world in a snow globe, eroding now through the obscuring glow of time. “Hey”, I yelled at him, “Here, let me pose for you”. I picked up the book and threw it at the red wall with all my might. I heard a click, but at the last moment the charred remains of its edges had disintegrated in my hands and it fell short by a few feet. I desperately looked around. Another one was within sight. I hastily scampered over, picked it up and looked at Trammel, eager and wild-eyed. “Again!”, I screamed at the top of my voice, and cast it into the fire.

Note: Perspicacious readers may have noticed that I have modeled the ending here on the last part of Raymond Carver's short story "Viewfinder". Very few writers could do quiet desperation as well as Carver.

A terrible year for academic organic chemistry

In the last one year, academic chemistry has lost Jack Roberts, Jerome Berson, George Olah, Gilbert Stork and now Ron Breslow. The last two in just one week. 

It's been a terrible loss. All these chemists were world-renowned pioneers in their areas who laid the foundations of much of what graduate students now learn in their textbooks and what professional chemists apply in their laboratories. They were the last torch bearers of the golden age of organic chemistry, the age which laid the foundations of the three pillars of organic chemistry: structure, reactivity and synthesis.

Jack Roberts pioneered NMR spectroscopy in the United States. He should really have received a Nobel Prize for this contribution in my opinion. But in addition to this, he was also one of the foremost practitioners of molecular orbital theory and made very significant contributions to conformational analysis and carbocation chemistry.

Jerome Berson was another important physical organic chemist who also wrote what I consider to be a very unique contribution to the history of science - a book titled "Chemical Creativity", that traces creativity in the work of leading chemists, from Hückel to Woodward.

George Olah was the father of modern carbocation chemistry and an inventor of superacids that allow us to stabilize carbocations. He contributed massively to work that is used in the petrochemical industry and, along with Martin Saunders, delivered the coup de grace that settled the famous non-classical cation controversy for good.

Gilbert Stork about whom much has been written since he passed away just a few days ago was one of the most original synthetic organic chemists of the 20th century. His work on enamine alkylations, radical cyclizations and other key reactions is now part of the textbooks, and so are his several elegant natural product syntheses.

And now Ronald Breslow. Primarily known as a physical organic chemist, Breslow was one of the most versatile chemists of the 20th century whose contributions ranged across the entire chemical landscape. He is famous for many things; for discovering the simplest aromatic system - the cyclopropenium ion, for d-orbital conjugation, for very intriguing work on chemistry in aqueous solvents, for building artificial enzymes, for inventing the marketed drug SAHA (the first histone deacetylase inhibitor) and for exploring the origin of chirality during the origin of life. How many chemists can claim that kind of oeuvre?

Breslow received pretty much every award for science there is out there except the Nobel Prize - the National Medal of Science, membership in the National Academy of Sciences and presidency of the ACS among others. He also saw his share of controversies, although the chemical community always came out wiser for learning from them. 

Most notably, in an age when senior professors are often criticized for using graduate students and postdocs as cheap labor, Breslow was an extraordinary educator. Among his students and postdocs are Robert Grubbs, Robert Bergman and Larry Overman. There is probably not a continent on which some student of his is not doing chemistry. More than once during his talks, he made a pitch for hiring the student or postdoc who had done the work. Breslow belonged to an older, more gentlemanly generation of professors who would make calls to get their students jobs.

I last heard Breslow speak only one year ago at an ACS meeting. Before that I had heard him speak at an ACS meeting about ten years ago. The remarkable thing is that between those ten years he did not seem to have aged, displaying the same boyish enthusiasm and curiosity for chemistry that was always his hallmark. When he received the Priestley Medal in 1999, one of his students said the same thing: "He just doesn't seem to age, certainly not intellectually. Talking to him now is like talking to him 30 years ago. He's got the same enthusiasm, the same excitement about chemistry."

We are all poorer for the loss of Breslow and these other pioneers, but the best thing is that they will be part of the textbooks as long as there is a science of chemistry.

Interpreting Kafka’s Metamorphosis in the Digital Age

I re-read Kafka’s “Metamorphosis”, wondering if it presented parallels for our age of mute communication enabled by the Internet. The Metamorphosis deals with themes of social alienation (often self-imposed) and existential anxiety. It’s worth understanding the context in which the story was written. This context involves Kafka’s own life and personality. Kafka held himself in low self-esteem and thought himself inadequate socially and sexually; he seems to have had several relationships with women but also visited prostitutes. Most importantly in the context of “The Metamorphosis”, he feared that people would find him physically and mentally repulsive and seems to have suffered from an eating disorder.
These qualities of self-loathing are inherent in the story. The protagonist Gregor Samsa finds himself transformed into a giant insect who his parents and sister naturally find repellant. He also starts hating most food, including even the rotten food that his loving sister tearfully puts in his room. It is clear that the insect loathes himself and understands why his own family would loathe him and wish him gone. It’s also significant that while Samsa can perfectly understand what his family is saying about him, his own speech is now grotesquely that of an insect and incomprehensible to them. Torn between an inability to communicate and a perfect ability to understand, the insect naturally feels both alienation and existential angst.
We should not have to look too far to find parallels in the digital age for most of Kafka’s afflictions. Technology and especially social media has sequestered us away from human contact in the same way that Samsa’s fundamental transformation shut him out. We spend hours on the Internet in our home, and yet can legitimately claim to have no real, human connection to the world outside. This is reflected in our “friendship” with hundreds of people on social media which translates to nihilistic friendlessness outside this medium. We also think that we can perfectly understand what people outside are saying, but just like the insect, keep on banging on the walls of our self-imposed prison because we cannot make ourselves heard above the din outside. We make a lot of noise, but very little sound.
A lot of the existential anxiety which we feel results from this dissonance between the clear, one way-mirror of the outside world and the opaque prison of the inside, a prison which nevertheless occasionally gives us the illusion of being able to communicate before the whole façade regularly comes tumbling down. Just like the giant insect, our minds are torn between wanting to communicate and wanting to believe that we can.
Most deaf are the technology companies which in our age seem to play the role of Samsa’s family; they claim to know what we are saying and even pretend to love us, but what they are offering us is a diet of information addiction and distraction which is being force fed to us. Like Samsa, we find ourselves in a love-hate relationship with these companies; on one hand we want to reject their sustenance, but on the other we find ourselves increasingly unable to survive without it. We hate ourselves for craving the food that the tech companies send our way, and we pity ourselves if we don’t have it.
The role of Samsa’s parents can also be ascribed to the global internet community which pretends to be our friend but whose main function is to publicly shame, vilify and abandon us the moment we say something they disagree with. The sense of alienation which Samsa feels partly comes from not being able to communicate with his parents and partly from their anger and disgust at his transformation. Similarly, the global internet community pretends to care about us because we are part of the same digital ecosystem, while being able to turn on us in disgust and indignation in a moment when we undergo our own transformation, a transformation perhaps to an unpopular social or political viewpoint. Veering away from the community and tech companies’ groupthink will be our version of the Metamorphosis. Is is therefore not surprising that we find ourselves suffering extreme feelings of alienation, facing censure, ostracism and indifference from a community that from the outside seems to look just like us but which really is so different as to be an actual alien species, again like Samsa’s giant insect.
The end of “The Metamorphosis” involves Samsa becoming infected and rueful and finally dying from shame, neglect and self-imposed starvation. A similar fate would likely befall the Gregor Samsas of today’s globally connected world, signified perhaps by these modern day vermin turning into brainwashed internet addicts who have completely surrendered their privacy, creative potential and personal dignity to both Internet companies and the global social media community. The original Gregor Samsa died, but this kind of complete surrender of mind and body would likely be a fate worse than death, perhaps not appreciated fully by the victims because of their delusional state, but real nonetheless.
However it need not be so. Gregor’s mistake was in pretending that his family would want a normal relationship with him even after his transformation. While it would have been difficult, it would not have been impossible for him to be proactive in severing his connections with them, perhaps running away into the sewers or streets and starting an independent existence as a free insect. Such a lifestyle would have been challenging to say the least, but it would have led to a strange and exhilarating kind of freedom from dependence on his parents’ approval and love.
The metaphor for our Internet age would be freedom from both the tech companies’ and the digital community’s feigned love toward us. The more we keep craving their approval, the more we will keep on becoming a victim of our own self-imposed existential angst. That way would lie catastrophe. Severing the bond with these two entities would not be easy and I don’t know what the best way to do it is. But there have certainly been some opinions offered toward achieving this goal.
What I do know is that when a Gregor Samsa from this world decides to escape, even into the sewers, he or she would find it much easier if other Gregor Samsas are already waiting in there.

Einstein: A Hundred Years of Relativity

Andrew Robinson’s compendium on all things Einstein is a lavishly illustrated treat which I read with great pleasure in one sitting. It consists of contributions from Robinson himself as well as from a variety of writers, scientists and philosophers on various aspects of Einstein’s life, work and the times he lived in. There are scores of photos of Einstein with everyone from Niels Bohr to Charlie Chaplain to Rabindranath Tagore. Robinson himself is a measured and very engaging guide to Einstein’s life, treading methodically and evenly over all major events in his life. The book consists of introductory essays by Robinson followed by chapters on specific topics. All the chapters on Einstein’s work are drawn from past writings on Einstein by leading thinkers and scientists; Stephen Hawking, Steven Weinberg, Freeman Dyson, Max Jammer, Philip Anderson and Philip Glass. It is especially illuminating to read Glass’s essay in which Glass talks about how Einstein inspired his opera, “Einstein on the Beach”. Each of these writers talks about a particular triumph and folly of Einstein’s, or how he influenced their own work. The volume also contains a reprint of a revealing interview with Einstein by Bernard Cohen, conducted only two weeks before his death.

Although Einstein is known for relativity – and both general and special relativity receive an extended treatment here - he contributed to many other important parts of physics. He was one of the fathers of quantum theory, a fact of perpetual irony given his vociferous later opposition to the meaning of the theory. In 1905 which is regarded as his annus mirabilis, he published papers on the sizes of atoms, on diffusion through different media and of course, on special relativity. Even after putting the finishing touches on relativity in 1915, Einstein made at least two major contributions to science. One was his work with Satyendranath Bose predicting what are called Bose-Einstein condensates; it took until the 1990s for these novel forms of matter to be created in the laboratory. The other was his contribution in explaining the process of stimulated emission which led to the laser. Another of Einstein’s lesser-known works was a practical one – the Einstein-Szilard refrigerator which he invented as a safe refrigerator with his friend Leo Szilard, the same Szilard who encouraged him to write the famous letter to president Roosevelt warning of the discovery of fission.

The book is roughly chronological; starting with Einstein’s rebellious days as a student and trailblazer at the Swiss patent office, as deep thinker and revolutionary when he was a professor in Berlin, as pacifist during World War 1, as one of the most famous men in the world after World War 2, as target of anti-Semitic propaganda, as world-famous émigré in Princeton and as pacifist, tongue-wagging celebrity-sage again after World War 2. One of the themes that constantly emerges through these different periods of Einstein’s life is that of stubbornness and rebellion combined with an unusual tolerance for unorthodox thinking and unconventional people. One of the significant myths about Einstein that the volume demolishes is that of an introverted, lonely, deep thinker. Throughout his life Einstein was surrounded by close friends who he kept in touch with either in person or through letters; his personal and professional correspondence with famous as well as common folk number in the thousands. During his young days he was a lusty, vivacious and joyful man filled with dry humor and cheekiness, and these qualities endured late into his life.

Robinson’s volume is also very good at exploring the paradoxes of Einstein’s life. Einstein was a wise, avuncular figure to strangers and the world at large, but he was often terribly cruel and indifferent to his family; he was an adulterer who womanized regularly even after being married twice, and he was estranged from both of his sons. Much of this revelation was hidden from the public and only became apparent with the discovery of Einstein’s letters to his wife in the 1990s. As with most other people, Einstein had trouble being consistent in his morality. His pacifism was perhaps one of the most consistent qualities in him, although he wisely cast it off once Hitler came to power.

Scientifically Einstein’s life presents even more interesting paradoxes. Freeman Dyson opens the volume with an essay talking about what was perhaps Einstein’s biggest scientific failure; his inability to imagine a universe without black holes. As I described in a post, both Einstein and Oppenheimer played foundational roles in the discovery of black holes; they were a logical result of Einstein’s field equations of general relativity. Yet both men essentially abandoned their scientific creations, staying utterly indifferent to them for the rest of their lives. Einstein even wrote a paper in 1939 that supposedly refuted black holes, but it was fatally flawed in its assumptions of static spacetime around these inherently dynamic objects. Today black holes are recognized as the engines which fuel the birth and death of the universe. Einstein also made a mistake when he inserted a cosmological constant in his equations to keep the universe from expanding or contracting. However to his credit, he immediately got rid of this constant once he learnt of Edwin Hubble’s discovery of the expanding universe. Ironically, as the expansion of the universe was found to be accelerating, the cosmological constant was resurrected.

Einstein’s reliance on beauty and mathematics was also paradoxical. As the physicist Philip Anderson describes, many still think of Einstein as primarily a mathematical physicist. But all his early advances in relativity were fueled not by abstract mathematics but by practical thought experiments in physics. He always stayed close to experiment and let the data guide his thinking. His time in the patent office in Bern had given him a real taste for mechanical contraptions. However, as described by Steven Weinberg, when developing general relativity, Einstein did have to take advantage of the novel field of Riemannian geometry which he learnt from his friend Marcel Grossmann. Weinberg speculates that perhaps Einstein got so enamored with mathematics during this time that it led to his isolation from the mainstream of physics during the last few decades of his life when he kept on trying to develop a unified field theory without paying attention to real advances in physics. Sadly, Weinberg finds that almost everything that Einstein did after 1925 was irrelevant in terms of real contributions to physics. The one exception was the debate with Niels Bohr and others about quantum entanglement which he sparked in 1935, and even in that debate he finally ended up on the losing side.

Einstein remains of great interest to a new generation, not because he was a genius but because – as this volume illustrates - he was human. Ultimately when we strip away the trappings of myth and fame from his scientific contributions, what remains is a human being in all his honest clarity. That is what makes him a topic of enduring interest.

Fine-tuning the future: A review of Aldous Huxley's "Brave New World"

Before there was ‘The Matrix’ and ‘Bladerunner’, before there was even ‘1984’, there was Brave New World. It is astonishing that Aldous Huxley wrote this tale of technological dystopia in 1932. The social elements from the story are similar to those in Orwell and Kafka and others, namely a society of obedient sheep run by the state and benevolent dictators through brainwashing and groupthink. But what’s striking about the novel is how it so astutely anticipates a society taken over by benevolent technocrats rather than politicians, a scenario that appears increasingly likely in the age of AI and genetic engineering. 

Huxley came from an illustrious scientific family with social connections. His grandfather was Thomas Henry Huxley, Darwin’s close friend, publicist and “bulldog”, whose famous smack down of Bishop Samuel Wilberforce has been relished by rationalists fighting against religious faith ever since. His brother was Julian Huxley, a famous biologist who among other accomplishments wrote a marvelous tome on everything that was then known about biology with H. G. Wells. Steeped in scientific as well as social discourse, possessing a deep knowledge of medical and other scientific research, Aldous was in an ideal position to write a far-reaching novel.

This he duly did. The basic premise of the novel sounds eerily prescient. Sometime in the near future, society has been regimented into a caste system where people are genetically engineered by the state in large state-run reproductive farms. Anticipating ‘The Handmaid’s Tale’, only a select few women and men are capable of providing fertile eggs and sperm for this careful social engineering. The higher castes are strong, intelligent and charismatic. The lower castes are turgid, obedient and physically weak. They don’t begrudge those from the upper castes because their genetic engineering has largely removed their propensity toward jealousy and violence. Most notably, because reproduction is now the responsibility of the state, there is no longer a concept of a family, of a father or mother. There is knowledge of these concepts, but it’s regarded as archaic history from a past era and is met with revulsion.

How is this population kept under control? Not shockingly at all, through sex, drugs and rock and roll. Promiscuity is encouraged from childhood onwards and is simply a way of life, and everyone sleeps with everyone else, again without feeling jealousy or resentment (it was this depiction of promiscuity that led the book to be banned in India in the 60s). They flood their bodies with a drug called soma whenever they feel any kind of negative emotion welling up inside and party like there’s no end. They are brainwashed into believing the virtues of these and other interventions by the state through subliminal messages played when they are sleeping; such unconscious brainwashing goes all the way back to their birth. People do die, but out of sight, and when they are still looking young and attractive. Death is little more than a nuisance, a slight distraction from youth, beauty and fun.

Like Neo from ‘The Matrix’, one particular citizen of this society named Bernard Marx starts feeling that there is more to the world than would be apparent from this state of induced bliss. On a tryst with a particularly attractive member of his caste in an Indian reservation in New Mexico, he comes across a man referred to as the savage. The savage is the product of an illegitimate encounter (back when there were parents) between a member of a lower caste and the Director of Hatcheries who oversees all the controlled reproduction. He has grown up without any of the enlightened instruments of the New World, but his mother has kept a copy of Shakespeare with her so he knows all of Shakespeare by heart and frequently quotes it. Marx brings the savage back to his society. The rest of the book describes the savage’s reaction to this supposed utopia and its ultimately tragic consequences. Ultimately he concludes that it’s better to have free will and feel occasionally unhappy, resentful and angry than live in a society where free will is squelched and the population is kept bathed in an induced state of artificial happiness.

The vision of technological control in the novel is sweeping and frighteningly prescient. There is the brainwashing and complacent submission to the status quo that everyone undergoes which is similar to the messages provided in modern times by TV, social media and the 24-hour news cycle. There are the chemical and genetic interventions made by the state right in the embryonic stage to make sure that the embryos grow up with desired physical or mental advantages or deficiencies. These kinds of interventions are the exact kind feared by those wary of CRISPR and other genetic editing technologies. Finally, keeping the population preoccupied, entertained and away from critical thinking through sex and promiscuity is a particularly potent form of societal control that has been appreciated well by Victoria’s Secret, and that will not end with developments in virtual reality.

In some sense, Huxley completely anticipates the social problems engendered by the technological takeover of human jobs by robots and AI. Once human beings are left with nothing to do, how does the state ensure that they are prevented from becoming bored and restless and causing all kinds of trouble? In his book “Homo Deus”, Yuval Harari asks the same questions and concludes that a technocratic society will come up with distractions like virtual reality video games, new psychoactive drugs and novel forms of sexual entertainment that will keep the vast majority of unemployed from becoming bored and potentially hostile. I do not know whether Harari read Huxley, but I do feel more frightened by Huxley than by Harari. One reason I feel more frightened is because of what he leaves out; the book was published in 1932, so it omits any discussion of nuclear weapons which were invented ten years later. The combination of nuclear weapons with limitless societal control through technology makes for a particularly combustible mix.

The biggest prediction of Huxley’s dystopia, and one distinctly different from that made by Orwell or Kafka, is that instead of a socialist state, people’s minds are much more likely to be controlled in the near future by the leaders of technology companies like Google and Facebook who have formed an unholy nexus with the government. With their social media alerts and Fitbits and maps, the tech companies are increasingly telling us how to live our lives and distracting us from free thinking. Instead of communist regimes like the Soviet Union forcibly trampling on individual choice and liberty, we are already gently but willingly ceding our choices, privacy and liberties to machines and algorithms developed by these companies. And just like the state in Huxley and Orwell’s works, the leaders of these corporations will tell us why it’s in our best interests to let technology control our lives and freedom, when all the while it would really be in their best interests to tell us this. Our capitulation to their inventions will look helpful and voluntary and will feel pleasurable and even noble, but it will be no less complete than the capitulation of every individual in “Brave New World” or “1984”. The only question is, will there be any savages left among us to tell us how foolishly we are behaving?