Some thoughts on "broader impact" statements for scientific papers

I know that conferences like NeurIPS (formerly called NIPS) have asked for statements about ethical and "broader impact" to accompany papers submitted to them. In principle I am all for this since it's always good for scientists to think about the social implications of the work. I also read the details of the requirements and they aren't draconian, especially for highly theoretical papers whose broader impact is far from clear.

But from a fundamental philosophical viewpoint I still don't think this is useful. My problem is not with the morality of predicting impact but with the lack of utility. The history of science and technology show that it is impossible to predict broader impact. When Maxwell published his electromagnetic equations he could have scarcely imagined the social and political repercussions electrical power generation would have. When Einstein published his general theory of relativity, he could have scarcely imagined the broader impact it would have on space exploration and GPS, both in war and peace. Perhaps most notably, nobody could have predicted the broader impacts of the discovery of the proton or the discovery of DNA as the genetic material. I do not see how James Chadwick or Oswald Avery could have submitted broader impact statements with their papers; anything interesting they might have had to say would probably have been untrue a few years later, and anything they would have admitted would probably have turned out to be important.
My biggest problem is not that broader impact statements will put more of a burden on already-overworked researchers, or that they might inflame all kinds of radical social agendas, or that they might bias conferences against very good technical papers which struggle to find broad impact, or that they might stifle lines of research which are considered to be dangerous or biased. All these problems are real and should be acknowledged. But the real problem simply is that whatever points these statements would make would almost certainly turn out to be wrong because of the fundamental unpredictability and rapid progress of technology. And they would then only cause confusion by sending people down a rabbit hole, one in which the rabbit not only does not exist but is likely to be a whole other creature. And this will be the case with all new technologies like AI and CRISPR.

The other problem with broader statements is what to do with them even if they are accurate, because accurate and actionable are two different things. Facial recognition software is an obvious example. It can be used to identify terrorists or bad guys but it can also be used to identify dissidents and put them into jail. So if I submit a broader statement with my facial recognition paper and point out these facts, now what? Would this kind of research be banned? That would be throwing the baby out with the bathwater. The fact is that science and technology are always dual use, and it is impossible to separate their good from their bad uses except as a matter of social choice after the fact. I am not saying that pointing out this dual use is a bad thing, but I am concerned that doing so might lead to stifling good research for fear that it may be put to bad ends.
So what is the remedy? Except for obvious cases, I would say that science and technology should be allowed to play out the way they have played out since the time of Francis Bacon and the Royal Society, as open areas of inquiry with no moral judgements being made beforehand. In that sense science has always put a severe burden on society and has asked for a tough bargain in return. It says, "If you want me to be useful, don't put me in a straitjacket and try to predict how I will work out. Instead give me the unfettered freedom of discovery, and then accept both the benefits and the risks that come with this freedom." This is the way science has always worked. Personally I believe that we have done an excellent jobs maximizing its benefits and minimizing its risks, and I do not see why it will be different with any new technology including machine learning. Let machine learning run unfettered, and while we should be mindful of its broader impact, predicting it will be as futile as damming the ocean.

Book Review: "His Very Best: Jimmy Carter, A Life" by Jonathan Alter

I first saw Jimmy Carter with a few other students during my first year as a graduate student at Emory University where he remains a visiting professor - I still remember the then 81-year-old briskly striding in with his signature broad, toothy smile and the energy of a man half his age. The amusing thing was that when he opened up the floor to any question that we wanted to ask him, the first question someone asked right away was whether LBJ was responsible for JFK's assassination. Without batting an eyelid Carter said no and moved on.

Now I am finishing Jonathan Alter's comprehensive biography of Carter and it's a revelation. The book's main goal is to show that Jimmy Carter is a much more complex human being and president than what people believe, and it succeeds exceedingly well in this goal. Carter was a highly intelligent, immensely hard working and, most importantly, a good man in the wrong job. But even then, the things he accomplished were very substantial. Most important were his two signature foreign policy achievements - giving the Panama Canal back to Panama, and brokering a peace between Israel and Egypt which has lasted up to to the present day. The Camp David Accords in particular showed a commitment over thirteen days that is without precedent before or since; Carter simply refused to give up, even when Begin and Sadat were on the verge of going home multiple times. The other huge achievement - albeit one that now seems like a mixed blessing - was to normalize relations with China. He also campaigned for human rights at a time when it wasn't fashionable for American presidents to do so, abandoning American presidents' traditional cozy relationship with anti-communist dictatorships.
There are also other, more minor achievements that are now forgotten - appointing more liberal federal judges (even more than Trump), deregulating the airline industry, appointing more African-Americans to important government positions than many of his predecessors, restoring a sense of decency and common sense to the White House after the tumultuous years of Vietnam and lackluster years of Ford, being the first Democratic president to woo evangelicals (the last before they turned Republican) and popularizing alternative energy and climate change at a time when few people cared about it. There's no doubt that the binary classification of Carter as "failed president, great ex-president" is flawed and reality is more complex.
The book is also fabulous at exploring Carter's childhood and background in rural Georgia as a farmer and his education in nuclear engineering at Annapolis. Carter grew up as the son of a farmer and general store owner, Earl Carter, who competed with his son in daily tasks and sports and was a fair if harsh father. Carter's mother Lillian who lived to see her son became president was quite liberal for her time, and astonishingly joined the Peace Corps and went to India for a few months for public service in her sixties. Alter does not shirk from criticizing Carter's poor record on civil rights before he became president (at one point he was friendly with George Wallace). Carter was a product of his time and grew up in the segregated Deep South after all, but this does not excuse his reluctance to take a stand even in matters like school desegregation. Of course, the defining relationship that Carter has had is with his wife Rosalynn who he married when he was twenty-one and she was nineteen; they have been married for more than seventy-five years now. Rosalynn has been a commanding presence in his life, and he often sought advice from her along with his other advisors during the most crucial moments of his presidency.
Carter's main problem was that he was dour, practical and business-like and almost completely lacked the warmth, optimism and PR skills that are necessary for political leaders to win over not just minds but hearts. He was the strict father who wants to lecture his children about what's best for them. His grim fireside chats about consumption and self-indulgence, while sensible, did not go down well with the American people.
In addition, while he did a good job during his first two years, Carter was completely overtaken by global events during his second two, most notably the Iranian Revolution, Soviet aggression and the oil crisis that hiked up oil prices. The book does a good job showing that while Carter was not responsible for these events, he was as clueless in understanding the situation in Iran as anyone else.

There is an excellent account of the Iranian Hostage crisis in Alter's biography which includes many details that I did not know. It seems like a real tragedy since it was a comedy of errors in some sense, albeit one which the US had yoked itself with since installing the Shah of Iran in a coup in 1953 (there is an excellent account of how Mohammed Mosaddegh's democratic government was toppled in Stephen Kinzer's book "All the Shah's Men"). The hostage crisis was essentially triggered by the Shah being allowed into the US for medical treatment. He had fled from Iran after the Ayatollah had been reinstalled and flown in from exile in France.
The Shah's case was engineered by a lobby prominently led among others by Kissinger - the man's villainy continued unabated even after leaving the Nixon administration. He and his cabal greatly exaggerated the Shah's medical condition and forced Carter to admit him into the US on humanitarian grounds; he was in Mexico, and the Kissinger faction wrongly made the case that Mexican hospitals weren't equipped to diagnose and treat him. This was the last straw since it told the Iranians that the US was about to embark on another 1953-like adventure. This wasn't true, but at this point cooler heads weren't prevailing.
Also complicit in the disaster was Carter's hawkish national security advisor Zbigniew Brzezinski, an early neoconservative. A dissenter was secretary of state Cyrus Vance who strenuously advised against letting the Shah in, having a good idea of how perilous the situation in Iran was. As it turned out, Carter was blindsided and ignorant of the internal situation in Iran and ended up letting the Shah in (before hurriedly getting him out again).
The irony was that the die had been cast by Carter's predecessors, especially Eisenhower and Nixon, and Carter himself had very little interest in adventurism abroad, but the Shah was America's burden to bear, and Carter's actions were conflated with previous ones by the Iranians. Once the hostages were taken Carter's hands were tied for months, his approval ratings plummeted and the way was paved for Reagan (and Ben Affleck and his team in 'Argo').
I always feel that the fractured relations between the US and Iran constitute one of the great international tragedies of the 20th and early 21st centuries. Both countries have a rich heritage and have so much to offer each other. If the US had not completely thrown in their lot with Saudi Arabia and Israel and instead been friends with Iran, we would have had a powerful ally against Islamic fundamentalism in the Middle East. As it happened, the current Iranian regime is certainly nothing to praise and funds terrorist groups like Hezbollah. But it's important to note that it was largely US actions in the 1950s that led to the present state of affairs.
In retrospect, it appears obvious how someone like Reagan who was just fundamentally better at being a people pleaser and projected sunny optimism could defeat Carter. Fortunately Carter's career was just beginning at the end of his presidency, and in the next three decades he did very significant human rights work, including eradicating guinea worm from Africa and working on Habitat for Humanity, winning the Nobel Peace Prize in the process and becoming a far more deserving recipient than most others who got the prize. Now 97, he still teaches Sunday School in Plains, GA. What we need today is the pragmatism and intelligence of Jimmy Carter and the optimism of Ronald Reagan.
A fantastic book, well worth its almost 800 pages, and likely the definitive biography of a remarkable man for many years to come.

Book Review: "The Jews of Spain", by Jane Gerber

Jane Gerber’s “The Jews of Spain” is a superb and comprehensive look at the history of the Sephardim - one of the two major branches of Jewry, the other being the Ashkenazim. The Sephardim originated in Spain and today occupy a place of high prominence. While the Ashkenazim are better known, about sixty percent of Israel’s population consists of the Sephardim.

Two main qualities mark the Sephardim. One is common to all Jews, which is the ability to persevere and thrive against all odds through centuries. The other is more unique and was the flourishing of their numbers under Muslim rule. Except for Germany in the 19th century and the United States in the 20th, Jews have not thrived anywhere else so well after they were driven out from the Roman Empire.
The book starts with the miserable fate of the Jews in Spain under the Visigoths; the Jews had fled there after being persecuted under the Roman Empire. It was in 711 AD when the Arabs under Tariq invaded Spain and defeated the Visigoths that their fortunes changed. This was largely because of the tolerant, creative and far-ranging Umayyad Caliphate that moved to Spain from Damascus, with Abd-Al-Rahman I founding its branch in Spain. The Umayyad Caliphate marked one of the highlights of the history of Islamic Civilization, making its home in Cordoba at first and then in places like Seville and Toledo. Not only did it allow Jews to practice their religion freely but it employed them in almost all important professions. Jews still did not have all the same rights as Muslims, but they could trade, study medicine and the arts and sciences, compose poetry and music and generally take part in the political life of Islamic Spain to the fullest history. There were many notables among Jewish intellectuals, with perhaps the peak reached by Hasdai Ibn Shaprut who became vizier and foreign minister.
Politically, the most useful purpose the Jews of Spain served was to mediate between the Byzantine and Christian kingdoms and Muslim lands. In some sense, because they were equally loved and hated by both religions, the Jews could do this balancing act well as neutrals. The Jews of Spain assimilated Arabic and moved smoothly between Arabic and Hebrew, often translating texts between the two. Even more valuably, they translated important Greek texts on science and medicine into Arabic; later these Arabic texts were translated into Latin by Christian scholars during the translation movement of the 10th and 11th centuries. The Jews of Spain thus served as a critical conduit between the Muslim and Christian worlds, diplomatically trading and interacting with each world while performing valuable functions for both. Jews became great and far flung traders, braving pirates and trading precious pearls, textiles, spices and other goods. The Radhanites were a particularly prominent group of Jewish merchants who went back and forth between Spain and as far as India and China. Some of these Jews even made their homes in India and China, becoming for instance the Bene Israelites of Maharashtra in India.
This useful and productive existence came to an end with the Reconquista and the Christian invasions of Spain. After a period of more repressive and less tolerant Islamic regimes that included the Almoravids and the Almohads, the Jews started seeing a period of decline. After this brief resurgence, the Christians decisively defeated the Muslims in the Battle of Navas de Tolosa in 1212. Within the next few decades both Cordoba and Seville fell to the Christians, and only the Islamic Kingdom of Granada remained. When Granada fell to the Christians, the fate of Spain's Jews was sealed.
The next two hundred years marked a period of severe decline for the Jews of Spain. Many started converting under Christian pressure. But the real blow came when Isabella and Ferdinand of Aragon and Castile unified the country. At first somewhat tolerant of the Jews, in 1478 they approved the dreaded Spanish Inquisition that started hauling converted Jews before feared magistrates like Torquemada, torturing and extracting false confessions from then. Finally the watershed came in 1492 when Isabella and Ferdinand issued the famous edict of expulsion that gave the 300,000 Jews of Spain the stark choice between converting or fleeing. The foremost intellectual among those expelled was the scholar Maimonides whose Mishneh Torah and Guide to the Perplexed remain, even today, touchstones of Judaism. Maimonides found refuge in Egypat. But many others left Spain and went to Portugal, where the Portuguese Inquisition was even worse. So pernicious were its methods that many Jews became marranos - underground Jews who quietly practiced their religion so cryptically that nobody knew. These crypto-Jews evolved a form of their religion that would have been almost incomprehensible to their ancestors. From these marranos arose some of the most prominent Jews of later years, including Spinoza.
When Portugal also denied the Jews sanctuary they dispersed to other parts of Europe and the Middle East. By this time the plight of Jews in Europe had become even more dire. The Black Death of 1347 had created an atmosphere of acute paranoia in which Jews were accused of perpetuating the blood libel and poisoning wells. Even the Catholic Pope cautioned against such unfounded rumors, but it did not stop violent pogroms erupting in which Jews were massacred wholesale and burnt alive. England had already banned the Jews a long time ago, and apart from scattered pockets in France like Bayonne, they could find no respite. It was at this point that the Jews saw their second resurgence in the Ottoman Empire - in Turkey.
The remarkable story of the Turkish Jews is a story unto itself, but in Istanbul under the Ottoman Emperors Suleiman, Bayezid and others, the Jews achieved a prominence that they had only achieved under the Umayyad Caliphate. Interestingly, it was here that they met the Ashkenazim who had come from Europe, and for a long time the much more affluent and educated Sephardim looked down upon their Ashkenazim co-religionists as uncouth and poor. Most importantly, and as a testament to the freedom they enjoyed, they were allowed to establish their own printing presses in the 15th and 16th centuries. The printing presses allowed them to keep not just their religion but many of their religious books alive. Once again they served as mediators with Christians in Europe. One of the most remarkable among these was the Portuguese marrano Dona Gracia, a self-taught Jewish woman who became a wealthy Jewish trader after fleeing from Portugal, led an underground pipeline for Portuguese Jews who were being targeted by the Inquisition and successfully organized a boycott of an Italian port after Jews there had been targeted by the papacy.
Unfortunately once the Ottoman Empire weakened in the 17th century and the Christian Kingdoms imposed a series of harsh punitive measures on the Jews there, they had no choice but to flee. The last part of the book describes this flight. The Jews of Turkey went in two different directions. Some went to Southeastern Europe, Greece and the Balkans. The others went to the Netherlands which in the 17th century was the most progressive country in Europe. Here the Jews found plenty of opportunities for trading and banking. One of the most important Jews here was Spinoza who was ironically excommunicated by his own people at the age of twenty-four and had to spend the rest of his life as a lens-maker to support himself. Nevertheless, he became the father of the Jewish Enlightenment and inspired many other philosophers in Europe.
From the Netherlands some Jews made it to South America, especially Brazil. But once Brazil was threatened by Portugal, a small group of Jews started out on a journey in 1654 to a hitherto unexplored country where they would establish the most important Jewish community of modern times - the nascent United States. Over the next one hundred years, Jews became successful traders and professionals in a secular republic, fought in the American Revolution and established thriving communities in many states, even making it as far as the Ohio Valley. The United States was to see two other great waves of European Jewry, one in the mid 19th century and the other in the early 20th century. They were welcomed with memorable words written on the statue of liberty by Emma Lazarus, also a Jew. But the Sephardim got here first, way back in 1654.
Sadly, the plight of the other wave of Ottoman Jewry was much worse. Greece was taken over by the Nazis and tens of thousands of Greek and Macedonian Jews were sent to the death camps. Once the war ended, scattered bands of Jews from all over Europe, the Middle East and survivors from the concentration camps started making their way back, looking for family and friends. Shattered to find most of these missing, they made their way to the only place that would give them spiritual solace - Israel. Today the majority of Israeli Jews are Sephardim.
The Sephardim retained a love for their ancestral country that was striking. After the Bosnian war in the 1990s, many petitioned the King of Spain to give them refuge back into a country which their ancestors had left hundreds of years ago. The ties that bound them to Spain were deep and invisible. Today when the Middle East is a cauldron of ethic and religious conflict between Israel and the Arab Nations, it’s worth remembering that historically, Jews had been treated much better by Muslim kings than by Christian ones. Their history in Spain and in the Ottoman Empire is testament to their doggedness, the resurgent creativity and their sponge-like capacity to absorb critical elements of the surrounding culture while staying true to their roots. It’s a glorious and moving history, and Jane Gerber tells it well.

Book Review: "The Pity Of It All: A Protrait of the German-Jewish Epoch", 1743-1933, by Amos Elon



Amos Elon’s ‘The Pity of It Al’ is a poignant and beautiful history of German Jews from 1743-1933. Why 1743? Because in 1740, Frederick of Prussia liberalized the state and allowed freedom of worship. The freedom did not extend to Jews who still had no political or civil rights, but it did make it easier for them to live in Prussia than in the other thirty-six states of what later came to be called Germany.

The book begins with the story of the first prominent modern German-Jewish intellectual, the fourteen-year-old, barefooted Moses Mendelssohn, who entered Berlin through a gate reserved for “Jews and cattle”. Mendelssohn was the first Jew to start an enduring tradition that was to both signal the high watermark of European Jewry and their eventual destruction. This was the almost vehement efforts of Jews to assimilate, to convert to Christianity, to adopt to German traditions and ways, to become bigger German patriots than most non-Jewish Germans while retaining their culture and identity. In fact the entire history of German Jewry is one of striking a tortuous balance between assimilating into the parent culture and preserving their religion and identity. Mendelssohn became the first great Jewish German scholar, translating Talmud into Hebrew and having an unsurpassed command of both German and Jewish philosophy, culture and history. While initially he grew up steeped only in German culture, a chance encountered with a Protestant theologian who exhorted him to convert. This encounter convinced Mendelssohn that he should be more proud of his Jewish roots, but at the same time seek to make himself part and parcel of German society. Mendelssohn’s lessons spread far and wide, not least to his grandson, the famous composer Felix Mendelssohn who used to go to Goethe’s house to play music for him.

Generally speaking, the history of German Jews tracks well with political upheavals. After Prussia became a relatively liberal state and, goaded by Mendelssohn, many Jews openly declared their Judaism while forging alliances with German intellectuals and princes, their condition improved relative to the past few centuries. A particularly notable example cited by Elon is the string of intellectual salons rivaling their counterparts in Paris that were started in the Berlin by Jewish women like Rachel Varnhagen which drew Goethe, the Humboldt brothers and other cream of German intellectual society. The flowering of German Jews as well-dot-do intellectuals and respectable members of the elite starkly contrasted with their centuries-old image in the rest of Europe as impoverished caftan-wearers, killers of Christ and perpetuators of the blood libel. Jews had been barred from almost all professions except medicine, and it was in Prussia that they could first enter other professions.

When Napoleon invaded Prussia, his revolutionary code of civil and political rights afforded the German Jews freedom that they had not known for centuries. The Edict of 1812 freed the Jews. They came out of the ghettoes, especially in places like Frankfurt, and the Jewish intelligentsia thrived. Perhaps foremost among them was the poet Heinrich Heine whose astute, poignant, tortured and incredibly prescient poetry, prose and writings were a kaleidoscope of the sentiments and condition of his fellow Jews. Heine reluctantly converted but was almost tortured by his torn identity. The Edict of 1812 met with a tide of rising German nationalism from the bottom, and Jews quickly started reverting back to their second-class status. Heine, along with Eduard Gans and Leopold Zunz who started one of the first scientific societies in Germany, had trouble finding academic jobs. The Hep! Hep! riots that started in Wurzburg and spread throughout Germany were emblematic of the backlash. Significantly, and again in potent portend, this was the first time that German intellectuals took part in the violent anti-Semtism; later when the Nazis took over, the legal basis of their murderous anti-Semtism was undergirded by intellectuals, and it was intellectuals who drew up the Final Solution in 1942 at the Wannsee conference. Jews in record numbers started to convert to escape discrimination.

For the next few decades, straddling this delicate and difficult balance between adopting two identities was to become a hallmark of the Jewish condition in Germany, although scores also converted without any compunction. Writing from Paris in 1834, Heine issued a warning:

“A drama will be enacted in Germany compared to which the French revolution will seem like a harmless idol. Christianity restrained the martial ardor of the Germans for a time but it did not destroy it; once the restraining talisman is shattered savagery will rise again. The mad fury of the berserk of which Nordic Gods sing and speak. The old stony gods will rise from the rubble and rub the thousand year old dust from their eyes. Thor with the giant hammer will come forth and smash the granite domes.”

Extraordinarily prescient words, especially considering the Nordic reference.

The next upheaval came with the European liberal revolution of 1848. As is well known, this revolution overthrew monarchies - temporarily - throughout Europe. For the first time, Germany’s Jews could agitate not just for civil but political rights. A record number of Jews were appointed to the Prussian parliament by Frederick William IV. Unfortunately even this revolution was not to last. Frederick William reneged on his promise, many Jews were either ejected from parliament or made impotent and another rising tide of nationalism engulfed Germany. The next few decades, while not as bad the ones before, sought to roll back the strides that had been made.

It’s in this context that the rise of Bismarck is fascinating. Bismarck dodges many stereotypes. He was the emblem of Prussian militarism and autocracy, the man who united Germany, but also the liberal who kickstarted the German welfare state, pioneering social security and health insurance. When he declared war on France in 1870, patriotic Jews not only took part in the war but funded it. “Bismarck’s Jews” procured the money, helped Bismarck draw up the terms of French capitulation and occupation at Sedan. Among these, Ludwig Bamberger and Abraham Bleichroder were the most prominent - Bleichroder even used stones from Versailles to build a large mansion in Germany. While praising these Jews for their contributions to the war effort, Bismarck stopped short of saying that they should be awarded full rights as citizens of Germany. Nevertheless, in 1871, Bismarck passed an edict that banned discrimination on the basis of religion in all civil and political functions. It seemed that the long-sought goal of complete emancipation was finally in sight for Germany’s Jews.

But even then, as patriotic Jews signed up for the Franco-Prussian War, a dissenting note was struck by another Jew. Leopold Sonnemann was the publisher of a leading Frankfurt newspaper. In editorial after editorial, he issued stark warnings both to Jews and gentiles of the rising militarism and rigid social order in Prussia that was taking over all of Germany. He warned Jews that ironically, their patriotism may cost them more than they bargained for. Sonnemann was another prescient Jew who saw what his community’s strenuous efforts to conform were costing them. Sonnemann’s predictions were confirmed almost right away when a recession hit Germany in 1873 that was among the worst of the previous hundred years. Immediately, as if on cue, anger turned toward the wealthy Jews who had apparently grown fat and rich during the war while their fellow citizens grew impoverished. In 1879, a prominent Protestant clergyman named Adolf Stocker started railing against the Jews, calling them a “poison in German blood”, echoing paranoia that was leveraged to devastating effect by another Adolf a half century later. The Kaiser and Bismarck both disapproved of Stocker’s virulent anti-Semitic diatribes, but thought that it perhaps might make the Jews more “modest”. To say that this was unfair retaliation against a patriotic group who had bankrolled and helped the war efforts significantly would be an understatement.

Even as Bismarck was propagating religious freedom in Germany, anti-Semitic continued to grow elsewhere. Interestingly, in France where Jews had a much better time after Napoleon, Arthur Gobineau published a book arguing for Nordic superiority. About the same time, the fascinating but deadly English-German Houston Chamberlain, son-in-law of Wagner, published the massive “Foundations of the Nineteenth Century” in 1899 that became a kind of Bible for the 20th century pan-German Völkisch movement that fused nationalism with racialism. Both Gobineau and Chamberlain were to serve as major ‘philosophers’ for Hitler and the Nazis. In France, the Dreyfus affair had already exposed how fragile the situation of French Jews was.

As sentiments against the Jews grew again, German Jews again became disillusioned with conversion and conformity. The Kabbalah movement and other mysticism-based theologies started to be propounded by the likes of Martin Buber. Rather than keep on bending over backward to please an ungrateful nation, some sought other means of reform and escape. Foremost among these was the centuries old dream of returning to the promised land. Theodor Herz picked up the mantle of Zionism and started trying to convince Jews to migrate to Palestine. Ironically, the main target of his pleas was the Kaiser. Herzl wanted the Kaiser to fund and officially approve Jewish migration to Palestine. Not only would that reduce the Jewish population in Germany and perhaps ease the pressure on gentiles, but in doing so, the Kaiser would be seen as a great benefactor and liberator. In retrospect Herzl’s efforts have a hint of pity among them, but at that time it made sense. The ironic fact is that very few German Jews signed on to Herzl’s efforts to emigrate because they felt at home in Germany. This paradox was to prove to be the German Jews’ most tragic quality. Where Herzl sought emigration, others like Freud and Marx (who had been baptized as a child) sought secular idols like psychoanalysis and communism. This would have been a fascinating theme in itself, and I wish Elon had explored it in more detail.

As the new century approached and another Great War loomed, the themes of 1870 would be repeated. The ‘Kaiserjuden’ or Kaiser’s Jews, most prominently Walter Rathenau, would bankroll and help Germany’s war with England and France. Many Jews again signed up or patriotic duty. Without Rathenau, who was in charge of logistics and supplies, German would likely have lost the war within a year or two. Yet once again, the strenuous efforts of these patriotic Jews were forgotten. A young Austrian corporal who had been blinded by gas took it upon himself to proselytize the “stab in the back” theory, the unfounded belief that it was the Jews who secretly orchestrated an underhanded deal that betrayed the army and cost Germany the war. The truth of course was the opposite, but it’s important to note that Hitler did not invent the myth of the Jewish betrayal. He only masterfully exploited it.

The tragic post-World War 1 history of Germany is well known. The short-lived republics of 1919 were followed by mayhem, chaos and assasinations. The Jews Kurt Eisner in Bavaria and Walter Rathenau were assasinated. By that time there was one discipline in which Jews had become preeminent - science. Fritz Heber had made a Faustian bargain when he developed poison gas for Germany. Einstein had put the finishing touches on his general theory of relativity by the end of the war and had already become the target of anti-Semitism. Others like Max Born and James Franck were to make revolutionary contributions to science in the turmoil of the 1920s.

Once the Great Depression hit Germany in 1929 the fate of Germany’s Jews was effectively sealed. When Hitler became chancellor in 1933, a group of leading Jewish intellectuals orchestrated a massive but, in retrospect, pitiful attempt to catalog the achievements of German Jews. The catalog included important contributions by artists, writers, scientists, philosophers, playwrights and politicians in an attempt to convince the Nazis of the foundational contributions that German Jews had made to the fatherland. But it all came to nothing. Intellectuals like Einstein soon dispersed. The first concentration camp at Dachau went up in 1936. By 1938 and Kristallnacht, it was all over. The book ends with Hannah Arendt, protege of Martin Heidegger who became a committed Nazi, fleeing Berlin in the opposite direction from which Moses Mendelssohn had entered the city two hundred years earlier. To no other nation had Jews made more enduring contributions and tried so hard to fit in. No other nation punished them so severely.

Book Review: "Against the Grain", by James Scott

James Scott's important and utterly fascinating book questions what we can call the "Whig view" of history, which goes something like this: At the beginning we we were all "savages". We then progressed to becoming hunter gatherers, then at some point we discovered agriculture and domesticated animals. This was a huge deal because it allowed us to became sedentary. Sedentism then became the turning point in the history of civilization because it led to cities, taxation, monarchies, social hierarchies, families, religion, science and the whole shebang of civilizational wherewithal that we take for granted.

Scott tells us that not only is this idea of progress far from being as certain, linear or logical as it sounds, but it's also not as much of a winning strategy as we think. In a nutshell, his basic thesis is that the transition from hunter-gatherer to sedentism was messy and mixed, with many societies sporadically existing in one or the other system. The transition from agriculture to city-states was even less certain and obvious, with agriculture emerging about 10,000 years ago and the first modern city-state of Uruk I in Mesopotamia emerging almost seven thousand years later, around 3000 BC. Until then people existed in temporary and fluctuating states of agriculture and hunter-gatherer existence.

Perhaps an even bigger message in the book is regarding the very nature of history which basically tells us the stories it preserves. Cities form the core of history because they leave traces like large monuments, but life outside cities which can be far more extensive - as it was until very recently - leaves no traces and is discounted in our narratives. The fact is that even after agriculture and the first city-states came along, cities were often temporary and fragmentary and often dispersed because of disease, famine, war, taxation or oppressive rulers, floods and droughts and reformed, much like an anthill. Then the population would live off the land as hunter-gatherers for some time and form city-like complexes again when the time was ripe. As part of his evidence that cities were by no means obvious, Scott makes the argument that the first civilizations formed around waterways and not in the plans and mountains. These civilizations were mixed models of hunter-gatherer and city-like existence at best.

Once we assume that cities were by no means enduring or certain, we can start questioning the wisdom of other narratives associated with them. For instance take the all-important nature of grains (wheat, barley, corn and rice) being the major staples of the world, then as now. Scott makes the brilliant argument that unlike other crops like potatoes and legumes, grains became the staple of city-dwellers not because they were objectively better in terms of nutrition but because they could be easily taxed because they were above-ground, ripened all at once and could be counted, assessed and carted away. But grains often consigned city residents to a monoculture, unlike hunting and gathering which could take advantage of a variety of food sources on land, water and brush.

The same arguments apply to domestication of animals. As Jared Diamond showed in his book "Guns, Germs and Steel", most of our modern diseases and pandemics can be traced back to diseases of animal or zoonotic origins, so domestication was hardly the wholly blissful invention we assume. With domesticated animals also came rats, sparrows, crows and mice which are called commensals, These brought other sources of destruction and disease. Finally, taxation which was a major feature of cities and which contributed massively to critical developments like slavery and writing could become very oppressive.

All this meant that cities were hardly the nuclei of civilization progress that we assume them to be. Not surprisingly, especially in a hybrid model, city dwellers often fled the unsanitary, tax-heavy, monoculture-rich environment of cities to a more flexible and open hunter-gatherer environment. In fact the vast majority of the population lived outside cities until very recently. Now, no means is Scott making the argument here, popular among "back to nature" paleo-enthusiasts, that hunting and gathering was fundamentally a better existence. He is saying that hunting and gathering continued to have advantages that made, until very recently, a permanent move to cities far from desirable, let alone inevitable. Unfortunately because cities leave archeological traces, we fall into the mistaken assumption that the history of civilization is the history of city-states.

In the last part of the book, Scott tackles the topic of "barbarians" versus city dwellers. Based on the ensuing discussion, it should come as no surprise that Scott is very cynical about the very word as invented by the Greeks and applied generously by the Romans. Clearly compared to the Roman and Greek city states, the barbarian countryside was often thriving and more desirable to live in. More importantly, the very distinction between barbarians and "civilized" folks is fluid and fuzzy - as is now well-known in the case of Rome, Romans could be barbarians, and barbarians could be Romans citizens (popularized recently in the Netflix show "Barbarians"). The fact is that Romans often willingly became "barbarians" because of the oppressive nature of the city-state.

Scott's book is one of the most important books I have read in years; it may well be one of the most important books I will ever read. The best thing about it is that it presents history the way it was, as a series of incidental, messy events whose end outcome was by no means certain. Whatever order we decide to impose on history is of our own making.

Two views of America

The United States is a country settled by Europeans in the 17th and 18th century who created a revolutionary form of government and a highly progressive constitution guaranteeing freedom of speech, religion and other fundamental human rights which could be modified. It was a development unprecedented both in space and time.

At the same time, this creation of the American republic came at great cost involving the displacement and decimation of millions of Native Americans and the institution of chattel slavery on these lands. The original constitution had grave deficiencies and it took a long time for these to be remedied.

Many people can’t hold these two very different-sounding views of America in their minds simultaneously and choose to emphasize one or the other, and this divide has only grown in recent times. But both of these views are equally valid and equally important, and ultimately in order to understand this country and see it progress, you have to be at peace with both views.

But it’s actually better than that, because there is a third, meta-level view which is even more important, that of progress. The original deficiencies of the constitution were corrected and equal rights extended to a variety of groups who didn’t have them, including women, people of color, Catholics, Jews and immigrants. Chattel slavery was abolished and Native Americans, while not reverting to their previous status, could live in dignity as American citizens. 

This was the idea of constantly striving toward a “more perfect Union” that Lincoln emphasized. There were hiccups along the way, but overall there was undoubtedly great progress. Today American citizens are some of the freest people in the world, even with the challenges they face. If you don’t believe this, then you effectively believe that the country is little different from what it was fifty or a hundred years ago.

It seems that this election and these times are fundamentally about whether you can hold the complex, often contradictory history of this country in your mind without conflict, and more importantly whether you subscribe to the meta-level view of progress. Because if you can’t, you will constantly either believe that the country is steeped in irredeemable sin or sweep its inequities under the rug. Not only would both views be pessimistic but both would do a disservice to reality. But if you can in fact hold this complex reality in mind, you will believe that this is a great country not just in spite of its history but because of it.

A Foray into Jewish History and Judaism

I have always found the similarities between Hinduism and Judaism (and between Hindu Brahmins in particular and Jews) very striking. In order of increasing importance:

1. Both religions are very old, extending back unbroken between 2500 and 3000 years with equally old holy books and rituals.

2. Both religions place a premium on rituals and laws like dietary restrictions etc.

3. Hindus and Jews have both endured for a very long time in spite of repeated persecution, exile, oppression etc. although this is far more true for Jews than Hindus. Of course, the ancestors of Brahmins have the burden of caste while Jews have no such thing, but both Hindus and Jews have been persecuted for centuries by Muslims and Christians. At the same time, people of both faiths have also lived in harmony and productive intercourse with these other faiths for almost as long.

4. Both religions place a premium on the acquisition and dissemination of knowledge and learning. Even today, higher education is a priority in Jewish and Hindu families. As a corollary, both religions also place a premium on fierce and incessant argumentation and are often made fun of for this reason.

5. Both religions are unusually pluralistic, secular and open to a variety of interpretations and lifestyles without losing the core of their faith. You can be a secular Jew or an observant one, a zealous supporter or harsh critic of Israel, a Jew who eats pork and still calls himself a Jew. You can even be a Godless Jewish atheist (as Freud called himself). Most importantly, as is prevalent especially in the United States, you can be a “cultural Jew” who enjoys the customs not because of deep faith but because it fosters a sense of community and tradition. Similarly, you can be a highly observant Hindu, a flaming Hindu nationalist, an atheist Hindu who was raised in the tradition but who is now a “cultural Hindu” (like myself), a Hindu who commits all kinds of blasphemies like eating steak and a Hindu who believes that Hinduism can encompass all other faiths and beliefs.

I think that it’s this highly pluralistic and flexible belief and tradition system that has made both Judaism and Hinduism what Nassim Taleb calls “anti-fragile”, not just resilient but being able to be actively energized in the face of bad events. Not surprisingly, Judaism has always been a minor but constant interest of mine, and there is no single group of people I admire more. The interest has always manifested itself previously in my study of Jewish scientists like Einstein, Bethe, von Neumann, Chargaff and Ulam, many of whom fled persecution and founded great schools of science and learning. More broadly though, although I am familiar with the general history, I am planning to do a deeper dive into Jewish history this year. Here is a list of books that I have either read (*), am reading ($) or planning to read (+). I would be interested in recommendations.

1. Paul Johnson’s “The History of the Jews”. (*)

2. Simon Schama’s “The Story of the Jews”. (*)

3. Jane Gerber’s “The Jews of Spain”. ($)

4. Nathan Katz’s “The Jews of India”. (*)

5. Amos Elon’s “The Pity of It All: A Portrait of the German-Jewish experience, 1743-1933”. ($)

6. Norman Lebrecht’s “Genius and Anxiety: How Jews Changed the World: 1847-1947”. ($)

7. Erwin Chargaff’s “Heraclitean Fire”. (*)

8. Stanislaw Ulam’s “Adventures of a Mathematician”. (*)

9. Stefan Zweig’s “The World of Yesterday”. (*)

10. Primo Levi’s “Survival in Auschwitz” and “The Periodic Table”. (*)

11. Robert Wistrich’s “A Lethal Obsession: Anti-Semitism from Antiquity to the Global Jihad”. (*)

12. Jonathan Kaufman’s “The Last Kings of Shanghai”. (This seems quite wild) (+)

13. Istvan Hargittai’s “The Martians of Science”. (*)

14. Bari Weiss’s “How to Fight Anti-Semitism”. (+)

15. Ari Shavit’s “My Promised Land”. (+)

16. Norman Cohn’s “Warrant for Genocide: The Myth of the Jewish World Conspiracy and the Protocols of the Elders of Zion” (*)

17. Irving Howe’s “World of Our Fathers: The Journey of the East European Jews to America and the Life They Found and Made“ (+)

18. Edward Kritzler’s “Jewish Pirates of the Caribbean” (another book that sounds wild) (+)

19. Alfred Kolatch’s “The Jewish Book of Why” (+)

20. Simon Sebag-Montefiore’s “Jerusalem” ($)

Life. Distributed.

One of my favorite science fiction novels is “The Black Cloud” by Fred Hoyle. It describes an alien intelligence in the form of a cloud that approaches the earth and settles by the sun. Because of its proximity to the sun the cloud causes havoc with the climate and thwarts the attempts of scientists to both study it and attack it. Gradually the scientists come to realize that the cloud is an intelligence unlike any they have encountered. They are finally successful in communicating with the cloud and realize that its intelligence is conveyed by electrical impulses moving inside it. The cloud and the humans finally part on peaceful terms.

There are two particularly interesting aspects of the cloud that warrant further attention. One is that it’s surprised to find intelligence on a solid planet; it is used to intelligence being gaseous. The second is that it’s surprised to find intelligence concentrated in individual human minds; it is used to intelligence constantly moving around. The reason these aspects of the story are interesting is because they show that Hoyle was ahead of his time and was already thinking about forms of intelligence and life that we have barely scratched the surface of.

Our intelligence is locked up in a three pound mass of wet solid matter. And it’s a result of the development of the central nervous system. The central nervous system was one of the great innovations in the history of life. It allowed organisms to concentrate their energy and information-processing power in a single mass that sent out tentacles communicating with the rest of the body. The tentacles are important but the preponderance of the brain’s capability resides in itself, in a single organ that cannot be detached or disassembled and moved around. From dolphins to tigers and from bonobos to humans, we find the same basic plan existing for good reasons. The central nervous system is an example of what’s called convergent evolution, which refers to the ability of evolution to find the same solutions for complex problems. Especially in Homo sapiens, the central nervous system and the consequent development of the neocortex are seen as the crowning glory of human evolution.

And yet it’s the solutions that escaped the general plan that are the most interesting in a sense. Throughout the animal and plant kingdom we find examples not of central but of distributed intelligence, like Hoyle’s cloud. Octopuses are particular fascinating examples. They can smell and touch and understand not just through their conspicuous brains but through their tentacles; they are even thought to “see” color through these appendages. But to find the ultimate examples of distributed intelligence, it might be prudent not to look at earth’s most conspicuous and popular examples of life but its most obscure – fungi. Communicating the wonders of distributed intelligence through the story of fungi is what Merlin Sheldrake accomplishes in his book, “Entangled Life”.

Fungi have always been our silent partners, partners that are much more like us than we can imagine. Like bacteria they are involved in an immense number of activities that both aid and harm human beings, but most interestingly, fungi unlike bacteria are eukaryotes and are therefore, counterintuitively, evolutionarily closer to us rather than to their superficially similar counterparts. And they get as close to us as we can imagine. Penicillin is famously produced by a fungus; so is the antibiotic fluconazole that is used to kill other fungal infections. Fungal infections can be deadly; Aspergillus forms clumps in the lungs that can rapidly kill patients by spreading through the bloodstream. Fungi of course charm purveyors of gastronomic delights everywhere in the world as mushrooms, and they also charm purveyors of olfactory delights as truffles; a small lump can easily sell for five thousand dollars. Last but not the least, fungi have taken millions of humans into other worlds and artistic explosions of colors and sight by inducing hallucinations.

With this diverse list of vivid qualities, it may seem odd that perhaps the most interesting quality of fungi lies not in what we can see but what we can’t. Mushrooms may grace dinner plates in restaurants and homes around the world, but they are merely the fruiting bodies of fungi. They may be visible as clear vials of life-saving drugs in hospitals. But as Sheldrake describes in loving detail, the most important parts of the fungi are hidden below the ground. These are the vast networks of the fungal mycelium – the sheer, gossamer, thread-like structure snaking its way through forests and hills, sometimes spreading over hundreds of square miles, occasionally being as old as the neolithic revolution, all out of sight of most human beings and visible only to the one entity with which it has forged an unbreakable, intimate alliance – trees. Dig a little deep into a tree root and put it under a microscope and your will find wisps of what seem like even smaller roots, except that these roots penetrate into the trees roots. The wisps are fungal mycelium. They are everywhere; around roots, under them, over them and inside them. At first glance the the ability of fungal networks to penetrate inside tree roots might evoke pain and invoke images of an unholy literal physical union of two species. It’s certainly a physical union, but it may be one of the holiest meetings of species in biology. In fact it might well be impossible to find a tree whose roots have no interaction with fungal mycelium. The vast network of fibers the mycelium forms is called a mycorrhizal network.

The mycorrhizal networks that wind their way in and out of tree roots are likely as old as trees themselves. The alliance almost certainly exists because of a simple matter of biochemistry. When plants first colonized land they possessed the miraculous ability of photosynthesis that completely changed the history of life on this planet. But unlike carbon which they can literally manufacture out of sunlight and thin air, they still have to find essential nutrients for life, metals like magnesium and other life-giving elements like phosphorus and nitrogen. Because of an intrinsic lack of mobility, plants and trees had to find someone who could bring them these essential elements. The answer was fungi. Fungal networks stretching across miles ensured that they could shuttle nutrients back and forth between trees. In return the fungi could consume the precious carbon that the tree sank into its body – as much as twenty tons during a large tree’s lifetime. It was the classic example of symbiosis, a term coined by the German botanist Albert Frank, who also coined the term mycorrhiza.

However, the discovery that fungal networks could supply trees with essential nutrients in a symbiotic exchange was only the beginning of the surprises they held. Sheldrake talks in particular about the work of the mycologists Lynne Body and Suzanne Simard who have found qualities in the mycorrhizal networks of trees that can only be described as deliberate intelligence. Here are a few examples: fungi seem to “buy low, sell high”, providing trees with important elements when they have fallen on hard times and liberally borrowing from them when they are doing well. Mycorrhizal networks also show electrical activity and can discharge a small burst of electrochemical potential when prodded. They can entrap nematodes in a kind of death grip and extract their nutrients; they can do the same with ants. Perhaps most fascinatingly, fungal mycelia display “intelligence at a distance”; one part of a huge fungal network seems to know what the other is doing. The most striking experiment that demonstrates this shows oyster mushroom mycelium growing on a piece of wood and spreading in all directions. When another piece of wood is kept at a distance, within a few days the fungal fibers spread and latch on to that piece. This is perhaps unsurprising. What is surprising is that once the fungus discovers this new food source, it almost instantly pares down growth in all other parts of its network and concentrates it in the direction of the new piece of wood. Even more interestingly, scientists have found that the hyphae or tips of fungi can act not only as sensors but as primitive Boolean logic gates, opening and closing to allow only certain branches of the network to communicate with each other. There are even attempts to use fungi as primitive computers.

This intelligent long-distance relay gets mirrored in the behavior of the trees that the fungi form a mind meld with. One of the characters in Richard Powers’s marvelous novel “The Overstory” discovers how trees are whispering hidden signals to each other, not just through fungal networks but through ordinary chemical communication. The character Patty Westford finds out that when insects attack one tree, it can send out a chemical alarm that alerts trees located even dozens of meters away of its plight, causing them to kick their own repellant chemical production into high gear. Meeting the usual fate of scientists with novel ideas, Westford and her ideas are first ignored, then mocked and ostracized and ultimately grudgingly accepted. But the discovery of trees and their fungal networks communicating through each other and through the agency of both chemicals and other organisms like insects is now generally accepted enough to become part of both serious scientific journals and prizewinning novels.

Fungi can also show intelligent behavior by manipulating our minds, and this is where things get speculative. Psilocybin and LSD have been used by shamans, hippies and Silicon Valley tech entrepreneurs over thousands of years. When you are familiar with both chemistry and biology it’s natural to ask what might be the perceived evolutionary utility of chemical compounds that bring about changes in perception that are so profound and seemingly liberating as to lead someone like Aldous Huxley to make sure that he was on a psychedelic high during the moment of his death. One interesting clue arises from the discovery of these compounds in the chemical defense responses of certain fungi. Clearly the microorganisms that are engaged in a war with fungi – and these often include other fungi – lack a central nervous system and have no concept of a hallucination. But if these compounds are found as part of the wreckage of fungal wars, maybe this was their original purpose, and the fact that they happen to take humans on a trip is only incidental.

That is the boring and likely explanation. The interesting and unlikely explanation that Sheldrake alludes to is to consider a human, in the medley of definitions that humans have lent themselves to, as a vehicle for a fungus to propagate itself. In the Selfish Fungus theory, magic mushrooms and ergot have been able to hijack our minds so that more of us will use them, cultivate and tend them and love them, ensuring their propagation. Even though their effects might be incidental, they can help us in unexpected ways. If acid and psilocybin trips can spark even the occasional discovery of a new mathematical object or a new artistic style, both the fungi and the humans’ purpose is served. I have another interesting theory of psychedelic mushroom-human co-evolution in mind that refers to Julian Jaynes’s idea of the bicameral mind. According to Jaynes, humans may have lacked consciousness until as recently as 3000 years ago because their mind was divided into two parts, one of which “spoke” and the other “listened”. What we call Gods speaking to humans was a result of the speaking side holding forth. Is it possible that at some point in time, humans got hold of psychedelic fungi and they hijacked a more primitive version of the speaking mind that allowed it it to turn into a full-blown voice inside the other mind’s head, so to speak? Jaynes’s theory has been called “either complete rubbish or a work of consummate genius, nothing in between” by Richard Dawkins, and this might be another way to probe whether it might be true for a reason.

It is all too easy to anthropomorphize trees and especially fungi, which only indicates how interestingly they behave. One can say that “trees give and trees receive”, “trees feel” and even “trees know”, but at a biological level is this behavior little more than a series of Darwinian business transactions, purely driven by natural selection and survival? Maybe, but ultimately what matters is not what we call the behavior but the connections it implies. And there is no doubt that fungi, trees, octopuses and a few other assorted creatures are displaying a unique type of intelligence that humans may have merely glimpsed. Distributed intelligence clearly has a few benefits over a central, localized one. Unlike humans who are unlikely to live when their heads are cut off, newts can regrow their heads when they get detached, so there’s certainly a survival advantage conferred by not having your intelligence organ be one and done. This principle has been exploited by the one form of distributed intelligence that is an extension of human beings and that has taken over the planet – the Internet. Among many ideas that are regarded as the origins of the Internet, one was conceived by the defense department which wanted to built a communications net that would be resilient in the face of nuclear attack. Having a distributed network with no one node being a central node was the key. Servers in companies like Google and Facebook are also constructed in such a way that a would be hacker or terrorist would have to take out several and not just a few in order to measurably impair the fidelity of the network.

I also want to posit the possibility that distributed systems might be more analog than central ones and therefore confer unique advantages. Think of a distributed network of water pipes, arteries, traffic lanes or tree roots and fungal networks and one has the image in mind of a network that can almost instantaneously transmit changes in parameters like pressure, temperature and density taking place in one part of the network to another. These are all good examples of analog computation, although in case of arteries, the analog process is built on a substrate of digital neuronal firing. The human body is clearly a system where a combination of analog and digital works well, but looking at distributed intelligence one gets a sense that we can optimize our intelligence significantly using more analog computing.

There is no reason why intelligence may not be predominantly analog and distributed so that it becomes resilient, sensitive and creative like mycorrhizal networks, being able to guard itself against existential threats, respond to new food and resource locations and construct new structures with new form and function. One way to make human intelligence more analog and distributed would be to enable human-to-human connections through high-fidelity electronics that allows a direct flow of information to and from human brains. But a more practical solution might be to enable downloading brain contents including memory into computers and then allowing these computers to communicate with each other. I do not know if this advance will take place during my lifetime, but it could certainly bring us closer to being a truly distributed intelligence that just like mycorrhizal networks is infinitely responsive, creative, resilient and empathetic. And then perhaps we will know exactly what it feels like to be a tree.

Brains, Computation And Thermodynamics: A View From The Future?

Rolf Landauer
Progress in science often happens when two or more fields productively meet. Astrophysics got a huge boost when the tools of radio and radar met the age-old science of astronomy. From this fruitful marriage came things like the discovery of the radiation from the big bang. Another example was the union of biology with chemistry and quantum mechanics that gave rise to molecular biology. There is little doubt that some of the most important future discoveries in science in the future will similarly arise from the accidental fusion of multiple disciplines.
One such fusion sits on the horizon, largely underappreciated and unseen by the public. It is the fusion between physics, computer science and biology. More specifically, this fusion will likely see its greatest manifestation in the interplay between information theory, thermodynamics and neuroscience. My prediction is that this fusion will be every bit as important as any potential fusion of general relativity with quantum theory, and at least as important as the development of molecular biology in the mid 20th century. I also believe that this development will likely happen during my own lifetime.
The roots of this predicted marriage go back to 1867. In that year the great Scottish physicist James Clerk Maxwell proposed a thought experiment that was later called ‘Maxwell’s Demon’. Maxwell’s Demon was purportedly a way to defy the second law of thermodynamics that had been proposed a few years earlier. The second law of thermodynamics is one of the fundamental laws governing everything in the universe, from the birth of stars to the birth of babies. It basically states that left to itself, an isolated system will tend to go from a state of order to one of disorder. A good example is how a bottle of perfume wafts throughout a room with time. This order and disorder was quantified by a quantity called entropy.
In technical terms, the order and disorder refers to the number of states a system can exist in; order means fewer states and disorder means more. The second law states that isolated systems will always go from fewer states and lower entropy (order) to more states and higher entropy (disorder). Ludwig Boltzmann quantified this relationship with a simple equation carved on his tombstone in Vienna: S = klnW, where k is a constant called the Boltzmann constant, ln is the natural logarithm (to the base e) and W is the number of states.
Maxwell’s Demon was a mischievous creature which sat on top of a box with a partition in the middle. The box contains molecules of a gas which are ricocheting in every direction. Maxwell himself had found that these molecules’ velocities follow a particular distribution of fast and slow. The demon observes these velocities, and whenever there is a molecule moving faster than usual in the right side of the box, he opens the partition and lets it into the left side, quickly closing the partition. Similarly he lets in slower moving molecules from left to right. After some time, all the slow molecules will be in the right side and the fast ones will in the left. Now, velocity is related to temperature, so this means that one side of the box has heated up and the other has cooled down. To put it another way, the box went from a state of random disorder to order. According to the second law this means that the entropy of the system of the system decreased, which is impossible.
Maxwell’s demon seemingly contravenes the second law of thermodynamics (University of Pittsburgh)
For the next few years scientists tried to get around Maxwell’s Demon’s paradox, but it was in 1922 that the Hungarian physicist Leo Szilard made a dent in it when he was a graduate student hobnobbing with Einstein, Planck and other physicists in Berlin. Szilard realized an obvious truth that many others seem to have missed. The work and decision-making that the demon does to determine the velocities of the molecules itself generates entropy. If one takes this work into account, it turns out that the total entropy of the system has indeed increased. The second law is safe. Szilard later went on to a distinguished career as a nuclear physicist, patenting a refrigerator with Einstein and becoming the first person to think of a chain reaction.
Perhaps unknowingly, however, Szilard had also discovered a connection – a fusion of two fields – that was going to revolutionize both science and technology. When the demon does work to determine the velocities of molecules, the entropy that he creates comes not just from the raising and lowering of the partition but from his thinking processes, and these processes involve information processing. Szilard had discovered a crucial and tantalizing link between entropy and information. Two decades later, mathematician Claude Shannon was working at Bell Labs, trying to improve the communication of signals through wires. This was unsurprisingly an important problem for a telephone and communications company. The problem was that when engineers were trying to send a message over a wire, it would lose its quality because of many factors including noise. One of Shannon’s jobs was to figure out how to make this transmission more efficient.
Shannon found out that there is a quantity that relates to the information transmitted over the wire. In crude terms, this quantity was inversely related to the information as well as to the probability of transmitting that information; the higher the probability of transmitting accurate information over a channel, the lower this quantity was and vice versa. When Shannon showed his result to the famous mathematician John von Neumann, von Neumann with his well-known lightning-fast ability to connect disparate ideas, immediately saw what it was: “You should call your function ‘entropy’”, he said, “firstly because that is what it looks like in thermodynamics, and secondly because nobody really knows what entropy is, so in a debate you will always have the upper hand.” Thus was born the connection between information and entropy. Another fortuitous connection was born – between information, entropy and error or uncertainty. The greater the uncertainty in transmitting a message, the greater the entropy, so entropy also provided a way to quantify error. Shannon’s 1948 paper, “A Mathematical Theory of Communication”, was a seminal publication and has been called the Magna Carta of the information age.
Even before Shannon, another pioneer had published a paper that laid the foundations of the theory of computing. In 1936 Alan Turing published “On Computable Numbers, with an Application to the Entscheidungsproblem”. This paper introduced the concept of Turing machines which also process information. But neither Turing nor von Neumann really made the connection between computation, entropy and information explicit. Making it explicit would take another few decades. But during those decades, another fascinating connection between thermodynamics and information would be discovered.
Stephen Hawking’s tombstone at Westminster Abbey (Cambridge News)
That connection came from Stephen Hawking getting annoyed. Hawking was one of the pioneers of black holes, and along with Roger Penrose he had discovered that at the center of every black hole is a singularity that warps spacetime infinitely. The boundary of the black hole is its event horizon and within that boundary not even light can escape. But black holes posed some fundamental problems for thermodynamics. Every object contains entropy, so when an object disappears into a black hole, where does its entropy go? If the entropy of the black hole does not increase then the second law of thermodynamics would be violated. Hawking had proven that the area of a black hole’s event horizon never decreases, but he had pushed the thermodynamic question under the rug. In 1972 at a physics summer school, Hawking met a graduate student from Princeton named Jacob Bekenstein who proposed that the increasing area of the black hole was basically a proxy for its increasing entropy. This annoyed Hawking and he did not believe it because increased entropy is related to heat (heat is the highest- entropy form of energy) and black holes, being black, could not radiate heat. With two colleagues Hawking set out to prove Bekenstein wrong. In the process, he not only proved him right but also made what is considered his greatest breakthrough: he gave black holes a temperature. Hawking found out that black holes do emit thermal radiation. This radiation can be explained when you take quantum mechanics into account. The Hawking-Bekenstein discovery was a spectacular example of another fusion: between information, thermodynamics, quantum mechanics and general relativity. Hawking deemed it so important that he wanted to put it on his tombstone in Westminster Abbey, and so it has been.
This short digression was to show that more links between information, thermodynamics and other disciplines were being forged in the 1960s and 70s. But nobody saw the connections between computation and thermodynamics until Rolf Landauer and Charles Bennett came along. Bennett and Landauer were both working at IBM. Landauer was an émigré who fled from Nazi Germany before working for the US Navy as an electrician’s mate, getting his PhD at Harvard and joining IBM. IBM was then a pioneer of computing; among other things they had built computers for the Manhattan Project. In 1961, Landauer published a paper titled “Irreversibility and Heat Generation in the Computing Process” that is destined to become a classic of science. In it, Landauer established that the basic act of computation – the change of one bit to another, say a 1 to a 0 – requires a bare minimum amount of entropy. He quantified this amount with another simple equation: S = kln2, with k again being the Boltzmann constant and ln the natural logarithm. This has become known as the Landauer bound; it is the absolute minimum amount of entropy that has to be expended in a single bit operation. Landauer died in 1999 and as far as I know the equation is not carved on his tombstone.
The Landauer bound applies to all kinds of computation in principle and biological processes are also a form of information processing and computation, so it’s tantalizing to ask whether Landauer’s calculation applies to them. Enter Charles Bennett. Bennett is one of the most famous scientists whose name you may not have heard of. He is not only one of the fathers of quantum computing and quantum cryptography but he is also one of the two fathers of the marriage of thermodynamics with computation, Landauer being the other. Working with Landauer in the 1970s and 80s, Bennett applied thermodynamics to both Turing machines and biology. By good fortune he had gotten his PhD in physical chemistry studying the motion of molecules, so his background primed him to apply ideas from computation to biology.
Charles Bennett from IBM has revolutionized our understanding of the thermodynamics of computation (AMSS)
To simplify matters, Bennett considered what he called a Brownian Turing machine. Brownian motion is the random motion of atoms and molecules. A Brownian Turing machine can write and erase characters on a tape using energy extracted from a random environment. This makes the Brownian Turing machine reversible. A reversible process might seem strange, but in fact it’s found in biology all the time. Enzyme reactions occur from the reversible motion of chemicals – at equilibrium there is equal probability that an enzymatic reaction will go forward or backward. What makes these processes irreversible is the addition of starting materials or the elimination of chemical products. Even in computation, only a process which erases bits is truly irreversible because you lose information. Bennett envisaged a biological process like protein translation as a Brownian Turing machine which adds or subtracts a molecule like an amino acid, and he calculated the energy and entropy expenditures involved in running this machine. Visualizing translation as a Turing machine made it easier to do a head-to-head comparison between biological processes and bit operations. Bennett found out that if the process is reversible the Landauer bound does not hold and there is no minimum entropy required. Real life of course is irreversible, so how do real-life processes compare to the Landauer bound?
In 2017, a group of researchers published a fascinating paper in the Philosophical Transactions of the Royal Society in which they explicitly calculated the thermodynamic efficiency of biological processes. Remarkably, they found that the efficiency of protein translation is several orders of magnitude better than the best supercomputers, in some cases as better as a million fold. More remarkably, they found that the efficiency is only one order of magnitude worse than the theoretical minimum Landauer bound. In other words, evolution has done one hell of a job in optimizing the thermodynamic efficiency of biological processes.
But not all biological processes. Circling back to the thinking processes of Maxwell’s little demon, how does this efficiency compare to the efficiency of the human brain? Surprisingly, it turns out that neural processes like the firing of synapses are estimated to be much worse than protein translation and more comparable to the efficiency of supercomputers. At first glance, the human brain thus appears to be worse than other biological processes. However, this seemingly low computational efficiency of the brain must be compared to its complex structure and function. The brain weighs only about a fiftieth of the weight of an average human but it uses up 20% of the body’s energy. It might seem that we are simply not getting the biggest bang for our buck, with an energy-hungry brain providing low computational efficiency. What would explain this inefficiency and this paradox?
My guess is that the brain has been designed to be inefficient through a combination of evolutionary accident and design and that efficiency is the wrong metric for gauging the performance of the brain. Efficiency is the wrong metric because thinking of the brain in digital terms is the wrong metric. The brain arose through a series of modular inventions responding to new environments created by both biology and culture. We now know that thriving in these environments needed a combination of analog and digital functions.; for instance, the nerve impulses controlling blood pressure are digital while the actual change in pressure is continuous and analog. It is likely that digital neuronal firing is built on an analog substrate of wet matter, and that higher-order analog functions could be emergent forms of digital neuronal firing. As early as the 1950s, von Neumann conjectured that we would need to model the brain as both analog and digital in order to understand it. Around the time that Bennett was working out the thermodynamics of computation, two mathematicians named Marian Pour-El and Ian Richards proved a very interesting theorem which showed that in certain cases, there are numbers that are not computable with digital computers but are computable with analog processes; analog computers are thus more powerful in such cases.
If our brains are a combination of digital and analog, it’s very likely that they are this way so that they can span a much bigger range of computation. But this bigger range would come at the expense of inefficiency in the analog computation process. The small price of lower computational efficiency as measured by the Landauer bound would come at the expense of the much greater evolutionary benefits of performing complex calculations that allow us to farm, build cities, know stranger from kin and develop technology. Essentially, the Landauer bound could be evidence for the analog nature of our brains. There is another interesting fact about analog computation, which is its greater error rate; digital computers took off precisely because they had low error rates. How does the brain function so well in spite of this relatively high error rate? Is the brain consolidating this error when we dream? And can we reduce this error rate by improving the brain’s efficiency? Would that make our brains better or worse at grasping the world?
From the origins of thermodynamics and Maxwell’s Demon to the fusion of thermodynamics with information processing, black holes, computation and biology, we have come a long way. The fusion of thermodynamics and computation with neuroscience just seems to be beginning, so for a young person starting out in the field the possibilities are exciting and limitless. A multitude of general questions abound: How does the efficiency of the brain relate to its computational abilities? What might be the evolutionary origins of such abilities? What analogies between the processing of information in our memories and that in computers might we discover through this analysis? And finally, just like Shannon did for information, Hawking and Bekenstein did for black holes and Landauer and Bennett did for computation and biology, can we find out a simple equation describing how the entropy of thought processes relates to simple neural parameters connected to memory, thinking, empathy and emotion? I do not know the answers to these questions, but I am hoping someone who is reading this will, and at the very least they will then be able to immortalize themselves by putting another simple formula describing the secrets of the universe on their tombstone.
Further reading:
  1. Charles Bennett – The Thermodynamics of Computation
  2. Seth Lloyd – Ultimate Physical Limits to Computation
  3. Freeman Dyson – Are brains analog or digital?
  4. George Dyson – Analogia: The Emergence of Technology Beyond Programmable Control (August 2020)
  5. Richard Feynman – The Feynman Lectures on Computation (Chapter 5)
  6. John von Neumann – The General and Logical Theory of Automata
First published on 3 Quarks Daily