Field of Science

"Hawking Hawking" and Michio Kaku

Two items of amusement and interest. One is a new biography of Hawking by Charles Seife, coming out tomorrow, that attempts to close the gap between Hawking’s actual scientific accomplishments and his celebrity status. Here's a good review by top science writer and online friend Philip Ball:


Seife's Hawking is a human being, given to petty disputes of priority and oneupmanship and often pontificating with platitudes on fields beyond his expertise. I used to have similar thoughts about Hawking myself but thought that his pronouncements were largely harmless fun. My copy of Seife's book arrives tomorrow and I am looking forward to his views, especially his take on how much it was the media rather than Hawking himself who fueled the exaggerations and the celebrity status.

The second item is an interview with Michio Kaku which seems to have ruffled a lot of feathers in the physics and science writing communities. 


The critics complain that he distorts the facts and says highly misleading things like string theory directly leading to the standard model. I hear the complaints as legitimate, but my take on Kaku is different. I don’t think of him as a science writer but as a futurist, fantasist and storyteller. I think of him rather like E. T. Bell whose “Men of Mathematics”, while highly romanticized and inaccurate regarding the details, nevertheless served to get future scientists Freeman Dyson and John Nash interested in math as kids. I doubt whether either Kaku himself or his readers take the details in his books very seriously.

I think we should always distinguish between writers who write about the facts and writers who tell stories. While you should be as rigorous as possible while writing about facts, you are allowed considerable leeway and speculation while telling stories. If not for this leeway, there wouldn't be any science writers and certainly on science fiction writers. A personal memory: my father was a big fan of Alvin Toffler's "Future Shock" and other futuristic musings. But he never took Toffler seriously as a writer on technology; rather he thought of him as an "ideas man" whose ideas were raw material for more serious considerations. If Kaku's writings get a few kids excited about science and technology the way "Star Trek' did, his purpose would be served.

Six lessons from the biotech startup world

Having worked for a few biotech startups over the years, while I am not exactly a grizzled warrior, I have been around the block a bit and have drawn some conclusions about what seems to work and not work in the world of small biopharma. I don't have any kind of grand lessons related to financial strategy, funding or IPOs or special insights, just some simple observations about science and people based on a limited slice of the universe. My suspicion is that much of what I am saying will be familiar.

1. It's about the problem, not about the technology: 

Many startups are founded with a particular kind of therapeutic area in mind, perhaps a particular kind of cancer or metabolic disease to address. But some are also founded on the basis of an exciting new platform or technology. This is completely legitimate as long as there is also a concomitant list of problems that can be addressed by that platform. If there aren't, then you are in the proverbial hammer-trying-to-find-a-nail territory, trying to be tool-oriented rather than problem-oriented. The best startups I have seen do what it takes to address a problem, sometimes even pivoting from their original toolkit. The not so great ones fall in love with the platform and technology so much that they keep on generating results from it in a frenzy that may or may not be applicable to a real problem. No matter how amazing your platform may be, it's key to find the right problem space as soon as you can. Not surprisingly, this is especially an issue in Silicon Valley where breathless new technology is often the basis for the founding platform for companies. Now I am as optimistic and excited about new technology as anyone else, but with new technological vision must come rigorous scrutiny that allows constant validation of the path that you are on and course-correction if that path looks crooked.

A corollary of this obsession with tools comes from my own field of molecular modeling and structure-based drug design. I have said before that the most important reason computational chemistry stays at the periphery rather than core of drug discovery is because it's not matched to the right problem. And while technical challenges still play a big role in the failure of the field - the complexity of biology usually far overshadows the utility of the tools - the real problem in my view is cultural. In a nutshell, modelers are not paid for saying "no". A modeler constantly has to justify his or her utility by applying the latest and greatest tools to every kind of problem. It doesn't matter if the protein structure is poorly resolved; it doesn't matter if the SAR is sparse; it doesn't matter if you have one static structure for a dynamic protein with many partners - the constant clink of your hammer in that corner office must be heard if your salary is to be justified. It's even more impressive, and correspondingly more futile, if you are using The Cloud or a whole bank of GPUs for your calculations (there are certainly some cases where sheer computing power can make a difference, but these are rare). There are no incentives for you to say, "You know what, computational tools are really not the best approach to this problem given the paucity and quality of data." (as Werner Heisenberg once said, the definition of an expert is someone who knows what doesn't work).

But it goes both ways. Just like management needs to not just allow but reward this kind of judicious selection and rejection of tools, it really helps if modelers know something about assays, synthesis and pharmacology so that they can provide an alternative suggestion to using modeling, otherwise you are just cursing the dark instead of lighting a candle. They don't need to be experts, but having enough knowledge to make general suggestions helps. In my view, having a modeler say, "You know what, I don't think current computational tools are the best way to find inhibitors for this protein, but have you tried biophysical assay X" can be music to the ears.

2. Assays are everything

In all the startups I have worked at, no scientist has been more important to success in the early stages of a drug discovery project than the assay expert. Having a well designed assay that mirrors the behavior of a protein under realistic conditions is worth a thousand computer models or hundreds of hours spent around the whiteboard. Good assays can both test and validate the target. Conversely, a badly designed assay, one that does not recapitulate the real state of the protein, can not only doom the project but lead you down a rabbit hole of false positives. No matter what therapeutic area or target you are dealing with, there are going to be few more important early hires than people who know the assays. And assays are all about the details - things like salt and protein concentration, length of construct, mutations, things only known by someone who has learnt them the hard way. The devil is always in the details, but he really hides in the assays.

3. Outsourcing works great, except when it doesn't

Most biotechs now outsource key aspects of their processes like compound synthesis, HTS and biophysical assays to CROs. And this works fine in many cases, except when that devil in the details rears his head. The problem with many CROs is that while they may be doing a good job of executing on the task, they then throw the results over the wall. The details are lost, and sometimes you don't even know you are going down a rabbit hole when that happens. I remember one example where the contamination of a chip in a SPR binding assay was throwing off our results for a long time, and it took a lot of forensic work and back and forth to figure this out. Timelines were set back substantially and confusion reigned. CROs need to be as collaborative and closely involved as internal scientists, and when this doesn't happen you can spend more time fixing that relationship than actually solving your problem - needless to say, the best CROs are very good at doing this kind of collaborative work. And it's important not just to have collaborative CROs but to have access to as many details as possible in case a problem arises, which it inevitably does.

4. Automation works great, except when it doesn't

The same problems that riddle CRO collaborations riddle automation. These days some form of automation is fairly common for tools like HTS, what with banks of liquid handling robots hopping rapidly and merrily over hundreds of wells in plates. And it again works great for pre-programmed protocols. But simple problems of contamination, efficiency and breakdowns like spills and robotic arms getting stuck can afflict these systems, especially in the more cutting-edge areas like synthesis - one thing you constantly discover that the main problem with automation is not the software but the hardware. I have found that the same caveats apply to automation that Hans Moravec applied to AI - the hard things are easy and the simple things are hard. Getting that multipipetting robot to transfer nanoliters around blazingly fast is beyond the ability of human beings, but that robot won't be able to look at a powder and determine if it's fluffy or crystalline. Theranos is a good example of the catastrophe that can result when the world of well-defined hard robotic grippers and vials meets the messy world of squishy cells, fluffy chemicals and messy fluids like blood (for one thing, stuff behaves very differently at small scale). You know your automation has a problem when you are spending more time babysitting the automation than doing things manually. It's great to be able to use automation to free up your time, but you need to make sure that it's actually doing so as well as generating accurate results without needing babysitting.

5. The best managers delegate

Now a human lesson. I have had the extraordinary good fortune of working for some truly outstanding scientists and human beings, some of whom have become good friends. And I have found that the primary function of a good manager is not to get things done from their reports but to help them grow. The best way to encapsulate sound manager thinking is Steve Jobs's famous quote - "It doesn't make sense to hire good people and tell them what they should do. We hire good people so that they can tell us what to do." The best managers I have worked with delegate important responsibilities to you, trust that you can get the job done, and then check in occasionally on how things are going, leaving the details and execution to you. Not only does this provide a great learning experience but more importantly it helps you feel empowered. If your manager communicates to you how important the task entrusted to you is for the entire company and how they trust you to do it well, the sense of empowerment this brings is enormous and you will usually do the job well (if you don't, it's a good sign for both you and your manager that things are not going well and a conversation is to be had). 

Bad managers are of course well known - they micromanage, constantly tell you what you should do and are often not on top of things. And while this is an uncomfortable truth to hear, often the best scientists are also the poorest managers (there's exceptions of course - Roy Vagelos who led Merck during its glory days excelled at both). One of the best scientists I have ever encountered wisely and deliberately stay away from senior managerial positions that repeatedly came his way. There are few managers worse than distracted scientists.

6. Expect trouble and enjoy the journey

I will leave the most obvious observation for last. Biology and drug discovery are devilishly complicated, hard and messy. After a hundred years of examining life at the molecular level, we still haven't figured it out. Almost every strategy you will adopt, every inspired idea you will have, every new million-dollar tranche of funding you will sink into your organization, will fail. No model will be accurate enough to capture the real life workings of a drug in a cell or a gene that's part of a network of genes, and you will have to approximate, simplify, build model systems and hope for the best. And on the human side, you will have disagreements and friction that should always be handle with considerateness and respect. Be forgiving of both the science and the people since both are hard. But in that sense, getting to the right answer in biotechnology is like building that "more perfect union" that Lincoln talked about. It's a goal that always seems to be one step beyond where you are, but that's precisely why you should enjoy the journey, because you will find that the gems you uncover on the way make the whole effort worth it.

Some thoughts on "broader impact" statements for scientific papers

I know that conferences like NeurIPS (formerly called NIPS) have asked for statements about ethical and "broader impact" to accompany papers submitted to them. In principle I am all for this since it's always good for scientists to think about the social implications of the work. I also read the details of the requirements and they aren't draconian, especially for highly theoretical papers whose broader impact is far from clear.

But from a fundamental philosophical viewpoint I still don't think this is useful. My problem is not with the morality of predicting impact but with the lack of utility. The history of science and technology show that it is impossible to predict broader impact. When Maxwell published his electromagnetic equations he could have scarcely imagined the social and political repercussions electrical power generation would have. When Einstein published his general theory of relativity, he could have scarcely imagined the broader impact it would have on space exploration and GPS, both in war and peace. Perhaps most notably, nobody could have predicted the broader impacts of the discovery of the proton or the discovery of DNA as the genetic material. I do not see how James Chadwick or Oswald Avery could have submitted broader impact statements with their papers; anything interesting they might have had to say would probably have been untrue a few years later, and anything they would have admitted would probably have turned out to be important.
My biggest problem is not that broader impact statements will put more of a burden on already-overworked researchers, or that they might inflame all kinds of radical social agendas, or that they might bias conferences against very good technical papers which struggle to find broad impact, or that they might stifle lines of research which are considered to be dangerous or biased. All these problems are real and should be acknowledged. But the real problem simply is that whatever points these statements would make would almost certainly turn out to be wrong because of the fundamental unpredictability and rapid progress of technology. And they would then only cause confusion by sending people down a rabbit hole, one in which the rabbit not only does not exist but is likely to be a whole other creature. And this will be the case with all new technologies like AI and CRISPR.

The other problem with broader statements is what to do with them even if they are accurate, because accurate and actionable are two different things. Facial recognition software is an obvious example. It can be used to identify terrorists or bad guys but it can also be used to identify dissidents and put them into jail. So if I submit a broader statement with my facial recognition paper and point out these facts, now what? Would this kind of research be banned? That would be throwing the baby out with the bathwater. The fact is that science and technology are always dual use, and it is impossible to separate their good from their bad uses except as a matter of social choice after the fact. I am not saying that pointing out this dual use is a bad thing, but I am concerned that doing so might lead to stifling good research for fear that it may be put to bad ends.
So what is the remedy? Except for obvious cases, I would say that science and technology should be allowed to play out the way they have played out since the time of Francis Bacon and the Royal Society, as open areas of inquiry with no moral judgements being made beforehand. In that sense science has always put a severe burden on society and has asked for a tough bargain in return. It says, "If you want me to be useful, don't put me in a straitjacket and try to predict how I will work out. Instead give me the unfettered freedom of discovery, and then accept both the benefits and the risks that come with this freedom." This is the way science has always worked. Personally I believe that we have done an excellent jobs maximizing its benefits and minimizing its risks, and I do not see why it will be different with any new technology including machine learning. Let machine learning run unfettered, and while we should be mindful of its broader impact, predicting it will be as futile as damming the ocean.

Book Review: "His Very Best: Jimmy Carter, A Life" by Jonathan Alter

I first saw Jimmy Carter with a few other students during my first year as a graduate student at Emory University where he remains a visiting professor - I still remember the then 81-year-old briskly striding in with his signature broad, toothy smile and the energy of a man half his age. The amusing thing was that when he opened up the floor to any question that we wanted to ask him, the first question someone asked right away was whether LBJ was responsible for JFK's assassination. Without batting an eyelid Carter said no and moved on.

Now I am finishing Jonathan Alter's comprehensive biography of Carter and it's a revelation. The book's main goal is to show that Jimmy Carter is a much more complex human being and president than what people believe, and it succeeds exceedingly well in this goal. Carter was a highly intelligent, immensely hard working and, most importantly, a good man in the wrong job. But even then, the things he accomplished were very substantial. Most important were his two signature foreign policy achievements - giving the Panama Canal back to Panama, and brokering a peace between Israel and Egypt which has lasted up to to the present day. The Camp David Accords in particular showed a commitment over thirteen days that is without precedent before or since; Carter simply refused to give up, even when Begin and Sadat were on the verge of going home multiple times. The other huge achievement - albeit one that now seems like a mixed blessing - was to normalize relations with China. He also campaigned for human rights at a time when it wasn't fashionable for American presidents to do so, abandoning American presidents' traditional cozy relationship with anti-communist dictatorships.
There are also other, more minor achievements that are now forgotten - appointing more liberal federal judges (even more than Trump), deregulating the airline industry, appointing more African-Americans to important government positions than many of his predecessors, restoring a sense of decency and common sense to the White House after the tumultuous years of Vietnam and lackluster years of Ford, being the first Democratic president to woo evangelicals (the last before they turned Republican) and popularizing alternative energy and climate change at a time when few people cared about it. There's no doubt that the binary classification of Carter as "failed president, great ex-president" is flawed and reality is more complex.
The book is also fabulous at exploring Carter's childhood and background in rural Georgia as a farmer and his education in nuclear engineering at Annapolis. Carter grew up as the son of a farmer and general store owner, Earl Carter, who competed with his son in daily tasks and sports and was a fair if harsh father. Carter's mother Lillian who lived to see her son became president was quite liberal for her time, and astonishingly joined the Peace Corps and went to India for a few months for public service in her sixties. Alter does not shirk from criticizing Carter's poor record on civil rights before he became president (at one point he was friendly with George Wallace). Carter was a product of his time and grew up in the segregated Deep South after all, but this does not excuse his reluctance to take a stand even in matters like school desegregation. Of course, the defining relationship that Carter has had is with his wife Rosalynn who he married when he was twenty-one and she was nineteen; they have been married for more than seventy-five years now. Rosalynn has been a commanding presence in his life, and he often sought advice from her along with his other advisors during the most crucial moments of his presidency.
Carter's main problem was that he was dour, practical and business-like and almost completely lacked the warmth, optimism and PR skills that are necessary for political leaders to win over not just minds but hearts. He was the strict father who wants to lecture his children about what's best for them. His grim fireside chats about consumption and self-indulgence, while sensible, did not go down well with the American people.
In addition, while he did a good job during his first two years, Carter was completely overtaken by global events during his second two, most notably the Iranian Revolution, Soviet aggression and the oil crisis that hiked up oil prices. The book does a good job showing that while Carter was not responsible for these events, he was as clueless in understanding the situation in Iran as anyone else.

There is an excellent account of the Iranian Hostage crisis in Alter's biography which includes many details that I did not know. It seems like a real tragedy since it was a comedy of errors in some sense, albeit one which the US had yoked itself with since installing the Shah of Iran in a coup in 1953 (there is an excellent account of how Mohammed Mosaddegh's democratic government was toppled in Stephen Kinzer's book "All the Shah's Men"). The hostage crisis was essentially triggered by the Shah being allowed into the US for medical treatment. He had fled from Iran after the Ayatollah had been reinstalled and flown in from exile in France.
The Shah's case was engineered by a lobby prominently led among others by Kissinger - the man's villainy continued unabated even after leaving the Nixon administration. He and his cabal greatly exaggerated the Shah's medical condition and forced Carter to admit him into the US on humanitarian grounds; he was in Mexico, and the Kissinger faction wrongly made the case that Mexican hospitals weren't equipped to diagnose and treat him. This was the last straw since it told the Iranians that the US was about to embark on another 1953-like adventure. This wasn't true, but at this point cooler heads weren't prevailing.
Also complicit in the disaster was Carter's hawkish national security advisor Zbigniew Brzezinski, an early neoconservative. A dissenter was secretary of state Cyrus Vance who strenuously advised against letting the Shah in, having a good idea of how perilous the situation in Iran was. As it turned out, Carter was blindsided and ignorant of the internal situation in Iran and ended up letting the Shah in (before hurriedly getting him out again).
The irony was that the die had been cast by Carter's predecessors, especially Eisenhower and Nixon, and Carter himself had very little interest in adventurism abroad, but the Shah was America's burden to bear, and Carter's actions were conflated with previous ones by the Iranians. Once the hostages were taken Carter's hands were tied for months, his approval ratings plummeted and the way was paved for Reagan (and Ben Affleck and his team in 'Argo').
I always feel that the fractured relations between the US and Iran constitute one of the great international tragedies of the 20th and early 21st centuries. Both countries have a rich heritage and have so much to offer each other. If the US had not completely thrown in their lot with Saudi Arabia and Israel and instead been friends with Iran, we would have had a powerful ally against Islamic fundamentalism in the Middle East. As it happened, the current Iranian regime is certainly nothing to praise and funds terrorist groups like Hezbollah. But it's important to note that it was largely US actions in the 1950s that led to the present state of affairs.
In retrospect, it appears obvious how someone like Reagan who was just fundamentally better at being a people pleaser and projected sunny optimism could defeat Carter. Fortunately Carter's career was just beginning at the end of his presidency, and in the next three decades he did very significant human rights work, including eradicating guinea worm from Africa and working on Habitat for Humanity, winning the Nobel Peace Prize in the process and becoming a far more deserving recipient than most others who got the prize. Now 97, he still teaches Sunday School in Plains, GA. What we need today is the pragmatism and intelligence of Jimmy Carter and the optimism of Ronald Reagan.
A fantastic book, well worth its almost 800 pages, and likely the definitive biography of a remarkable man for many years to come.

Book Review: "The Jews of Spain", by Jane Gerber

Jane Gerber’s “The Jews of Spain” is a superb and comprehensive look at the history of the Sephardim - one of the two major branches of Jewry, the other being the Ashkenazim. The Sephardim originated in Spain and today occupy a place of high prominence. While the Ashkenazim are better known, about sixty percent of Israel’s population consists of the Sephardim.

Two main qualities mark the Sephardim. One is common to all Jews, which is the ability to persevere and thrive against all odds through centuries. The other is more unique and was the flourishing of their numbers under Muslim rule. Except for Germany in the 19th century and the United States in the 20th, Jews have not thrived anywhere else so well after they were driven out from the Roman Empire.
The book starts with the miserable fate of the Jews in Spain under the Visigoths; the Jews had fled there after being persecuted under the Roman Empire. It was in 711 AD when the Arabs under Tariq invaded Spain and defeated the Visigoths that their fortunes changed. This was largely because of the tolerant, creative and far-ranging Umayyad Caliphate that moved to Spain from Damascus, with Abd-Al-Rahman I founding its branch in Spain. The Umayyad Caliphate marked one of the highlights of the history of Islamic Civilization, making its home in Cordoba at first and then in places like Seville and Toledo. Not only did it allow Jews to practice their religion freely but it employed them in almost all important professions. Jews still did not have all the same rights as Muslims, but they could trade, study medicine and the arts and sciences, compose poetry and music and generally take part in the political life of Islamic Spain to the fullest history. There were many notables among Jewish intellectuals, with perhaps the peak reached by Hasdai Ibn Shaprut who became vizier and foreign minister.
Politically, the most useful purpose the Jews of Spain served was to mediate between the Byzantine and Christian kingdoms and Muslim lands. In some sense, because they were equally loved and hated by both religions, the Jews could do this balancing act well as neutrals. The Jews of Spain assimilated Arabic and moved smoothly between Arabic and Hebrew, often translating texts between the two. Even more valuably, they translated important Greek texts on science and medicine into Arabic; later these Arabic texts were translated into Latin by Christian scholars during the translation movement of the 10th and 11th centuries. The Jews of Spain thus served as a critical conduit between the Muslim and Christian worlds, diplomatically trading and interacting with each world while performing valuable functions for both. Jews became great and far flung traders, braving pirates and trading precious pearls, textiles, spices and other goods. The Radhanites were a particularly prominent group of Jewish merchants who went back and forth between Spain and as far as India and China. Some of these Jews even made their homes in India and China, becoming for instance the Bene Israelites of Maharashtra in India.
This useful and productive existence came to an end with the Reconquista and the Christian invasions of Spain. After a period of more repressive and less tolerant Islamic regimes that included the Almoravids and the Almohads, the Jews started seeing a period of decline. After this brief resurgence, the Christians decisively defeated the Muslims in the Battle of Navas de Tolosa in 1212. Within the next few decades both Cordoba and Seville fell to the Christians, and only the Islamic Kingdom of Granada remained. When Granada fell to the Christians, the fate of Spain's Jews was sealed.
The next two hundred years marked a period of severe decline for the Jews of Spain. Many started converting under Christian pressure. But the real blow came when Isabella and Ferdinand of Aragon and Castile unified the country. At first somewhat tolerant of the Jews, in 1478 they approved the dreaded Spanish Inquisition that started hauling converted Jews before feared magistrates like Torquemada, torturing and extracting false confessions from then. Finally the watershed came in 1492 when Isabella and Ferdinand issued the famous edict of expulsion that gave the 300,000 Jews of Spain the stark choice between converting or fleeing. The foremost intellectual among those expelled was the scholar Maimonides whose Mishneh Torah and Guide to the Perplexed remain, even today, touchstones of Judaism. Maimonides found refuge in Egypat. But many others left Spain and went to Portugal, where the Portuguese Inquisition was even worse. So pernicious were its methods that many Jews became marranos - underground Jews who quietly practiced their religion so cryptically that nobody knew. These crypto-Jews evolved a form of their religion that would have been almost incomprehensible to their ancestors. From these marranos arose some of the most prominent Jews of later years, including Spinoza.
When Portugal also denied the Jews sanctuary they dispersed to other parts of Europe and the Middle East. By this time the plight of Jews in Europe had become even more dire. The Black Death of 1347 had created an atmosphere of acute paranoia in which Jews were accused of perpetuating the blood libel and poisoning wells. Even the Catholic Pope cautioned against such unfounded rumors, but it did not stop violent pogroms erupting in which Jews were massacred wholesale and burnt alive. England had already banned the Jews a long time ago, and apart from scattered pockets in France like Bayonne, they could find no respite. It was at this point that the Jews saw their second resurgence in the Ottoman Empire - in Turkey.
The remarkable story of the Turkish Jews is a story unto itself, but in Istanbul under the Ottoman Emperors Suleiman, Bayezid and others, the Jews achieved a prominence that they had only achieved under the Umayyad Caliphate. Interestingly, it was here that they met the Ashkenazim who had come from Europe, and for a long time the much more affluent and educated Sephardim looked down upon their Ashkenazim co-religionists as uncouth and poor. Most importantly, and as a testament to the freedom they enjoyed, they were allowed to establish their own printing presses in the 15th and 16th centuries. The printing presses allowed them to keep not just their religion but many of their religious books alive. Once again they served as mediators with Christians in Europe. One of the most remarkable among these was the Portuguese marrano Dona Gracia, a self-taught Jewish woman who became a wealthy Jewish trader after fleeing from Portugal, led an underground pipeline for Portuguese Jews who were being targeted by the Inquisition and successfully organized a boycott of an Italian port after Jews there had been targeted by the papacy.
Unfortunately once the Ottoman Empire weakened in the 17th century and the Christian Kingdoms imposed a series of harsh punitive measures on the Jews there, they had no choice but to flee. The last part of the book describes this flight. The Jews of Turkey went in two different directions. Some went to Southeastern Europe, Greece and the Balkans. The others went to the Netherlands which in the 17th century was the most progressive country in Europe. Here the Jews found plenty of opportunities for trading and banking. One of the most important Jews here was Spinoza who was ironically excommunicated by his own people at the age of twenty-four and had to spend the rest of his life as a lens-maker to support himself. Nevertheless, he became the father of the Jewish Enlightenment and inspired many other philosophers in Europe.
From the Netherlands some Jews made it to South America, especially Brazil. But once Brazil was threatened by Portugal, a small group of Jews started out on a journey in 1654 to a hitherto unexplored country where they would establish the most important Jewish community of modern times - the nascent United States. Over the next one hundred years, Jews became successful traders and professionals in a secular republic, fought in the American Revolution and established thriving communities in many states, even making it as far as the Ohio Valley. The United States was to see two other great waves of European Jewry, one in the mid 19th century and the other in the early 20th century. They were welcomed with memorable words written on the statue of liberty by Emma Lazarus, also a Jew. But the Sephardim got here first, way back in 1654.
Sadly, the plight of the other wave of Ottoman Jewry was much worse. Greece was taken over by the Nazis and tens of thousands of Greek and Macedonian Jews were sent to the death camps. Once the war ended, scattered bands of Jews from all over Europe, the Middle East and survivors from the concentration camps started making their way back, looking for family and friends. Shattered to find most of these missing, they made their way to the only place that would give them spiritual solace - Israel. Today the majority of Israeli Jews are Sephardim.
The Sephardim retained a love for their ancestral country that was striking. After the Bosnian war in the 1990s, many petitioned the King of Spain to give them refuge back into a country which their ancestors had left hundreds of years ago. The ties that bound them to Spain were deep and invisible. Today when the Middle East is a cauldron of ethic and religious conflict between Israel and the Arab Nations, it’s worth remembering that historically, Jews had been treated much better by Muslim kings than by Christian ones. Their history in Spain and in the Ottoman Empire is testament to their doggedness, the resurgent creativity and their sponge-like capacity to absorb critical elements of the surrounding culture while staying true to their roots. It’s a glorious and moving history, and Jane Gerber tells it well.

Book Review: "The Pity Of It All: A Protrait of the German-Jewish Epoch", 1743-1933, by Amos Elon



Amos Elon’s ‘The Pity of It Al’ is a poignant and beautiful history of German Jews from 1743-1933. Why 1743? Because in 1740, Frederick of Prussia liberalized the state and allowed freedom of worship. The freedom did not extend to Jews who still had no political or civil rights, but it did make it easier for them to live in Prussia than in the other thirty-six states of what later came to be called Germany.

The book begins with the story of the first prominent modern German-Jewish intellectual, the fourteen-year-old, barefooted Moses Mendelssohn, who entered Berlin through a gate reserved for “Jews and cattle”. Mendelssohn was the first Jew to start an enduring tradition that was to both signal the high watermark of European Jewry and their eventual destruction. This was the almost vehement efforts of Jews to assimilate, to convert to Christianity, to adopt to German traditions and ways, to become bigger German patriots than most non-Jewish Germans while retaining their culture and identity. In fact the entire history of German Jewry is one of striking a tortuous balance between assimilating into the parent culture and preserving their religion and identity. Mendelssohn became the first great Jewish German scholar, translating Talmud into Hebrew and having an unsurpassed command of both German and Jewish philosophy, culture and history. While initially he grew up steeped only in German culture, a chance encountered with a Protestant theologian who exhorted him to convert. This encounter convinced Mendelssohn that he should be more proud of his Jewish roots, but at the same time seek to make himself part and parcel of German society. Mendelssohn’s lessons spread far and wide, not least to his grandson, the famous composer Felix Mendelssohn who used to go to Goethe’s house to play music for him.

Generally speaking, the history of German Jews tracks well with political upheavals. After Prussia became a relatively liberal state and, goaded by Mendelssohn, many Jews openly declared their Judaism while forging alliances with German intellectuals and princes, their condition improved relative to the past few centuries. A particularly notable example cited by Elon is the string of intellectual salons rivaling their counterparts in Paris that were started in the Berlin by Jewish women like Rachel Varnhagen which drew Goethe, the Humboldt brothers and other cream of German intellectual society. The flowering of German Jews as well-dot-do intellectuals and respectable members of the elite starkly contrasted with their centuries-old image in the rest of Europe as impoverished caftan-wearers, killers of Christ and perpetuators of the blood libel. Jews had been barred from almost all professions except medicine, and it was in Prussia that they could first enter other professions.

When Napoleon invaded Prussia, his revolutionary code of civil and political rights afforded the German Jews freedom that they had not known for centuries. The Edict of 1812 freed the Jews. They came out of the ghettoes, especially in places like Frankfurt, and the Jewish intelligentsia thrived. Perhaps foremost among them was the poet Heinrich Heine whose astute, poignant, tortured and incredibly prescient poetry, prose and writings were a kaleidoscope of the sentiments and condition of his fellow Jews. Heine reluctantly converted but was almost tortured by his torn identity. The Edict of 1812 met with a tide of rising German nationalism from the bottom, and Jews quickly started reverting back to their second-class status. Heine, along with Eduard Gans and Leopold Zunz who started one of the first scientific societies in Germany, had trouble finding academic jobs. The Hep! Hep! riots that started in Wurzburg and spread throughout Germany were emblematic of the backlash. Significantly, and again in potent portend, this was the first time that German intellectuals took part in the violent anti-Semtism; later when the Nazis took over, the legal basis of their murderous anti-Semtism was undergirded by intellectuals, and it was intellectuals who drew up the Final Solution in 1942 at the Wannsee conference. Jews in record numbers started to convert to escape discrimination.

For the next few decades, straddling this delicate and difficult balance between adopting two identities was to become a hallmark of the Jewish condition in Germany, although scores also converted without any compunction. Writing from Paris in 1834, Heine issued a warning:

“A drama will be enacted in Germany compared to which the French revolution will seem like a harmless idol. Christianity restrained the martial ardor of the Germans for a time but it did not destroy it; once the restraining talisman is shattered savagery will rise again. The mad fury of the berserk of which Nordic Gods sing and speak. The old stony gods will rise from the rubble and rub the thousand year old dust from their eyes. Thor with the giant hammer will come forth and smash the granite domes.”

Extraordinarily prescient words, especially considering the Nordic reference.

The next upheaval came with the European liberal revolution of 1848. As is well known, this revolution overthrew monarchies - temporarily - throughout Europe. For the first time, Germany’s Jews could agitate not just for civil but political rights. A record number of Jews were appointed to the Prussian parliament by Frederick William IV. Unfortunately even this revolution was not to last. Frederick William reneged on his promise, many Jews were either ejected from parliament or made impotent and another rising tide of nationalism engulfed Germany. The next few decades, while not as bad the ones before, sought to roll back the strides that had been made.

It’s in this context that the rise of Bismarck is fascinating. Bismarck dodges many stereotypes. He was the emblem of Prussian militarism and autocracy, the man who united Germany, but also the liberal who kickstarted the German welfare state, pioneering social security and health insurance. When he declared war on France in 1870, patriotic Jews not only took part in the war but funded it. “Bismarck’s Jews” procured the money, helped Bismarck draw up the terms of French capitulation and occupation at Sedan. Among these, Ludwig Bamberger and Abraham Bleichroder were the most prominent - Bleichroder even used stones from Versailles to build a large mansion in Germany. While praising these Jews for their contributions to the war effort, Bismarck stopped short of saying that they should be awarded full rights as citizens of Germany. Nevertheless, in 1871, Bismarck passed an edict that banned discrimination on the basis of religion in all civil and political functions. It seemed that the long-sought goal of complete emancipation was finally in sight for Germany’s Jews.

But even then, as patriotic Jews signed up for the Franco-Prussian War, a dissenting note was struck by another Jew. Leopold Sonnemann was the publisher of a leading Frankfurt newspaper. In editorial after editorial, he issued stark warnings both to Jews and gentiles of the rising militarism and rigid social order in Prussia that was taking over all of Germany. He warned Jews that ironically, their patriotism may cost them more than they bargained for. Sonnemann was another prescient Jew who saw what his community’s strenuous efforts to conform were costing them. Sonnemann’s predictions were confirmed almost right away when a recession hit Germany in 1873 that was among the worst of the previous hundred years. Immediately, as if on cue, anger turned toward the wealthy Jews who had apparently grown fat and rich during the war while their fellow citizens grew impoverished. In 1879, a prominent Protestant clergyman named Adolf Stocker started railing against the Jews, calling them a “poison in German blood”, echoing paranoia that was leveraged to devastating effect by another Adolf a half century later. The Kaiser and Bismarck both disapproved of Stocker’s virulent anti-Semitic diatribes, but thought that it perhaps might make the Jews more “modest”. To say that this was unfair retaliation against a patriotic group who had bankrolled and helped the war efforts significantly would be an understatement.

Even as Bismarck was propagating religious freedom in Germany, anti-Semitic continued to grow elsewhere. Interestingly, in France where Jews had a much better time after Napoleon, Arthur Gobineau published a book arguing for Nordic superiority. About the same time, the fascinating but deadly English-German Houston Chamberlain, son-in-law of Wagner, published the massive “Foundations of the Nineteenth Century” in 1899 that became a kind of Bible for the 20th century pan-German Völkisch movement that fused nationalism with racialism. Both Gobineau and Chamberlain were to serve as major ‘philosophers’ for Hitler and the Nazis. In France, the Dreyfus affair had already exposed how fragile the situation of French Jews was.

As sentiments against the Jews grew again, German Jews again became disillusioned with conversion and conformity. The Kabbalah movement and other mysticism-based theologies started to be propounded by the likes of Martin Buber. Rather than keep on bending over backward to please an ungrateful nation, some sought other means of reform and escape. Foremost among these was the centuries old dream of returning to the promised land. Theodor Herz picked up the mantle of Zionism and started trying to convince Jews to migrate to Palestine. Ironically, the main target of his pleas was the Kaiser. Herzl wanted the Kaiser to fund and officially approve Jewish migration to Palestine. Not only would that reduce the Jewish population in Germany and perhaps ease the pressure on gentiles, but in doing so, the Kaiser would be seen as a great benefactor and liberator. In retrospect Herzl’s efforts have a hint of pity among them, but at that time it made sense. The ironic fact is that very few German Jews signed on to Herzl’s efforts to emigrate because they felt at home in Germany. This paradox was to prove to be the German Jews’ most tragic quality. Where Herzl sought emigration, others like Freud and Marx (who had been baptized as a child) sought secular idols like psychoanalysis and communism. This would have been a fascinating theme in itself, and I wish Elon had explored it in more detail.

As the new century approached and another Great War loomed, the themes of 1870 would be repeated. The ‘Kaiserjuden’ or Kaiser’s Jews, most prominently Walter Rathenau, would bankroll and help Germany’s war with England and France. Many Jews again signed up or patriotic duty. Without Rathenau, who was in charge of logistics and supplies, German would likely have lost the war within a year or two. Yet once again, the strenuous efforts of these patriotic Jews were forgotten. A young Austrian corporal who had been blinded by gas took it upon himself to proselytize the “stab in the back” theory, the unfounded belief that it was the Jews who secretly orchestrated an underhanded deal that betrayed the army and cost Germany the war. The truth of course was the opposite, but it’s important to note that Hitler did not invent the myth of the Jewish betrayal. He only masterfully exploited it.

The tragic post-World War 1 history of Germany is well known. The short-lived republics of 1919 were followed by mayhem, chaos and assasinations. The Jews Kurt Eisner in Bavaria and Walter Rathenau were assasinated. By that time there was one discipline in which Jews had become preeminent - science. Fritz Heber had made a Faustian bargain when he developed poison gas for Germany. Einstein had put the finishing touches on his general theory of relativity by the end of the war and had already become the target of anti-Semitism. Others like Max Born and James Franck were to make revolutionary contributions to science in the turmoil of the 1920s.

Once the Great Depression hit Germany in 1929 the fate of Germany’s Jews was effectively sealed. When Hitler became chancellor in 1933, a group of leading Jewish intellectuals orchestrated a massive but, in retrospect, pitiful attempt to catalog the achievements of German Jews. The catalog included important contributions by artists, writers, scientists, philosophers, playwrights and politicians in an attempt to convince the Nazis of the foundational contributions that German Jews had made to the fatherland. But it all came to nothing. Intellectuals like Einstein soon dispersed. The first concentration camp at Dachau went up in 1936. By 1938 and Kristallnacht, it was all over. The book ends with Hannah Arendt, protege of Martin Heidegger who became a committed Nazi, fleeing Berlin in the opposite direction from which Moses Mendelssohn had entered the city two hundred years earlier. To no other nation had Jews made more enduring contributions and tried so hard to fit in. No other nation punished them so severely.

Book Review: "Against the Grain", by James Scott

James Scott's important and utterly fascinating book questions what we can call the "Whig view" of history, which goes something like this: At the beginning we we were all "savages". We then progressed to becoming hunter gatherers, then at some point we discovered agriculture and domesticated animals. This was a huge deal because it allowed us to became sedentary. Sedentism then became the turning point in the history of civilization because it led to cities, taxation, monarchies, social hierarchies, families, religion, science and the whole shebang of civilizational wherewithal that we take for granted.

Scott tells us that not only is this idea of progress far from being as certain, linear or logical as it sounds, but it's also not as much of a winning strategy as we think. In a nutshell, his basic thesis is that the transition from hunter-gatherer to sedentism was messy and mixed, with many societies sporadically existing in one or the other system. The transition from agriculture to city-states was even less certain and obvious, with agriculture emerging about 10,000 years ago and the first modern city-state of Uruk I in Mesopotamia emerging almost seven thousand years later, around 3000 BC. Until then people existed in temporary and fluctuating states of agriculture and hunter-gatherer existence.

Perhaps an even bigger message in the book is regarding the very nature of history which basically tells us the stories it preserves. Cities form the core of history because they leave traces like large monuments, but life outside cities which can be far more extensive - as it was until very recently - leaves no traces and is discounted in our narratives. The fact is that even after agriculture and the first city-states came along, cities were often temporary and fragmentary and often dispersed because of disease, famine, war, taxation or oppressive rulers, floods and droughts and reformed, much like an anthill. Then the population would live off the land as hunter-gatherers for some time and form city-like complexes again when the time was ripe. As part of his evidence that cities were by no means obvious, Scott makes the argument that the first civilizations formed around waterways and not in the plans and mountains. These civilizations were mixed models of hunter-gatherer and city-like existence at best.

Once we assume that cities were by no means enduring or certain, we can start questioning the wisdom of other narratives associated with them. For instance take the all-important nature of grains (wheat, barley, corn and rice) being the major staples of the world, then as now. Scott makes the brilliant argument that unlike other crops like potatoes and legumes, grains became the staple of city-dwellers not because they were objectively better in terms of nutrition but because they could be easily taxed because they were above-ground, ripened all at once and could be counted, assessed and carted away. But grains often consigned city residents to a monoculture, unlike hunting and gathering which could take advantage of a variety of food sources on land, water and brush.

The same arguments apply to domestication of animals. As Jared Diamond showed in his book "Guns, Germs and Steel", most of our modern diseases and pandemics can be traced back to diseases of animal or zoonotic origins, so domestication was hardly the wholly blissful invention we assume. With domesticated animals also came rats, sparrows, crows and mice which are called commensals, These brought other sources of destruction and disease. Finally, taxation which was a major feature of cities and which contributed massively to critical developments like slavery and writing could become very oppressive.

All this meant that cities were hardly the nuclei of civilization progress that we assume them to be. Not surprisingly, especially in a hybrid model, city dwellers often fled the unsanitary, tax-heavy, monoculture-rich environment of cities to a more flexible and open hunter-gatherer environment. In fact the vast majority of the population lived outside cities until very recently. Now, no means is Scott making the argument here, popular among "back to nature" paleo-enthusiasts, that hunting and gathering was fundamentally a better existence. He is saying that hunting and gathering continued to have advantages that made, until very recently, a permanent move to cities far from desirable, let alone inevitable. Unfortunately because cities leave archeological traces, we fall into the mistaken assumption that the history of civilization is the history of city-states.

In the last part of the book, Scott tackles the topic of "barbarians" versus city dwellers. Based on the ensuing discussion, it should come as no surprise that Scott is very cynical about the very word as invented by the Greeks and applied generously by the Romans. Clearly compared to the Roman and Greek city states, the barbarian countryside was often thriving and more desirable to live in. More importantly, the very distinction between barbarians and "civilized" folks is fluid and fuzzy - as is now well-known in the case of Rome, Romans could be barbarians, and barbarians could be Romans citizens (popularized recently in the Netflix show "Barbarians"). The fact is that Romans often willingly became "barbarians" because of the oppressive nature of the city-state.

Scott's book is one of the most important books I have read in years; it may well be one of the most important books I will ever read. The best thing about it is that it presents history the way it was, as a series of incidental, messy events whose end outcome was by no means certain. Whatever order we decide to impose on history is of our own making.

Two views of America

The United States is a country settled by Europeans in the 17th and 18th century who created a revolutionary form of government and a highly progressive constitution guaranteeing freedom of speech, religion and other fundamental human rights which could be modified. It was a development unprecedented both in space and time.

At the same time, this creation of the American republic came at great cost involving the displacement and decimation of millions of Native Americans and the institution of chattel slavery on these lands. The original constitution had grave deficiencies and it took a long time for these to be remedied.

Many people can’t hold these two very different-sounding views of America in their minds simultaneously and choose to emphasize one or the other, and this divide has only grown in recent times. But both of these views are equally valid and equally important, and ultimately in order to understand this country and see it progress, you have to be at peace with both views.

But it’s actually better than that, because there is a third, meta-level view which is even more important, that of progress. The original deficiencies of the constitution were corrected and equal rights extended to a variety of groups who didn’t have them, including women, people of color, Catholics, Jews and immigrants. Chattel slavery was abolished and Native Americans, while not reverting to their previous status, could live in dignity as American citizens. 

This was the idea of constantly striving toward a “more perfect Union” that Lincoln emphasized. There were hiccups along the way, but overall there was undoubtedly great progress. Today American citizens are some of the freest people in the world, even with the challenges they face. If you don’t believe this, then you effectively believe that the country is little different from what it was fifty or a hundred years ago.

It seems that this election and these times are fundamentally about whether you can hold the complex, often contradictory history of this country in your mind without conflict, and more importantly whether you subscribe to the meta-level view of progress. Because if you can’t, you will constantly either believe that the country is steeped in irredeemable sin or sweep its inequities under the rug. Not only would both views be pessimistic but both would do a disservice to reality. But if you can in fact hold this complex reality in mind, you will believe that this is a great country not just in spite of its history but because of it.

A Foray into Jewish History and Judaism

I have always found the similarities between Hinduism and Judaism (and between Hindu Brahmins in particular and Jews) very striking. In order of increasing importance:

1. Both religions are very old, extending back unbroken between 2500 and 3000 years with equally old holy books and rituals.

2. Both religions place a premium on rituals and laws like dietary restrictions etc.

3. Hindus and Jews have both endured for a very long time in spite of repeated persecution, exile, oppression etc. although this is far more true for Jews than Hindus. Of course, the ancestors of Brahmins have the burden of caste while Jews have no such thing, but both Hindus and Jews have been persecuted for centuries by Muslims and Christians. At the same time, people of both faiths have also lived in harmony and productive intercourse with these other faiths for almost as long.

4. Both religions place a premium on the acquisition and dissemination of knowledge and learning. Even today, higher education is a priority in Jewish and Hindu families. As a corollary, both religions also place a premium on fierce and incessant argumentation and are often made fun of for this reason.

5. Both religions are unusually pluralistic, secular and open to a variety of interpretations and lifestyles without losing the core of their faith. You can be a secular Jew or an observant one, a zealous supporter or harsh critic of Israel, a Jew who eats pork and still calls himself a Jew. You can even be a Godless Jewish atheist (as Freud called himself). Most importantly, as is prevalent especially in the United States, you can be a “cultural Jew” who enjoys the customs not because of deep faith but because it fosters a sense of community and tradition. Similarly, you can be a highly observant Hindu, a flaming Hindu nationalist, an atheist Hindu who was raised in the tradition but who is now a “cultural Hindu” (like myself), a Hindu who commits all kinds of blasphemies like eating steak and a Hindu who believes that Hinduism can encompass all other faiths and beliefs.

I think that it’s this highly pluralistic and flexible belief and tradition system that has made both Judaism and Hinduism what Nassim Taleb calls “anti-fragile”, not just resilient but being able to be actively energized in the face of bad events. Not surprisingly, Judaism has always been a minor but constant interest of mine, and there is no single group of people I admire more. The interest has always manifested itself previously in my study of Jewish scientists like Einstein, Bethe, von Neumann, Chargaff and Ulam, many of whom fled persecution and founded great schools of science and learning. More broadly though, although I am familiar with the general history, I am planning to do a deeper dive into Jewish history this year. Here is a list of books that I have either read (*), am reading ($) or planning to read (+). I would be interested in recommendations.

1. Paul Johnson’s “The History of the Jews”. (*)

2. Simon Schama’s “The Story of the Jews”. (*)

3. Jane Gerber’s “The Jews of Spain”. ($)

4. Nathan Katz’s “The Jews of India”. (*)

5. Amos Elon’s “The Pity of It All: A Portrait of the German-Jewish experience, 1743-1933”. ($)

6. Norman Lebrecht’s “Genius and Anxiety: How Jews Changed the World: 1847-1947”. ($)

7. Erwin Chargaff’s “Heraclitean Fire”. (*)

8. Stanislaw Ulam’s “Adventures of a Mathematician”. (*)

9. Stefan Zweig’s “The World of Yesterday”. (*)

10. Primo Levi’s “Survival in Auschwitz” and “The Periodic Table”. (*)

11. Robert Wistrich’s “A Lethal Obsession: Anti-Semitism from Antiquity to the Global Jihad”. (*)

12. Jonathan Kaufman’s “The Last Kings of Shanghai”. (This seems quite wild) (+)

13. Istvan Hargittai’s “The Martians of Science”. (*)

14. Bari Weiss’s “How to Fight Anti-Semitism”. (+)

15. Ari Shavit’s “My Promised Land”. (+)

16. Norman Cohn’s “Warrant for Genocide: The Myth of the Jewish World Conspiracy and the Protocols of the Elders of Zion” (*)

17. Irving Howe’s “World of Our Fathers: The Journey of the East European Jews to America and the Life They Found and Made“ (+)

18. Edward Kritzler’s “Jewish Pirates of the Caribbean” (another book that sounds wild) (+)

19. Alfred Kolatch’s “The Jewish Book of Why” (+)

20. Simon Sebag-Montefiore’s “Jerusalem” ($)

Life. Distributed.

One of my favorite science fiction novels is “The Black Cloud” by Fred Hoyle. It describes an alien intelligence in the form of a cloud that approaches the earth and settles by the sun. Because of its proximity to the sun the cloud causes havoc with the climate and thwarts the attempts of scientists to both study it and attack it. Gradually the scientists come to realize that the cloud is an intelligence unlike any they have encountered. They are finally successful in communicating with the cloud and realize that its intelligence is conveyed by electrical impulses moving inside it. The cloud and the humans finally part on peaceful terms.

There are two particularly interesting aspects of the cloud that warrant further attention. One is that it’s surprised to find intelligence on a solid planet; it is used to intelligence being gaseous. The second is that it’s surprised to find intelligence concentrated in individual human minds; it is used to intelligence constantly moving around. The reason these aspects of the story are interesting is because they show that Hoyle was ahead of his time and was already thinking about forms of intelligence and life that we have barely scratched the surface of.

Our intelligence is locked up in a three pound mass of wet solid matter. And it’s a result of the development of the central nervous system. The central nervous system was one of the great innovations in the history of life. It allowed organisms to concentrate their energy and information-processing power in a single mass that sent out tentacles communicating with the rest of the body. The tentacles are important but the preponderance of the brain’s capability resides in itself, in a single organ that cannot be detached or disassembled and moved around. From dolphins to tigers and from bonobos to humans, we find the same basic plan existing for good reasons. The central nervous system is an example of what’s called convergent evolution, which refers to the ability of evolution to find the same solutions for complex problems. Especially in Homo sapiens, the central nervous system and the consequent development of the neocortex are seen as the crowning glory of human evolution.

And yet it’s the solutions that escaped the general plan that are the most interesting in a sense. Throughout the animal and plant kingdom we find examples not of central but of distributed intelligence, like Hoyle’s cloud. Octopuses are particular fascinating examples. They can smell and touch and understand not just through their conspicuous brains but through their tentacles; they are even thought to “see” color through these appendages. But to find the ultimate examples of distributed intelligence, it might be prudent not to look at earth’s most conspicuous and popular examples of life but its most obscure – fungi. Communicating the wonders of distributed intelligence through the story of fungi is what Merlin Sheldrake accomplishes in his book, “Entangled Life”.

Fungi have always been our silent partners, partners that are much more like us than we can imagine. Like bacteria they are involved in an immense number of activities that both aid and harm human beings, but most interestingly, fungi unlike bacteria are eukaryotes and are therefore, counterintuitively, evolutionarily closer to us rather than to their superficially similar counterparts. And they get as close to us as we can imagine. Penicillin is famously produced by a fungus; so is the antibiotic fluconazole that is used to kill other fungal infections. Fungal infections can be deadly; Aspergillus forms clumps in the lungs that can rapidly kill patients by spreading through the bloodstream. Fungi of course charm purveyors of gastronomic delights everywhere in the world as mushrooms, and they also charm purveyors of olfactory delights as truffles; a small lump can easily sell for five thousand dollars. Last but not the least, fungi have taken millions of humans into other worlds and artistic explosions of colors and sight by inducing hallucinations.

With this diverse list of vivid qualities, it may seem odd that perhaps the most interesting quality of fungi lies not in what we can see but what we can’t. Mushrooms may grace dinner plates in restaurants and homes around the world, but they are merely the fruiting bodies of fungi. They may be visible as clear vials of life-saving drugs in hospitals. But as Sheldrake describes in loving detail, the most important parts of the fungi are hidden below the ground. These are the vast networks of the fungal mycelium – the sheer, gossamer, thread-like structure snaking its way through forests and hills, sometimes spreading over hundreds of square miles, occasionally being as old as the neolithic revolution, all out of sight of most human beings and visible only to the one entity with which it has forged an unbreakable, intimate alliance – trees. Dig a little deep into a tree root and put it under a microscope and your will find wisps of what seem like even smaller roots, except that these roots penetrate into the trees roots. The wisps are fungal mycelium. They are everywhere; around roots, under them, over them and inside them. At first glance the the ability of fungal networks to penetrate inside tree roots might evoke pain and invoke images of an unholy literal physical union of two species. It’s certainly a physical union, but it may be one of the holiest meetings of species in biology. In fact it might well be impossible to find a tree whose roots have no interaction with fungal mycelium. The vast network of fibers the mycelium forms is called a mycorrhizal network.

The mycorrhizal networks that wind their way in and out of tree roots are likely as old as trees themselves. The alliance almost certainly exists because of a simple matter of biochemistry. When plants first colonized land they possessed the miraculous ability of photosynthesis that completely changed the history of life on this planet. But unlike carbon which they can literally manufacture out of sunlight and thin air, they still have to find essential nutrients for life, metals like magnesium and other life-giving elements like phosphorus and nitrogen. Because of an intrinsic lack of mobility, plants and trees had to find someone who could bring them these essential elements. The answer was fungi. Fungal networks stretching across miles ensured that they could shuttle nutrients back and forth between trees. In return the fungi could consume the precious carbon that the tree sank into its body – as much as twenty tons during a large tree’s lifetime. It was the classic example of symbiosis, a term coined by the German botanist Albert Frank, who also coined the term mycorrhiza.

However, the discovery that fungal networks could supply trees with essential nutrients in a symbiotic exchange was only the beginning of the surprises they held. Sheldrake talks in particular about the work of the mycologists Lynne Body and Suzanne Simard who have found qualities in the mycorrhizal networks of trees that can only be described as deliberate intelligence. Here are a few examples: fungi seem to “buy low, sell high”, providing trees with important elements when they have fallen on hard times and liberally borrowing from them when they are doing well. Mycorrhizal networks also show electrical activity and can discharge a small burst of electrochemical potential when prodded. They can entrap nematodes in a kind of death grip and extract their nutrients; they can do the same with ants. Perhaps most fascinatingly, fungal mycelia display “intelligence at a distance”; one part of a huge fungal network seems to know what the other is doing. The most striking experiment that demonstrates this shows oyster mushroom mycelium growing on a piece of wood and spreading in all directions. When another piece of wood is kept at a distance, within a few days the fungal fibers spread and latch on to that piece. This is perhaps unsurprising. What is surprising is that once the fungus discovers this new food source, it almost instantly pares down growth in all other parts of its network and concentrates it in the direction of the new piece of wood. Even more interestingly, scientists have found that the hyphae or tips of fungi can act not only as sensors but as primitive Boolean logic gates, opening and closing to allow only certain branches of the network to communicate with each other. There are even attempts to use fungi as primitive computers.

This intelligent long-distance relay gets mirrored in the behavior of the trees that the fungi form a mind meld with. One of the characters in Richard Powers’s marvelous novel “The Overstory” discovers how trees are whispering hidden signals to each other, not just through fungal networks but through ordinary chemical communication. The character Patty Westford finds out that when insects attack one tree, it can send out a chemical alarm that alerts trees located even dozens of meters away of its plight, causing them to kick their own repellant chemical production into high gear. Meeting the usual fate of scientists with novel ideas, Westford and her ideas are first ignored, then mocked and ostracized and ultimately grudgingly accepted. But the discovery of trees and their fungal networks communicating through each other and through the agency of both chemicals and other organisms like insects is now generally accepted enough to become part of both serious scientific journals and prizewinning novels.

Fungi can also show intelligent behavior by manipulating our minds, and this is where things get speculative. Psilocybin and LSD have been used by shamans, hippies and Silicon Valley tech entrepreneurs over thousands of years. When you are familiar with both chemistry and biology it’s natural to ask what might be the perceived evolutionary utility of chemical compounds that bring about changes in perception that are so profound and seemingly liberating as to lead someone like Aldous Huxley to make sure that he was on a psychedelic high during the moment of his death. One interesting clue arises from the discovery of these compounds in the chemical defense responses of certain fungi. Clearly the microorganisms that are engaged in a war with fungi – and these often include other fungi – lack a central nervous system and have no concept of a hallucination. But if these compounds are found as part of the wreckage of fungal wars, maybe this was their original purpose, and the fact that they happen to take humans on a trip is only incidental.

That is the boring and likely explanation. The interesting and unlikely explanation that Sheldrake alludes to is to consider a human, in the medley of definitions that humans have lent themselves to, as a vehicle for a fungus to propagate itself. In the Selfish Fungus theory, magic mushrooms and ergot have been able to hijack our minds so that more of us will use them, cultivate and tend them and love them, ensuring their propagation. Even though their effects might be incidental, they can help us in unexpected ways. If acid and psilocybin trips can spark even the occasional discovery of a new mathematical object or a new artistic style, both the fungi and the humans’ purpose is served. I have another interesting theory of psychedelic mushroom-human co-evolution in mind that refers to Julian Jaynes’s idea of the bicameral mind. According to Jaynes, humans may have lacked consciousness until as recently as 3000 years ago because their mind was divided into two parts, one of which “spoke” and the other “listened”. What we call Gods speaking to humans was a result of the speaking side holding forth. Is it possible that at some point in time, humans got hold of psychedelic fungi and they hijacked a more primitive version of the speaking mind that allowed it it to turn into a full-blown voice inside the other mind’s head, so to speak? Jaynes’s theory has been called “either complete rubbish or a work of consummate genius, nothing in between” by Richard Dawkins, and this might be another way to probe whether it might be true for a reason.

It is all too easy to anthropomorphize trees and especially fungi, which only indicates how interestingly they behave. One can say that “trees give and trees receive”, “trees feel” and even “trees know”, but at a biological level is this behavior little more than a series of Darwinian business transactions, purely driven by natural selection and survival? Maybe, but ultimately what matters is not what we call the behavior but the connections it implies. And there is no doubt that fungi, trees, octopuses and a few other assorted creatures are displaying a unique type of intelligence that humans may have merely glimpsed. Distributed intelligence clearly has a few benefits over a central, localized one. Unlike humans who are unlikely to live when their heads are cut off, newts can regrow their heads when they get detached, so there’s certainly a survival advantage conferred by not having your intelligence organ be one and done. This principle has been exploited by the one form of distributed intelligence that is an extension of human beings and that has taken over the planet – the Internet. Among many ideas that are regarded as the origins of the Internet, one was conceived by the defense department which wanted to built a communications net that would be resilient in the face of nuclear attack. Having a distributed network with no one node being a central node was the key. Servers in companies like Google and Facebook are also constructed in such a way that a would be hacker or terrorist would have to take out several and not just a few in order to measurably impair the fidelity of the network.

I also want to posit the possibility that distributed systems might be more analog than central ones and therefore confer unique advantages. Think of a distributed network of water pipes, arteries, traffic lanes or tree roots and fungal networks and one has the image in mind of a network that can almost instantaneously transmit changes in parameters like pressure, temperature and density taking place in one part of the network to another. These are all good examples of analog computation, although in case of arteries, the analog process is built on a substrate of digital neuronal firing. The human body is clearly a system where a combination of analog and digital works well, but looking at distributed intelligence one gets a sense that we can optimize our intelligence significantly using more analog computing.

There is no reason why intelligence may not be predominantly analog and distributed so that it becomes resilient, sensitive and creative like mycorrhizal networks, being able to guard itself against existential threats, respond to new food and resource locations and construct new structures with new form and function. One way to make human intelligence more analog and distributed would be to enable human-to-human connections through high-fidelity electronics that allows a direct flow of information to and from human brains. But a more practical solution might be to enable downloading brain contents including memory into computers and then allowing these computers to communicate with each other. I do not know if this advance will take place during my lifetime, but it could certainly bring us closer to being a truly distributed intelligence that just like mycorrhizal networks is infinitely responsive, creative, resilient and empathetic. And then perhaps we will know exactly what it feels like to be a tree.