Field of Science

A Science Thanksgiving

It’s Thanksgiving weekend here in the U.S., and there’s an informal tradition on Thanksgiving to give thanks for all kinds of things in our lives. Certainly there’s plenty to be thankful for this year, especially for those of us whose lives and livelihoods haven’t been personally devastated by the coronavirus pandemic. But I thought I would do something different this year. Instead of being thankful for life’s usual blessings, how about being thankful for some specific facts of nature and the universe that are responsible for our very existence and make it wondrous? Being employed and healthy and surrounded by family and friends is excellent, but none of that would be possible without the amazing unity and diversity of life and the universe. So without further ado and in no particular order, I present an entirely personal selection of ten favorites for which I am eternally thankful.

I am thankful for the value of the resonance level energy of the excited state of carbon-12: carbon-12 which is the basis of all organic life on earth is formed in stars through the reaction of beryllium-8 with helium-4. The difference in energies between the starting materials (beryllium + helium) and carbon is only about 4%. If this difference had been even slightly higher, the unstable beryllium-8 would have disappeared long before it had transmuted into carbon-12, making life impossible.

I am thankful for the phenomenon of horizontal gene transfer (HGT): it allowed bacteria during early evolution to jump over evolutionary barriers by sharing genetic material between themselves instead of just with their progeny. The importance of HGT for evolution may be immense since regular HGT early on might have led to the universality of the genetic code. HGT mixed and matched genetic material in the cauldron of life, eventually leading to the evolution of multicellular organisms including human beings.

I am thankful for the pistol shrimp: an amazing creature that can “clap” its pincers and send out a high-pressure bubble with lightning speed to kill its prey. This sonication bubble can produce light when it collapses, and the speed of collapse is such that temperature inside the bubble can briefly approach the surface temperature of the sun. The pistol shrimp shows us that nature hides phenomena that are not dreamt of in our philosophy, leading to an inexhaustible list of natural wonders for us to explore.

I am thankful to the electron: an entire universe within a point particle that performs the subtlest and most profound magic, making possible the chemistry of life; giving rise to the electromagnetic force that holds ordinary matter together; ultimately creating minds that can win prizes for studying electrons.

I am thankful to the cockroach: may humanity have the resilience to survive the long nights of our making the way you have.

I am thankful to the redwoods: majestic observers and guardians of nature who were here before us, who through their long, slow, considered lives have watched us live out our frantic, anxious lives the way we watch ants live out theirs, and whose survival is now consequentially entwined with our own.

I am thankful to the acetyl group, a simple geometric arrangement of two carbon and one oxygen atoms whose diverse, myriad forms fueling life and alleviating pain – acetylcholine, acetyl-coenzyme A, acetaminophen – are tribute to the ingenuity of both human minds and nature.

I am thankful to i, the square root of minus one: who knew that this diabolical creature, initially alien to even the abstract perception of mathematicians, would be as “real” as real numbers and more importantly, underlie the foundation of our most hallowed descriptions of nature such as quantum theory.

I am thankful to the black hole: an endless laboratory of the most bizarre and fantastic wonders; trapping light but letting information escape; providing the ultimate playground for spacetime curvature; working relentlessly over billions of years as a clearinghouse and organizing principle for the universe’s wayward children; proving that the freaks of the cosmos are in fact the soul food of its very existence.

I am thankful for time: that elusive entity which, in the physicist John Wheeler’s words, “keeps everything from happening all at once”; which waits for no one and grinds kings and paupers into the same ethereal dust; whose passage magically changes children every day before our very eyes; whose very fleeting nature makes life precious and gives us the most to be thankful for.

Book review: A Divine Language: Learning Algebra, Geometry, and Calculus at the Edge of Old Age, by Alec Wilkinson

A beautifully written account of mathematics lost and found. The author got "estranged" from mathematics in school and now, at the age of 65 and after a distinguished writing career, has taken it upon himself to learn the fundamentals of algebra, geometry and calculus. The book is by turns funny and sad even as Wilkinson recounts his struggling attempts to master material that would be child's play for many bright teenagers. He is helped in his efforts by his niece Amie Wilkinson, an accomplished mathematician at the University of Chicago. I myself could empathize with the author since I too had an estrangement of sorts with the subject in high school because of a cruel, vindictive teacher, and it took me until college when, thanks to brilliant and empathetic teachers, I clawed myself back up to start appreciating it.

But while he may struggle even with high school mathematical skills (and he I share a particular loathing for word problems), Wilkinson brings a poetic, philosophical sensibility acquired through a long career to bear on the topic that no young 15-year-old whippersnapper genius in math could commit to paper. He ruminates on the platonic beauty of math and wonders whether and how some people's minds might be wired differently for it. He does not always understand how his brilliant mathematical niece Amie always "gets it" and she in turn doesn't always understand why her uncle has trouble with ideas that are second nature to her.

Often quoting from eloquent mathematicians and physicists like Bertrand Russell, G. H. Hardy and Roger Penrose, Wilkinson brings a fresh, beautiful perspective to the utility and beauty of mathematics; to the struggle inherent in mastering it and the rewards that await those who persevere. I would highly recommend the book to those who may have lost faith in mathematics in high school and want to pick up some of the concepts later, or even to young students of math who may be wizards at solving equations but who might want to acquire a broader, more philosophical perspective on this purest of human endeavors.

Temple Grandin vs algebra

There's a rather strange article by Temple Grandin in the Atlantic, parts of which had me vigorously nodding my head and parts of which had my eyebrows crawling straight up. It's a critique of how our school system tries a one-size-fits-all approach that does a lot of students disservice, but more specifically takes aim at algebra. 

First, let me say how much I admire Temple Grandin. A remarkable woman who had severe autism for most of her childhood (there's a very good profile of her in Oliver Sacks's "An Anthropologist On Mars"), she rose above her circumstances and channeled her unusual abilities into empathy for animals, becoming one of the world's leading experts in the design of humane housing and conditions for livestock. She has without a doubt demonstrated the value of what we can call 'non-standard' modes of thinking, teaching and learning that utilize visual and tactile ability. So she starts off strong enough here:

As a professor of animal science, I have ample opportunity to observe how young people emerge from our education system into further study and the work world. As a visual thinker who has autism, I often think about how education fails to meet the needs of our very diverse minds. We are shunting students into a one-size-fits-all curriculum instead of nurturing the budding builders, engineers, and inventors that our country needs.

So far so good. In fact let me digress a bit here. When I was in high school I was very good at geometry but terrible at algebra; I still remember this one midterm where I got an A and in fact the highest points-based grade in the class in geometry but almost flunked algebra. It took me a long time to claw back to a position where algebra made sense to me. In fact this appreciation of visual explanations was what drew me in part to chemistry, so I perfectly appreciate what Grandin is saying about being sympathetic to students who might have more of a visual capacity. 

But further down the pages she takes a detour into the evils of algebra that doesn't make sense to me. Again, some of what she says is spot on; for instance the fact that algebra (and math in general) can be taught much better if you can relate it to the real world. Too often it's presented simply as abstraction and symbol manipulation. But then there's this:

Cognitive skills may simply not be developed enough to handle abstract reasoning before late adolescence, which suggests that, at the very least, we’re teaching algebra too early and too fast. But abstract reasoning is also developed through experience, which is a good argument for keeping all those extracurriculars.

This part may make more of a case for tying algebra to specific real-world applications than doing away with the abstractions per se. The fact of the matter is that math is abstract; in fact it's precisely this abstraction that makes it a powerful general tool. And there are good and bad ways of teaching that abstraction, but the solution isn't to get rid of it or delay it. In fact, that kind of thinking feeds into the popular belief seen in some quarters these days that algebra and calculus both need to be optional classes.

It's when she gets to the end of the piece, however, that Grandin completely loses me:

"No two people have the same intelligence, not even identical twins. And yet we persist in testing—and teaching—people in the same way. We don’t need Americans to be better at algebra, per se. We need future generations that can build and repair infrastructure, overhaul energy and agriculture, develop robotics and AI. We need kids who grow up with the imagination to invent the solutions to pandemics and climate change. When school fails them, it fails all of us."

Say what? Building and repairing infrastructure, overhauling energy and agriculture and - especially - developing robotics and AI do not need algebra? In fact most of these professions involve a very solid grounding in abstract aspects of algebra and calculus. I think Grandin is treading very handily from saying that algebra should be taught better to saying that we should get rid of it or make it optional. Two very different things.

My concern based on this article and others I am reading these days is that, in our drive to reform the system, we want to consider it unnecessary. That is a grave mistake. Algebra and calculus and for that matter music and art are things that, even beyond the practical utility of the first two, help us appreciate our place in society and the cosmos better and in general teach us to be more human. Make them better we certainly should, but let's not burn the building down in our zeal.

David McCullough (1933-2022)

I have been wanting to write about David McCullough who passed away recently and whose writings I always enjoyed. McCullough was admittedly one of the finest popular historians of his generation. His biographical portraits and writings were wide-ranging, covering a variety of eras; from "1776" and "John Adams" about the revolutionary period through "The Great Bridge" about the building of the Brooklyn Bridge in the 1860s to "Truman" about Harry Truman's life and presidency. "Truman" is in fact the best presidential biography I have read. In spite of its size it never bogs down and paints a fair and balanced portrait of the farmer from Missouri who became the unlikely and successful president.

McCullough's writing style and approach to history warrant some discussion. He was what you would call a gentleman writer: amiable, avuncular, genteel, not one to kick up dust or to engage in hard-hitting journalism; the opposite of Howard Zinn. Although his writing was balanced and he stayed away from hagiography, it was also clear that he was fond of his subjects, and that fondness might have made him sometimes avert a completely objective, critical approach.

That style opened him up to criticism. For instance, his "The Pioneers" that described the opening up of the Ohio country and the Northwest Ordinance of 1787 engineered by Manasseh Cutler came under scrutiny for its omission of the brutal and unfair treatment of Native Americans in the new territory. The ordinance was actually quite revolutionary for its time since it outlawed slavery and effectively laid the fuse for developments sparking division between slave and free states in the 1850s and the ensuing Civil War. McCullough did emphasize this positive aspect of the ordinance, but not the negative repercussions for Indians. 

That streak is emblematic of his other writings. He never shied away from the evils of slavery, treatment of Native Americans or oppression of women, but his gaze was always upward, toward the better angels of our nature. Most characteristic of this style is his "The American Spirit", written at a fraught time in this country's history. As I mentioned in my review of the book, McCullough's emphasis is on the positive aspects of this country's founding and the founders' emphasis on individual rights and education, even if some of them personally fell short of observing those rights for others.

While I understand that McCullough might have had a bias toward the better parts of this country's history, I think that's the right approach especially today. That is because I think that a lot of Americans on both sides have acquired a strangely and fundamentally pessimistic approach toward both our past and our future. They seem to think that the country was born and steeped in sin that cannot be expiated. This is a very flawed perspective in my opinion. Perhaps as an immigrant I am more mindful of the freedoms and gifts that this country has bestowed on me, freedoms that are still unique compared to many other countries, but I share McCullough's view that whatever the substantial sins that this country was born in and perpetuated, its moral arc, as Martin Luther King would say, has always been upward and toward justice. In many ways the United States through its constitution laid the foundations for democracy and freedom that been emulated, in big and small ways, by most of the world's successful democracies. The leaders and activists of this country themselves were mindful that their country was not conforming to that perfect union described in its founding documents.

Progress has not been linear, certainly, but it has been steady throughout the ages. I think it's appropriate to complain that some aspects of progress should have taken much less time than what they did - unlike many other countries, the United States still has not had a female president, for instance - but that's different from saying that progress was made only by certain groups of people or that it wasn't made at all. As just one example, while African-Americans took the lead in the civil rights movement, there was no dearth of white Americans including religious activists like Benjamin Lay, firebrand speakers like William Lloyd Garrison and women suffragists who also wanted to end slavery. In addition, as David Hackett Fischer details in his monumental new study of black Africans' contribution to the country's early years, black and white people often worked hand in hand to make big and small achievements for slaves and freedmen alike. Recognizing this unity in diversity - E pluribus unum - is central to recognizing the essence of America.

The United States was a melting pot of different kinds and dimensions since before its founding, and all elements of this melting pot helped shape progressive views in this country. To privilege only certain elements does a disservice to the diversity that this country has exemplified. David McCullough knew this. He distinguished himself by telling us in his many writings how there was a constant stream of progressive forces emerging from all quarters of society, including all races and economic classes, that helped this country implement its founding ideals of liberty and equality. Even when the sky appeared darkest, as happened often in our history, the forces provided the proverbial silver lining for all of us to aspire to. We need more of that sentiment today. McCullough will be missed, but his writings should provide a sure guide.

Book review: "The Apocalypse Factory: Plutonium and the Making of the Atomic Age", by Steve Olson

In the history of the Manhattan Project, Los Alamos has always been the star, and Hanford and Oak Ridge where plutonium and uranium respectively were created have been supporting actors. Steve Olson's goal is to resurrect Hanford as the most important site in retrospect. Its product, plutonium, is now the element of choice in the vast majority of the world's nuclear arsenals. And the product of that creation has created an environmental catastrophe beyond reason.

Olson has written a lively and thought-provoking book about the "devil's element" and the global catastrophe and promise it has bred. Olson's account especially shines in the first half as he describes Glenn Seaborg, Joseph Kennedy and Arthur Wahl discovering plutonium-239 at Berkeley in February, 1942. Very quickly plutonium's promise became clear - unlike uranium whose rare fissionable isotope (uranium-235) it would take herculean efforts to separate from its more copious cousin (uranium-238), plutonium, being a different element from uranium, could be separated using relatively simple chemical means from its parent uranium-238. It was also clear that plutonium could be more efficiently fissioned than uranium and so less of it was needed to build bombs; if this elementary fact of nature had not been true, enough plutonium would never have been produced in time for the bomb that destroyed Nagasaki, and the world's nuclear arsenals might have looked very different. As it turned out, while the Hiroshima bomb needed about 140 pounds (63 kilograms) of uranium, the Nagasaki bomb needed only about 13 pounds (6 kilograms) of plutonium. It is still stupendous and terrifying to think that an amount of plutonium that can be carried as a cube that's about 3 inches on one side can destroy an entire city.
The first hulking reactor at Hanford (Reactor B) went up soon under the watchful eyes of Enrico Fermi, Eugene Wigner and the DuPont company; the first batch of plutonium from Hanford was produced at the beginning of 1945. Olson's book has amusing accounts of the differences in philosophy between the DuPont engineers and the physicists; the engineers thought the physicists considered everything too simple, the physicists thought the engineers made everything too complex. Of special note was Crawfort Greenewalt, a bright young engineer who had married into the DuPont family and who orchestrated DuPont's building of the reactor. Somehow peace was brokered and the warring functions worked well during the rest of the war. The plutonium in the Nagasaki bomb came from Hanford, its high spontaneous fission rate necessitating a revolutionary new design - implosion - used in that bomb and pretty much all its successors.
Olson's account of the Nagasaki mission is gripping. The poor city was the third choice after Hiroshima. Kokura which was the second choice turned out to have significant cloud cover. So did Nagasaki, but at that point the 'Bockscar', the B-29 bomber that was delivering the bomb, made a last-minute decision to bomb in spite of lack of the visual bombing requirement which had been mandated. After the war, even Manhattan Project chief General Leslie Groves who never publicly regretted the bombings said privately that he did not think Nagasaki was necessary.
As the Cold War heated up, the Hanford site became the principal site of production of plutonium for the tens of thousand of nuclear weapons that were to fill the missiles, bombers and submarines of the United States, a number that was many fold that necessary to bring about the destruction of the entire planet in a nuclear exchange between the two superpowers. The reactors were powered down in the 60s and early 70s, only to be powered up again during the hawkish administration of Ronald Reagan. There was another kind of destruction wrought during their operation. In their haste to make plutonium: billions of gallons and pounds of toxic radioactive and chemical sludge and waste were stored in makeshift steel tanks underground; some of this effluent was released into the mighty Columbia River. The scientists and engineers and politicians who made Hanford did not quite understand the profoundly difficult long-term problem for humanity that these long-lived radioactive materials would face. Even today, the Hanford site is often referred to as the most contaminated site in the world, and it is estimated that it could take up to $640 billion to clean up the site.
With plutonium also came jobs and families and hospitals and schools. Olson who grew up in the area talks about the complicated relationship people whose fathers and grandfathers and grandmothers worked on the reactors have with the site. On one hand, they are proud that their work contributed to the end of World War 2 and preserved America's edge and possibly survival during the Cold War; on the other hand, they worry about the bad reputation that the site has gotten as the principal protagonist in creating weapons of mass destruction. Most of all, they worry about the potential cancers that they think the contaminated site might have caused. As Olson documents, studies have found tenuous links at best between the radiation at the site and the rate of cancers, but it's hard to convince people who believe that any amount of radiation must be bad.
Today the Hanford site is part of the Manhattan Project National Historical Park that encompasses Oak Ridge and Los Alamos (I have been wanting to go on a tour for a long time). The B reactor no longer produces the devil's element. Instead it is a mute testament to humankind's discovery of the means of its own destruction. That nuclear weapons have never been used in anger since August, 1945 might elevate it in the future to an importance that we cannot yet gauge.

The root of diverse evil

It wasn’t very long ago that I was rather enamored with the New Atheist movement, of which the most prominent proponent was Richard Dawkins. I remember having marathon debates with a religious roommate of mine in graduate school about religion as the “root of all evil”, as the producers of a documentary by Dawkins called it. Dawkins and his colleagues made the point that no belief system in human history is as all-pervasive in its ability to cause harm as religion.

My attitude toward religion started changing when I realized that what the New Atheists were criticizing wasn’t religion but a caricature of religion that was all about faith. Calling religion the “root of all evil” was also a bad public relations strategy since it opened up the New Atheists to obvious criticism – surely not all evil in history has been caused by religion? But the real criticism of the movement goes deeper. Just like the word ‘God’, the word ‘religion’ is a very broad term, and people who subscribe to various religions do so with different degrees of belief and fervor. For most moderately religious people, faith is a small part of their belonging to a religion; rather, it’s about community and friendship and music and literature and what we can broadly call culture. Many American Jews and American Hindus for instance call themselves cultural Jews or cultural Hindus.

My friend Freeman Dyson made this point especially well, and he strongly disagreed with Dawkins. One of Freeman’s arguments, with which I still agree, was that people like Dawkins set up an antagonistic relationship between science and religion that makes it seem like the two are completely incompatible. Now, irrespective of whether the two are intellectually compatible or not, it’s simply a fact that they aren’t so in practice, as evidenced by scores of scientists throughout history like Newton, Kepler and Faraday who were both undoubtedly great scientists and devoutly religious. These scientists satisfied one of the popular definitions of intelligence – the ability to simultaneously hold two opposing thoughts in one’s mind.

Dyson thought that Dawkins would make it hard for a young religious person to consider a career in science, which would be a loss to the field. My feeling about religion as an atheist are still largely the same: most religion is harmless if it’s practiced privately and moderately, most religious people aren’t out to convert or coerce others and most of the times science and religion can be kept apart, except when they tread into each other’s territory (in that case, as in the case of young earth creationism, scientists should fight back as vociferously as they can).

But recently my feelings toward religion have soured again. A reference point for this change is a particularly memorable quote by Steven Weinberg who said, “Without religion good people will do good things and bad people will do bad things. But for good people to do bad things, that takes religion.” Weinberg got a lot of flak for this quote, and I think it’s because of a single word in it that causes confusion. That word is “good”. If we replace that word by “normal” or “regular” his quote makes a lot of sense. “For normal people to do evil or harm, that takes religion.” What Weinberg is saying that people who are otherwise reasonable and uncontroversial and boring in their lives will do something exceptionally bad because of religion. This discrepancy is not limited to religious ideology – the Nazis at Auschwitz were also otherwise “normal” people who had families and pets and hobbies – but religious ideology, because of its unreason and reliance on blind faith, seems to pose a particularly all-pervading example. Religion may not be the root of all evil, but it certainly may be the root of the most diverse evil.

I was reminded of Weinberg’s quote when I read about the shocking attack on Salman Rushdie a few weeks ago. Rushdie famously had to go into hiding for a long time and abandon any pretense of a normal life because of an unconscionable death sentence or fatwa to kill him issued by Ayatollah Khomeini of Iran. Rushdie’s attacker is a 24-year-old man named Hadi Matar who was born in the United States but was radicalized after a trip to Lebanon to see his father. By many accounts, Matar was a loner but otherwise a normal person. The single enabling philosophy that motivated him to attack and almost kill Rushdie was religious. As Weinberg would say, without religion, he would have just been another disgruntled guy, but it was religion that gave him a hook to hang his toxic hat on. Even now Matar says he is “surprised” that Rushdie survived. He also says that he hasn’t even read the controversial ‘Satanic Verses’ which led to the edict, which just goes to show how intellectually vacuous, mindless sheep the religiously motivated can be.

I had the same feelings, even more strongly felt, when I looked up the stories of the Boston marathon bomber brothers, Dzhokhar and Tamerlan Tsarnaev. By any account theirs should have been the quintessential American success story: both were brought to this country from war-torn Chechnya, placed in one of the most enlightened and progressive cities in the United States (Cambridge, MA) and given access to great educational resources. What, if not religious ideology, would lead them to commit such mindless, horrific acts against innocent people? Both Matar and the marathon bombers are a perfect example of Weinberg’s adage – it was religion that led them down a dark path and made the crucial difference.

The other recent development that has made me feel depressed about the prospects for peace between religion and secularism is the overturning of Roe v. Wade by the United States Supreme Court. In doing so, the Supreme Court has overturned a precedent with which a significant majority (often cited to be at least 60%) of Americans agree. Whatever the legal merits of the court’s decision, there is little doubt that the buildup to this deeply regressive decision was driven primarily by a religious belief that considers life to begin at conception. It’s a belief without any basis in science; in fact, as Carl Sagan and Ann Druyan wrote many years, if you factored in science, then Roe v. Wade would seem to have drawn the line at the right point, when the fetus develops a nervous system and really distinguishes itself as a human. In fact one of the tragedies of overturning Roe v. Wade is that the verdict struck a good balance between respecting the wishes of religious moderates and taking rational science into account.

But Evangelical Christians in the United States, of which there has a been dwindling and therefore proportionately bitter and vociferous number in recent years, don’t care about such lowly details as nervous systems (although they do seem to care about heartbeats which ironically aren’t unique to humans). For them, all there is to know about when life begins has been written in a medieval book. Lest there be any doubt that this consequential decision by the court was religiously motivated, it’s worth reading a recent, detailed analysis by Laurence Tribe, a leading constitutional scholar. Lessig convincingly argues that the Catholic justices’ arguments were in fact rooted in the view that life begins at conception, a view on which the constitution is silent but religion has plenty to say.

The grim fact that we who care about things like due process and equality are dealing with here is that a minority of religious extremists continues to foist extremely regressive views on the majority of us who reject those views to different degrees. For a while it seemed that religiosity was declining in the United States. But now it appears that those of us who found this trend reassuring were too smug; it’s not the numbers of the religious that have mattered but the strength of their convictions, crucially applied over time like water dripping on a stone to wear the system down. And that’s exactly what they have wanted.

The third reason why I am feeling rather bitter about religion is a recent personal experience. I was invited to a religious event at an extremely devout friend’s place. I will not note the friend’s religion or denomination to keep the story general and to avoid bias; similar stories could be told about any religion. My friend is a smart, kind and intelligent man, and while I usually avoid religious events, I made an exception this time because I like him and also because I wanted to observe the event, much like an anthropologist would observe the customs of another tribe. What struck me from the beginning was the lack of inclusivity in the event. We were not supposed to go into certain rooms, touch certain objects or food, take photos of them or even point at them. We were supposed to speak in hushed tones. Most tellingly, we weren’t supposed to shake hands with my friend or touch him in any way because he was conducting the event in a kind of priestly capacity. What social or historical contexts in more than one society this behavior evokes I do not need to spell out.

Now, my friend is well-meaning and was otherwise very friendly and generous, but all these actions struck me as emblematic of the worst features of religion, features meant to draw boundaries and divide the world into “us” and “them”. And the experience was again emblematic of Weinberg’s quote – an otherwise intelligent, kind and honest person was practicing strange, exclusionary customs because his holy book told him to do so, customs that otherwise would have been regarded as odd and even offensive. For normal people to do strange things, that takes religion.

Fortunately, these depressing thoughts about religion have, as their counterpart, hopeful thoughts about science. Everything about science makes it a different system. Nobody will issue a fatwa in science because a scientist says something that others disagree with or even find offensive, because if the scientist is wrong, the facts will decide one way or another. Nobody will carry out a decades-long vendetta to overturn a rule or decision which the majority believes as shown by the data. And certainly nobody will try to exclude anyone from doing a scientific experiment or proposing a theory just because they don’t belong to their particular tribe. All this is true even if science has its own priesthoods and has historically practiced forms of exclusion at one time or another. Scientists have their own biases as much as any other human people – witness the right’s opposition to climate change and the left’s opposition to parts of genetics research – but the great thing about science is that slowly but surely, it’s the facts about the world that decide truths, not authority or majority or minority opinion. Science is the greatest self-correcting system discovered by human beings, while religion keeps on allowing errors to propagate for generations and centuries by invoking authority and faith.

Sadly, these recent developments have shown us that the destructive passions unleashed by religious faith continue to proliferate. Again and again, when those of us who value rationality and science think we have reached some kind of understanding with the religious or think that the most corrosive effects of religion are waning, along comes a Hadi Matar to try to end the life of a Salman Rushdie, and along comes a cohort of religious extremists to end the will of the majority. Religion may not be the root of all evil, but it’s the root of a lot of evil, and undoubtedly of the most diverse evil. That’s reason enough to oppose it with all our hearts and minds. It’s time to loudly sound the trumpets of rationalism and the scientific worldview again.

First published on 3 Quarks Daily.

Book review: "Unraveling the Double Helix: The Lost Heroes of DNA", by Gareth Williams.

Newton rightly decried that science progresses by standing on the shoulders of giants. But his often-quoted statement applies even more broadly than he thought. A case in point: when it comes to the discovery of DNA, how many have heard of Friedrich Miescher, Fred Griffith or Lionel Alloway? Miescher was the first person to isolate DNA, from pus bandages of patients. Fred Griffith performed the crucial experiment that proved that a ‘transforming principle’ was somehow passing from a virulent dead bacterium to a non-virulent live bacterium, magically rendering the non-virulent strain virulent. Lionel Alloway came up with the first expedient method to isolate DNA by adding alcohol to a concentrated solution.

In this thoroughly engaging book, Gareth Williams brings these and other lost heroes of DNA. The book spans the first 85 years of DNA and ends with Watson and Crick's discovery of the structure. There are figures both well-known and obscure here. Along with those mentioned above, there are excellent capsule histories of Gregor Mendel, Thomas Hunt Morgan, Oswald Avery, Rosalind Franklin, Maurice Wilkins and, of course, James Watson and Francis Crick. The book traces a journey through a variety of disciplines, most notably the fields of biochemistry and genetics, that were key in deciphering the structure of DNA and its role in transmitting hereditary characteristics.
Williams’s account begins with Miescher’s isolation of DNA from pus bandages in 1869. At that point in time, proteins were well-recognized, and all proteins contained a handful of elements like carbon, nitrogen, oxygen and sulfur. The one element they did not contain was phosphorus. It was Miescher’s discovery of phosphorus in his extracts that led him and others to propose the existence of a substance they called ‘nuclein’ that seemed ubiquitous in living organisms. The two other towering figures in the biochemical history of DNA are the German chemist Albrecht Kossel and the Russian-born American chemist Phoebus Levene. They figured out the exact composition of DNA and identified its three key components: the sugar, the phosphate and most importantly, the four bases (adenine, cytosine, thymine and guanine). Kossel was such a revered figure that his students led a torchlight procession through the streets from the train station to his lab when he came back to Heidelberg with the Nobel Prize.
Levene’s case is especially interesting since his identification of the four bases set DNA research back by years, perhaps decades. Because there were only four bases, he became convinced that DNA could never be the hereditary material because it was too simple. His ‘tetra-nucleotide hypothesis’ which said that DNA could only have a repeating structure of four bases doomed its candidacy as a viable genetic material for a long time. Most scientists kept on believing that only proteins could be complex enough to be the stuff of heredity.
Meanwhile, while the biochemists were unraveling the nature of DNA in their own way, the geneticists paved the way. Williams has a brisk but vivid description of the lone monk Gregor Mendel toiling away with thousands of meticulous experiments on pea plants in his monastery in the Moravian town of Brünn. As we now know, Mendel was fortunate in picking the pea plant since it’s a purebred species. Mendel’s faith in his own work was shaken toward the end of his life when he tried to duplicate his experiments using the hawkweed plant whose genetics are more complex. Tragically, Mendel’s notebooks and letters were burnt after his death and his work was forgotten for thirty years before it was resurrected independently by three scientists, all of whom tried to claim credit for the discovery. The other major figure in genetics during the first half of the 20th century was Thomas Hunt Morgan whose famous ‘fly room’ at Columbia University carried our experiments showing the presence of hundreds of genes are precise locations on chromosomes. In his lab, there was a large pillar on which Morgan and his students drew the locations of new genes.
From the work of Mendel, Morgan, Levene and Kossel we move on to New York City where Oswald Avery, Colin MacLeod and Maclyn McCarty at the Rockefeller University and the sharp-tongued, erudite Erwin Chargaff at Columbia made two seminal discoveries about DNA. Avery and his colleagues showed that DNA is in fact the ‘transforming principle’ that Fred Griffith had identified. Chargaff showed that the proportions of A and T and G and C in DNA were similar. Williams says in the epilogue that of all the people who were potentially robbed of Nobel Prizes for DNA, the two most consequential were Avery and Griffith.
By this time, along with biochemistry and genetics, x-ray crystallography had started to become very prominent in the study of molecules: by shining x-rays on a crystal and interpreting the resulting diffraction pattern, scientists could potentially figure out the structure of the molecule on an atomic level. Williams provides an excellent history of this development, starting with the Nobel Prize-winning father-son duo of William Henry and William Lawrence Bragg (who remains the youngest Nobel Laureate at age 25) and continuing with other pioneering figures like J. D. Bernal, William Astbury, Dorothy Hodgkin and Linus Pauling.
Science is done by scientists, but it’s made possible by science administrators. Two major characters star in the DNA drama as science administrators par excellence. Both had their flaws, but without the institutions they set up to fund and encourage biological work, it is doubtful whether the men and women who discovered DNA and its structure would have made the discoveries when and where they did. William Lawrence Bragg repurposed the famed Cavendish Laboratories at Cambridge University – where Ernest Rutherford had reigned supreme - for crystallographic work on biological molecules. A parallel effort was started by John Randall, a physicist who had played a critical role in Britain’s efforts to develop radar during World War 2, at King’s College in London. While Bragg recruited Max Perutz, Francis Crick and James Watson for his group, Randall recruited Maurice Wilkins, Ray Gosling and Rosalind Franklin.
One of the strengths of Williams’s book is that it resurrects the role of Maurice Wilkins who is often regarded as the least important of the Nobel Prize-winning triplet of Watson, Crick and Wilkins. In fact, it was Wilkins and Gosling who took the first x-ray photographs of DNA that seemed to indicate a helical structure. Wilkins was also convinced that DNA and not protein was the genetic material when that view was still unfashionable; he passed on his infectious enthusiasm to Crick and Watson. But even before his work, the Norwegian crystallographer Sven Furberg had been the first to propose a helix – although a single one – as the structure of DNA based on his density and other important features. A key feature of Furberg’s model was that the sugar and the base were perpendicular, which is in fact the case with DNA.
The last third of the book deals with the race to discover the precise structure of DNA. This story has been told many times, but Williams tells it exceptionally well and especially drives home how Watson and Crick were able to stand on the shoulders of many others. Rosalind Franklin comes across as a fascinating, complex, brilliant and flawed character. There was no doubt that she was an exceptional scientist who was struggling to make herself heard in a male-dominated establishment, but it’s also true that her prickly and defensive personality made her hard to work with. Unlike Watson, she was especially reluctant to build models, perhaps because she had identified a fatal flaw in one of the pair’s earlier models. It’s not clear how close Franklin came to identifying DNA as a helix; experimentally she came close, but psychologically she seemed reluctant and bounced back and forth between helical and non-helical structures.
So what did Watson and Crick have that the others did not? As I have described in a post written a few years ago on the 70th anniversary of the DNA structure, many others were in possession of key parts of the evidence, but only Watson and Crick put it all together and compulsively built models. In this sense it was very much like the blind men and the elephant; only Watson and Crick bounced around the entire animal and saw how it was put together. Watson’s key achievement was recognizing the precise base pairing: adenine with thymine and guanine with cytosine. Even here he was helped by the chemist Jerry Donohue who corrected a key chemical feature of the bases (organic chemists will recognize it as what’s called keto-enol tautomerism). Also instrumental were Alec Stokes and John Griffith. Stokes was a first-rate mathematician who, using the theory of Bessel functions, figured out the diffraction pattern that would correspond to a helix; Crick who was a physicist well-versed with the mathematics of diffraction, instantly understood Stokes’s work. Griffith was a first-rate quantum chemist who figured out, independently of Donohue, that A would pair with T and G with C. Before the advent of computers and what are called ab initio quantum chemical techniques, this seems like a remarkable achievement.
With Chargaff’s knowledge of the constancy of base ratios, Donohue’s precise base structures, Franklin and Gosling’s x-ray measurements and Stokes’s mathematics of helix diffraction patterns, Watson and Crick had all the information they needed to try out different models and cross the finish line. No one else had this entire map of information at their disposal. The rest, as they say, is history.
I greatly enjoyed reading Williams’s book. It is, perhaps, the best book on the DNA story that I have read since Horace Freeland Judson’s “The Eighth Day of Creation”. Even characters I was familiar with newly come to life as flawed, brilliant human beings with colorful lives. The account shows that many major and minor figures made important discoveries about DNA. Some came close to figuring out the structure but never made the leap, either because they lacked data or because of personal prejudices. Taken as a whole, the book showcases well the intrinsically human story and the group effort, playing out over 85 years, at the heart of the one of the greatest discoveries that humanity has made. I highly recommend it.

Brian Greene and John Preskill on Steven Weinberg


There's a very nice tribute to Steven Weinberg by Brian Greene and John Preskill that I came across recently that is worth watching. Weinberg was of course one of the greatest theoretical physicists of the later half of the 20th century, winning the Nobel Prize for one of the great unifications of modern physics, which was the unification of the electromagnetic and the weak forces. He was also a prolific author of rigorous, magisterial textbooks on quantum field theory, gravitation and other aspects of modern physics. And on top of it all, he was a true scholar and gifted communicator of complex ideas to the general public through popular books and essays; not just ideas in physics but ones in pretty much any field that caught his fancy. I had the great pleasure and good fortune to interact with him twice.

The conversation between Greene and Preskill is illuminating because it sheds light on many underappreciated qualities of Weinberg that enabled him to become a great physicist and writer, qualities that are worth emulating. Greene starts out by talking about when he first interacted with Weinberg when he gave a talk as a graduate student at the physics department of the University of Texas at Austin where Weinberg taught. He recalls how he packed the talk with equations and formal derivations, only to have the same concepts explained by Weinberg more clearly later. As physicists appreciate, while mathematics remains the key to unlock the secrets of the universe, being able to understand the physical picture is key. Weinberg was a master at doing both.

Preskill was a graduate student of Weinberg's at Harvard and he talks about many memories of Weinberg. One of the more endearing and instructive ones is from when he introduced Weinberg to his parents at his house. They were making ice cream for dinner, and Weinberg wondered aloud why we add salt while making the ice cream. By that time Weinberg had already won the Nobel Prize, so Preskill's father wondered if he genuinely didn't understand that you add the salt to lower the melting point of the ice cream so that it would stay colder longer. When Preskill's father mentioned this Weinberg went, "Of course, that makes sense!". Now both Preskill and Greene think that Weinberg might have been playing it up a bit to impress Preskill's family, but I wouldn't be surprised if he genuinely did not know; top tier scientists who work in the most rarefied heights of their fields are sometimes not as connected to basic facts as graduate students might be. 

More importantly, in my mind the anecdote illustrates an important quality that Weinberg had and that any true scientist should have, which is to never hesitate to ask even simple questions. If, as a Nobel Prize winning scientist, you think you are beyond asking simple questions, especially when you don't know the answers, you aren't being a very good scientist. The anecdote demonstrates a bigger quality that Weinberg had which Preskill and Greene discuss, which was his lifelong curiosity about things that he didn't know. He never hesitated to pump people for information about aspects of physics he wasn't familiar with, not to mention another disciplines. Freeman Dyson who I knew well had the same quality: both Weinberg and Dyson were excellent listeners. In fact, asking the right question, whether it was about salt and ice cream or about electroweak unification, seems to have been a signature Weinberg quality that students should take to heart.

Weinberg became famous for a seminal 1967 paper that unified the electromagnetic and weak force (and used ideas developed by Peter Higgs to postulate what we now call the Higgs boson). The title of the paper was "A Model of Leptons", but interestingly, Weinberg wasn't much of a model builder. As Preskill says, he was much more interested in developing general, overarching theories than building models, partly because models have a limited applicability to a specific domain while theories are much more general. This is a good point, but of course, in fields like my own field of computational chemistry, the problem isn't that there are no general theoretical frameworks  - there are, most notably the frameworks of quantum mechanics and statistical mechanics - but that applying them to practical problems is too complicated unless we build specific models. Nevertheless, Weinberg's attitude of shunning specific models for generality is emblematic of the greatest scientists, including Newton, Pauling, Darwin and Einstein.

Weinberg was also a rather solitary researcher; as Preskill points out, of his 50 most highly cited papers, 42 are written alone. He admitted himself in a talk that he wasn't the best collaborator. This did not make him the best graduate advisor either, since while he was supportive, his main contribution was more along the lines of inspiration rather than guidance and day-to-day conversations. He would often point students to papers and ask them to study them themselves, which works fine if you are Brian Greene or John Preskill but perhaps not so much if are someone else. In this sense Weinberg seems to be have been a bit like Richard Feynman who was a great physicist but who also wasn't the best graduate advisor.

Finally, both Preskill and Greene touch upon Weinberg's gifts as a science writer and communicator. More than many other scientists, he never talked down to his readers because he understood that many of them were as smart as him even if they weren't physicists. Read any one of his books and you see him explaining even simple ideas, but never in a way that assumes his audience are dunces. This is a lesson that every scientist and science writer should take to heart.

Greene especially knew Weinberg well because he invited him often to the World Science Festival which he and his wife had organized in New York over the years. The tribute includes snippets from Weinberg talking about the current and future state of particle physics. In the last part, an interviewer asks him about what is arguably the most famous sentence from his popular writings. In the last part of his first book, "The First Three Minutes", he says, "The more the universe seems comprehensible, the more it seems pointless." Weinberg's eloquent response when he was asked what this means sums up his life's philosophy and tells us why he was so unique, as a scientist and as a human being:

"Oh, I think everything's pointless, in the sense that there's no point out there to be discovered by the methods of science. That's not to say that we don't create points for our lives. For many people it's their loved ones; living a life of helping people you love, that's all the point that's needed for many people. That's probably the main point for me. And for some of us there's a point in scientific discovery. But these points are all invented by humans and there's nothing out there that supports them. And it's better that we not look for it. In a way, we are freer, in a way it's more noble and admirable to give points to our lives ourselves rather than to accept them from some external force."

A long time ago, in a galaxy far, far away





For a brief period earlier this week, social media and the world at large seemed to stop squabbling about politics and culture and united in a moment of wonder as the James Webb Space Telescope (JWST) released its first stunning images of the cosmos. These "extreme deep field" images represent the farthest and the oldest that we have been able to see in the universe, surpassing even the amazing images captured by the Hubble Space Telescope that we have become so familiar with. We will soon see these photographs decorating the walls of classrooms and hospitals everywhere.

The scale of the JWST images is breathtaking. Each dot represents a galaxy or nebula from far, far away. Each galaxy or nebula is home to billions of stars in various stages of life and death. The curved light in the image comes from a classic prediction of Einstein's general theory of relativity called gravitational lensing - the bending of light by gravity that makes spacetime curvature act like a lens. 

Some of the stars in these distant galaxies and nebulae are being nurtured in stellar nurseries; others are in their end stages and might be turning into neutron stars, supernovae or black holes. And since galaxies have been moving away from us because of the expansion of the universe, the farther out we see, the older the galaxy is. This makes the image a gigantic hodgepodge of older and newer photographs, ranging from objects that go as far back as 100 million years after the Big Bang to very close (on a cosmological timescale) objects like Stephan's Quintet and the Carina Nebula that are only a few tens of thousands of light years away.

It is a significant and poignant fact that we are seeing objects not as they are but as they were. The Carina Nebula is 8,500 light years away, so we are seeing it as it looked like 8,500 years ago, during the Neolithic Age when humanity had just taken to farming and agriculture. On the oldest timescale, objects that are billions of light years away look the way did during the universe's childhood. The fact that we are seeing old photographs or stars, galaxies and nebulae gives the photo a poignant quality. For a younger audience who has always grown up with Facebook, imagine seeing a hodgepodge of images of people from Facebook over the last fifteen years presented to you: some people are alive and some people no longer so, some people look very different from what they did when their photo was last taken. It would be a poignant feeling. But the JWST image also fills me with joy. Looking at the vast expanse, the universe feels not like a cold, inhospitable place but like a living thing that's pulsating with old and young blood. We are a privileged part of this universe.

There's little doubt that one of the biggest questions stimulated by these images would be whether we can detect any signatures of life on one of the many planets orbiting some of the stars in those galaxies. By now we have discovered thousands of extrasolar planets around the universe, so there's no doubt that there will be many more in the regions the JWST is capturing. The analysis of the telescope data already indicates a steamy atmosphere containing water on a planet about 1,150 light years away. Detecting elements like nitrogen, carbon, sulfur and phosphorus is a good start to hypothesizing about the presence of life, but much more would be needed to clarify whether these elements arise from an inanimate process or a living one. It may seem impossible that a landscape as gargantuan as this one is completely barren of life, but given the improbability of especially intelligent life arising through a series of accidents, we may have to search very wide and long.

I was gratified as my twitter timeline - otherwise mostly a cesspool of arguments and ad hominem attacks punctuated by all-too-rare tweets of insight - was completely flooded with the first images taken by the JWST. The images proved that humanity is still capable of coming together and focusing on a singular achievement of science and technology, how so ever briefly. Most of all, they prove both that science is indeed bigger than all of us and that we can comprehend it if we put our minds and hands together. It's up to us to decide whether we distract ourselves and blow ourselves up with our petty disputes or explore the universe as revealed by JWST and other feats of human ingenuity in all its glory.

Image credits: NASA, ESA, CSA and STScl

Book Review: "The Rise and Reign of the Mammals: A New History, From the Shadows of the Dinosaurs to US", by Steve Brusatte

A terrific book by Edinburgh paleontologist Steve Brusatte on the rise of the mammals. Engaging, personal and packed with simple explanations and analogies. Brusatte tracks the evolution of mammals from about 325 million years ago when our reptilian answers split off into two groups - the synapsids and the diapsids. The diapsids gave rise to reptiles like crocodiles and snakes while the synapsids eventually gave rise to us. The synapsids evolved with a hole behind their eye socket: it’s now covered with a set of muscles which you can feel if you touch your cheek while chewing.

Much of the book is focused on how mammals evolved different anatomical and physiological functions against the backdrop of catastrophic and gentle climate change, including the shifting of the continents and major extinctions driven by volcanic eruptions, meteors (during the K-T extinction event that killed the dinosaurs) sea level rises and ice ages. That mammals survived these upheavals is partly a result of chance and partly a result of some remarkable adaptations which the author spends considerable time describing. These adaptations include milk production, temperature regulation, hair, bigger brains and stable locomotion, among others.
Some these changes were simple but significant - for instance, a law named Carrier’s law limits lung capacity in slithering reptiles because each lung alternately gets compressed during sidewinding motions. When mammalian ancestors were able to lift their body upward from the ground and able to install a set of bones that constrained the rib cage, it allowed their lungs to be able to breathe and expel oxygen during movement and when the animal was eating. Needless to say, the ability to breathe and move while eating was momentous for survival in an environment in which predators abounded.
Another adaptation was the development of a specialized set of teeth that mark all mammals including humans - the incisors, canines, pre-molars and molars. Because these teeth form a specialized, complex apparatus, they emerge only twice in mammals - once during infancy and one more time during adulthood. But out chewing apparatus gave rise to another remarkable adaptation - in an evolutionary migration spread out over millions of years, bones of the jaw became the bones of the ear. The ear bones are a set of finely orchestrated and sensitive sound detectors that gave mammals an acute sense of hearing and enabled them to seek out mates and avoid predators.
Quite naturally, the book spends a good amount of time describing the mystery of why mammals survived the great meteor extinction of dinosaurs and much of other life on the planet. Except that it’s no mystery. Dinosaurs were bulky and specialized cold-blooded eaters which were exposed. Mammals were furry, rodent-like warm-blooded omnivores which could hide out underground and eke out an existence on charred vegetation and dead flesh in the post-apocalyptic environment. After the K-T event, there was no turning back for mammals.
The rest of the book spends time discussing particular features of mammalian evolution like flight in bats and the odd monotremes like the duck-billed platypus which lay eggs. A particularly memorable discussion is of the whales, the biggest mammals which have ever lived, which actually evolved from land mammals that would occasionally take to water to escape predators and seek out new food. With their exceptionally big brains and bat-like echolocation, whales remain a wonder of nature.
Brusatte also spices up his account with adventurous stories of intrepid paleontologists and archeologists who have dup up pioneering fossils in extreme environments ranging from the blistering tropical forests of Africa to the Gobi desert of Mongolia. Paleontology comes across as a truly international endeavor, with Chinese paleontologists especially making significant contributions; they were among the first for instance to discover a feather dinosaur, attesting to the reptile to bird evolutionary transition. Unlike old times when Victorian men did most of the digging, women are now a healthy percentage of the field.
Human evolution occupies only a few chapters of Brusatte’s book, and for good reason. While humans occupy a unique niche because of their intelligence, evolutionarily they are no more special or fascinating than whales, bats, platypuses, elephants or indeed the earliest synapsids. What we can take heart from is the fact that we are part of an unbroken thread of evolution ranging across all these creatures. Mammals have survived catastrophic extinctions and climate change events. Humans are now being responsible for one. Whether they are responsible for their own extinction or show the kind of adaptability that their ancestors showed is a future state only they are responsible for.

Book Review: "Don't Tell Me I Can't: An Ambitious Homeschooler's Journey", by Cole Summers (Kevin Cooper)

I finished this book with a profound sense of loss combined with an inspired feeling of admiration for what young people can do. Cole Summers grew up in the Great Basin Desert region of Nevada and Utah with a father who had tragically become confined to a wheelchair after an accident in military training. His parents were poor but they wanted Cole to become an independent thinker and doer. Right from when he was a kid, they never said "No" to him and let him try out everything that he wanted to. When four-year-old Cole wanted to plant and grow a garden, they let him, undeterred by the minor cuts and injuries on the way.


Partly because of financial reasons and partly because there were no good schools available in their part of town, Cole's parents decided to homeschool him. But homeschooling for Cole happened on his terms. When they saw him watching Warren Buffet, Charlie Munger and Bill Gates videos on investing and business, they told him it was ok to learn practical skills by watching YouTube videos instead of reading school books. Cole talks about many lessons he learnt from Munger and Buffet about patience and common mistakes in investing. When other kids were reciting the names of planets, Cole was reading company balance sheets and learning how to write off payroll expenses as tax deductions through clever investing.

This amazing kid had, by the age of fourteen, started two businesses - one raising rabbits and one farming. He parlayed his income into buying a beat up house and a sophisticated John Deere tractor. He fixed up the house from scratch, learning everything about roofing, flooring, cabinet installation and other important aspects of construction from YouTube videos and from some local experts. He learnt, sometimes through hard experience, how to operate a tractor and farm his own land. He made a deep study of the Great Basin desert water table which is dropping a few feet every year and came up with a novel and detailed proposal to prevent water levels from declining by planting low-water plants. He came up with solutions to fix the supply chain problems with timber and farm equipment.

A week or two ago, Cole and his brother were kayaking and horsing around in a local reservoir when Cold drowned and died. He leaves behind a profound sense of loss at an incredible life snuffed out too young and some deep wisdom that most of us who have lived our entire lives still don't appreciate.

The main lesson in the book that Cole wants to leave us with is to let kids do what they want, not tell them they can't do things and give them the freedom to explore and spend leisurely time learning things in an unconventional manner. He rightly says that we have structured parenting in a such a way that every minute of a kid's day is oversubscribed. He is also right that many modern parents err on the side of caution.

It was certainly not the way my parents let me use my time when I was growing up, and I was free to explore the local hills looking for insects and libraries reading books and do dangerous experiment in my home lab from an early age; there is little doubt that this relaxed style of parenting on my parents' part significantly contributed to who I am.

I strongly believe that if you let kids do what they want (within some limits, of course), not only will they turn out ok but they will do something special. Cole Summers seems to me to be the epitome of this ideal. May we all, parents and kids, learn from his extraordinary example and memory.