Field of Science

Galton's "Hereditary Genius" (1871)




As someone who loved collecting vintage books, I was stoked to acquire a first American edition of Francis Galton's pioneering book “Hereditary Genius” for the bizarrely low price of $25 - most copies in good condition like this one sell for an unaffordable few hundred dollars at the minimum.

First published in 1869, “Hereditary Genius” is an important book in the history of science as well as a good example of how racist ideas are respectable in their own times. Galton was a statistician, geneticist and brilliant polymath who was one of the founders of statistics (among other ideas, he was the one who developed the concept of regression to the mean) and biometry or biological measurement. He was also Darwin’s cousin and was heavily influenced by Darwin’s ideas on survival of the fittest and natural selection.
His book was the first to make a serious and fairly exhaustive case that intelligence is inherited and genetic. He made this case almost a quarter century before Gregor Mendel figured out the nature of genes. To do this Galton made a detailed survey of what he called “eminent men” (no women, although he acknowledges this deficiency) and traced their lineage through several generations, making the case that intelligence was preserved. The eminent men included men as diverse as scientists, poets, writers, “divines”, “oarsmen” and “wrestlers from the north country”.
The book is clearly written and argued and was hugely successful both in Europe and the United States. Darwin was smitten by it and wrote:
“I have only read 50 pages of your book (to Judge), but I must exhale myself, else something will go wrong with my inside. I do not think I ever in all of my life read anything more interesting and original—and how well and clearly you put every point!"
But the book was a double edged sword. While the hereditary nature of intelligence is now accepted, Galton ended up making the case for eugenics, social Darwinism and the superiority of certain races (the examples in Galton's book are all Caucasian), arguments that were unsurprising for the times he lived in. While today his book is considered clearly incomplete and flawed, because of its novelty, clarity and reputation of its author, it became a rallying cry for eugenicists and white supremacists especially in the United States who advocated the culling of “inferior stock” to preserve intelligent races, which in their view naturally meant the Anglo-Saxon race.
An important and readable book, very much a product of its times, correct in certain fundamental ways but incorrect, incomplete and dangerous in others.

John Polkinghorne's "Belief in God in an Age of Science"

A book I have been enjoying recently is John Polkinghorne's "Belief in God in an Age of Science." Polkinghorne who died recently was a noted theoretical physicist who was also a theologian. Unlike Polkinghorne I am an atheist, but he makes a good case for why religion, science, poetry, art, literature should all be welcomed as sources for truth about the universe and about human beings. A quote I particularly like from it:

"If we are seeking to serve the God of truth then we should really welcome truth from whatever source it comes. We shouldn’t fear the truth. Some of it will be from science, obviously, but by no means all of it. It will sometimes be perplexing, how this bit of truth relates to that bit of truth; we know that within science itself often enough and we find it outside of science as well. The crucial thing is to be honest.”
I would quibble with the catch-all definition of truth in Polkinghorne's quote (scientific "truth" by its very nature is tentative) but otherwise agree. In my scientific career I have found this as well. Often Tolstoy or the Bhagavad Gita or Bach have taught me deep truths about human beings that I never saw in any physics or chemistry or mathematics textbook. The great thing about human life is its diversity. Science is the most important thing that enriches it, but it's not the only one. That's a good thing. These multiple sources of diversity should keep us busy for as long as there is a human species.

Tolman, “The Principles of Statistical Mechanics, Chapter 1, Part 1

Survey of classical mechanics: Generalized coordinates and momenta. Lagrangian equations. Derivation of Hamilton’s equations from Lagrangian. Poisson brackets. Hamilton as representing invariant E under time for conservative systems.

“Pull quote”: Something simple and seemingly obvious but actually deep and foundational

Some notes (not checked for typos!)





100 Desert Island Books

Finally got around to making that "100 books I would want on a desert island" list. Another title would be "100 books that I consider essential reading for *my* life": thus, this is a personal selection. I don't claim to have this list cover the most important aspects of human life or the universe, nor do I expect "famous" books to be on this list (although some of them are). The list just reflects my personal traditional interests - history and philosophy of science has the most numbers, followed by science textbooks, general history, philosophy and theology and a tiny sliver of fiction (I started reading fiction seriously quite recently). One condition in listing these books was that I should have read them in their entirety: this is true of all of them except "Gödel, Escher, Bach" which I think I am going to keep soldiering through my whole life. I am very privileged to call some of the authors here my friends.


One common thread running through most of these books is that I discovered them early, when I was in high school, college and graduate school, in most cases in either the college or university library or the British Library which was a stone's throw from where I grew up. Early impressions are often the strongest, so I keep coming back to these volumes and they keep inspiring and instructing me.

I have thousands of books on my shelf and I always find it hard to give any away. There are many others I haven't listed here which I love, but if I actually had just these 100 (110 to be precise), I wouldn't be entirely depressed (just don't tell my significant other...).

HISTORY AND PHILOSOPHY OF SCIENCE (INCLUDING BIOGRAPHY AND AUTOBIOGRAPHY)

Richard Rhodes - The Making of the Atomic Bomb
Richard Rhodes - Dark Sun: The Making of the Hydrogen Bomb
Freeman Dyson - Disturbing the Universe
Freeman Dyson - Infinite in All Directions
George Dyson - Turing’s Cathedral
George Dyson - Darwin Among the Machines
Edward Wilson - Naturalist
Edward Wilson - Consilience
James Gleick - Chaos
John Horgan - The End of Science
Robert Serber - Peace and War
Jeremy Bernstein - Hans Bethe: Prophet of Energy
Silvan Schweber - In the Shadow of the Bomb
Silvan Schweber - QED and the Men Who Made It
David Kaiser - Drawing Theories Apart
Kip Thorne - Black Holes and Time Warps
Robert Kanigel - The Man Who Knew Infinity
Robert Hoffman - The Man Who Loved Only Numbers
Robert Crease and Charles Mann - The Second Creation
Douglas Hofstadter - Gödel, Escher, Bach
Alice Kimball-Smith and Charles Weiner - Robert Oppenheimer: Letters and Recollections
Peter Galison - Image and Logic
Emanuel Derman - My Life as a Quant
Kameshwar Wali - Chandra
John Gribbin - In Search of Schrödinger’s Cat
John Casti - Paradigms Lost
John Casti - The Cambridge Quintet
John Casti - Gödel: A Life in Logic
George Johnson - Strange Beauty
Roger Penrose - The Emperor’s New Mind
Roger Penrose - The Road to Reality
Richard Dawkins - Climbing Mount Improbable
Gerald Durrell - My Family and Other Animals
Konrad Lorenz - King Solomon’s Ring
Robert Laughlin - A Different Universe
Horace Freeland Judson - The Eighth Day of Creation
Peter Michelmore - The Swift Years: The Robert Oppenheimer Story
Richard Feynman - Surely You’re Joking Mr. Feynman
Stanislaw Ulam - Adventures of a Mathematician
Laura Fermi - Atoms in the Family
Werner Heisenberg - Physics and Philosophy
Ronald Clark - Einstein
Steven Pinker - The Blank Slate
David Deutsch - The Beginning of Infinity
Steven Weinberg - Dreams of a Final Theory
J. Robert Oppenheimer - The Open Mind
Stuart Kauffman - Reinventing the Sacred
Barry Werth - The Billion Dollar Molecule
Oliver Sacks - On the Move
Carl Sagan - The Demon-Haunted World
Max Perutz - I Wish I’d Made You Angry Earlier
Jonathan Allday - Quarks, Leptons and the Big Bang
Philip Ball - H2O: A Biography of Water
Philip Ball - The Self-Made Tapestry
Alan Lightman - Einstein’s Dreams
Alan Lightman - The Accidental Universe
Brown, Pais and Pippard - Twentieth Century Physics (3 volumes)
Ed Regis - Who Got Einstein’s Office?
C. P. Snow - The Physicists

TEXTBOOKS

Ira Levine - Quantum Chemistry
Peter Atkins - Molecular Quantum Mechanics
Lubert Stryer - Biochemistry
Albert Lehninger - Biochemistry
George Simmons - Introduction to Topology and Modern Analysis
George Simmons - Differential Equations
Richard Feynman - The Feynman Lectures on Physics
David Griffiths - Introduction to Electrodynamics
John Lee - Inorganic Chemistry
Samuel Glasstone - Sourcebook on Atomic Energy
Samuel Glasstone - Thermodynamics for Chemists
Arthur Beiser - Concepts of Modern Physics
Gautam Desiraju - The Weak Hydrogen Bond
Linus Pauling - The Nature of the Chemical Bond
Linus Pauling and Edward Bright Wilson - Introduction to Quantum Mechanics
Clayden, Warren, Reeves and Wothers - Organic Chemistry
Eric Anslyn and Dennis Dougherty - Modern Physical Organic Chemistry
Wells, Wells and Huxley - The Science of Life
Goodman and Gilman - The Pharmacological Basis of Therapeutics
Jerry March - Advanced Organic Chemistry

HISTORY

Barbara Tuchman - The Guns of August
William Shirer - The Rise and Fall of the Third Reich
James Swanson - Manhunt: The 12-Day Chase for Lincoln’s Killer
David McCullough - Truman
James Scott - Against the Grain
James McPherson - Battle Cry of Freedom
Gordon Wood - Empire of Liberty
John Barry - Roger Williams and the Creation of the American Soul
Bernard Bailyn - The Ideological Origins of the American Revolution
Robert Caro - The Years of Lyndon Johnson (Vols. 1-4)
Rick Atkinson - An Army at Dawn
Will Durant - Our Oriental Heritage
Russell Shorto - The Island at the Center of the World
Nick Bunker - An Empire on the Edge
Brad Gregory - Rebel in the Ranks
Cornelius Ryan - The Longest Day

PHILOSOPHY AND THEOLOGY

Sam Harris - The End of Faith
David Edmonds and John Eidinow - Wittgenstein's Poker
Plato - The Republic
Matthew Stewart - The Courtier and the Heretic
Isaiah Berlin - The Proper Study of Mankind
Bertrand Russell - Unpopular Essays
Bertrand Russell - Why I am Not a Christian

FICTION

Vasily Grossman - Life and Fate
Haruki Murakami - What I Talk About When I Talk About Running
Cormac McCarthy - Blood Meridian
Cormac McCarthy - The Road
Isaac Asimov - Asimov’s Mysteries
Cordwainer Smith - No, No, Not Rogov! (this is a single story but it is very striking in its vividness and poetry and made a deep impression)
Leo Tolstoy - War and Peace
Fyodor Dostoevsky - Notes from the Underground
William Faulkner - As I Lay Dying
H. G. Wells - The Time Machine
Chekhov - Stories

Brenner, von Neumann and Schrödinger

Erwin Schrödinger's book, "What is Life"?, inspired many scientists like Crick, Watson and Perutz to go into molecular biology. While many of the details in the book were wrong, the book's central message that the time was ripe for a concerted attack on the structure of the genes based on physical principles strongly resonated.

However, influence and importance are two things, and unfortunately the two aren't always correlated. As Sydney Brenner recounts in detail here, the founding script for molecular biology should really have been John von Neumann's 1948 talk at Caltech as part of the Hixon Symposium, titled "The General and Logical Theory of Automata". In retrospect this talk was seminal and far-reaching. Brenner is one of the very few scientists who seems to have appreciated that von Neumann's influence on biology was greater than Schrödinger's and that von Neumann was right and Schrödinger wrong. Part of the reason was that while many biologists like Crick and Watson had read Schrödinger's "What is Life?", almost nobody had read von Neumann's "General and Logical Theory of Automata".

As Brenner puts it, Schrödinger postulated that the machinery for replication (chromosomes) also included the means of reproducing it. Von Neumann realized that the machinery did not include the means themselves but only the *instructions* for those means.
That's a big difference; the instructions are genes, the means are proteins. In fact as Freeman Dyson says in his "Origins of Life", von Neumann was the first to clearly realize the distinction between software (genes) and hardware (proteins). Why? Because as a mathematician and a generalist (and pioneer of computer science), he had a vantage point that was unavailable to specialist biologists and chemists in the field.

Unfortunately abstract generalists are often not recognized as the true originators of an idea. It's worth noting that in his lecture, von Neumann laid out an entire general program for what we now call translation, five years before Watson, Crick, Franklin and others even solved the structure of DNA. The wages of the theoretician are sparse, especially those of the one, as mathematician John Casti put it, who solves "only" the general case.

On change

Two weeks ago, outside a coffee shop near Los Angeles, I discovered a beautiful creature, a moth. It was lying still on the pavement and I was afraid someone might trample on it, so I gently picked it up and carried it to a clump of garden plants on the side. Before that I showed it to my 2-year-old daughter who let it walk slowly over her arm. The moth was brown and huge, almost about the size of my hand. It had the feathery antennae typical of a moth and two black eyes on the ends of its wings. It moved slowly and gradually disappeared into the protective shadow of the plants when I put it down.

Later I looked up the species on the Internet and found that it was a male Ceanothus silk moth, very prevalent in the Western United States. I found out that the reason it’s not seen very often is because the males live only for about a week or two after they take flight. During that time they don’t eat; their only purpose is to mate and die. When I read about it I realized that I had held in my hand a thing of indescribable beauty, indescribable precisely because of the briefness of its life. Then I realized that our lives are perhaps not all that long compared to the Ceanothus moth’s. Assuming that an average human lives for about 80 years, the moth’s lifespan is about 2000 times shorter than ours. But our lifespans are much shorter than those of redwood trees. Might not we appear the same way to redwood trees the way Ceanoth moths or ants appear to us, brief specks of life fluttering for an instant and then disappearing? The difference, as far as we know, is that unlike redwood trees we can consciously understand this impermanence. Our lives are no less beautiful because on a relative scale of events they are no less brief. They are brief instants between the lives of redwood trees just like redwood trees’ lives are brief instants in the intervals between the lives of stars.

I have been thinking about change recently, perhaps because it’s the standard thing to do for someone in their forties. But as a chemist I have thought about change a great deal in my career. The gist of a chemist’s work deals with the structure of molecules and their transformations into each other. The molecules can be natural or synthetic. They can be as varied as DNA, nylon, chlorophyll, rocket fuel, cement and aspirin. But what connects all of them is change. At some point in time they did not exist and came about through the union of atoms of carbon, oxygen, hydrogen, phosphorus and other elements. At some point they will cease to be and those atoms will become part of some other molecule or some other life form.

Sometimes popular culture can capture the essence of science and philosophy well. In this case, chemistry as change was captured eloquently by the character of Walter White in the TV show “Breaking Bad”. In his first lecture as a high school chemistry teacher White says,

“Chemistry is the study of matter. But I prefer to think of it as the study of change. Now, just think about this. Electrons change their energy levels. Elements, they change and combine into compounds. Well, that’s…that’s all of life, right? It’s the constant, it’s the cycle, it’s solution, dissolution, just over and over and over. It’s growth, then decay, then transformation. It is fascinating, really.”

Changes in the structure of atoms and molecules are ultimately dictated by the laws of atomic physics and the laws of thermodynamics. The second law of thermodynamics which loosely states that disorder is more likely than order guarantees that change will occur. At its root the second law is an argument from probability: there are simply many more ways for a system to be disordered than to be ordered. The miracle of life and the universe at large is that complex systems like biological systems can briefly defy the second law, assembling order from disorder, letting it persist for a few short decades during which that order can do astonishing things like make music and art and solve mathematical equations enabling it to understand where it came from. The biologist Carl Woese once gave an enduringly beautiful metaphor for life, comparing it to a child playing in a stream.

“If not machines, what are organisms? A metaphor far more to my liking is this. Imagine a child playing in a woodland stream, poking a stick into an eddy in the flowing current, thereby disrupting it. But the eddy quickly reforms. The child disperses it again. Again it reforms, and the fascinating game goes on. There you have it! Organisms are resilient patterns in a turbulent flow—patterns in an energy flow.”

Woese’s metaphor perfectly captures both the permanence and impermanence of life. The structure is interrupted, but over time its essence persists. It changes and yet stays the same.

Although thermodynamics and Darwin’s theory of evolution help us understand how ordered structures can perform these complex actions, ultimately we don’t really understand it at the deepest level. The best illustration of our ignorance is the most complex structure in the universe – the human brain. The brain is composed of exactly the same elements as my table, my cup of coffee and the fern plant growing outside my window. Yet the same elements, when assembled together to create a fern, somehow when assembled in another, very specific way, create a 3-pound, jellylike structure that can seemingly perform miracles like writing ‘Hamlet’, finding the equations of spacetime curvature and composing the Choral Symphony. We have loose terminology like ’emergence’ to describe the unique property of consciousness that arises when human brains are assembled together from inanimate elements, but if we were to be honest as scientists, we must admit that we don’t understand how exactly that happens. The ultimate example of change that makes the essence of us as humans possible is still an enduring mystery. Will we ever solve that mystery? Even some of the smartest scientists on the planet, like the theoretical physicist Edward Witten, think we may not. As Witten puts it,

“I think consciousness will remain a mystery. Yes, that’s what I tend to believe. I tend to think that the workings of the conscious brain will be elucidated to a large extent. Biologists and perhaps physicists will understand much better how the brain works. But why something that we call consciousness goes with those workings, I think that will remain mysterious. I have a much easier time imagining how we understand the Big Bang than I have imagining how we can understand consciousness…”

In other words, what Witten is saying is that even if someday we may understand the how and the what of consciousness, we may never understand the why. One of the biggest examples of change in the history of the universe may well remain hidden behind a veil.

I think about change a lot not just because I am a chemist but because I am a parent. Sometimes it feels like our daughter who is now two and a half years old has changed more in that short time than a caterpillar changes into a butterfly. Her language, reasoning, social and motor skills have undergone an amazing change since she was born. And this is, of course, a change that is observed by every parent: children change an incredible amount during their first few years. Some of that change can be guided by parents, but other change is genetic as well as idiosyncratic and unpredictable. Just like you can coax simple arrangements of atoms into certain compounds but not others, as a parent you have to make peace with the fact that you will be able to mold your child’s temperament, personality and trajectory in life to a certain extent but not beyond that. As the old alchemists figured out, you cannot change mercury into gold or gold into mercury no matter how hard you try. And that’s ultimately for the better because, just like the diversity of elements, we then get a diversity of novel and surprising life trajectories for our children.

Children undergo change but they are are also often the best instruments for causing it. Recently I finished reading Octavia Butler’s remarkable “Parable of the Sower” which is set in a 2024 California that is racked by violence and arson by desperate, homeless people who break into gated communities and burn, murder and rape. The protagonist of the story is a clear-eyed, determined 18-year-old named Lauren Olamina who, after her family is murdered, starts out by herself with the goal of starting a new religion called Earthseed amidst the madness surrounding her. Earthseed sees God as a changeable being and embraces change as the essence of living. Lauren thinks that in a world where people have to deal with unpredictable, seismic, sometimes violent change, a religion that makes the very nature of change a blueprint for God’s work can not just survive but thrive. For an atheist like myself, Earthseed seems as good a religion as any for us to believe in if we want to thrive in an uncertain world. Butler’s story tells us that just like they always have, our children exist to fix the problems our generation has created.

Change permeates the largest scales of the universe as much as it does ourselves, our children and our bodies and brains. One of the most philosophically shattering experiences in the development of science was the realization by Galileo, Brahe, Newton and others that the perfect, crystalline, quiet universe of Aristotle and other ancients was in fact a dynamic, violent universe. In the mid 20th century, astrophysicists worked out that stars go through a life sequence much like we do. When they are born they furiously burn hydrogen into helium and form the lighter elements. As they age they can go in one of several directions. Stars the size of the sun will first blow up into red giants and then quietly settle into the life of a white dwarf. But stars much more massive than the sun can turn into supernovae and black holes, ending their lives in a cosmic show of spectacular explosion or fiery gravitational contraction.

When our sun turns into a red giant, about 6 billion years from now, its outer shell will expand and embrace the orbits of Mercury, Venus and Earth. There is no reason to believe that those planets will survive that encounter. By that time the human race would either be extinct or would have migrated to other star systems; the worst thing that it could do would be to stay put. Even after that we will not escape change. The science of eschatology, the study of the ultimate fate of the universe, has mapped out many changes that will be unstoppable in the far future. At some point the Andromeda galaxy will collide with our Milky Way galaxy. Eventually the stars in the universe will run out of fuel and cease to shine; the universe will become a quieter and darker place. Soon it will only contain black holes and at a further point even black holes will evaporate through the process of Hawking radiation. And way beyond that, the laws of quantum mechanics will ensure that the proton, usually considered a stable particle, will decay. Matter as we know it will dissolve into nothingness. The accelerated expansion of our universe will ensure that most of these processes will inevitably take place. The exact fate of the universe is too uncertain to predict beyond these unimaginable gulfs of time, but there is little doubt that the universe will be profoundly different from what it is now and what it has been before.

The elements from which my body and brain are composed will one day be given back to the universe (I like to think that they will perhaps become part of a redwood tree). That fact does not fill me with a feeling of dread or sadness but instead feels me with peace, joy and gratitude. The ultimate death of the universe described above causes similar feelings to arise. Sometimes I like to sit back, close my eyes and imagine a peaceful, lifeless universe, the galaxies receding past the cosmic horizons, the occasional supernova going off. The carbon, oxygen, nitrogen and other heavier elements in my body came from such supernova explosions a long time ago; the hydrogen came from the Big Bang. Those are astounding facts that science has discovered in the last few decades. Of all the things that could have happened to those elements forged in the furnace of a far off supernova, what were the chances that they would assemble into the exact specific arrangements that would be me? While we understand now how that happens, it could well have gone countless other ways. I feel privileged to exist as part of that brief interval between supernova explosions, to be able to understand, in my own modest way, the workings of our universe. To be a tiny part of the change that makes the universe what it is.

Book Review: Chip War: The Fight for the World's Most Critical Technology

In the 19th century it was coal and steel, in the 20th century it was oil and gas, what will it be in the 21st century? The answer, according to Chris Miller in this lively and sweeping book, is semiconductor chips.

There is little doubt that chips are ubiquitous, not just in our computer and cell phones but in our washers and dryers, our dishwashers and ovens, our cars and sprinklers, in hospital monitors and security systems, in rockets and military drones. Modern life as we know it would be unimaginable without these marvels of silicon and germanium. And as Miller describes, we have a problem because much of the technology to make these existential entities is the province of a handful of companies and countries that are caught in geopolitical conflict.
Miller starts by tracing out the arc of the semiconductor industry and its growth in the United States, driven by pioneers like William Shockley, Andy Grove and Gordon Moore and fueled by demands from the defense establishment during the Cold War. Moore's Law has guaranteed that the demand and supply for chips has exploded in the last few decades; pronouncements of its decline have often been premature. Miller also talks about little-known but critically important people like Weldon Ward who designed chips that made precision missiles and weapons possible, secretary of defense Bill Perry who pressed the Pentagon for funding and developing precision weapons and Lynn Conway, a transgender scientist who laid the foundations for chip design.
Weldon Ward's early design for a precision guided missile in Vietnam was particularly neat: a small window in the tip of the warhead shined laser back to a chip that was divided into four quadrants. If one quadrant started getting more light than the other you would know the missile was off-course and would adjust it. Before he designed the missile, Ward was shown photos of a bridge in Vietnam that was surrounded by craters that indicated where the missile had hit. After he designed his missile, there were no more craters, only a destroyed bridge.
There are three kinds of chips: memory chips which control the RAM in your computer, logic chips which control the CPU and analog chips which control things like temperature and pressure sensing in appliances. While much of the pioneering work in designing transistors and chips was spearheaded by American scientists at companies like Intel and Texas Instruments, soon the landscape shifted. First the Japanese led by Sony's Akio Morita captured the market for memory or DRAM chips in the 80s before Andy Grove powerfully brought it back to the US by foreseeing the personal computer era and retooling Intel for making laptop chips. The landscape also shifted because the U.S. found cheap labor in Asia and outsourced much of the manufacturing of chips.
But the real major player in this shift was Morris Chang. Chang was one of the early employees at Texas Instruments and his speciality was in optimizing the chemical and industrial processes for yielding high-quality silicon. He rose through the ranks and advised the defense department. But, in one of those momentous quirks of history that at the time sound trivial, he was passed over for the CEO position. Fortunately he found a receptive audience in the Taiwanese government who gave him a no-strings-attached opportunity to set up a chip manufacturing plant in Taiwan.
The resulting company, TSMC, has been both the boon and the bane of the electronics age. If you use a device with a chip in it, it has most probably been made by TSMC. Apple, Amazon, Tesla, Intel, all design their own chips but have them made by TSMC. However it does not help that TSMC is located in a company that both sits on top of a major earthquake fault and is the target for invasion or takeover by a gigantic world power. The question of whether our modern technology that is dependent on chips can thrive is closely related to whether China is going to invade Taiwan.
The rest of the supply chain for making chips is equally far flung. But although it sounds globalized, it's not. For instance the stunningly sophisticated process of extreme ultraviolet lithography (EUV) that etches designs on chips is essentially monopolized by one company - ASML in the Netherlands. The machines to do this cost more than $100 million each and have about 500,000 moving parts. If something were to happen to ASML the world's chip supply would come to a grinding halt.
The same goes for the companies that make the software for designing the chips. Three companies in particular - Cadence, Synopsys and Mentor - make 90% of chip design software. There are a handful of other companies making specialized software and hardware, but they are all narrowly located.
Miller makes the argument that the future of chips, and therefore of modern technology at large, is going to depend on the geopolitical relationship especially between China and the United States. The good news is that currently China lags significantly behind the U.S. in almost all aspects of chip design and manufacturing; the major centers for these processes are either in the U.S. or in countries which are allies of the U.S. In addition, replicating machinery of the kind used for etching by ASML is hideously complicated. The bad news is that China has a lot of smart scientists and engineers and uses theft and deception to gain access to chip design and making technology. Using front companies and legitimate buyouts, they have already tried to gain such access. While it will still take years for them to catch up, it is more a question of when than if.
If we are to continue our modern way of life that depends on this critical technology, it will have to be done through multiple fronts, some of which are already being set in motion. Intel is now setting up its own foundry and trying to replicate some of the technology that ASML uses. China will have to be brought to the bargaining table and every attempt will have to be made to ensure that they play fair.
But much of the progress also depends on funding basic science. It's worth remembering that much of the early pioneering work in semiconductors was done by physicists and chemists at places like Bell Labs and Intel, a lot of it by immigrants like Andy Grove and Morris Chang. Basic research at national labs like Los Alamos and Sandia laid the foundations for ASML's etching technology. Attempts to circumvent Moore's Law will also have to be continued to be made; as transistors shrink down to single digit nanometer sizes, quantum effects make their functioning more uncertain. However there are plans to avoid these issues through strategies like stacking them together. All these strategies depend on training the next generation of scientists and engineers, because progress on technology ultimately depends on education.

A Science Thanksgiving

It’s Thanksgiving weekend here in the U.S., and there’s an informal tradition on Thanksgiving to give thanks for all kinds of things in our lives. Certainly there’s plenty to be thankful for this year, especially for those of us whose lives and livelihoods haven’t been personally devastated by the coronavirus pandemic. But I thought I would do something different this year. Instead of being thankful for life’s usual blessings, how about being thankful for some specific facts of nature and the universe that are responsible for our very existence and make it wondrous? Being employed and healthy and surrounded by family and friends is excellent, but none of that would be possible without the amazing unity and diversity of life and the universe. So without further ado and in no particular order, I present an entirely personal selection of ten favorites for which I am eternally thankful.

I am thankful for the value of the resonance level energy of the excited state of carbon-12: carbon-12 which is the basis of all organic life on earth is formed in stars through the reaction of beryllium-8 with helium-4. The difference in energies between the starting materials (beryllium + helium) and carbon is only about 4%. If this difference had been even slightly higher, the unstable beryllium-8 would have disappeared long before it had transmuted into carbon-12, making life impossible.

I am thankful for the phenomenon of horizontal gene transfer (HGT): it allowed bacteria during early evolution to jump over evolutionary barriers by sharing genetic material between themselves instead of just with their progeny. The importance of HGT for evolution may be immense since regular HGT early on might have led to the universality of the genetic code. HGT mixed and matched genetic material in the cauldron of life, eventually leading to the evolution of multicellular organisms including human beings.

I am thankful for the pistol shrimp: an amazing creature that can “clap” its pincers and send out a high-pressure bubble with lightning speed to kill its prey. This sonication bubble can produce light when it collapses, and the speed of collapse is such that temperature inside the bubble can briefly approach the surface temperature of the sun. The pistol shrimp shows us that nature hides phenomena that are not dreamt of in our philosophy, leading to an inexhaustible list of natural wonders for us to explore.

I am thankful to the electron: an entire universe within a point particle that performs the subtlest and most profound magic, making possible the chemistry of life; giving rise to the electromagnetic force that holds ordinary matter together; ultimately creating minds that can win prizes for studying electrons.

I am thankful to the cockroach: may humanity have the resilience to survive the long nights of our making the way you have.

I am thankful to the redwoods: majestic observers and guardians of nature who were here before us, who through their long, slow, considered lives have watched us live out our frantic, anxious lives the way we watch ants live out theirs, and whose survival is now consequentially entwined with our own.

I am thankful to the acetyl group, a simple geometric arrangement of two carbon and one oxygen atoms whose diverse, myriad forms fueling life and alleviating pain – acetylcholine, acetyl-coenzyme A, acetaminophen – are tribute to the ingenuity of both human minds and nature.

I am thankful to i, the square root of minus one: who knew that this diabolical creature, initially alien to even the abstract perception of mathematicians, would be as “real” as real numbers and more importantly, underlie the foundation of our most hallowed descriptions of nature such as quantum theory.

I am thankful to the black hole: an endless laboratory of the most bizarre and fantastic wonders; trapping light but letting information escape; providing the ultimate playground for spacetime curvature; working relentlessly over billions of years as a clearinghouse and organizing principle for the universe’s wayward children; proving that the freaks of the cosmos are in fact the soul food of its very existence.

I am thankful for time: that elusive entity which, in the physicist John Wheeler’s words, “keeps everything from happening all at once”; which waits for no one and grinds kings and paupers into the same ethereal dust; whose passage magically changes children every day before our very eyes; whose very fleeting nature makes life precious and gives us the most to be thankful for.

Book review: A Divine Language: Learning Algebra, Geometry, and Calculus at the Edge of Old Age, by Alec Wilkinson

A beautifully written account of mathematics lost and found. The author got "estranged" from mathematics in school and now, at the age of 65 and after a distinguished writing career, has taken it upon himself to learn the fundamentals of algebra, geometry and calculus. The book is by turns funny and sad even as Wilkinson recounts his struggling attempts to master material that would be child's play for many bright teenagers. He is helped in his efforts by his niece Amie Wilkinson, an accomplished mathematician at the University of Chicago. I myself could empathize with the author since I too had an estrangement of sorts with the subject in high school because of a cruel, vindictive teacher, and it took me until college when, thanks to brilliant and empathetic teachers, I clawed myself back up to start appreciating it.

But while he may struggle even with high school mathematical skills (and he I share a particular loathing for word problems), Wilkinson brings a poetic, philosophical sensibility acquired through a long career to bear on the topic that no young 15-year-old whippersnapper genius in math could commit to paper. He ruminates on the platonic beauty of math and wonders whether and how some people's minds might be wired differently for it. He does not always understand how his brilliant mathematical niece Amie always "gets it" and she in turn doesn't always understand why her uncle has trouble with ideas that are second nature to her.

Often quoting from eloquent mathematicians and physicists like Bertrand Russell, G. H. Hardy and Roger Penrose, Wilkinson brings a fresh, beautiful perspective to the utility and beauty of mathematics; to the struggle inherent in mastering it and the rewards that await those who persevere. I would highly recommend the book to those who may have lost faith in mathematics in high school and want to pick up some of the concepts later, or even to young students of math who may be wizards at solving equations but who might want to acquire a broader, more philosophical perspective on this purest of human endeavors.

Temple Grandin vs algebra

There's a rather strange article by Temple Grandin in the Atlantic, parts of which had me vigorously nodding my head and parts of which had my eyebrows crawling straight up. It's a critique of how our school system tries a one-size-fits-all approach that does a lot of students disservice, but more specifically takes aim at algebra. 

First, let me say how much I admire Temple Grandin. A remarkable woman who had severe autism for most of her childhood (there's a very good profile of her in Oliver Sacks's "An Anthropologist On Mars"), she rose above her circumstances and channeled her unusual abilities into empathy for animals, becoming one of the world's leading experts in the design of humane housing and conditions for livestock. She has without a doubt demonstrated the value of what we can call 'non-standard' modes of thinking, teaching and learning that utilize visual and tactile ability. So she starts off strong enough here:

As a professor of animal science, I have ample opportunity to observe how young people emerge from our education system into further study and the work world. As a visual thinker who has autism, I often think about how education fails to meet the needs of our very diverse minds. We are shunting students into a one-size-fits-all curriculum instead of nurturing the budding builders, engineers, and inventors that our country needs.

So far so good. In fact let me digress a bit here. When I was in high school I was very good at geometry but terrible at algebra; I still remember this one midterm where I got an A and in fact the highest points-based grade in the class in geometry but almost flunked algebra. It took me a long time to claw back to a position where algebra made sense to me. In fact this appreciation of visual explanations was what drew me in part to chemistry, so I perfectly appreciate what Grandin is saying about being sympathetic to students who might have more of a visual capacity. 

But further down the pages she takes a detour into the evils of algebra that doesn't make sense to me. Again, some of what she says is spot on; for instance the fact that algebra (and math in general) can be taught much better if you can relate it to the real world. Too often it's presented simply as abstraction and symbol manipulation. But then there's this:

Cognitive skills may simply not be developed enough to handle abstract reasoning before late adolescence, which suggests that, at the very least, we’re teaching algebra too early and too fast. But abstract reasoning is also developed through experience, which is a good argument for keeping all those extracurriculars.

This part may make more of a case for tying algebra to specific real-world applications than doing away with the abstractions per se. The fact of the matter is that math is abstract; in fact it's precisely this abstraction that makes it a powerful general tool. And there are good and bad ways of teaching that abstraction, but the solution isn't to get rid of it or delay it. In fact, that kind of thinking feeds into the popular belief seen in some quarters these days that algebra and calculus both need to be optional classes.

It's when she gets to the end of the piece, however, that Grandin completely loses me:

"No two people have the same intelligence, not even identical twins. And yet we persist in testing—and teaching—people in the same way. We don’t need Americans to be better at algebra, per se. We need future generations that can build and repair infrastructure, overhaul energy and agriculture, develop robotics and AI. We need kids who grow up with the imagination to invent the solutions to pandemics and climate change. When school fails them, it fails all of us."

Say what? Building and repairing infrastructure, overhauling energy and agriculture and - especially - developing robotics and AI do not need algebra? In fact most of these professions involve a very solid grounding in abstract aspects of algebra and calculus. I think Grandin is treading very handily from saying that algebra should be taught better to saying that we should get rid of it or make it optional. Two very different things.

My concern based on this article and others I am reading these days is that, in our drive to reform the system, we want to consider it unnecessary. That is a grave mistake. Algebra and calculus and for that matter music and art are things that, even beyond the practical utility of the first two, help us appreciate our place in society and the cosmos better and in general teach us to be more human. Make them better we certainly should, but let's not burn the building down in our zeal.