A Christmas message from Steve Jobs for our friends in pharma: 2015 version

I had posted this at the end of 2011, and it's both fascinating and highly disconcerting at the same time that Steve Jobs's lament about product designers' focus on sales instead of product design leading to the decline of specific industries rings even more true for pharma in 2015 than it did in 2011. If anything, the string of mergers and layoffs in Big Pharma during the last four years have underscored even more what happens when an industry starts to worry more about perception and short-term shareholder value than its core reason for existence. Most would agree that that's not how you innovate and that's not how you solve the really hard problems. Let's hope Steve will have a different message for us in 2019.

I am at the end of Walter Isaacson's excellent biography of Steve Jobs and it's worth a read even if you think you know a lot about the man. Love him or hate him, it's hard to deny that Jobs was one of those who disturbed our universe in the last few decades. You can accuse him of a lot of things, but not of being a lackluster innovator or product designer.

The last chapter titled "Legacy" has a distillation of Jobs's words about innovation, creativity and the key to productive, sustainable companies. In that chapter I found this:

"I have my own theory about why decline happens at companies like IBM or Microsoft. The company does a great job, innovates and becomes a monopoly or close to it in some field, and then the quality of product becomes less important. The company starts valuing the great salesmen, because they're the ones who can move the needle on revenues, not the product engineers and designers. So the salespeople end up running the company. John Akers at IBM was a smart, eloquent, fantastic salesperson but he didn't know anything about product. The same thing happened at Xerox. When the sales guys run the company, the product guys don't matter so much, and a lot of them just turn off."

Jobs could be speaking about the modern pharmaceutical industry. There the "product designers" are the scientists of course. Although many factors have been responsible for the decline of innovation in modern pharma, one of the variables that strongly correlates is the replacement of product designers at the helm by salespeople and lawyers beginning roughly in the early 90s.

There's a profound lesson in there somewhere. Not that wishes come true, but it's Christmas, and while we don't have the freedom to innovate, hold a stable job and work on what really matters, we do have the freedom to wish. So with this generous dose of wishful thinking, I wish you all a Merry Christmas.

The AAAS's nomination of Prof. Patrick Harran does a grave disservice to their stated mission

I hear through the Twitterverse that the American Association for the Advancement of Science (AAAS) has appointed Prof. Patrick Harran of UCLA as a new fellow for 2015. I have to say that this choice leaves me both befuddled and disappointed.

Most of us would remember that Prof. Harran was charged on four felony counts for the laboratory death of undergraduate Sheri Sangji in December 2008. The case dragged on for several years, and in 2014 Prof. Harran and UCLA struck a deal with prosecutors that allowed him to avoid the charges in exchange for a fine and community service. The charges were not refuted; they were negotiated for.

Now I am certainly not of the opinion that someone like Prof. Harran should not be rehabilitated into the scientific community in some way or another. Nor do I think that he should never be recognized for his ongoing research. I am also not in a position to pass legal judgement on the degree of his culpability.

But that's not the point here at all. If the award was from, say the ACS, for purely technical achievement I would have been less miffed. As it happens it's a recognition from the AAAS: the American Association for Advancement of Science. 

Advancement of Science does not just mean advancement of the technical aspects of science; it means advancement of the sum total of the scientific enterprise, a key component of which is the intersection of science with public appreciation and public policy. The AAAS was set up in 1848 with the express goal of not just recognizing scientific achievement but of facilitating scientific discourse in the public sphere. Past presidents of the AAAS have included Robert Millikan and Stephen Jay Gould, both of whom put a premium on scientists actively engaging with the public.

Let's take a look at the official mission of the AAAS as noted on their own website:


  • Enhance communication among scientists, engineers, and the public;
  • Promote and defend the integrity of science and its use;
  • Strengthen support for the science and technology enterprise;
  • Provide a voice for science on societal issues;
  • Promote the responsible use of science in public policy;
  • Strengthen and diversify the science and technology workforce;
  • Foster education in science and technology for everyone;
  • Increase public engagement with science and technology; and
  • Advance international cooperation in science.
In my opinion, the election of Prof. Harran goes against at least four of these goals; enhancement of communication between scientists and the public, strengthening support for the scientific enterprise, increasing public engagement with science and most importantly, "promoting and defending the integrity of science and its use".

It's quite clear from the AAAS's mission statement that scientific responsibility and scientific outreach are two of its major aims. In fact one can argue that the AAAS, along with the NAS (National Academy of Sciences), is one of two policy organs in this country which represent the public face of the scientific enterprise. For more than a century now the AAAS has been an integral part of the nationwide scientific dialogue involving scientists, the government and the people. Perhaps it's fitting in this regard that the current CEO of the AAAS is former New Jersey Congressman Rush Holt, one of the few politicians in this country who's not only finely attuned to the truth-seeking nature of science and its potential corruption but was also a serious practicing scientist himself at one point.

All this makes the matter even denser and harder to understand. How does the election of someone who is still under a cloud of suspicion for not having implemented responsible safety practices in his laboratory at a major university and who has not pled guilty to any of the charges against him a healthy reaffirmation of the dialogue between scientists and the public? How does this election help them in their stated mission of "promoting the integrity of science and its use" when Prof. Harran's actions and the charges against him clearly called that integrity of use into question? 

In addition, the statement from a spokesperson of the AAAS saying that they were "unaware of the charges against Harran" is simply bizarre. The Harran and Sangji stories have been all over the news for more than seven years now; how much more exposure do they need for an organization of the size and reach of the AAAS to take notice?

The whole episode is deflating and incomprehensible. Again, this is not about Prof. Harran's merits purely as a scientist; in fact in a sense it's not about him at all. It's about what the AAAS wants to be. Does it want to be an institution purely recognizing technical achievement or does it want to be one which promotes scientific responsibility and outreach? If - as its almost one hundred and fifty year history indicates - it wants to be the latter, it can surely do better than this.

Note: For a comprehensive view of the details of the case as they unfolded, see C&EN reporter Jyllian Kemsley's outstanding coverage here.

Abraham Flexner, the Institute for Advanced Study, and the usefulness of useless knowledge

The most succinct encapsulation of the value of curiosity to practical pursuits came from Michael Faraday; when asked by William Gladstone, Chancellor of the Exchequer, about the utility of electricity, Faraday is purported to have replied, "One day, sir, you may tax it". Whether apocryphal or not, the remark accurately captures the far-reaching, often universal material benefits of the most fundamental of scientific investigations. Faraday's basic research on the relationship between electricity and magnetism ushered in the electrical age, as much as it shed light on one of Natures deepest secrets.

Part of Faraday's sentiment saw its flowering potential in the establishment of the Institute for Advanced Study (IAS) in Princeton. The IAS was set up in 1933 by Abraham Flexner, a far-thinking educator and reformer, with the explicit purpose of providing a heaven for the worlds purest thinkers that was free of teaching, administrative duties and the myriad interferences of the modern university. Funds came from the wealthy Bamberger family who did the world a favor by switching their monetary support from a medical school to the institute (some people might think they did us an even bigger favor by founding the clothing store chain Macy's). 

Flexners paean to unadulterated pure thought was duly enshrined in the institutes founding by an invitation to Albert Einstein to serve as its first permanent member in 1933; other intellectual giants including John von Neumann, Herman Weyl and Kurt Godel followed suit, finding a safe refuge from a continent which seemed to have gone half-mad. Over the next eight decades the institute produced scores of leading thinkers and writers, many of whom have inaugurated new fields of science and the humanities and been associated with prestigious prizes like the Nobel Prize, the Fields Medal and the Pulitzer Prize. Later permanent members have included diplomat George Kennan, physicist Freeman Dyson and art historian Erwin Panofsky. 

Flexner's pioneering thinking found its way into a 1939 issue of Harpers Magazine in the form of an article with the memorable title The Usefulness of Useless Knowledge. The document still provides one of the clearest and most eloquent arguments for supporting thinking without palpable ends that I have come across. The very beginning makes a telling case for science as a candle in the dark, a fact that must have shone like a gem on a mountaintop in the dark year of 1939:
"Is it not a curious fact that in a world steeped in irrational hatreds which threaten civilization, men and women old and young detach themselves wholly or partly from the angry current of daily life to devote themselves to the cultivation of beauty, to the extension of knowledge, to the cure of disease, to the amelioration of suffering, just as fanatics were not simultaneously engaged in spreading pain, ugliness and suffering?"
Flexner then goes on to give the example of half a dozen scientists including Maxwell, Faraday, Gauss, Ehrlich and Einstein whose passionate tinkering with science and mathematics led to pioneering applications in industry, medicine and transportation. Each of these scientists was pursuing research for its own sake, free of concerns regarding future application. Paul Ehrlich's case is especially instructive. Ehrlich who is the father of both modern antibiotic research and drug discovery was asked by his supervisor, Wilhelm von Waldeyer, why he spent so much time tinkering aimlessly with bacterial broths and petri dishes; Ehrlich simply replied, "Ich probiere", which can be loosely translated to "I am just fooling around". Waldeyer wisely left him to fool around, and Ehrlich ended up suggesting the function of protein receptors for drugs and discovering Salvarsan, the first remedy for the scourge of syphilis.

The theme repeats throughout the history of science; Alexander Fleming mulling over unexplained bacterial genocide, Claude Shannon obsessed with the mathematization of information transfer, Edward Purcell and Isidor Rabi investigating the behavior of atoms in magnetic fields. Each of these studies led to momentous practical inventions; specifically leading to antibiotics, information technology and MRI in the above cases.

Thus it should not be hard to make a case for why untrammeled intellectual wandering should be encouraged. It's of course not true that pure thinking always leads to the next iPad or brain scanner in every single instance. But as Flexner eloquently put it, even the occasional benefits far outweigh the perceived waste:
"I am not for a moment suggesting that everything that goes on in laboratories will ultimately turn to some unexpected practical use or that an ultimate practical use is its actual justification. Much more am I pleading for the abolition of the word use, and for the freeing of the human spirit. To be sure, we shall free some harmless cranks. To be sure, we shall thus waste some precious dollars. But what is infinitely more important is that we shall be striking the shackles off the human mind and setting it free for the adventures which in our own day have, on the one hand, taken Hale and Rutherford and Einstein and their peers millions upon millions of miles into the uttermost realms of space, and on the other, loosed the boundless energy imprisoned in the atom."
It is clear from Flexner's words that the sheer motive power of pure thinking by an Einstein or a Bohr makes the accompanying modest wastage of funds or an entry by the occasional crank a mere trifle. However to Flexner's credit, he also defuses the myth of the Great Man of Science, noting that sometimes practical discoveries very much rest on the shoulders of aimless meandering; a fact that makes the web of both pure and applied discovery a highly interconnected and interdependent one:
"Thus it become obvious that one must be wary in attributing scientific discovery wholly to any one person. Almost every discovery has a long and precarious history. Someone finds a bit here, another a bit there. A third step succeeds later and thus onward till a genius pieces the bits together and makes the decisive contribution. Science, like the Mississippi, begins in a tiny rivulet in the distant forest. Gradually other streams swell its volume. And the roaring river that bursts the dikes is formed from countless sources."
A deeper question though is why this relationship between idea and use exists, why even the purest of thought often leads to the most practical of inventions. In 2012 I attended the Lindau meeting of Nobel Laureates in Germany where the uncertain and yet immensely historically successful relationship between ideas and use was amply driven home. 

Physicist David Gross put his finger on the essential reason. He pointed out that "Nature is a reluctant mistress who shares her secrets reluctantly". The case for basic research thus boils down to a practical consideration: the recognition that a stubborn sea of scientific possibilities will yield its secrets only to the one who casts her net the widest, takes the biggest risks, makes the most unlikely and indirect connections, pursues a path of discovery for the sheer pleasure of it. Even from a strictly practical viewpoint, you encourage pure research because you want to maximize the odds of a hit in the face of uncertainty about the landscape of facts. It's simply a matter of statistical optimization.

At the Lindau meeting we could hear first hand accounts by scientists regarding how the uselessness of their investigations turned into useful, sometimes wholly unexpected knowledge. There was Steven Chu talking about how his work in using lasers to cool atoms is now being used by spacecraft studying global warming by tracking the motion of glaciers down to millimeter accuracy. Interestingly Chu also defused the popular notion that research in the exalted corridors of Bell Labs was entirely pie in the sky; as he noted, both the transistor and information theory arose from company concerns regarding communication through noisy channels and finding urgent replacements for vacuum tubes. Pure and applied research certainly don't need to be antagonists.

Others recounted their own stories. There was Alan Heeger who on a whim mixed a conducting polymer with fullerenes, thus anticipating ultrafast electron transfer. And Hartmut Michel, the Frankfurt chemist who is known for not being one to mince words, told the audience about how archeological applications of DNA technology are transforming our knowledge about the deepest mysteries of human origins. Michel also pointed out the important fact that one third or more Nobel Prizes have been awarded for methods development, a pattern which indicates that technical engineering for its own ends is as much a part of science as idea generation. There is great art both in the fashioning of the most abstract equations and the machining of the simplest tools of science.

The life and times of the successful scientists on stage made the immense spinoffs and unanticipated benefits of seemingly aimless research clear. And they did not even touch on the fact amply documented by Flexner in his essay that these aimless investigations have opened up windows into the workings of life and the universe that would have been inconceivable even a hundred years ago. Man is something more than the fruits of his labors, and that something is well worth preserving, even at the cost of billions of dollars and countless false alleys. 

The useful pondering of useless knowledge makes no claim to infallible wisdom or a steady stream of inventions. But what it promises is something far more precious; freedom from fear and the opportunity to see the light wherever it exists. Ich probiere.

This is an updated version of a past post.

Environmentalism is not climate change; climate change is not environmentalism

Freeman Dyson has an Op-Ed in the Boston Globe about the ongoing climate change talks in Paris in which he makes a cogent point - that environmentalism does not equal climate change and focus on climate change should not distract us from other environmental problems which may be largely unrelated to global warming.
"The environmental movement is a great force for good in the world, an alliance of billions of people determined to protect birds and butterflies and preserve the natural habitats that allow endangered species to survive. The environmental movement is a cause fit to fight for. There are many human activities that threaten the ecology of the planet. The environmental movement has done a great job of educating the public and working to heal the damage we have done to nature. I am a tree-hugger, in love with frogs and forests.  
But I am horrified to see the environmental movement hijacked by a bunch of climate fanatics, who have captured the attention of the public with scare stories. As a result, the public and the politicians believe that climate change is our most important environmental problem. More urgent and more real problems, such as the over-fishing of the oceans and the destruction of wild-life habitat on land, are neglected, while the environmental activists waste their time and energy ranting about climate change. The Paris meeting is a sad story of good intentions gone awry."
I rather agree with him that for many people the term environmentalism has become largely synonymous with climate change. However the two are not the same, as becomes clear to me when I hear him mention overfishing which is a real problem largely unconnected with climate change. I have recently been reading about the state of the world's fish in Paul Greenberg's excellent book "Four Fish". Greenberg focuses on the history and future of the four fish that have largely dominated the Western world's diet - salmon, sea bass, cod and  tuna. 

The book talks about how most of these fish were overharvested and almost driven to extinction by humans building dams and poisoning rivers with industrial effluent. Neither of these two issues is directly connected with climate change, but how often do we see high-profile global meetings which press everyone to deal with dam-building and water pollution on a war footing, let alone meetings led by presidents and prime ministers? 


Overfishing also brings another aspect of the climate change problem into sharp perspective: The only reason certain places in the US, such as the Salmon River in New York, are now full of fish is because those fish have been carefully cultivated in captivity and then released in the wild. Without this human intervention the Salmon River would have stayed barren. Then there is the revolution in fish farming, also described in "Four Fish", which has brought expensive fish to the plates of literally millions of people who were previously deprived of it. The equivalent of fish farming in case of climate change is geoengineering. Geoengineering carries more risks but also more potential benefits than fish farming, and it too deserves serious consideration in a meeting on climate change. As far as I know, most of these high-profile meetings on the topic focus on prevention rather than mitigation in the form of geoengineering. As exemplified by solutions to overfishing, any serious discussion of climate change should at least involve discussions of geoengineering.


Politically the sad history of the climate change wars seems to have a simple explanation in my mind. Before 2004 or so when the effects of climate change were not known as well and anti-science Republicans dominated the government, conservative deniers largely ruled over the debate and the media. After 2004 or so, in part because of better data and in part because of relentless and wide publicity by people like Al Gore, the media started paying much more attention to the issue. I do not blame the left for going a little overboard with emphasizing the case for climate change at the beginning, when it was important to counter the right-wing extremism against the topic. But since then the world has been sold on the issue, and there is no longer a need to be overzealous about it. Unfortunately segments of the left have continued the crusade which they started in good faith at the beginning and now many of them have turned into hard liners on the issue, extending the theory beyond where the evidence might lead and denouncing almost any opponent as motivated by politically enabled bigotry. This has led to the silencing of reasonable critics along with irrational deniers (see my previous post for a discussion of the distinction between deniers and skeptics).


But whatever our feelings about the political rancor surrounding climate change, I do share Dyson's concerns that it might distract us from problems that are at least equally important. I think environmentalism has been one of the most important movements for the common good in history, too important to be pigeonholed into one category. Climate change is not environmentalism; environmentalism is not climate change.


The second aspect of the op-ed is Dyson's contention that the science of climate change is not settled. Many people have attacked him for this contention in the past and they will no doubt attack him now, but both in conversations with him as well as based on what he has written, I have found that most of his issues about the science are very general and not extreme at all. He says we don't understand a system as complex as the climate enough to make detailed accurate predictions, and he also says that much of the debate has unfortunately turned so political and rancorous that it has become hard for reasonable people to disagree even on the scientific details. These statements are both quite true and should ideally be uncontroversial. In addition I have discussed parallels between molecular modeling and climate modeling with him; in both cases we seem to see a healthy amount of uncertainty as well as an incomplete understanding of key components involved in the process: for instance water seems to be a common culprit; we have as fuzzy an understanding of water in cloud formation as around the surface of proteins and small organic molecules. 


I think climate change is a serious problem that deserves our attention. There is little doubt that we have injected unprecedented amounts of carbon dioxide into the atmosphere since the industrial revolution, and we would be naive to think that these have not impacted the climate at all. But the devil is in the details. It seems not just bad science but bad policy to me to hold high-profile meetings on the topic every year while neglecting other equally valid topics, all the time making detailed plans for mitigation (and not active intervention) in a system which is not understood well enough right now for detailed preemptive actions which will impact the lives of billions of people, especially in the developing world. 


To me it seems reasonable to think that climate change should be part of a larger portfolio we should invest in if we want to try to protect our future. As they say, it's always best to diversify your portfolio to hedge your bets against future risks.

The late Paul Kalanithi's book "When Breath Becomes Air" is devastating, edifying, eloquent and very real

I read this book in one sitting, long after the lights should have been turned off. I felt like not doing so would have been a disservice to Paul Kalanithi. After reading the book I felt stunned and hopeful in equal parts. Stunned because of the realization that someone as prodigiously talented and eloquent as Paul Kalanithi was taken from the world at such an early age. Hopeful because even in his brief life of thirty-six years he showcased what we as human beings are capable of in our best incarnations. His family can rest assured that he will live on through his book.

When Breath Becomes Air details Dr. Kalanithi's life as a neurosurgeon and his fight against advanced lung cancer. Even in his brief life he achieved noteworthy recognition as a scholar, a surgeon, a scientist and now - posthumously - as a writer. The book is a tale of tribulations and frank reflections. Ultimately there's not much triumph in it in the traditional sense of the word, but there is a dogged, quiet resilience and a frank earthiness that endures long after the last word appears. The tribulations occur in both Dr. Kalanithi's stellar career and his refusal to give in to the illness which ultimately consumed him.

The first part of the book could almost stand separately as an outstanding account of the coming of age of a neurosurgeon and writer. Dr. Kalanithi talks about his upbringing as the child of hardworking and inspiring Indian immigrant parents and his tenacious and passionate espousal of medicine and literature. He speaks lovingly of his relationship with his remarkable wife - also a doctor - who he met in medical school and who played an outsized role in supporting him through everything he went through. He had a stunning and multifaceted career, studying biology and literature at Stanford, then history and philosophy of medicine at Cambridge, and finally neurosurgery at Yale.

Along the way he became not just a neurosurgeon who worked grueling hours and tried to glimpse the very soul of his discipline, but also a persuasive writer. The mark of a man of letters is evident everywhere in the book, and quotes from Eliot, Beckett, Pope and Shakespeare make frequent appearances. Accounts of how Dr. Kalanithi wrested with walking the line between objective medicine and compassionate humanity when it came to treating his patients give us an inside view of medicine as practiced at its most intimate level. Metaphors abound and the prose often soars: When describing how important it is to develop good surgical technique, he tells us that "Technical excellence was a moral requirement"; meanwhile, the overwhelming stress of late night shifts, hundred hour weeks and patients with acute trauma made him occasionally feel like he was "trapped in an endless jungle summer, wet with sweat, the rain of tears of the dying pouring down". This is writing that comes not from the brain or from the heart, but from the gut. When we lost Dr. Kalanithi we lost not only a great doctor but a great writer spun from the same cloth as Oliver Sacks and Atul Gawande.

It is in the second part of the book that the devastating tide of disease and death creeps in, even as Dr. Kalanithi is suddenly transformed from a doctor into a patient (Eliot helps him find the right words here: "At my back in a cold blast I hear, The rattle of bones, and a chuckle spread from ear to ear")
. It must be slightly bizarre to be on the other side of the mirror and intimately know everything that is happening to your body and Dr. Kalanithi is brutally frank in communicating his disbelief, his hope and his understanding of his fatal disease. It's worth noting that this candid recognition permeates the entire account; almost nothing is sanitized. Science mingles with emotion as compassionate doctors, family and a battery of medications and tests become a mainstay of life. The doctor finds out that difficult past conversations with terminal patients can't really help him when he is one of them.

The painful uncertainty which Dr. Kalanithi documents - in particular the tyranny of statistics which makes it impossible to predict how a specific individual will react to cancer therapy - must sadly be familiar to anyone who has had experience with the disease. As he says, "One has a very different relationship with statistics when one becomes one". There are heartbreaking descriptions of how at one point the cancer seemed to have almost disappeared and how, after Dr. Kalanithi had again cautiously made plans for a hopeful future with his wife, it suddenly returned with a vengeance and became resistant to all drugs. There is no bravado in the story; as he says, the tumor was what it was and you simply experienced the feelings it brought to your mind and heart.

What makes the book so valuable is this ready admission of what terminal disease feels like, especially an admission that is nonetheless infused with wise acceptance, hope and a tenacious desire to live, work and love normally. In spite of the diagnosis Dr. Kalanithi tries very hard - and succeeds admirably - to live a normal life. He returns to his surgery, he spends time with his family and most importantly, he decides to have a child with his wife. In his everyday struggles is seen a chronicle of the struggles that we will all face in some regard, and which thousands of people face on a daily basis. His constant partner in this struggle is his exemplary wife Lucy, whose epilogue is almost as eloquent as his own writing; I really hope that she picks up the baton where he left off.

As Lucy tells us in the epilogue, this is not some simple tale of a man who somehow "beats" a disease by refusing to give up. It's certainly that, but it's much more because it's a very human tale of failure and fear, of uncertainty and despair, of cynicism and anger. And yes, it is also a tale of scientific understanding, of battling a disease even in the face of uncertainty, of poetry and philosophy, of love and family, and of bequeathing a legacy to a two year old daughter who will soon understand the kind of man her father was and the heritage he left behind. It's as good a testament to Dr. Kalanithi's favorite Beckett quote as anything I can think of: "I can't go on. I will go on".

Read this book; it's devastating and heartbreaking, inspiring and edifying. Most importantly, it's real.

Science as a messy human endeavor: The origin of the Woodward-Hoffmann rules

A 1973 slide from Roald Hoffmann displaying the 'Woodward
Challenge' - four mysterious reactions which spurred the
Woodward-Hoffmann rules
There is a remarkable and unique article written by my friend and noted historian of chemistry Jeff Seeman that has just come out in the Journal of Organic Chemistry. The paper deals with seven pivotal months in 1964 when Robert Burns Woodward and Roald Hoffmann worked out the basic structure of what we call the Woodward-Hoffmann rules

Organic chemists need no introduction to these seminal rules, but for non-chemists it might suffice to say that they opened the door to an entire world of key chemical reactions - both in nature and in the chemist's test tube - whose essential details had hitherto stayed mysterious. These details include the probability of such reactions occurring in the first place and the stereochemistry (geometric disposition) of their molecular constituents. The rules were probably the first significant meld between theoretical and organic chemistry - ten commandments carried down from a mountain by Woodward and Hoffmann, pointing to the discovery of the promised land.The recognition of their importance was relatively quick; In 1980 Hoffmann shared a Nobel Prize for his contributions, and Woodward would have shared it too (it would have been his second) had he not suddenly passed away in 1979.

The first paper on these rules was submitted in November, 1964 and it came out in January, 1965. Jeff's piece essentially traces the conception of the rules in the previous six months or so. The article is very valuable for the light it sheds not just on the human aspect of scientific discovery but on its meandering, haphazard nature. It is one of the best testaments to science as a process of fits and starts that I have recently seen. Even from a strictly historical perspective Jeff's article is wholly unique. He had unprecedented access to Hoffmann in the form of daylong interviews at Cornell as well as unfettered access to Hoffmann's office. He has also interviewed many other important historical figures such as Andrew Streitweiser, George Whitesides and Jack Roberts who were working in physical organic chemistry at the time: insightful and amusing quotes from all these people (such as Whitesides's reference to the demise of a computer at MIT implying that he would now have to perform calculations using an abacus or his toes) litter the account. And there are copious and fascinating images of scores of notebook pages from Hoffmann's research as well as amusing and interesting letters to editors, lists of publications, scribblings in margins and other correspondence between friends and colleagues. Anyone who knows Jeff and has worked with him will be nodding their heads when they see how thorough the job here is.

The story begins when Woodward was already the world's most acclaimed organic chemist and Hoffmann was an upcoming theoretical chemistry postdoc at Harvard. Then as now, Hoffmann was the quintessential fox whose interests knew no bounds and who was eager to apply theoretical knowledge to almost any problem in chemistry that suited his interests. By then he had already developed Extended Hückel Theory (EHT), a method for calculating energies and orbitals of molecules which was the poster child for a model: imprecise, inaccurate, semiquantitative and yet pitched at the right level so that it could explain a variety of facts in chemistry. Woodward had already been interested in theory for a while and had worked on some theoretical constructs like the octant rule. It was a marriage made in heaven.

The most striking thing that emerges from Jeff's exhaustive and meticulous work is how relatively laid back Woodward and Hoffmann's research was in a sense. Hoffmann became aware of what was called the 'Woodward challenge' early in 1964 during an important meeting; this challenge involved the then mysterious stereochemical disposition during some well-known four and six electron reactions, reactions whose jargon ("electrocyclization", conrotatory") has now turned into household banter for organic chemists. The conventional story would then have had both Woodward and Hoffmann burning the midnight oil and persisting doggedly for the next few months until they cracked the puzzle like warriors on a quest. This was far from the case. Both pursued other interests, often ended up traveling and only occasionally touching base. Why they did this is unclear, but then it's no more unclear than why humans do anything else for that matter. Once they realized that could crack the puzzle however they kicked the door open. The paper that emerged in early 1965 was so long and comprehensive that they worried about its suitability for JACS in a cover letter to the editor.

Jeff's story also touches on a tantalizing conundrum whose solution many readers would have loved to know - E. J. Corey's potential role or the lack thereof in the conception of the rules, a role Corey unambiguously acknowledged in his 2004 Priestley Medal address, setting off a firestorm. Unfortunately Corey declined to talk to Jeff about this article (although he does dispute the timing of Woodward and Hoffmann's first meeting). His side of the story may never be known.

There is a lot of good stuff in the 45-page article that is worth reading about which I can only mention in passing here. Many of the actual mechanistic and technical details would be of interest only to organic chemists. But the more general message should not be lost upon more general readers: science is a messy, almost always unheroic, haphazard process. In addition, its real story is often warped by malleable memory, shifting egos, mundane oversights and blind alleys. For a long time science was described in bestselling books and newspaper articles as a determined, heroic march to the truth. These days there is an increasing number of books aimed at uncovering science's massive storehouses of failure and ignorance. But there is a third view of science - that of a journey to the truth which is more mundane, more complex, perpetually puzzling because of its mystery and perpetually comforting because of its human nature.

In this case even Jeff's exhaustive research leaves us with kaleidoscopic questions, questions that may likely remain unanswered. These pertain to Woodward and Hoffmann's occasional indifference to what was clearly a pivotal piece of research, to Corey's claim about the reactions, to the potential cross-fertilization between whatever else Woodward and Hoffmann were doing during this time and the project in question, and to the number of insights they might have imbibed from the community at large. Jeff conjectures answers to these questions, but even his probing mind provides no comforting conclusions, probably because there are none. The quote from Roald Hoffmann with which the piece ends captures the humanity quite well.
"Life is messy. Science is not all straight logic. And all scientists are not always logical. We're just scrabbler for knowledge and understanding."
Here's to the messy scrabblers.