Field of Science

Prof. Erland Stevens's online medicinal chemistry course

This spring I had a post up about an online medicinal chemistry course taught by Prof. Erland Stevens of Davidson College. It looks like the course was received quite well, with 14,000 students signing up. Now it's going to be taught again, starting from October 13th. Here's some information about the course that Prof. Stevens sent along; from the list of topics it seems to cover most major aspects of the basic principles and practice of drug discovery from a chemistry perspective. Would definitely recommend it.

  • The edX course (Medicinal Chemistry) starts October 13th and runs eight weeks
  • Cost: free with both free and for-pay certification options
  • Prerequisites: general chemistry (binding energies, intermolecular forces), some organic chemistry (line-angle structures and functional groups), knowledge of cell parts and functions, comfort with logarithmic and exponential equations
  • Time required: 1 hour of video per week, completing all assignments will require approx. 1 hour per day
  • Topics (appox. 1 wk each)
    (1) drug approval process (early drugs, clinical trials, IP factors)
    (2) enzymes and receptors (inhibition, Ki, types of ligands, Kd)
    (3) pharmacokinetics (Vd, CL, compartment models)
    (4) metabolism (phase I and II, genetic factors, prodrugs)
    (5) molecular diversity (drug space, combi chem, libraries)
    (6) lead discovery (screening, filtering hits)
    (7) lead optimization (FG replacements, isosteres, peptidomimetics)
    (8) important drug classes (selected examples)
  • Target audience: anyone with an interest in the structural basis of how drugs are designed

It's the most wonderful time of the year...#2014Nobels


...and the carolers are already out. Thomson Reuters already have their 2014 Nobel Prize prediction list, and past tradition obliges me to respond. C&EN is also having a nice panel discussion on the prize at 3:30 PM EDT on September 30th, so if you want to contribute to the chest-thumping/eye-gouging polite discussion feel free to join.

I was of course quite tickled when my own field won the prize last year so I think it's safe to discount a theoretical or computational chemistry prize from being given out this year (I am thinking of you, David Baker; you'll just have to wait). Contrary to there being absolutely no evidence to support this, I continue to believe that the prize committee does try to preserve a modicum of diversity, so that they don't end up recognizing similar areas every year.

This year I will keep my list short. The past ten years have recognized theoretical/computational chemistry (1 out of 10), biochemistry/bioorganic chemistry (5 out of 10!), organic synthesis (2 out of 10), surface chemistry (1 out of 10) and quasicrystals (1 out of 10) so I am going to be partial to other fields this time. 

Top of my list is John Goodenough for being a co-developer of the lithium-ion battery. I have thought for a while now that this is a prize that has probably been neglected because it's too obvious (remember the prize for IVF?). But that's precisely what makes it so important; lithium-ion batteries are everywhere. Giving a prize for batteries will also be a recognition of the growing importance of energy, and especially non-fossil fuel based energy. Another way to recognize the importance of energy would be to award a prize to Michael Grätzel for the dye-sensitized solar cell.

Speaking of electrochemistry, other often cited names are those of Allen Bard and Harry Gray who have devoted their careers to studying electron transfer, albeit in different kinds of systems (Gray mainly in proteins). The last electron transfer prize was given out to Rudolf Marcus in 1992 so its time may be due. Personally I think that the work that Bard and Gray have done is undoubtedly fascinating and important, but I am a little skeptical about whether its broadly applicable enough to get top billing from the Nobel committee.

Another favorite of mine for this year is Ching Tang of the University of Rochester who made key contributions to the organic light-emitting diode (LED). Like the lithium-ion battery the organic LED is also a mainstay of modern life so saluting its utility would be no surprise. There is also a chance that this contribution could be poached by the physicists. Thomson Reuters also has Tang on their list and he has received the 2011 Wolf Prize, which is almost as good as getting a Nobel.

Turning to what I think are less probable but still attractive choices, I think dendrimers (Frechet) and click chemistry (Sharpless) might stand a chance; the former more than the latter in terms of its longer life and more extensive validation in chemistry. Metal-organic frameworks (Yaghi and others) are also probably validated and used enough to be recognized. I don't really see a straight prize for organic synthesis or methodology, partly because two methodology prizes (palladium-catalyzed reactions and metathesis) have been awarded in the last decade. I would love to see a chemistry/medicine prize awarded to Carl Djerassi for his invention of the pill, but I increasingly get the feeling that that ship has sailed for good.

So that's my personal list for this year's chemistry prize - energy as a field is at the top, and LEDs, batteries and electron transfer are my favorite contenders.

Occasionally the prizes for chemistry and medicine overlap so it's worth speculating a bit on the medicine prize. In the "obvious" category are people like Alec Jeffreys (DNA fingerprinting), Leroy Hood and Craig Venter (the nuts and bolts of gene sequencing), Robert Langer (drug delivery) and a longstanding favorite of many people - nuclear receptors (Chambon and Evans; Elwood Jensen sadly passed away in 2012). Thomson Reuters is proposing David Julius who did important work on figuring out the receptors responsible for pain, and that seems like a good choice to me. More speculatively, I feel convinced that there are prizes for CLARITY and CRISPR sometime in the future, but not yet.

Although I can't predict who will win, one thing that I think we can all bet on is that someone will be woken up at the crack of dawn by a phone call, someone will inevitably be left out, and someone's ardent belief in the possibility that his or her favorite scientist was somehow hijacked by a rival field will be shored up by another data point.

I will make a list of other predictions as I hear them.

Update: Here's Everyday Scientist and here's the Pipeline.

What if the Manhattan Project had been like an Alzheimer's disease drug discovery project

The vast K-25 gaseous diffusion plant at Oak Ridge - an
engineering endeavor (Image: Nuclear Secrecy Blog)
Every once in a while you will find someone comparing a major scientific or technological challenge to the Manhattan Project - among such comparisons would be the Human Genome Project and the Brain Map Initiative. It's also not unheard of for drug discovery being uttered in the same breath as the Manhattan project; for instance administrators and scientists have been calling for new antibiotic discovery to be placed on the same footing as the wartime quest to build the first atomic bomb.

To be honest, most of these comparisons obfuscate more than they instruct. It's not that they are entirely invalid, but their core kernels of truth significantly differ.

To see why, let's compare a typically challenging, novel drug discovery project like finding a cure or mitigating therapy for Alzheimer's disease to designing Fat Man or Little Boy. The mandate for the Manhattan Project was, "produce a practical military weapon that works by harnessing energy from nuclear fission of uranium or plutonium". The mandate for Alzheimer's disease would be "discover and develop a small molecule that mitigates or cures the symptoms of Alzheimer's disease and that is potent, safe, enters and exits the body in a reasonable period of time and causes minimal side effects".

The first thing to notice is the difference between the word "produce" and "discover". Manhattan was not a discovery project, and the word "produce" in fact appeared in the first line of 'The Los Alamos Primer', the indoctrination lectures given by physicist Robert Serber at the beginning of the adventure. Production is more akin to engineering than science, so that word sets the tone for the entire project. That is not to say that Manhattan did not involve science and scientists - of course it did. But the key thing to realize is that the basic discovery part of the science had been done between 1938 and 1942. This part was symbolized by four milestones: the discovery of fission in December 1938 by Hahn and Strassmann, the first 'proof of principle' calculations by Frisch and Peierls in March 1940 indicating the feasibility of a fission weapon, the working out of the actual mechanism and effects of a bomb in the summer of 1942 by Oppenheimer and his associates at Berkeley and the successful initiation of a nuclear chain reaction by Fermi and his associates in Chicago in December 1942. By the time the project started in March 1943, the atomic constitution of matter was thus firmly mapped and the elementary particles which were required to harness fission were all discovered and their properties charted as either 'known knowns' or 'known unknowns'.

The major challenges associated with the project were thus not discovery challenges. It was fully known by the beginning of the project that if you suddenly bring a sufficiently large lump of highly purified uranium-235 together you would cause a very big bang. The key words there were 'large', 'purified' and 'suddenly' and these words really signaled the enormous engineering challenges. To produce a large and purified lump of uranium or plutonium would take vast chemical and engineering complexes in Oak Ridge, TN and Hanford, WA which employed hundreds of thousands of workers and whose use of resources like electricity and copper would rival the size of the US automobile industry. That was really mostly engineering, the unimaginably strenuous application of man and machine in separating two isotopes from each other and in creating a novel element.

The 'sudden' part of the challenge was equally important and again heavily steeped in engineering. For the uranium bomb this was not a big issue since a heavy modified gun would do the job. The really novel find - and this would be classified as a discovery, albeit of an applied kind - was the mechanism of a plutonium weapon. In the summer of 1944 it was realized for good that given the spontaneous rate of fission of Pu-240 which was contaminating the samples of Pu-239 produced in the Hanford reactor, a gun-type weapon that would work for uranium would simply cause an equivalent plutonium weapon to pre-detonate. The solution circumventing this problem - implosion using shaped 'lenses' - was probably the most novel discovery/invention to come out of the Manhattan Project. The novelty of this solution was why they tested the plutonium bomb in July 1945 but not the uranium bomb. In fact implosion could be considered to be the one truly novel 'secret' to come out of the project.

But again, putting implosion into practice was almost completely engineering. Brilliant scientists like John von Neumann did contribute in calculating how you would get a perfectly symmetrical, inward-looking shock wave to compress the plutonium core, but the key challenges were designing and fashioning the explosive lenses - layered arrangements of slow and fast-burning plastic charges that would alternately diverge and then precisely converge a shock wave - that would make implosion possible as well as crafting detonators that would fire simultaneously on the surface of the weapon to send the shock waves in. To that end the metallurgists and chemists at Los Alamos were put to work refining these plastic arrangements from hot molds, shaping them and smoothing out even the tiniest air imperfections in the form of bubbles that would cause the shock waves to deviate from perfect symmetry. Similarly experts in electronics were put to work inventing new timing systems for detonators. This work was all chemistry, chemical engineering, electronics and machine shop. It involved not exalted, Nobel Prize-winning minds but the most practical minds, minds which sometimes lacked even a college degree but which could work wonders with their hands (one of these minds happened to be that of David Greenglass, the spy who shuttled secrets to Klaus Fuchs). In fact this part of the project was so important that much of the details are still classified, and rightly so. Master the design of explosive lenses and you command the explosive consequences of plutonium.

The culmination of all this engineering and science is well known, but the tone for that was set in 1943. No wonder Richard Feynman, when describing the Manhattan Project in his memoirs, called it "not science, mostly engineering". He was right.

Now what if the Manhattan Project had been like a novel drug discovery project for Alzheimer's disease? The physicists working on it would still be in the dark ages. The equivalent of fission in Alzheimer's would be the mechanism(s) that causes it, both on the molecular level and on a more global level. We don't know that yet. The Alzheimer's equivalent of protons, neutron and electrons would be the molecular or epidemiological components that cause the disease. There are scattered clues about them, but we really don't know those yet either. Consider the rogue, misfolded protein Aß (amyloid beta) for instance: ten years ago it was regarded as possibly the major culprit in the disease; now it is regarded simply as something associated with the disease in a major way, but nobody knows exactly how. Every major clinical trial that promises to target interesting mechanisms and components in Alzheimer's has failed miserably over the last few years, which is probably not too surprising if we were in ignorance of the real molecular components and were targeting the wrong mechanisms to begin with. 

And even if we knew the mechanisms and the components, the ensuing development of an Alzheimer's drug is no mere engineering challenge. That's because even the most basic processes of drug discovery - things like getting drugs across cell membranes or even getting them to dissolve in an aqueous solution - are too poorly understood to be able to be predicted accurately. Thus, the engineering part of drug discovery is far more tied to a woefully deficient understanding of the science than the engineering part for the Manhattan Project was. The implication of this is that because prediction is largely futile, we have to test tens of thousands of candidates in drug discovery to see what works: What if the Manhattan project needed to build thousands of bomb prototypes to find the one that finally worked? The difference between bomb design and drug design is thus not one of manpower, resources or engineering; it is one of a basic lack of information and deep, dark patches of ignorance. And it arises from the fundamental complexity of biological systems compared to engineering systems.

Thus, if the Manhattan Project were truly an Alzheimer's disease drug discovery project, the physicists working on it would have started not knowing about nuclear fission and not even knowing about protons, neutrons and electrons, let alone about cross sections or plutonium. Here's what their mandate would have looked like then: "Discover what stuff is made up of. Find if any of it can be manipulated to release large amounts of energy. And if you find this, then try to figure out if you can get enough of this special material to make a practical military weapon." In other words, there probably would not have been a mandate to build an atomic bomb in the first place.

However I am ready to take bets on whether this mandate - given to all those brilliant minds in 1942 - would have led to Little Boy or Fat Man by 1945.

Afterthought: So if the Manhattan Project was in fact mostly engineering, it's worthwhile asking why it's associated with science and scientists - and especially physics as opposed to chemistry which was equally important - in the public imagination. I believe the answer in one word is 'myth-making'. Men like Feynman and Oppenheimer are considered so brilliant and fascinating that they have inevitably come to stand in for the whole project.

Other posts on the complexities of biology and the futility of comparing drug discovery with engineering challenges or physics:

1. Why chemistry (and biology) is not physics.
2. Why drug design is like airplane design. And why it isn't.
3. Derek Lowe on what he calls the 'Andy Grove Fallacy': 12.
4. Why it's hard to explain drug discovery to physicists.

Happy Birthday to the man who found life slipping away from his fingers



“In my quest for the secret of life I started my research in histology. Unsatisfied by the information that cellular morphology could give me about life, I turned to physiology. Finding physiology too complex, I took up pharmacology. Still finding the situation too complicated, I turned to bacteriology. But bacteria were even too complex, so I descended to the molecular level, studying chemistry and physical chemistry. After twenty years' work, I was led to conclude that to understand life we have to descend to the electronic level and to the world of wave mechanics. But electrons are just electrons and have no life at all. Evidently on the way I lost life; it had run out between my fingers.”
This quote comes from Albert Szent-Gyorgyi - born 121 years ago today. Gyorgyi was a Hungarian biochemist and Nobel Laureate who discovered vitamin C and worked out many of the components of what we now call the Krebs cycle. 

His quote is the best encapsulation I know of the limitations of reductionism, and of the non-reduction of biology to physics in particular. Szent-Gyorgyi zeroes in on the essential problem; as we drill down from unquestionably living cells to molecules to unquestionably non-living individual electrons, life somehow slips away sometime during the transition from between our microscopes, pipettes and petri dishes.

At what point it exactly does this and how is a quest that will continue to occupy us for as long as there is a human mind capable of scientific reflection.

Happy Birthday to the man with five brains

Today is Murray Gell-Mann's birthday. John Brockman calls him "the man with five brains, each one of which is smarter than yours". We are thankful he is still with us and holding forth on a variety of important problems. Gell-Mann of course is famous as the man who, inspired by a line from a book which he predictably would be the right person to have read, invented quarks. Nobody has observed an isolated quark yet, but there have been plenty of Nobel Prize-winning experiments confirming their evidence through other incontrovertible means.

In the 1960s there was a famous running rivalry between Gell-Mann and Richard Feynman for the title of Smartest Man on the Planet (there was another lesser known rivalry between Gell-Mann and Einstein biographer Abraham Pais, who Gell-Mann once bitterly called "that little dwarf"). In terms of raw I.Q. the two New Yorkers certainly were equal to each other. It was the joint presence of these wunderkinder that made Caltech perhaps the most exciting place for theoretical physics in the 60s and 70s. Now most of the public believes Feynman won the contest, but that's probably because, as Gell-Mann put it, Feynman was inordinately fond of generating anecdotes about himself and making himself appear larger than life. The two worked together for a while and admired each other for the rest of their lives, but according to Gell-Mann he eventually got tired of what he thought was Feynman's preoccupation with self-promotion.

In some sense Gell-Mann was the perfect foil to Feynman's anecdote generator, just as Feynman was often the perfect foil to Gell-Mann's predilection for tossing out trivia. I would wager that Gell-Mann's quarks were at least as important as Feynman's quantum electrodynamics, laying the foundation for all the particle physics that followed, including the culmination of the efforts of many people in the Standard Model. Gell-Mann also made other important contributions to physics, including to current algebra and quantum chromodynamics. Ironically, famous rival as he was, it was Feynman who paid Gell-Mann the ultimate compliment: "Our knowledge of fundamental physics contains not one fruitful idea that does not carry the name of Murray Gell-Mann".

Gell-Mann would easily be Feynman's contender for the world's smartest man not only because he has racked up a tremendous amount of achievement in physics but because he also, in the words of his biographer George Johnson, seems to know everything about everything. Simply conquering physics was not enough for Gell-Mann, he wanted to conquer the depths of all human knowledge. His christening of quarks based on a remark from James Joyce's dense "Finnegan's Wake" was not a coincidence. The range of his intellectual facilities equaled that of Oppenheimer, and he can tell you as much about linguistics or classical history as he can about physics; it was his propensity to generously offer trivia about these topics that in part used to drive Feynman crazy. He is fluent in half a dozen languages and known for correcting native speakers' pronunciations of words from their own language. With such a prodigious command of the world's knowledge at his disposal, it's not surprising that he does not suffer fools gladly; he is known to walk out of meetings (like his august predecessor Wolfgang Pauli) if he thinks less of the speaker. I was not afraid of knocking on Freeman Dyson's door, but I would have to fortify my nerves with a few shots of gin before contemplating an encounter with Gell-Mann.

Regarding literature on or by Gell-Mann, you can do no better than George Johnson's "Strange Beauty" which along with James Gleick's "Genius" is the best physics biography I have ever read. Gell-Mann himself has found it notoriously painful to put pen to paper all his life and therefore it is no surprise to find that he almost disowned his own book - part scientific meditation, part memoir- after it was written. And yet I will recommend it warmly: "In "The Quark and the Jaguar" Gell-Mann ruminates across a vast range of time and length scales of the cosmos, from quarks to humans to the entire universe. It has sparkling and remarkably clear discussions of topics like algorithmic complexity and quantum electrodynamics and reading it feels akin to getting an intellectual workout in a swanky gym. With Gell-Mann holding forth on science and the universe, life is at least not dull.

And therefore the best tribute to the man with five brains would be a vodka martini, shaken with a generous helping of nature's fundamental forces and stirred with the ingredients of the cosmos, and held up with a full-throated cry of "Three quarks for Muster Mark!".

Modular drug design software?

The latest issue of C&EN has an interesting article (unfortunately subscription only) about how quantum chemists are making code for standard protocols in quantum chemistry calculations available to each other as off-the-shelf modules. The movement has been driven by the realization that whenever someone develops a new quantum chemistry program he or she has to go through the tedious process of rewriting code for standardized algorithms like the Hartree-Fock method for calculation of potential energies of molecules. Why reinvent the wheel when you can simply buy it off the shelf in a centralized local tire shop?

I like this idea and I applaud the quantum chemists for having the generosity in sharing their code. But that left me wondering how soon it would be before something similar could happen in the world of computational drug design, or whether it would even be feasible.

The essence of methods like Hartree-Fock is that their highly iterative and standardized nature made them instantly amenable to computation. Your code for Hartree-Fock may be faster and cleaner than the other fellow's but the basic methodology which can be captured in a well-defined flowchart is not going to change. Contrast this with 'standard' drug design software protocols like docking, similarity searching and molecular dynamics calculations. 

Even though the objective is the same in every case, every practitioner uses his or her own favorite technique for their calculations; for instance docking can be physics-based or knowledge-based or it may depend on genetic algorithms. The sampling algorithms in MD may similarly be different in every case. Docking or MD are thus not as 'standardized' as say the Hartree-Fock method so it may be difficult to offer these protocols as standardized modules.

However I cannot see why it may not be possible to offer even more specialized components that are in fact standard for the wider use of the community. For instance, certain force fields - parameters and equations for calculation of structure and energetics - are pretty standard; the MMFF force field will have a certain set of components and the MM2 will have another. Similarly in a protocol like MD, the precise methods of sampling can be much more standard compared to the overall package. So in principle these methods could be packaged as standardized modules and offered to users.

The ideal situation for computational drug design would be an age where a variety of protocols ranging from quantum chemistry, docking and MD to homology modeling, gene and protein sequence comparison tools and toxicity and PK prediction algorithms would be available for any user to patch together, rearrange and deploy in the solution of his or her particular problem. 

Going even further, we could envisage an age where the tools of systems and computational biology are thoroughly ingrained in the drug discovery process so that one can now add standard systems tools to the toolbox; for instance, in such an age, not only would I be able to snatch standard docking protocols from a website but I would also be able to combine them with some kind of a wiring diagram of the protein which I am trying to target linked to its partners, so that I know exactly which partner hubs I should additionally dock my drug against in order to maximize its efficacy and minimize its toxicity. And who knows, maybe I can even get to a stage where I can download some kind of a minimalist but accurate model of an entire cell and observe how my drug will qualitatively perturb its network of organelles and signaling pathways.

For now this sounds like a pipe dream, although I suspect that the cultural barriers to sharing algorithms with commercial potential may be much harder to overcome than the scientific hurdles to actually incorporating systems biology in drug discovery and making the whole process modular. That's where the siren song of these socialist quantum chemists would be particularly relevant.

The 2014 Fields and Nevanlinna prizes: Celebrating diversity

"And if we cannot end now our differences, at least we can help make the world safe for diversity." - John F. Kennedy
An Iranian woman, a first and a second generation Indian, an Englishman and a Brazilian. Most of them working in the United States - The 2014 Fields and Nevanlinna prizes celebrate diversity like no other.
Quanta Magazine has a wonderful set of profiles of this year's top math prize winners that are worth reading. 
Maryam Mirzakhani is especially notable as the first woman to win the prestigious prize. The profiles are accompanied by short videos. The prizewinners are a varied bunch whose interests and origins are spread across geography and mathematics. From topology to number theory, from geometry to chaos theory, they seem to have it all covered.

Diversity and bridge-building across nations and cultures have always been an important part of science - witness Eddington's confirmation of Einstein's general theory of relativity right after Germany and England had been embroiled in a catastrophic war. But in no field is this more apparent than in pure mathematics where people across the world can be connected purely by way of ideas, unencumbered by political or religious affiliations or commercial applications. Hopefully we can look forward to more such celebrations.