Field of Science

The flame of life and death: My favorite (insufferable) chemical reaction

For me, the most astounding thing about science has always been the almost unimaginably far-reaching and profound influence that the most trite truths about the universe can have on our existence. We may think that we are in charge of our lives through our seemingly sure control of things like food, water, energy and material substances and we pride the ability of our species to stave off the worst ravages of the natural environment such as disease, starvation and environmental catastrophe. We have done such a good job of sequestering ourselves from the raw power of nature that it's all too easy to take our apparent triumph over the elements for granted. But the truth is that we are all without exception critically and pitifully beholden to a few numbers and a few laws of physics.

And a few simple chemical reactions. Which brings me to my favorite reaction for this month's blog carnival. It's a reaction so elementary that it will occupy barely a tenth of the space on a napkin or t-shirt and which could (and should) be productively explained to every human being on the planet. And it's a reaction so important that it both sustains life and very much has the potential to end it.

By now you might have guessed it. It's the humble combination of hydrocarbons with oxygen, known to all of us as combustion.

First the reaction itself which is bleedingly simple:

CnH2n+2 + (3n+1)/2 O2 → (n+1) H2O + n CO2 + Energy

That's all there is to it. There, in one line, is a statement about our world that packs at least as much information into itself as all of humanity's accumulated wisdom and follies. A hydrocarbon with a general formula CnH2n+2 reacts with oxygen to produce carbon dioxide, water and energy. That's it. You want a pithy, multifaceted (or two-faced, take your pick) take on the human condition, there you have it. While serving as the fundamental energy source for life and all the glory of evolution, it's also one that drives wars, makes enemies out of friends, divides and builds ties between nations and will without a doubt be responsible for the rise, fall and future of human civilization. Faust himself could have appeared in Goethe's dream and begged him to use this reaction in his great work.

First, the hydrocarbon itself. Humanity launched itself onto a momentous trajectory when it learnt how to dig carbon out of the ground and use it as fuel. Since then we have been biding our time for better or worse. The laws of quantum mechanics could not have supplied us with a more appropriate substance. Carbon in stable hydrocarbons is in its most reduced state, which means that you can get a bigger bang out of your buck by oxidizing it compared to almost any other substance. What billions of controlled experiments over the years in oil and natural gas refineries and coal plants have proven is that you really can't do better than carbon when it comes to balancing energy density against availability, cost, ease of handling and transportation and safety. In its solid form you can burn it to stay warm and to produce electricity, in its liquid form you can pump it into an incredibly efficient and compact gas tank. For better or worse we are probably going to be stuck with carbon as a fuel (although the energy source can wildly differ).

The second component of the chemical equation is oxygen. Carbon is very fortunate in not requiring a pure source of oxygen to burn; if it burned, say, only in an environment with 70% or more oxygen that would have been the end of modern civilization as we know it. Air is good enough for combusting carbon. In fact the element can burn under a wide range of oxygen concentrations, which is a blessing because it means that we can safely burn it in a very controlled manner. Varying the amount of oxygen can also lead to different products and can minimize the amount of soot and toxic byproducts. The marriage of carbon and oxygen is a wonderfully tolerant and productive one and we have gained enormously from this union

The right side of the combustion equation is where our troubles begin. First off, water. It may seem like a trivial, harmless byproduct of the reaction but it's precisely its benign nature that allows us to use combustion so widely. Just imagine if the combustion of carbon had produced some godforsaken toxic substance in addition to carbon dioxide as a byproduct. Making energy from combustion would then have turned into a woefully expensive activity, with special facilities required to sequester the poisonous waste. This would likely have radically altered the global production and distribution of energy and human development would have been decidedly hampered. We may then have been forced to pick alternative sources of energy early on in our history, and the face of politics, economics and technology would consequently have been very different.

Moving on we come to what's almost ubiquitously regarded as a villain these days- carbon dioxide. If carbon dioxide were as harmless as water we would live in a very different world. Sadly it's not and its properties again underscore the profound influence that a few elementary facts of physics and chemistry can have on our fate. The one property of CO2 that causes us so much agony is the fact that it's opaque to long-wavelength infrared radiation and absorbs it, thus warming the surroundings. This is not a post to discuss global warming but it's obvious to anyone not living in a cave that the issue has divided the world like no other. We still don't know for sure what it will do, either by itself or because of the actions taken by human beings from merely perceiving its effects. But whatever it is, it will profoundly alter the landscape of human civilization for better or worse. We can all collectively curse the day that the laws of physics and chemistry decided to produce carbon dioxide as a product of combustion.

Finally we come to the piece de resistance. None of this would have mattered if it weren't for the most important thing combustion produces- energy (in fact we wouldn't have been around to give a fig). In this context combustion is exactly like nuclear fission; twentieth-century history would have been very different if all uranium did was break up into two pieces. Energy production from combustion is what drives life and human greed. We stay alive by eating carbon-rich compounds - especially glucose - which are then burned in a spectacularly controlled manner to provide us with energy. The energy liberated cannot be used directly for our actions and thoughts. Instead it is used to construct devilishly clever chemical packages of ATP (adenosine triphosphate) which then serves as the energy currency.

Our bodies (and those of other creatures) are staggeringly efficient at squeezing oxidation-derived energy out of compounds like glucose; for instance in the aerobic oxidation of glucose, a single glucose molecule can generate 32 molecules of ATP. Put another way, the oxidation of a gram of glucose yields about 4 kilocalories of energy. This may not seem like a lot until we realize that the detonation of a gram of TNT yields only about 1 kilocalorie (the reason the latter seems so violent is because all the energy is liberated almost instantaneously). Clearly it is the all-important energy term in the combustion equation that has made life on earth possible. We are generously contributing to this term these days by virtue of quarter pounders and supersizing but our abuse does not diminish its importance.

The same term of course is responsible for our energy triumphs and problems. Fossil fuel plants are nowhere as efficient in extracting energy from carbon-rich hydrocarbons as our bodies, but what matters is whether they are cheap enough. It's primarily the cost of digging, transporting, storing and burning carbon that has dictated the calculus of energy. Whatever climate change does, of one thing we can be sure; we will continue to pay the cheapest price for our fuel. Considering the many advantages of carbon, it doesn't seem like anything is going to substitute its extraordinarily fortuitous properties anytime soon. We will simply have to find some way to work around, over or through its abundance and advantages.

If we think about it then, the implications of combustion for our little planet and its denizens are overwhelming and sometimes it's hard to take it all in. At such times we only need to take a deep breath and remember the last words spoken by Kevin Spacey's character from "American Beauty":

"Sometimes I feel like I'm seeing it all at once, and it's too much, my heart fills up like a balloon that's about to burst... And then I remember to relax, and stop trying to hold on to it, and then it flows through me like rain and I can't feel anything but gratitude for every single moment of my stupid little life..."

That's right. Let's have it flow through us like rain. And watch it burn.

Image source

Rookie mistakes in molecular modeling: Part 1

Molecular modeling as a general approach is no longer utilized only by experts but has reached the masses. Improved hardware and software capabilities combined with easy-to-use graphical user interfaces have enabled experimental chemists of all kinds to build models of molecules and perform relatively sophisticated calculations on them. Calculations which once required supercomputers can now be routinely done on desktops by organic, inorganic and biological chemists who can use the results to explain, support and predict chemical phenomena. In the coming years we can be confident that we will witness an increasing use of modeling techniques by experimentalists.

An unfortunate (but probably not unexpected) consequence of this ease of use is that it has also become easier to make mistakes while building molecular structures. The main source of errors arises during the translation of 2D chemical structures to their 3D counterparts using some energy minimization protocol. The apparently simple process of drawing a 3D-worthy 2D structure is trickier than it sounds and is therefore quite prone to error. Conformation which was not as important when drawing in 2D is suddenly of overriding importance and it's relatively easy to get it wrong.

As a modeler who has interacted closely with experiment, I have come across a number of rookie mistakes which I have seen myself and others make over the years. Sometimes these mistakes don't matter too much for the final result but sometimes they can completely change it. So I thought I would make a short list of easy-to-avoid errors which may provide checks on modeling structures. In part 1 I will describe mistakes commonly seen during the simple building of structures. Part 2 will deal with interactions with experimentalists.

1. Getting the ionization state wrong: I put this rookie mistake at the top because it's remarkable how many times I have seen even experienced modelers make it. Always remember; amines are protonated at physiological pH while carboxylic acids are deprotonated. The reason why getting this right is important is because it can completely change results from protocols like docking. Just think of the difference a protonated vs unprotonated carboxylate makes for binding to a protein. Also, many modeling algorithms use force fields which are dominated by electrostatic interactions; the wrong protonation state can therefore make a world of difference. A corollary of the ionization state problem results when replacing atoms. For instance you may have a protonated amine which you then want to turn into an alcohol by replacing the N with a O. Unfortunately the atom does change but not the ionization state, and you end up with a weird positively charged doubly bonded oxygen. On a related note, it goes without saying that you shouldn't charge up inappropriate atoms such as those which are conjugated to aromatic systems. The best way to overcome these issues is to simply display charges for all heteroatoms in your final structure.

2. Getting the stereochemistry wrong: The CIP rules were taught to us because they really matter. Here's a typical stereochemical mistake: You construct a structure in 2D and come across a stereocenter. You may even build that stereocenter with the right absolute (R or S) stereochemistry. And then you attach something else to that center and forget to recheck the stereochemistry which may have changed because of the change in CIP priority. The simplest way to make sure about stereochemistry is to always have the program display all absolute stereochemistry for the final structure.

2. Forgetting basic conformational rules: This mistake is most commonly made when converting a 2D structure into a 3D structure. The problem is that when you build a 2D structure, your placement of bonds and angles is somewhat ad hoc based on the rather random way in which you are conveniently rotating and viewing the structure. When you then suddenly convert 2D to 3D, you may end up with axial substituents on six-membered rings, syn-pentane or eclipsing interactions between substituents, funky substructures like non-planar aromatic rings resulting from strain or in the worst cases, even boats for cyclohexanes. Here's another common pitfall: You may try to close a ring by building an unrealistic long bond between two initially separated atoms, thinking that when you then minimize this structure the program will take care of the bond by shortening it to its standard length. This usually happens, but in the process some other parts of your molecule gets messed up. Again, judicious inspection can avoid most of these issues.

3. Cis and trans: The process of building unrealistic bonds between distant atoms and then simply minimizing a structure that I just mentioned can sometimes result in amide bonds becoming cis and this is important enough to be listed as a separate point. This is also a common consequence of importing 2D files in SDF format (which lack hydrogens) and asking a program to add hydrogens. The same thing can happen with double bonds.

4. Forgetting basic chemistry: This mistake has more to do with forgetting basic rules of bonding and chemistry than with modeling. Occasionally you may do things like exceeding the allowed valency of an atom, putting a double bond at a bridgehead carbon (violating Bredt's rule), generating antiaromatic rings, forgetting Baldwin's rules for ring closure, building a vinyl amine or a geminal amino alcohol...and generally creating all sorts of unstable and "impossible" molecules. The problem is that your program won't always raise red flags notifying you about these errors so you need to remember your chemistry and make sure you don't recommend some wacky molecules to make to the synthetic chemist (one of the constant sources of friction between experimentalists and modelers arises from the latter forgetting what's synthetically feasible and stable).

Ultimately, the path to a well-constructed molecule simply depends on being vigilant and judiciously checking your final structure. Remember the well-worn adage; computers don't know any chemistry whatever and they are only as good as the code that goes into them. Nothing can trump a sound knowledge of basic chemical principles.

In which creationists' understanding of amyloid appears...tangled

The biophysicist David Eisenberg of UCLA recently published a paper in which his group surveyed what they called the "amylome", the set of all possible proteins that can potentially form the deadly amyloid aggregate implicated in diseases like Alzheimer's. I haven't read the whole paper yet and will describe it in another post but it has some very intriguing conclusions (see the Nature News piece).

Eisenberg's group end up finding common segments possessing amyloid-forming propensity in pretty much every protein, given the right conditions. Not surprisingly, these segments are mostly kept tucked inside protein cores; if exposed the proteins are refolded with chaperones or eliminated as non-functional. But this ties in with Chris Dobson's work which I described in a recent post. Dobson's most recent paper seemed to conclude that amyloid is actually the most thermodynamically stable state of a protein, with "normal" protein states being metastable.

All this is fascinating stuff, but you can always trust creationists to put an anti-evolutionist spin on almost any scientific funding. Someone named Cornelius Hunter asserts on his blog that the fact that the most stable state of proteins seems to be amyloid and that most real proteins don't actually exist in this state seems to be a kind of miracle or at the very least indicates the enormous difficulties attendant in creating complex biomolecular structures.

Mr. Hunter seems to be indulging in a common fallacy, that of assuming that evolution somehow tends to an ideal. This is just not the way the process works. Considering the stringent constraints and time in which evolution has to work, it can only explore the available space of solutions and not the entire possible space. It can never achieve the best possible result in solution space, only one that is good enough under the given circumstances. I don't know if it's really that hard to understand or whether creationists like Mr. Hunter want to deliberately obfuscate the issue. Evolution can only work on what's already available; it can only mix and match existing motifs, and what exists need not be perfect at all. As an aside, this function of evolution reminds me of the concept of "satisficing" or "bounded rationality" in economics; lacking perfect knowledge of all solutions and unlimited time, economic actors like us can only pick solutions optimal within current constraints, not "global" maximums on the solution landscape.

There's therefore no reason to believe that biological evolution should create the most thermodynamically stable state of a protein. Creating a state that is functionally relevant is enough, even if it's thermodynamically metastable. In fact one can even make an argument that a thermodynamically superstable state might lead to an evolutionary dead end since it will be hard to tinker with. Given such constraints it's indeed impressive that evolution creates enzymes speeding up chemical reactions by twelve orders of magnitude, but even these enzymes are few and probably not the best possible in all of enzyme space.


So yes, the fact that it's amyloid and not the normal state of proteins that is the most thermodynamically stable one is fascinating, but in no way does this present a great challenge to evolution. In fact it reinforces evolution's essential character, to hunker down and make do as well as it can under the given circumstances. Don't we all?

2011 Nobel Prizes


So it's that time of the year again, the time when just like Richard Feynman and Paul Dirac, three select individuals get to mull over whether they will incur more publicity by accepting the Nobel Prize or rejecting it.

Predicting the Nobel Prizes gets easier every year ((I said
predicting, not getting your predictions right) since there's very little you can add in the previous year's list, although there are a few changes; the Plucky Palladists can now happily be struck off the list. As before, I am dividing categories into 'easy', and 'difficult' and assigning pros and cons to every prediction.

The easy ones are those regarding discoveries whose importance is (now) ‘obvious’; these discoveries inevitably make it to lists everywhere each year and the palladists clearly fell into this category. The difficult predictions would either be discoveries which have been predicted by few others or ones that that are ‘non-obvious’. But what exactly is a discovery of ‘non-obvious’ importance? Well, one of the criteria in my mind for a ‘non-obvious’ Nobel Prize is one that is awarded to an individual for general achievements in a field rather than for specific discoveries, much like the lifetime achievement Academy Awards given out to men and women with canes. Such predictions are somewhat harder to make simply because fields are honored by prizes much less frequently than specific discoveries.

When predicting the Nobel prize it's also prudent to be cognizant of discoveries whose recognition makes you go "Of course! That's obvious". Prizes for the charge-coupled device (CCD) (2009) integrated chip (2000) and in-vitro fertilization (2010) fall into this category.

Anyway, here's the N-list

2. Computational chemistry and biochemistry (Difficult):
Pros: Computational chemistry as a field has not been recognized since 1999 so the time seems due. One obvious candidate would be Martin Karplus. Another would be Norman Allinger, the pioneer of molecular mechanics.
Cons: This would definitely be a lifetime achievement award. Karplus did do the first MD simulation of a protein ever but that by itself wouldn’t command a Nobel Prize. The other question is regarding what field exactly the prize would honor. If it’s specifically applications to biochemistry, then Karplus alone would probably suffice. But if the prize is for computational methods and applications in general, then others would also have to be considered, most notably Allinger but perhaps also Ken Houk who has been foremost in applying such methods to organic chemistry. Another interesting candidate is David Baker whose program Rosetta has really produced some fantastic results in predicting protein structure and folding. It even spawned a cool game. But the field is probably too new for a prize and would have to be further validated.

3. Chemical biology and chemical genetics (Easy)
Another favorite for years, with Stuart Schreiber and Peter Schultz being touted as leading candidates.
Pros: The general field has had a significant impact on basic and applied science
Cons: This again would be more of a lifetime achievement award which is rare. Plus, there are several individuals in recent years (Cravatt, Bertozzi, Shokat) who have contributed to the field. It may make some sense to award Schreiber a ‘pioneer’ award for raising ‘awareness’ but that’s sure going to make a lot of people unhappy. Also, a prize for chemical biology might be yet another one whose time has just passed.

4. Single-molecule spectroscopy (Easy)
Pros: The field has obviously matured and is now a powerful tool for exploring everything from nanoparticles to DNA. It’s been touted as a candidate for years. The frontrunners seem to be W E Moerner and M Orrit, although Richard Zare has also been floated often.
Cons: The only con I can think of is that the field might yet be too new for a prize

5. Electron transfer in biological systems (Easy)
Pros: Another field which has matured and has been well-validated. Gray and Bard seem to be leading candidates.

Among other fields, I don’t really see a prize for the long lionized birth pill and Carl Djerassi; although we might yet be surprised, the time just seems to have passed. Then there are fields which seem too immature for the prize; among these are molecular machines (Stoddart et al.) and solar cells (Gratzel).

MEDICINE:

1. Nuclear receptors (Easy)
Pros: The importance of these proteins is unquestioned. Most predictors seem to converge on the names of Chambon/Jensen/Evans.

2. Chaperones: (Easy)
Arthur Horwich and Franz-Ulrich Hartl just won this year's Lasker Award for their discovery of chaperones. Their names have been high on the list for some time now.
Pros: Clearly important. Chaperones are not only important for studying protein folding on a basic level but in the last few years the malfunctioning of chaperones such as heat-shock proteins has been shown to be very relevant to diseases like cancer.
Cons: Too early? Probably not.

3. Statins (Difficult)
Akira Endo’s name does not seem to have been discussed much. Endo discovered the first statin. Although this particular compound was not a blockbuster drug, since then statins have revolutionized the treatment of heart disease.
Pros: The “importance” as described in Nobel’s will is obvious since statins have become the best-selling drugs in history. It also might be a nice statement to award the prize to the discovery of a drug for a change. Who knows, it might even boost the image of a much maligned pharmaceutical industry...
Cons: The committee is not really known for awarding actual drug discovery. Precedents like Alexander Fleming (antibiotics), James Black (beta blockers, antiulcer drugs) and Gertrude Elion (immunosuppresants, anticancer agents) exist but are far and few in between. On the other hand this fact might make a prize for drug discovery overdue.

4. Genomics (Difficult)
A lot of people say that Venter should get the prize, but it’s not clear exactly for what. Not for the human genome, which others would deserve too. If a prize was to be given out for synthetic biology, it’s almost certainly premature. Venter’s synthetic organisms from last year may rule the world, but for now we humans still prevail. On the other hand, a possible prize for genomics may rope in people like Carruthers and Hood who pioneered methods for DNA synthesis.

5. DNA fingerprinting and synthesis (Easy)
Now this seems to me to be very much a field from the "obvious" category. The impact of DNA fingerprinting and Western and Southern Blots on pure and applied science- everything from discovering new drugs to hunting down serial killers- is at least as big as the prizeworthy PCR. I think the committee would be doing itself a favor by honoring Jeffreys, Stark, Burnette and Southern.

And while we are on DNA, I think it's also worth throwing in Marvin Caruthers whose technique for DNA synthesis really transformed the field. In fact it would be nice to award a dual kind of prize for DNA- for both synthesis and diagnosis.

Cons: Picking three might be tricky.

6. Stem Cells (Easy)
This seems to be yet another favorite. McCulloch and Till are often listed. Unfortunately McCullough died earlier this year so it would be a little unfair to award just Till. However such a thing is not unprecedented. For example, the psychologist Daniel Kahneman shared the 2002 Economics Nobel Prize with Vernon L. Smith. Left out was his long-time collaborator Amos Tversky who had died in the 90s; it's pretty much regarded as a given that Tversky would have shared the prize had he been alive.
Pros: Surely one of the most important biological discoveries of the last 50 years, promising fascinating advances in human health and disease.
Cons: Politically controversial (although we hope the committee can rise above this). Plus, a 2007 Nobel was awarded for work on embryonic stem cells using gene targeting strategies so there’s a recent precedent.

7. Membrane vesicle trafficking (Easy)
Rothman and Schekman
Pros: Clearly important. The last trafficking/transport prize was given out in 1999 (Blobel) so another one is due and Rothman and Schekman seem to be the most likely canidates. Plus, they have already won the Lasker Award which in the past has been a good indicator of the Nobel.

8. GPCR structures (Difficult)
A commenter reminded me of this. When the latest GPCR structure (the first one of a GPCR bound to a G protein) came out I remember remarking that Kobilka, Stevens and Palczewski are probably up for a prize sometime.
Palczewski solved the first structure of rhodopsin and Stevens and Kobilka have been churning out structure after important structure over the last decade, including the first structure of an active receptor along with several medicinally important ones including the dopamine D3 and CXCR4 receptors. These feats are definitely technical tour de forces.
Pros: GPCR's are clearly important for basic and applied science, especially drug discovery where 30% of drugs already target these proteins.
Cons: Too early.

PHYSICS

I am not a physicist
But if I were
I would dare
To shout from my lair
“Give Hawking and Penrose the Prize!”
For being rock stars of humungous size

Also, Anton Zeilinger, John Clauser and Alain Aspect deserve it for bringing the unbelievably weird phenomenon of quantum entanglement to the masses. Zeilinger's book "Dance of the Photons" presents an informative and revealing account of this work

I have also always wondered whether non-linear dynamics and chaos deserves a prize. The proliferation and importance of the field certainly seems to warrant one; the problem is that there are way too many deserving recipients (and Mandelbrot is dead).

Other predictions: Canine Ed, Sam@EverydayScientist