More model perils; parametrize this

ResearchBlogging.orgNow here's a very interesting review article that puts some of the pitfalls of models that I have mentioned on these pages in perspective. The article is by Jack Dunitz and his long-time colleague Angelo Gavezzotti. Dunitz is in my opinion one of the finest chemists and technical writers of the last half century and I have learnt a lot from his articles. Two that are on my "top 10" list are his article showing the entropic gain accrued by displacing water molecules in crystals and proteins (a maximum of 2 kcal/mol for strongly bound water) and his paper demonstrating that organic fluorine rarely, if ever, forms hydrogen bonds.

In any case, in this article he talks about an area in which he is the world's acknowledged expert; organic crystal structures. Understanding and predicting (the horror!) crystal structures essentially boils down to understanding the forces that makes molecules stick to each other. Dunitz and Gavezzotti describe theoretical and historical attempts to model forces between molecules, and many of their statements about the inherent limitations of modeling these forces rang as loudly in my mind as the bell in Sainte-Mère-Église during the Battle of Normandy.

Dunitz has a lot to say about atom-atom potentials that are the most popular framework for modeling inter and intramolecular interactions. Basically such potentials assume simple functional forms that model the attractive and repulsive interactions between nuclei which are treated as rigid balls. This is also of course the fundamental approximation in molecular mechanics and force fields. The interactions are basically Coulombic interactions (relatively simple to model) and more complicated dispersion interactions which are essentially quantum mechanical in nature. The real and continuing challenge is to model these weak dispersive interactions.

But the problem is fuzzy. As Dunitz says, atom-atom potentials are popular mainly because they are simple in form and easy to calculate. However, they have scant, if any, connection to "reality". This point cannot be stressed enough again. As this blog has noted several times before, we use models because they work, not because they are real. The coefficients in the functional forms of the atom-atom potentials are essentially varied to minimize the potential energy of the system and there are several ways to skin this cat. For instance, atomic point charges are rather arbitrary (and definitely not "real") and can be calculated and assigned by a variety of theoretical approaches. In the end, nobody knows if the final values or even the functional forms have much to do with the real forces inside crystals. It's all a question of parameterization which gives you the answer, and while parameterization may seem like a magic wand which may give you anything that you want, that's precisely the problem with it...that it may give you anything that you want without reproducing the underlying reality. Overfitting is also a constant headache and one of the biggest problems with any modeling in my opinion; whether in chemistry, quantitative finance or atmospheric science. More on that later.

An accurate treatment of intermolecular forces will have to take electron delocalization into consideration. The part which is the hardest to deal with is the part close to the bottom of the famous Van der Waals energy curve, where there is an extremely delicate balance between repulsion and attraction. Naturally one thinks of quantum mechanics to handle such fine details. A host of sophisticated methods have been developed to calculate molecular energies and forces. But those who think QM will take them to heaven may be mistaken; it may in fact take them to hell.

Let's start with the basics. In any QM calculation one uses a certain theoretical framework and a certain basis set to represent atomic and molecular orbitals. One then adds terms to the basis set to improve accuracy. Consider Hartree-Fock theory. As Dunitz says, it is essentially useless for dealing with electron delocalization because it does not take electron correlation into account, no matter how large a basis set you use. More sophisticated methods have names like "Moller-Plesset perturbation theory with second order corrections" (MP2) but these may greatly overestimate the interaction energy, and more importantly the calculations become hideously computer intensive for anything more than the simplest molecules.

True, there are "model systems" like the benzene dimer (which has been productively beaten to death) for which extremely high levels of theory have been developed that approach experimental accuracy within a hairsbreadth. But firstly, model systems are just that, model systems; the benzene dimer is not exactly a molecular arrangement which real life chemists deal with all the time. Secondly, a practical chemist would rather have an accuracy of 1 kcal/mol for a large system than an accuracy of 0.1 kcal/mole for a simple system like the benzene dimer. Thus, while MP2 and other methods may give you unprecedented accuracy for some model systems, they are usually very expensive for most systems of biological interest and not very useful.

DFT still seems to be one of the best techniques around to deal with intermolecular forces. But "classical" DFT suffers from a well-known inability to treat dispersion. "Parameterized DFT" in which an inverse sixth power term is added to the basic equations can work well and promises to be a very useful addition to the theoretical chemist's arsenal. More parameterization though.

And yet, as Dunitz points out, problems remain. Even if one can accurately calculate the interaction energy of the benzene dimer, it is not really possible to know how much of it comes from dispersion and how much of it comes from higher order terms. Atom-atom potentials are happiest calculating interaction energies at large distances, where the Coulomb term is pretty much the only one which survives, but at small interatomic distances which are the distances most of interest for the chemist and the crystallographer, a complex dance between attraction and repulsion, monopoles, dipoles and multipoles and overlapping electron clouds manifests itself. The devil himself would have a hard time calculating interactions in these regions.

The theoretical physicist turned Wall Street quant Emanuel Derman (author of the excellent book ("My Life as a Quant: Reflections on Physics and Finance") says that one of the problems with the financial modelers on Wall Street is that they suffer from "physics envy". Just like in physics, they want to discover three laws that govern 99% of the financial world. More predictably as Derman says, they end up discovering 99 laws that seem to govern 3% of the financial world with varying error margins. I would go a step further and say that even physics is accurate only in the limit of ideal cases and this deviation from absolute accuracy distinctly shows in theoretical chemistry. Just consider that the Schrodinger equation can be solved exactly only for the hydrogen atom, which is where chemistry only begins. Anything more complicated that, and even the most austere physicist cannot help but approximate, parametrize, and constantly struggle with errors and noise. As much as the theoretical physicist would like to tout the platonic purity of his theories, their practical applications would without exception involve much approximation. There is a reason why that pinnacle of twentieth century physics is called the Standard Model.

I would say that computational modelers in virtually every field from finance to climate change to biology and chemistry suffer from what Freeman Dyson has called "technical arrogance". We have made enormous progress in understanding complex systems in the last fifty years and yet when it comes to modeling the stock market, the climate or protein folding, we seem to think that we know it all. But we don't. Far from it. Until we do all we can do is parametrize, and try to avoid the fallacy of equating our models with reality.

That's right Dorothy. Everything is a model. Let's start with the benzene dimer.

Dunitz, J., & Gavezzotti, A. (2009). How molecules stick together in organic crystals: weak intermolecular interactions Chemical Society Reviews, 38 (9) DOI: 10.1039/b822963p

California axes science

From the NYT
As the University of California struggles to absorb its sharpest drop in state financing since the Great Depression, every professor, administrator and clerical worker has been put on furlough amounting to an average pay cut of 8 percent.

In chemistry laboratories that have produced Nobel Prize-winning research, wastebaskets are stuffed to the brim on the new reduced cleaning schedule. Many students are frozen out of required classes as course sections are trimmed.

And on Thursday, to top it all off, the Board of Regents voted to increase undergraduate fees — the equivalent of tuition — by 32 percent next fall, to more than $10,000. The university will cost about three times as much as it did a decade ago, and what was once an educational bargain will be one of the nation’s higher-priced public universities.
There was a time when people used to go to Berkeley for the lower tuition. Seems the last refuges of education are gradually eroding away.

What did you say the error was??

I was looking at some experimental data for drug molecules binding to a pharmaceutically relevant protein.

The numbers reported were as percentages of binding relative to a standard which was defined to be 100%. Here's how they looked:

97.3 + - (plus or minus) 68.4
79.4 + - 96.1
59.5 + - 55.3
1.4 + - 2.5

Seriously, how did the reviewers allow this to go through?

A new look, with hair combed and shoes brushed

As you may have noticed, I have transitioned to a spanking new look at Field of Science (FoS), thanks to Edward's invitation and am loving it. I also join a super team of fellow bloggers whom I hope to regularly read. You won't have to update your bookmarks since you will be automatically directed here when you click on the old link.

I have to admit that after exercising my primitive blog management skills for the last five years, this feels like warp speed and spinach combined.

A hermitian operator in self-imposed exile

Perfect Rigor: A Genius and the Mathematical Breakthrough of the Century
Masha Gessen (Houghton Mifflin Harcourt, 2009)

Pure mathematicians have the reputation of being otherworldly and divorced from practical matters. Grisha or Grigory Perelman, the Russian mathematician who at the turn of this century solved one of the great unsolved problems in mathematics, the Poincare Conjecture, is sadly or perhaps appropriately an almost perfect specimen of this belief. For Perelman, even the rudiments of any kind of monetary, professional or material rewards resulting from his theorem were not just unnecessary but downright abhorrent. He has turned down professorships at the best universities in the world, declined the Fields Medal, and will probably not accept the 1 million dollar prize awarded by the Clay Mathematics Institute for the solution of the some of the most daunting mathematical problems of all time. He has cut himself off from the world after seeing the publicity that his work received and has become a recluse, living with his mother in St. Petersburg. For Perelman, mathematics should purely and strictly be done for its own sake, and could never be tainted with any kind of worldly stigma. Perelman is truly a mathematical hermit, or what a professor of mine would call using mathematical jargon, a "hermitian operator".

In "Perfect Rigor", Masha Gessen tells us the story of this remarkable individual, but even more importantly tells us the story of the Russian mathematical system that produced this genius. The inside details of Russian mathematics were cut off from the world until the fall of the Soviet Union. Russian mathematics was nurtured by a small group of extraordinary mathematicians including Andrey Kolmogorov, the greatest Russian mathematician of the twentieth century. Kolmogorov and others who followed him believed in taking latent, outstanding talent in the form of young children and single-mindedly transforming them into great problem solvers and thinkers. Interestingly in the early Soviet Union under Stalin's brutal rule, mathematics flourished where other sciences languished partly because Stalin and others simply could not understand abstract mathematical concepts and thus did not think they posed any danger to communist ideology. Soviet mathematics also got a boost when its great value was recognized during the Great Patriotic War in building aircraft and later in work on the atomic bomb. Mathematicians and physicists thus became unusually valued assets to the Soviet system.

Kolmogorov and a select band of others took advantage of the state's appreciation of math and created small, elite schools for students to train them for the mathematical olympiads. Foremost among the teachers was a man named Sergei Rukshin who Gessen talks about at length. Rukshin believed in completely enveloping his students in his world. In his schools the students entered a different universe, forged by intense thought and mathematical camaraderie. They were largely shielded from outside influences and coddled. The exceptions were women and Jews. Gessen tells us about the rampant anti-Semitism in the Soviet Union which lasted until its end and prevented many bright Jewish students from showcasing their talents. Perelman was one of the very few Jews who made it, and only because he achieved a perfect score in the International Mathematical Olympiad.

Perelman's extreme qualities were partly a result of this system, which had kept him from knowing about politics and the vagaries of human existence and insulated him from a capricious world where compromise is necessary. For him, everything had to be logical and utterly honest. There was no room for things such as diplomacy, white lies, nationalism and manipulation to achieve one's personal ends. If a mathematical theorem was proven to be true, then any further acknowledgment of its existence in the form of monetary or practical benefits was almost vulgar. This was manifested in his peculiar behavior in the United States. For instance, when he visited the US in the 90s as a postdoctoral researcher he had already made a name for himself. Princeton offered the twenty nine year old an assistant professorship, a rare and privileged opportunity. However Perelman would settle for nothing less than a full professorship and was repulsed even by the request that he officially interview for the position (which would have been simply a formality) and submit his CV. Rudimentary formalities which would be normal for almost everyone were abhorrent for Perelman.

After being disillusioned with what he saw as an excessively materialistic academic food chain in the US, Perelman returned to Russia. For five years after that he virtually cut himself off from his colleagues. But it was then that he worked on the Poincare Conjecture and created his lasting achievement. Sadly, his time spent intensely working alone in Russia seemed to have made him even more sensitive to real and perceived slights. However, he did publicly put up his proofs on the internet in 2002 and then visited the US. For a brief period he even seemed to enjoy the reception he received in the country, with mathematicians everywhere vying to secure his services for their universities. He was unusually patient in giving several talks and patiently explaining his proof to mathematicians. Yet it was clear he was indulging in this exercise only for the sake of clarifying the mathematical concepts, and not to be socially acceptable.

However, after this brief period of normalcy, a series of events made Perelman reject the world of human beings and even that of his beloved mathematics. He was appalled by the publicity he received in newspapers like the New York Times which could not understand his work. He found the rat race to recruit him, with universities climbing over each other and making him fantastic offers of salary and opportunity, utterly repulsive. After rejecting all these offers and even accusing some of his colleagues of being traitors who gave him undue publicity, he withdrew to Russia and definitively severed himself from the world. The final straw may have been two events; the awarding of the Fields Medal which, since his work was still being verified, could not explicitly state that he had proven the Poincare conjecture, and the publication of a paper by Chinese mathematicians which in hindsight clearly seems to have been written for stealing the limelight and the honors from Perelman. For Perelman, all this (including the sharing of the Fields with three other mathematicians) was a grave insult and unbecoming of the pursuit of pure mathematics.

Since then Perelman has been almost completely inaccessible. He does not answer emails, letters and phone calls. In an unprecedented move, the president of the International Mathematical Congress which awards the Fields Medals personally went to St. Petersburg to talk him out of declining the prize. Perelman was polite, but the conversation was to no avail. Neither is there any indication that he would accept the 1 million dollar Clay prize. Gessen himself could never interview him, and because of this the essence of Perelman remains vague and we don't really get to know him in the book. Since Gessen is trying to somewhat psychoanalyze her subject and depends on second-hand information to draw her own conclusions, her narrative sometimes lacks coherence and meanders off. As some other reviewers have noted, the discussion of the actual math is sparse and disappointing, but this book is not really about the math but about the man and his social milieu. The content remains intriguing and novel.

Of course, Perelman's behavior is bizarre and impenetrable only to us mere mortals. For Perelman it forms a subset of what has in his mind always been a perfectly internally consistent and logical set of postulates and conclusions. Mathematics has to be done for its own sake. Academic appointments, prizes, publicity and professional rivalries should have no place in the acknowledgement of a beautiful mathematical proof. While things like applying for interviews and negotiating job offers may seem to us to be perfectly reasonable components of the real world and may even seem to be necessary evils, for Perelman they are simply evils interfering with a system of pure thought and should be completely rejected. He is the epitome of the Platonic ideal; where pure ideas are concerned, any human association could only be a deeply unsettling imposition.

Constancy of the discodermolide hairpin motif

ResearchBlogging.org
Our paper on the conformational analysis of discodermolide is now up on the ACS website. The following is a brief description of the work.

Discodermolide (DDM) is a well-known highly flexible polyketide that is the most potent microtubule polymerization agent known. In this capacity it functions very similar to taxol and the epothilones. However the binding mode of DDM will intimately depend on its conformations in solution.

To this end we have performed multiple force field conformational searches on DDM and the first surprising thing we noticed was that all four force fields located the same global minimum for the molecule in terms of geometry. This is surprising because, given the dissimilar parameterization criteria used in different force fields, minima obtained for flexible organic molecules are usually different for different force fields. Not only that, but all the minima closely superimposed on the x-ray structure of DDM which we call the "hairpin" motif. This is also surprising since the solid state structure of such a highly flexible molecule should not generally bear resemblance to a theoretically calculated global minimum.

Next, we used our NAMFIS methodology that combines parameters from conformational searches to coupling constants and interproton distances obtained from NMR data to determine DDM conformations in two solvents, water and DMSO. We were again surprised to see the x-ray/force field global minimum structure existing as a major component of the complex solution conformational ensemble. In many earlier studies, the x-ray structure has been located as a minor component so this too was unexpected.

However, this same structure has also been remarkably implicated as the bioactive conformation bound to tubulin by a series of elegant NMR experiments. To our knowledge, this is the first tubulin binder which has a single dominant preferred conformation in the solid-state, as a theoretical global minimum in multiple force field conformational searches, in solution as well as in the binding pocket of tubulin. In fact I personally don't know of any other molecule of this flexibility which exists as one dominant conformation in such extremely diverse environments; if this happened to every or even most molecules, drug discovery would suddenly become easier by an order of magnitude since all we would have to do to predict the binding mode of a drug would be to crystallize it or to look at its theoretical energy minima. To rationalize this very pronounced conformational preference of DDM, we analyze the energetics of three distributed synthons (methyl-hydroxy-methyl triads) in the molecule using molecular mechanics and quantum chemical methods; it seems that these three synthons modulate the conformational preferences of the molecule and essentially override other interactions with solvent, adjacent crystal entities, and amino acid elements in the protein.

Finally, we supplement this conformational analysis with a set of docking experiments which lead to a binding mode that is different from the earlier one postulated by NMR (as of now there is no x-ray structure of DDM bound to tubulin). We rationalize this binding mode in the light of SAR data for the molecule and describe why we prefer it to the previous one.

In summary then, DDM emerges as a unique molecule which seems to exist in one dominant conformation in highly dissimilar environments. The study also indicates the use of reinforcing synthons as modular elements to control conformation.

Jogalekar, A., Kriel, F., Shi, Q., Cornett, B., Cicero, D., & Snyder, J. (2009). The Discodermolide Hairpin Structure Flows from Conformationally Stable Modular Motifs Journal of Medicinal Chemistry DOI: 10.1021/jm9015284

Warren DeLano

This is quite shocking. I just heard him speak at the eChemInfo conference two weeks back and talked to him briefly. His visualization software Pymol was *the* standard for producing and manipulating beautiful molecular images, and almost all images in all my papers until now were created using Pymol.

This is shocking and saddening. He could not have been more than in his late 30s. I have heard him give talks a couple of times and talked to him at another conference; he was naturally pleased to see Pymol used all over my poster. I think everyone can vouch that he was a cool and fun person. I really wonder what's going to happen to Pymol without him.

Here is a brief note posted by Dr. Axel Brunger in whose lab he greatly helped contribute to the programs X-PLOR and CNS for crystallography and modeling.

Dear CCP4 Community:

I write today with very sad news about Dr. Warren Lyford DeLano.

I was informed by his family today that Warren suddenly passed away at home on Tuesday morning, November 3rd.

While at Yale, Warren made countless contributions to the computational tools and methods developed in my laboratory (the X-PLOR and CNS programs),
including the direct rotation function, the first prediction of helical coiled coil structures, the scripting and parsing tools that made CNS a universal computational crystallography program.

He then joined Dr. Jim Wells laboratory at USCF and Genentech where he pursued a Ph.D. in biophysics, discovering some of the principles that govern
protein-protein interactions.

Warren then made a fundamental contribution to biological sciences by creating the Open Source molecular graphics program PyMOL that is widely used throughout the world. Nearly all publications that display macromolecular structures use PyMOL.

Warren was a strong advocate of freely available software and the Open Source movement.

Warren's family is planning to announce a memorial service, but arrangements have not yet been made. I will send more information as I receive it.

Please join me in extending our condolences to Warren's family.

Sincerely yours,
Axel Brunger

Axel T. Brunger
Investigator, Howard Hughes Medical Institute
Professor of Molecular and Cellular Physiology
Stanford University

A wrong kind of religion; Freeman Dyson, Superfreakonomics, and global warming

Image Hosted by ImageShack.us Image Hosted by ImageShack.us


The greatest strength of science is that it tries to avoid dogma. Theories, explanations, hypotheses, everything is tentative, true only as long as the next piece of data does not invalidate it. This is how science progresses, by constantly checking and cross checking its own assumptions. The heart of this engine of scientific progress is constant skepticism and questioning. This skepticism and questioning can often be exasperating. You can enthusiastically propound your latest brainwave only to be met with hard-nosed opposition, deflating your long harbored fervor for your pet idea. Sometimes scientists can be vicious in seminars, questioning and cross questioning you as if you were a defendant in a court.

But you learn to live with this frustration. That's because in science, skepticism always means erring on the safer side. As long as skepticism does not descend into outright irrational cynicism, it is far better to be skeptical than to buy into a new idea. This is science's own way to ensure immunity to crackpot notions that can lead it astray. One of the important lessons you learn in graduate school is to make peace with your skeptics, to take them seriously, to be respectful to them in debate. This attitude keeps the flow of ideas open, giving everyone a chance to voice their opinion.

Yet the mainstay of science is also a readiness to test audacious new concepts. Sadly, whenever a paradigm of science reaches something like universal consensus, the opposite can happen. New ideas and criticism are met with so much skepticism that it borders on hostility. Bold conjectures are shot down mercilessly sometimes even without considering their possible merits. The universal consensus separates scientists into a majority who provide a vocal and even threatening wall of obduracy against new ideas. From what I have seen in recent times, this unfortunately seems to have happened to the science of global warming.

First, a disclaimer. I have always been firmly in the "Aye" camp when it comes to global warming. There is no doubt that the climate is warming due to greenhouse gases, especially CO2, and that human activities are most probably responsible for the majority of that warming. There is also very little doubt that this rate of warming has been unprecedented into the distant past. It is also true that if kept unchecked, these developments will cause dangerous and unpredictable changes in the composition of our planet and its biosphere. Yet it does not stop there. Understanding and accepting the details about climate change is one thing, proposing practical solutions for mitigating it is a whole different ball game. This ball game involves more economics than science, since any such measures will have to be adopted on a very large scale that would significantly affect the livelihood of hundreds of millions. We need vigorous discussion on solutions to climate change from all quarters, and the question is far from settled.

But even from a scientific perspective, there are a lot of details about climate change that can still be open to healthy debate. Thus, one would think that any skepticism about certain details of climate change would be met with the same kind of lively, animated argument that is the mainstay of science. Sadly, that does not seem to be happening. Probably the most recent prominent example of this occurred when the New York Times magazine ran a profile of the distinguished physicist Freeman Dyson. Dyson is a personal scientific hero of mine and I have read all of his books (except his recent very technical book on quantum mechanics). Climate change is not one of Dyson's main interests and has occupied very little of his writings, although more so recently. To me Dyson appears as a mildly interested climate change buff who has some opinions on some aspects of the science. He is by no means an expert on the subject, and he never claims to be one. However he has certain ideas, ideas which may be wrong, but which he thinks make sense (in his own words, "It is better to be wrong than to be vague"). For instance he is quite skeptical about computer models of climate change, a skepticism which I share based on my own experience with the uncertainty modeling even "simple" chemical systems. Dyson who is also well known as a "futurist" has proposed a very interesting possible solution to climate change; the breeding of special genetically engineered plants and trees with an increased capacity for capturing carbon. I think there is no reason why this possibility could not be looked into.

Now if this were the entire story, all one would expect at most would be experts in climate change respectfully debating and refuting Dyson's ideas strictly on a factual basis. But surprisingly, that's not what you got after the Times profile. There were ad hominem attacks calling him a "crackpot", "global warming denier", "pompous twit" and "faker". Now anyone who knows the first thing about Dyson would know that the man does not have a political agenda and he has always been, if anything, utterly honest about his views. Yet his opponents spared no pains in painting him with a broad denialist brush and even discrediting his other admirable work in physics to debunk his climate change views. What disturbed me immensely was not that they were attacking his facts- that is after all how science works and is perfectly reasonable- but they were attacking his character, his sanity and his general credibility. The respected climate blogger Joe Romm rained down on Dyson like a ton of bricks, and his criticism of Dyson was full of condescension and efforts to discredit Dyson's other achievements. My problem was not with Romm's expertise or his debunking of facts, but with his tone; note for instance how Romm calls Dyson a crackpot right in the title. One got the feeling that Romm wanted to portray Dyson as a senile old man who was off his rocker. Other bloggers too seized upon Romm-style condescension and dismissed Dyson as a crank. Since then Dyson has expressed regret over the way his views on global warming were overemphasized by the journalist who wrote the piece. But the fact is that it was this piece which made Freeman Dyson notorious as some great global warming contrarian, when the truth was much simpler. In a Charlie Rose interview, Dyson talked about how global warming occupies very little of his time, and his writings clearly demonstrate this. Yet his views on the topic were blown out of proportion. Sadly, such vociferous, almost violent reactions to even reasonable critics of climate change seems to be becoming commonplace. If this is how the science of global warming is looking like, then it's not a very favourable outlook for the future .

If Dyson has been Exhibit A in the list of examples of zealous reactions to unbiased critics of climate change, then the recent book "Superfreakonomics" by economists Steven Levitt and Stephen Dubner (authors of the popular "Freakonomics") would surely be Exhibit B. There is one chapter among six in their book about global warming. And yet almost every negative review on Amazon focuses on this chapter. The authors are bombarded with accusations of misrepresentation, political agendas and outright lies. Joe Romm again penned a rather propagandish and sensationalist sounding critique of the authors' arguments. Others duly followed. In response the authors wrote a couple of posts on their New York Times blog to answer these critics. One of the posts was written by Nathan Myhrvold, previously Chief Technology officer of Microsoft and now the head of a Seattle-based think tank called Intellectual Ventures. Myhrvold is one of the prominent players in the book. Just note the calm, rational, response that he pens and compare it to one of Joe Romm's posts filled with condescending personal epithets. If this is really a scientific debate, then Myhrvold surely seems to be behaving like the objective scientist in this case.

So are the statements made by Levitt and Dubner as explosive as Romm and others would make us believe? I promptly bought the book and read it, and read the chapter on climate change twice to make sure. The picture that emerged in front of me was quite different from the one that I had been exposed to until then. Firstly, the authors' style is quite matter of fact and not sensationalist or contrarian sounding at all. Secondly, they never deny climate change anywhere. Thirdly, they make the very important general point that complex problems like climate change are not beyond easy, cheap solutions and that people sometimes don't readily think of these; they cite hand washing to drastically reduce infections and seat belts to reduce fatal car crashes as two simple and cheap innovations that saved countless lives. But on to Chapter 5 on warming.

Now let me say upfront that at least some of Levitt and Dubner's research is sloppy. They unnecessarily focus on the so-called "global cooling" events of the 70s, events that by no means refute global warming. They also seem to cherry pick the words of Ken Caldeira, a leading expert on climate change. But most of their chapter is devoted to possible cheap, easy solutions to climate change. To tell this story, they focus on Nathan Myhrvold and his team at Intellectual Ventures who have come up with two extremely innovative and interesting solutions to tackle the problem. The innovations are based on the injection of sulfate aerosols in the upper atmosphere. This rationale is based on a singular event, the eruption of Mount Pinatubo in the Phillipines in 1990 which sent millions of tons of sulfates and sulfur dioxide into the atmosphere and circulated them around the planet. Sulfate aerosols serve to reflect sunlight and tend to cause cooling. Remarkably, global temperatures fell by a slight amount for a few years after that. The phenomenon was carefully and exhaustively documented. It was a key contributor to the development of ideas which fall under the rubric of "geoengineering". These ideas involve artificially modulating the atmosphere to offset the warming effects of CO2. Geoengineering is controversial and hotly debated, but it is supported by several very well known scientists, and nobody has come up with a good reason why it would not work. In the light of the seriousness of global warming, it deserves to be investigated. With this in mind, Myhrvold and his team came up with a rather crazy sounding idea; to send up a large hose connected to motors and helium balloons which would pump sulfates and sulfur dioxide into the stratosphere. Coupled with this they came up with an even crazier sounding idea; to thwart hurricanes by erecting large, balloon like structures on coastlines which would essentially suck the hot air out of the hurricanes. With their power source gone, the hurricanes would possibly quieten down.

Are these ideas audacious? Yes. Would they work? Maybe, and maybe not. Are they testable? Absolutely, at least on a prototypical, experimental basis. Throughout the history of science, science has never been fundamentally hostile to crazy ideas if they could be tested. Most importantly, the authors propose these ideas because the analysis indicates them to be much cheaper than long-term measures designed to reduce carbon emissions. Solutions to climate change need to be as cheap as they need to be scientifically viable.

So let's get this straight; the authors are not denying global warming and in fact in their own words, they are proposing a possible solution that could be cheap and relatively simple. And they are proposing this solution only to temporarily act as a gag on global warming, so that long-term measures could then be researched at relative leisure. In fact they are not even claiming that such a scheme would work, only that it deserves research attention. Exactly what part of this argument screams "global warming denial"? One would imagine that opponents of these ideas would pen objective, rational objections based on hard data and facts. And yet almost none of the vociferous critics of Levitt and Dubner seem to have engaged in such an exercise (except a few). Most exercises seem to be of the "Oh my God! Levitt and Dubner are global warming deniers!!" kind. Science simply does not progress in this manner. All we need to do here is to debate the merit of a particular set of ideas. Sure, they could turn out to be bad ideas, but we will never know until we test them. The late Nobel laureate Linus Pauling said it best; "If you want to have a good idea, first have lots of ideas, then throw the bad ones away". Especially a problem as big as climate change needs ideas flying in from all quarters, some conservative, some radical. And as the authors indicate, cheap and simple ideas ought to be especially welcome. Yet the reception to Superfreakonomics to me looked like the authors were being castigated and resented for having ideas. The last thing scientific progress needs is a vocal majority that thwarts ideas from others and encourages them to shut up.

Freeman Dyson once said that global warming sometimes looks like a province of "the secular religion of environmentalism" and sadly there seems to be some truth to this statement. It is definitely the wrong kind of religion. As I mentioned before, almost any paradigm that reaches almost universal consensus runs the risk of getting forged into a religion. At such a point it is even more important to respect critics and give them a voice. Otherwise, going by the almost violent reaction against both Dyson and the authors of Superfreakonomics, I fear that global warming science will descend to the status of biological studies of race. Any research that has to do with race is so politically sensitive and fraught with liabilities and racist overtones that even reasonable scientists who feel that there is actually something beneficial to be gained from the study of race (and there certainly is; nobody would deny that certain diseases are more common to certain ethnic minorities) feel extremely afraid to speak up, let alone apply for funding.

We cannot let such a thing happen with the extremely important issue of climate change. Scientific progress itself would be in a very sad state if critics of climate change with no axe to grind are so vilified and resented that they feel inclined to shut up. Such a situation would trample the very core principles of science underfoot.

That is verboten

I have been poring over some manuscripts recently and realized that there are some words which are best avoided in any scientific paper. Hopefully I would not use them myself and I would find myself grimacing if someone else used them.

Probably the most verboten word is "prove". There is no proof in science, only in mathematics. Especially in science where almost everything we do consists in building a model; whether it is a protein-ligand interaction model, stereoselective organic reaction model, or transition state model. A model can never be "proven" to be "true". It can only be shown to correlate to experimental results. Thus anyone who says that such and such a piece of data "proves" his model should get the referees' noose right away.

So what would be a better word? "Validate"? Even that sounds too strong a word to me. So does "justify". How about "support"? Perhaps. I think about the best thing that all of us can say is that our model is consistent with the experimental data. This statement makes it clear that we aren't even proposing it as the sole model, only as a model that agrees with the data.

Even here the comparison is tricky since all pieces of data are not created equal. For instance one might have a model of a drug bound to a protein that's consistent with a lot of SAR data but somehow does not seem to agree with one key data point. The question to ask here is what the degree of disagreement is and what the quality of that data point is. If the disagreement is strong, this should be made clear in the presentation of the model. Often it is messy to tally the validity of a model with a plethora of diverse data points of differing quality. But quality of data and underreporting of errors in it is something we will leave for some other time.

For now we can try to keep the proofs out of the manuscripts.