In the absence of rational approaches this technique (which these days is called "repurposing") could be surprisingly fruitful. However, as molecular biology, crystallography and structure-based drug design took off in the 80s, rational drug discovery became much more focused on protein structure and drug developers began trying to guess drug function by looking at target similarity in terms of sequences and binding pockets.
But the relative lack of dividends from structure-based drug design (which nonetheless continues to be important) has led to a re-examination of the old paradigm. One of the most interesting advances to come out of this thinking was pioneered by Brian Shoichet and Bryan Roth's groups (at UCSF and UNC-Chapel Hill) a few years back. Their rationale was simple too; look at drug similarity, albeit using modern computational methods and metrics, and try to cross-correlate drug targets using these similarity metrics. Similar drugs should hit each other's targets. The method seems somewhat crude but has worked surprisingly well. It was even listed as one of Wired magazine's top ten scientific breakthroughs of 2009.
In a recent publication the authors take the method a step further and apply it to phenotypic screening which has emerged as an increasingly popular method in the pharmaceutical industry. Phenotypic screening is attractive because it bypasses having knowledge of the exact molecular target; basically you just inject different potential drugs in a test system and look at the results which are usually quantified by a phenotypic response such as a change in membrane potential, cell differentiation or even something as general as locomotion. Once a particular compound has elicited a particular response, we can start the arduous task of finding out what it's doing at a molecular level. Not surprisingly, several proteins can be responsible for a general response like locomotion and detecting all of them is non-trivial to say the least. It is for this rather involved exercise that the authors provide a remarkably simple potential (partial) solution.
The current study looks at phenotypic screening on the zebrafish, a favorite model organism of biologists. 14,000 molecule were screened to elicit a photomotor response in the zebrafish embryos. Out of these, about 700 were deemed active. To find out the targets for these molecules, the authors interrogated their chemical "similarity" against a large group of compounds for which targets are known. Importantly, the authors use a statistical technique to calculate an expectation value (E value) which indicates whether the similarity arises by chance alone. A lower E value means a higher likelihood of statistical significance. One of the most remarkable things in these studies is that the metric for similarity is a simple, computationally cheap 2D metric called a fingerprint which looks at the presence or absence of certain functional groups in the molecules. That such a metric can work at all is remarkable because we know that an accurate estimation of similarity should ideally include the 3D conformation that the drug presents to the protein target.
Nonetheless, 284 molecules predicted to be active on 404 targets were picked based on their low E values. Out of these, 20 molecules were especially interesting because they were seen to have novel drug-target associations not seen before. When the authors tested these molecules against the relevant targets, 11 among them had activities ranging from low nanomolar to 10 µM. For a computational method this hit rate is quite impressive, although a more realistic measure of the hit rate would have come from testing all 284 compounds. The activity of the molecules was validated by running comparisons with molecules that were known to elicit the same response from the predicted targets. Confirmation also came from competition experiments with antagonists. Some of the unexpected targets predicted include the beta-adrenergic receptor, dopamine receptors and potassium ion channels.
I find these studies very encouraging. For one thing, the computational method can potentially save a huge amount of time needed to experimentally uncover the target for every active compound. As mentioned above, it's also remarkable that a very simple metric of 2D similarity yields useful results. The success rate is impressive; however, an even lower rate would still have been worth the modest effort if it resulted in new drug-target relationships (which are far more useful than new chemically similar ligands for the same target). However I do think it would have been very interesting to look at the failures. An obvious source of failure comes from using the wrong measure of similarity; at least some compounds are failing presumably because their 2D similarity does not translate to the 3D similarity in conformation required for binding to the protein target. In addition there could be protein flexibility which would result in very different binding for supposedly similar compounds. Medicinal chemists are well aware of "activity cliffs" where small differences in chemical structure lead to great differences in activity. These cliffs could also lead to lack of binding to a predicted target.
Nevertheless, in an age when drug discovery is only getting harder and phenotypic screening seems to be an increasingly important technique, these computational approaches promise to be a useful complement. Gratifyingly, the authors have developed a website where the algorithm is available for free. The technique has also spawned a startup.
"Ubiquitin and the ribosome, fluorescent proteins and ion channels are as fundamentally chemical as metal surfaces, enantioselective catalysts, olefin metathesis, or, just to name some fields squarely in our profession that should be (or should have been) recognized, laser chemistry, metal–metal multiple bonding, bioinorganic chemistry, oral contraception, and green or sustainable chemistry."
And ultimately he emphasizes something that we should all constantly remind each other. It's a prize, awarded by human beings. It's an honor all right, but it does very little to highlight the objective value of the research which is usually evident far before the actual recognition. The fact that we were informally nominating Robert Grubbs or Roger Tsien years before they received the prizes makes it clear that no prize was really going to change our perception of how important their work was. Today we look at Tsien's research on green fluorescent protein with the same joyful interest that we did ten years ago.
Hoffmann sees the principal function of the Nobel Prize as providing an incentive for young students and researchers from scientifically underprivileged countries, and he cites the examples of Kenichi Fukui and Ahmed Zewail inspiring their fellow countrymen. The Nobel Prize certainly serves this function, but I have always been a little wary of pitching the benefits of scientific research by citing any kind of prize. The fact is that most people who do interesting research will never win the Nobel Prize and this does nothing to undervalue the importance of their work. So even from a strictly statistical standpoint, it would continue to be much more fruitful to point out the real benefits of science to young people- as a means of understanding the world and having fun while you are at it. Prizes may or may not follow.
Hat tip: Excimer
The book is roughly divided into three parts. The first part details Anderson's views on the history and philosophy of science including his own field - solid-state physics. The second part talks about Anderson's reminiscences and thoughts on his scientific peers, mostly in the form of book reviews that he has written for various magazines and newspapers. The third part deals with science policy and politics and the fourth is dedicated to "attempts" at popularizing science.
Some of the chapters are full of scientific details and can be best appreciated by physicists but there's also a lot of fodder for the layman in here. A running thread through several essays is Anderson's criticism of ultra-reductionism in science which is reflected in the title of the book, "More and Different". Anderson's basic viewpoint is that more is not just quantitatively but qualitatively different from less. In 1972 he made a splash by discussing in an article in Science magazine how "higher-level" sciences are based on their own fundamental laws which cannot be reduced to physics. In the book he details this philosophy through several examples from physics, chemistry, biology and psychology. He does not deny the great value of reductionism in the development of modern science but he incisively explores its limits.
Other chapters contain critiques of the latest fads in physics including string theory. Anderson bemoans string theory's lack of connection to concrete experiment and its failure to predict unique, robust solutions. He makes it clear that string theory is really mathematics and that it fails to adhere to the tried and tested philosophy of science which has been successful for almost five hundred years. Other chapters have insightful commentary on the role of mathematics in physics, Bayesian probability and physics at Bell Labs. A particularly amusing essay critiquing the current funding situation in the United States proposes a hypothetical alternative history of quantum mechanics in the US, where scientific pioneers like Dirac and Heisenberg may not have been able to do groundbreaking research because of the publish-or-perish environment and the dominance of the old guard.
There's also some valuable material in here about the sociology of science. This is exemplified by an especially insightful and detailed chapter on scientific fraud where Anderson explores the reasons why some scientists commit fraud and others don't expose it as widely as they should. In Anderson's opinion the most valuable method to expose fraud is to ask whether it destroys what he calls the "seamless web of science" - the existing framework of fundamental laws and principles that allow relatively little room for revolutionary breakthroughs on a regular basis. In many cases the web's integrity is clearly not consistent with the new finding, and the rare case where the web can subsume the new discovery and still stay intact leads us into genuinely new scientific territory. He also takes scientists to task for failing to point out the destruction of this seamless web by apparently far-reaching but fundamentally flawed new discoveries. In other chapters Anderson also comes down hard on the postmodernism distortion of science, critiquing such philosophers as Nancy Cartwright and upholding the views of debunkers like Alan Sokal. He also has some valuable commentary on science policy, especially on Star Wars and missile defense. Other writers have written much more detailed critiques of such programs, but Anderson succinctly demonstrates the limitations of the concept using commonsense thinking (The bottom line: Decoys can easily foil the system and a marginal improvement by the offense will result in a vastly increased cost for the defense).
Finally, the book contains mini sketches of some of Anderson's peers who happened to be some of the great scientific minds of the twentieth century. Anderson reviews books by and about Richard Feynman, Murray Gell-Mann, Stuart Kauffman, Stephen Hawking, Roger Penrose, John Bardeen and William Shockley among others. I happen to agree with him that books by scientists like Hawking, Penrose and Greene, while fascinating to read, paint a rather biased picture of physics and science. For one thing, they usually oversell the whole reductionist methodology in their constant drive to advertise the "Theory of Everything". But more importantly, they make it sound like particle physics and cosmology are the only games in town worth thinking about and that everything else in physics is done on the periphery. This is just not true. As Anderson makes it clear, there are lots of fields of physics including condensed matter physics, biophysics and non-linear dynamics which contain questions as exciting, fundamental and research-worthy as anything else in science. As just one example, classical physics was considered a staid old backwater of the physics world until chaos burst upon the scene. It's also clear, as was the case with chaos, that some of the most exciting advances will come from non-physicists. There are foundational phenomena and rich dividends to be mined from the intersection of physics with other fields in the twenty-first century.
Anderson's book might precisely be the kind of writing ignored by the public because they are too taken with the Hawkings, Greenes and Randalls. To those folks this volume would be an essential and healthy antidote. There's something in there for everyone, and it makes it clear that science still presents infinite horizons on every level. After all, more is different.