Field of Science

Why the same can be different: The case of the two enantiomers

The R enantiomer (green) allows Tyr337 to adopt
two different orientations. The S (yellow) does not.
Since we were discussing thermodynamics in biological systems the other day, here's a neat example from Angewandte Chemie of a system where thermodynamics reveals something surprising. The authors from Umeå University in Sweden were looking at two enantiomers of a ligand binding acetylcholinesterase. It's a robust, well-studied system and you don't really expect anything unexpected.

Except that it does do something unexpected. The first surprise was that both enantiomers bound with the same binding affinity. This is an observation that violates a central general tenet of biochemistry, namely that ligands and receptors are both chiral and therefore enantiomeric ligands will bind differently. The second surprise was that when they dissected the similar free energy of binding into entropic and enthalpic components, they found that the S enantiomer had a much more unfavorable entropy (1.5 e.u) than the R (8.5 e.u). Since the free energies were the same, this meant that there was enthalpy-entropy compensation, which meant in turn that the S enantiomer must have the more favorable enthalpy.

To investigate the origins of these differences, the two enantiomers were crystallized with the protein. Observation of the binding site indicated something interesting; the R enantiomer bound in a way that allowed a critical tyrosine residue (Tyr337) to adopt two different orientations. However, the S enantiomer shoved an ethyl group next to the tyrosine, essentially precluding this movement. Greater conformational flexibility for the tyrosine translated to greater disorder, hence the more favorable entropy for the R. What about enthalpy? Here it turns out that the S enantiomer, while sacrificing entropic freedom for the tyrosine, compensates by making stronger interactions with it. This was analyzed by quantum chemical calculations on a "reduced" version of the protein. Interestingly, the interactions are not "normal", respectable hydrogen bonds but "unnatural" C-H---O hydrogen bonds. For the R enantiomer, even these relatively weak interactions were enough to confer an enthalpic advantage that offset the entropic disadvantage.

This is why chemistry in general and biochemistry in particular are endlessly interesting; conventional wisdom is always being challenged even in well-studied systems, weak can be important, every example is unique and best of all, surprises lurk around almost every corner. As Arthur Kornberg put it, "I never met a dull enzyme".

Angewandte Chemie retracts hexacyclinol paper. Sort of

So it seems that the infamous hexacyclinol saga has been finally put to rest and Angewandte Chemie has retracted the paper. For those chemists who might still be unfamiliar with it, it's not hard to explain: Total synthesis paper published in 2006 with more holes than the vacuum of deep space. Multiple blog postings and papers demolish the claim within months. Journal does not retract the paper for six years.

Well, now the journal has published the retraction. Here's what it has to say:

The following article from Angewandte Chemie International Edition, “Total Syntheses of Hexacyclinol, 5-epi-Hexacyclinol, and Desoxohexacyclinol Unveil an Antimalarial Prodrug Motif” by James J. La Clair, published online on February 9, 2006 in Wiley Online Library (, has been retracted by agreement between the author, the journal Editor in Chief, Peter Gölitz, and Wiley-VCH Verlag GmbH & Co. KGaA. The retraction has been agreed due to lack of sufficient Supporting Information. In particular, the lack of experimental procedures and characterization data for the synthetic intermediates as well as copies of salient NMR spectra prevents validation of the synthetic claims. The author acknowledges this shortcoming and its potential impact on the community.

What I find disappointing about this retraction is that it's just not strong enough in denouncing the paper. It's not just that the procedures were irreproducible or that the supporting information was incomplete, it's that the whole synthesis was essentially...make believe. This was made clear by papers published later (re-synthesizing the natural product and calculating and comparing NMR spectra) which demonstrated beyond any shade of reasonable doubt that whatever was supposedly synthesized in the paper simply couldn't correspond to the structure of hexacyclinol as we know it. 

I think this is an important difference that the retraction does not acknowledge; it's the difference between saying "we think this could be wrong but we can't be sure since we can't reproduce the data" and "we are almost certain this is wrong since independent studies have convincingly demonstrated its utter implausibility".

Update: Carmen Drahl from C&EN has a superb Storify summary of the hexacyclinol saga over the last six years which features some of the blog posts commenting on the debacle. Carmen was also kind enough to post a picture of my cherished hexacyclinol t-shirt which I am still eager to break out; as I said in my email to her, I am still waiting to wear it at a big party where fellow t-shirters get together, laugh with sadistic glee, and mock the scattered bones of hexacyclinol's atomic constituents.

Occam, me and a conformational medley

Originally posted on the Scientific American Blog Network.

William of Occam, whose principle of parsimony has been used and misused (Image: WikiCommons)
The philosopher and writer Jim Holt who has written the sparkling new book “Why Does The World Exist?” recently wrote an op-ed column in the New York Times, gently reprimanding physicists to stop being ‘churlish’ and appreciate the centuries-old interplay between physics and philosophy. Holt’s point was that science and philosophy have always co-existed, even if their relationship has been more of an uneasy truce rather than an enthusiastic embrace. Some of the greatest physicists including Bohr and Einstein were also great philosophers.

Fortunately – or unfortunately – chemistry has had little to say about philosophy compared to physics. Chemistry is essentially an experimental science and for the longest time, theoretical chemistry had much less to contribute to chemistry than theoretical physics had to physics. This is now changing; people like Michael WeisbergEric Scerri and Roald Hoffmann proclaim themselves to to be bonafide philosophers of chemistry and bring valuable ideas to the discussion.

But the interplay between chemistry and philosophy is a topic for another post. In this post I want to explore one of the very few philosophical principles that chemists have embraced so wholeheartedly that they speak of it with the same familiar nonchalance with which they would toss around facts about acids and bases. This principle is Occam’s Razor, a sort of guiding vehicle that allows chemists to pick between competing explanations for a phenomenon or observation. Occam’s Razor owes its provenance to William of Occam, a 14th century Franciscan friar who dabbled in many branches of science and philosophy. Fully stated, the proposition tells us that “entities should not be multiplied unnecessarily” or that the fewer the assumptions and hypotheses underlying a particular analysis, the more preferred that analysis relative to those of equal explanatory power. More simply put, simple explanations are always better than complex explanations.

Sadly, the multiple derivate restatements of Occam’s Razor combined with our tendency to look for simple explanations can sometimes lead to erroneous results. Part of the blame lies not with Occam’s razor but with his interpreters; the main problem is that it’s not clear what “simple” and “complex” mean when applied to a natural law or phenomena. In addition, nature does not really care about what we perceive as simple or complex, and what may seem complex to us may appear perfectly simple to nature because it’s…real. This was driven home to me early on in my career.

Most of my research in graduate school was concerned with finding out the many conformations that complex organic molecules adopt in solution. Throw an organic molecule like ibuprofen in water and you don’t get a static picture of the molecule standing still; instead, there is free rotation about single bonds joining various atoms leading to multiple, rapidly interconverting shapes, or conformations, that are buffeted around by water like ships on the high seas. The exact percentage of each conformation in this dance is dictated by its energy; low-energy conformations are more prevalent than high-energy ones.
Different shapes of conformations of cyclohexane - a ring of six carbon atoms - ranked by energy (Image: Mcat review)

Since the existence of multiple conformations enabled by rotation around single bonds is a logical consequence of the basic principles of molecular structure, it would seem that this picture would be uncontroversial. Surprisingly though, it’s not always appreciated. The reason has to do with the fact that measurements of conformations by experimental techniques like nuclear magnetic resonance (NMR) spectroscopy always result in averages. This is because the time-scales for most of these techniques are longer than the time-scales needed for interconversion between conformations and therefore they cannot make out individual differences. The best analogy is that of a ceiling fan; when the fan is rotating fast, all we see is a contiguous disk because of the low time resolution of our eye. But we know that in reality, there are separate individual blades (see figure at end of post). NMR is like the eye that sees the disk and mistakes it for the fan.

Such is the problem with using experimental techniques to determine individual conformations of molecules. Their long time scales lead to average data to which a single, average structure is assigned. Clearly this is a flawed interpretation, but partly because of entrenched beliefs and partly because of lack of methods to tease apart individual conformations, scientists through the years have routinely published single structures as representing a more complex distribution of conformers. Such structures are sometimes called “virtual structures”, a moniker that reflects their illusory – essentially non-existent – nature. A lot of my work in graduate school was to use a method called NAMFIS (NMR Analysis of Molecular Flexibility In Solution) that combined average NMR data with theoretically calculated conformations to tease apart the data into individual conformations. There are others. Here's an article on NAMFIS that I wrote for college students.

When time came to give a talk on this research, a very distinguished scientist in the audience told me that he found it hard to digest this complicated picture of multiple conformations vying for a spot on the energy ladder. Wouldn’t the assumption of a single, clean, average structure be more pleasing? Wouldn’t Occam’s Razor favor this interpretation of the data? That was when I realized the limitations of Occam’s principle. The “complicated” picture of the multiple conformations was the real one in this case, and the simple picture of  a single average conformation was unreal. In this case, it was the complicated and not the simple explanation that turned out to be the right one. This interpretation was validated when I also managed to find, among the panoply of conformations, one which bound to a crucial protein in the body and turned the molecule into a promising anticancer drug. The experience again drove home the point that nature doesn’t often care about what we scientists find simple or complex.

Recently Occam made another appearance, again in the context of molecular conformations. This time I was studying the diffusion of organic molecules through cell membranes, a process that’s of great significance in drug discovery since even your best test-tube drug is useless if it cannot get into a cell. A chemist from San Francisco has come up with a method to calculate different conformations of molecules. By looking at the lowest-energy conformation, he then predicts whether that conformation will be stable inside the lipid-rich cell membrane. Based on this he predicts whether the molecule will make it across. Now for me this posed a conundrum and I found myself in the shoes of my old inquisitor; we know that molecules have several conformations, so how can only the single, lowest-energy conformation matter in predicting membrane permeability?

I still don’t know the answer, but a couple of months ago another researcher did a more realistic calculation in which she did take all these other conformations into consideration. Her conclusion? More often than not the accuracy of the prediction becomes worse because by including more conformations, we are also including more noise. Someday perhaps we can take all those conformations into account without the accompanying noise. Would we then be both more predictive and more realistic? I don’t know.

These episodes from my own research underscores the rather complex and subtle nature of Occam’s Razor and its incarnation in scientific models. In the first case, the assumption of multiple conformations is both realistic and predictive. In the second, the assumption of multiple conformations is realistic but not predictive because the multiple-conformation model is not good enough for calculation. In the first case, a simple application of Occam’s razor is flawed while in the second, the flawed simple assumption actually leads to better predictions. Thus, sometimes simple assumptions can work not because the more complex ones are wrong, but because we simply lack the capacity to implement the more complex ones.

I am glad that my work with molecular conformations invariably led me to explore the quirky manifestations of Occam’s razor. And I am thankful to a well-known biochemist who put it best: “Nature doesn’t always shave with Occam’s Razor”. In science as in life, simple can be quite complicated, and complicated can turn out to be refreshingly simple.

A rotating ceiling fan - Occam's razor might lead us to think that the fan is a contiguous disk, but we know better.

Does modern day college and graduate education in chemistry sacrifice rigor for flexibility?

This is what I love about blogging; it's the classic "one thing leads to another" device. The previous discussion on the paucity of thermodynamics in college coursework led to a more general and important exchange in the comments section that basically asked: Are we sacrificing rigor for flexibility by giving students too much freedom to pick and choose their courses?

The following sentiments (or variations thereof) were expressed:

- There should be a core curriculum for chemistry students that exposes them to mandatory courses in general, organic and physical chemistry at the very least. These requirements seems to be more widespread among physics departments. To my knowledge, Caltech is one of the few schools with a general core curriculum for all science majors. How many other schools have this?

- Part of the lack of exposure to important topics in grad school results from emphasizing research at the cost of coursework. And this is related to a more widespread sentiment of woe: it's become all too easy to get a PhD, partly because of the curse of academia that encourages one to become a glorified technician at the cost of instilling creative scientific thought. The belief is that many professors (and there's many exceptions) would rather produce well-trained manual laborers who contribute to the Grant and Paper Production Factory than independent scientific thinkers who can assimilate ideas from diverse scientific fields. You shouldn't really get a Ph.D. just for putting in 80-hour weeks.

We need to hold students to higher standards, but I think this is not going to happen until the publish-or-perish culture is fundamentally transformed and the movers and shakers of academic research take a hard look at what they are doing to their graduate students.

- Many textbooks are mired in the age-old, classical presentation of thermodynamics that emphasizes Carnot cycles and Maxwell relations much more than any semi-quantative feel for the operation of thermodynamic in practical chemical and biological systems. We are just not doing a good job communicating the real-world importance of topics like thermodynamics; add to this students who are not going to study something if it's not required and we are in a real bind.

- Physical organic chemistry - the one discipline that can naturally build bridges between physical and organic chemistry - is disappearing from the curriculum. Those who intellectually matured in its heyday were naturally exposed to thermodynamics and kinetics. Graduate students in organic chemistry shouldn't be able to get away with just synthesis and spectroscopy courses.

- Matt who, unlike most of us armchair philosophers, is a live professor at an actual research institution, makes the point that we should do an outstanding job of emphasizing thermodynamics in the freshman general chemistry class. We should do such a good job that students should always be able to connect those concepts to anything else that they study later. As Matt recommends, we could include the more qualitative important real-life applications of thermodynamics (and not just to antiquated heat engines) like those in drug discovery in this gen chem class.

All great points in my opinion. I have strong feelings about all this myself, but I have not done any detailed study of college curricula so my opinions are mostly anecdotal. Feel free to chime in with actual data or more opinions in the comments section.

Who's afraid of Big Bad Thermodynamics?

George Whitesides, in a trademark outfit.
In a talk at Northeastern University yesterday, George Whitesides asked the students in the audience if they had ever studied thermodynamics. Not a single hand went up.

Even accounting for the fact that some students might have been reluctant to flag themselves in a large audience, I find this staggering, especially if these students are planning to go into basic drug discovery research. But I can’t completely blame them. I happened to take one (mandatory) thermodynamics and one (non-mandatory) basic statistical mechanics class in college and was exposed to thermodynamics in graduate school through my work on conformational equilibria and NMR. But most of my fellow graduate students in organic and biochemistry had little inkling of thermodynamics; it certainly wasn't a part of their standard intellectual toolkit.

The problem’s made worse by misunderstandings about thermodynamics that seem to linger in students’ heads even later in their career. These misunderstandings stem from a larger rift between organic and physical chemistry; the latter is supposed to be highly mathematical and abstract and rather irrelevant to the former. This is in spite of the overwhelming importance of concepts from p-chem in classical physical organic chemistry. Sadly, classical physical organic chemistry itself is disappearing from the college and grad school curriculum, and an argument in favor of emphasizing thermodynamics is also a plug for not letting physical organic chemistry become a relic of the past. No other topic gives you as good a basic feel for structure, function and reactivity in organic chemistry.

But coming back to thermodynamics, this impression that many students have about thermodynamics being all about Maxwell relations and Carnot cycles and virial theorems is rather misleading. It’s not that these things are not important, it’s just that that’s not the kind of thermodynamics Whitesides was talking about when he was talking about drug discovery. Thermodynamics in drug discovery is often much less complicated than bonafide textbook thermodynamics. And it’s of foundational importance in the field since at its core, drug discovery is about understanding molecular recognition which is completely a thermodynamic phenomenon governed by free energy.

For a drug designer, the key thermodynamic circus to understand is the interplay between G, H, and S as manifested in the classic equation ∆G = ∆H – T∆S. It’s key to get a feel for how opposing values of H and S can lead to the same value of G, since this is at the heart of protein-drug recognition. It’s also important to know the different features of water, protein and solvent that contribute to changes in these parameters. Probably the most important thermodynamic effect that drug designers need to be aware of is the hydrophobic effect. They need to know that the hydrophobic effect is largely an entropic effect arising from the release of bound waters, although as Whitesides has himself demonstrated, reality can be more complicated. But the fact is that we simply cannot understand water without thermodynamics, and we cannot understand drug action without understanding water.

Also paramount is to understand the relationship between thermodynamics and kinetics, something that again benefits from studying reactions under thermodynamic and kinetic control and things like the Curtin-Hammet principle in classical physical organic chemistry. It’s crucial to know the difference between thermodynamic and kinetic stability, especially when one is dealing with small molecule and protein conformations. Finally, it’s enormously valuable to have a feel for a few key numbers, foremost among which may be the relationship between equilibrium constant and free energy; knowing this tells you for instance that it takes only a difference of 1.8 kcal/mol of free energy between two conformers to almost completely shift the conformational equilibrium on to the side of the more stable one. And when that difference is 3 kcal/mol, the higher-energy conformation is all gone, well beyond the detection limits of techniques like NMR. Speaking of which, a good understanding of thermodynamics also tells you why it’s incorrect to rely on average NMR data to tease apart the details of multiple conformations in solution.

All this knowledge about thermodynamics is ingrained more easily than complicated mathematical derivations of configuration integrals in free-energy perturbation theory. Students need to realize that the thermodynamics that they need to tackle drug discovery is a semi-quantitative blend of ideas relying much more on a rough feel for numbers and competing entropic and enthalpic effects. This kind of feel can lead to some very useful insights, for instance regarding relationships between G, H and S in the evolution of new drugs.

It’s time to incorporate this more general thermodynamics outlook in drug discovery classes and even in regular chemistry classes. It’s simple enough to be taught to undergraduates and bypasses the more sophisticated and intimidating ideas of statistical mechanics.

In yesterday’s conference, the chairman had the last laugh. Half-jokingly he emphasized that Northeastern’s chemistry course is ACS certified, which means that one semester of p-chem and thermodynamics are mandatory. Apparently Harvard’s is not. To which Whitesides replied that he can guarantee that you will find students at Harvard who are also not familiar with thermodynamics.

Whitesides's appeal to give pharmaceutical scientists-in-training a firm grounding in thermodynamics applies across the board. On top of Plato’s Academy there was rumored to be a sign which said “Let no one ignorant of geometry enter”. Perhaps one can make a similar case for thermodynamics in the pharmaceutical industry?


I have often written about thermodynamics in drug design on my blog. A few potentially useful posts:

Some useful references:

George Whitesides - Designing ligands to bind tightly to proteins (book chapter PDF): Includes much of the material from Whitesides's talk.
Jonathan Chaires - Calorimetry and Thermodynamics in Drug Design.

Live blogging the Northeastern Drug Discovery conference

So I figured that since I am attending the drug discovery conference at Northeastern University I might as well jot down a few thoughts about the talks.

Leroy Hood: Institute for Systems Biology (Seattle).

Lots of optimism, which is usually the case with systems biology talks; being able to distinguish "wellness" genes from "disease" genes for everybody in about ten years, being able to map all disease-related biomarkers from blood analysis etc. But there were some interesting tidbits:

- Noise - especially biological noise - cannot be handled by traditional machine learning approaches. Signal to noise ratio is very low especially when picking biomarkers.

- SysBio can help pharma pick targets (which it is increasingly getting worse at).
- Cost can be minimized in optimal cases; eg. FDA approved Herceptin specific for 20% of patients in only 40-patient sample (Genentech). 
- Descriptive and graphical models can be enormously useful; in fact complexity often precludes mathematical modeling.
- Example of prions injected into mouse: expression of 33% genes changed. Biological noise can be “subtracted” by judiciously picking strains that get rid of one effect and preserve others.

My own take on systems biology has always been that, while it is likely to become quite significant at some point:

a. It's going to take longer than we think.

b. Separating signal from noise and honing in on the handful of approaches which will be robust and meaningful is going to give us a lot of grief. This will likely be Darwinian selection at its best.

Patricia Hurter (Vertex): Formulation

For people like me in discovery, formulation is a whole new world. Compaction, rolling, powder flow, force-response curves; engineers would feel right at home with these concepts, and in fact they do. And of course, you don’t talk about anything less than 25 kilograms.

Eric Olson (Vertex): Cystic Fibrosis

- Most common mutation is F508del (targets 88% of patients)

- Two potential drugs; potentiators (for restoring function) and correctors (for localizing protein from ER to membrane surface).
- However, only potentiators needed for G551D mutation (targets 4% of patients). Ivacaftor increases probability of channel being open; more beating cilia (nice video).
- Development challenges: little CF expertise, limited patient pop, no defined preclinical and regulatory path, outcomes for proof-of-concept and phase 3 not well established for mechanism-of-action.

I thought that the development of Vertex's CF drugs is a model example of charting out drug development in a novel, unexplored area.

Arun Ghosh (Purdue): Darunavir

From a medicinal chemistry standpoint this was probably the most impressive. Ghosh is one of the very few academic scientists to have a drug (Darunavir) named after them. He described the evolution of Darunavir from the key idea of targeting the backbone of HIV protease; the belief was that while side-chains are different between HIV mutants, the backbone stays constant and therefore compound binding to the backbone would be effective against resistant strains. 

This idea turned out to be remarkably productive, and Ghosh described a series of studies that just kept on improving potencies against virtually any mutant HIV strain that the biologists threw at the compound. It was a medicinal chemist’s dream; there was a wealth of crystal structure data, compounds routinely turned out to have picomolar potencies, and almost every single modification that the chemists designed worked exactly as expected. Some of this success was of course good luck, but that’s something that’s usually a given in drug discovery. Darunavir and its analogs got fast-track FDA approval against HIV strains that had failed to respond against every other medication. Ghosh’s study was a powerful reminder that the right kind of design principal can lead to exceptional success, even against a target that's been beaten to death.

George Whitesides (Harvard): Challenges

Interesting talk by Whitesides. A pretty laid back speaker. The first half was a general rumination on the state of pharma and drug discovery ("the current model of capitalism is not working"; "the FDA has become unreasonable"; "if the best we can do in cancer is to invent a drug that gives someone 3 extra months with a lot of side effects, then we are doing something wrong").

The second half concerned his work on the hydrophobic effect. The papers deal with ligand binding to carbonic anhydrase. Basically he found out that the so-called entropic signature of the hydrophobic effect (an increase in entropy from release of bound water molecules) is more complicated.

A few notes:

- Designing drugs is hard because we are robust, multi multiplexed complex systems.
- Cost of healthcare in the US is ~17% of GDP: also, no correlation between health cost and quality, as evidenced by low standing of US.
- Quoted Anna Karenina’s happy and unhappy families; has something to do with drug development. Every successful drug has its success in common, unsuccessful drugs are unsuccessful in their own way.
- Pharmaceutical crisis has nothing to with per se with science, everything to do with costs.

Finally, he made an important point: biochemists have always done experiments in dilute phosphate buffer. Interior of cell is anything but.

Favorite quote, regarding the limitations of animal models: "Whatever else you may think of me, I am not a large, hairless mouse”

Book review: Benoit Mandelbrot's "The Fractalist"

"My life", says Benoit Mandelbrot in the introduction to his memoir, "reminds me of that fairy tale in which the hero finds a hitherto unseen thread, and as he unravels the thread it leads him to unimaginable and unknown wonders". Mandelbrot not only found those wonders, but bequeathed to us the thread which will continue to lead us to more wondrous discoveries.

Mandelbrot was one of those chosen few scientists in history who are generalists, people whose ideas impact a vast landscape of fields. A maverick in the best sense of the term, he even went one step further and created his own field of fractal geometry. In a nutshell, he developed a "theory of roughness", and the fractals which represent this roughness are now household names, even making it into "Jurassic Park". Today fractals are known to manifest themselves in a staggering range of phenomena; the rhythms of the heart, the distribution of galaxies, market fluctuations, the rise and fall of species populations, the shapes of blood vessels, earthquakes, and the weather. Before Mandelbrot scientists liked to deal with smooth averages and equilibria, assuming that the outliers, the anomalies, the sudden jumps from normalcy were rare and could be ignored. Mandelbrot proved that they can't and found methods to tame them and bring them into the mainstream. His insights into this new view of nature effected minor and major revolutions in fields as diverse as economics, astronomy, physiology and fluid dynamics. More than almost any other thinker he was responsible for teaching natural and social scientists to model the world as it is rather than the abstraction which they want it to be.

In this memoir Mandelbrot describes his immensely eventful and somewhat haphazard journey to these revelations. The volume is quirky, charming, wide-ranging, often lingering on self-similar themes, much like his fractals. It is divided into three parts. The first deals with family history and childhood influences. The second deals with a peripatetic, broad scientific education. The third details Mandelbrot's great moments of discovery, the ones he calls "Keplerian moments" in homage to the great astronomer who realized the power of abstract mathematical notions to illuminate reality.

Mandelbrot grew up in a Lithuanian family first in Warsaw and then in France. He came from an educated and intellectually alert household. His most formative influences were his garment-maker father and dentist mother and especially his mathematician uncle Szolem. The parents had acquired great reserves of tenacity, having been uprooted from one place to another at least six times because of the depression. Szolem had toured the great centers of European mathematics and knew quite a few famous mathematicians himself. Mandelbrot grew up steeped in the mathematical beauty and folklore which Szolem vividly imparted to him. A dominant theme in the household was self-improvement, constantly challenging oneself to do better. This theme served Benoit well.

Mandelbrot's early years were marked by the rise of Nazism. After the fall of France his family fled Paris, taking refuge in the south of France before the country was liberated. There were dangerous moments, like his father narrowly escaping a strafing and Benoit and his cousin being interrogated by the Vichy police. After the war Mandelbrot studied at the prestigious École Polytechnique. At this point his central character started to reveal itself; an intellectual restless that inspired forays into diverse fields, a thirst for knowledge that would take him to many corners of the globe, a tendency to question orthodox wisdom and most importantly, an unwillingness to be a specialist. All these traits would turn out to be paramount in his future discoveries. Throughout his life Mandelbrot was known as a sometimes cantankerous and difficult person, but while there is a trace of these qualities in his memoir, most of the volume is generous in acknowledging the influence of family, friends, colleagues and institutions.

His intellectual restlessness led him across the Atlantic to major centers of scientific research including Caltech, MIT and the Institute for Advanced Study in Princeton, where he was the last postdoc of the great mathematician John von Neumann. Part of the joy of the book comes from Mandelbrot's accounts of encounters with a veritable who's who of late twentieth century science including von Neumann, Oppenheimer, Wiener, Feynman, Chomsky and Stephen Jay Gould. A particularly memorable incident has him flabbergasted by a penetrating comment from an audience member and Oppenheimer and von Neumann coming to his defense to explain his ideas even better than he could. At all these institutions Mandelbrot worked on a remarkable variety of problems, from aircraft design to linguistics, and acquired a rare, extremely broad education that would serve him in good stead.

As he explains, the trajectory of Mandebrot's life was irrevocably changed when his uncle Szolem introduced him to a law named Zipf's Law that deals with the frequencies of words in various languages. Mandelbrot discovered that Zipf's law led to some counterintuitive and universal results that could only be explained by non-standard distributions; this was when he discovered the high prevalence of what many had previous considered to be "rare" events. His work in this area as well as some preliminary work in economics led him to a highly productive position at IBM. Mandelbrot describes IBM's remarkable scientific culture that allowed scientists like him to pursue unfettered basic scientific research; sadly that culture has now all but vanished in many organizations. During this time he stayed in touch with academia, giving seminars at many leading universities. Ironically, it was Mandelbrot's lack of specialization that made universities reluctant to hire him; implicitly, his experience is also a critique of an academic system that discourages broad thinkers and generalists. The difficulty of pinning down an unconventional thinker like Mandelbrot is reflected in the fact that Chicago found his interests too spread out while Harvard thought them too narrow!

But IBM was more than happy to support his multiple intellectual forays and in addition to his own explorations he also has accounts of IBM's pioneering work in software and graphics design. It was while at IBM that Mandelbrot discovered what he is most famous for - fractals. As the book recounts, the work arose partly from analyzing price and market fluctuations. Mandelbrot was struck by the uncanny similarity of disparate price and income curves and realized that the equilibrium model that economists were relying for decades was of little use in analyzing real world jumps which tended to be much more frequent than normal distributions would indicate. In a set of stunning and sweeping intellectual insights engendered by his broad scientific background, Mandelbrot realized that the math underlying an astonishing range of phenomena, from economic fluctuations to geographic coastlines, is the same. His work in this area was seminal by any standard, but it was not adopted by economists partly because they found it difficult to use and partly because the field was entrenched in established ideas from equilibrium models. It was only in the 1980s that his insights became accepted into the mainstream, and the global recession in 2008 and the shocks to the economy have soundly validated his fractal fluctuation models. Outliers are not so rare after all, and as Nassim Taleb has documented, their impact can be tremendous and unpredictable. The parts of the book charting the road leading to fractals are fascinating and clearly detail the advantage of having a broad scientific education.

In spite of the lukewarm reception by economists Mandelbrot persevered along his general line of thinking, and in the late 1970s he discovered the iconic Mandelbrot set which made him a household name. Starting from an almost laughably simple formula, one quickly generates what has been called the most complex object in mathematics. The stunning geometry of the set today dots everything from murals to coffee mugs and there are hundreds of websites on which you can generate the set and examine it. Zooming in on the picture reveals a thick and endlessly complex jungle of self-similar geometric shapes and convolutions; one can gaze at this mesmerizing creature for hours.

Mandelbrot retired from IBM in the 80s and his career culminated in his appointment as the Sterling professor at Yale University. His eventful journey, from Warsaw to New Haven, holds many key lessons for us. He taught us to celebrate diversity and broad interests in an era of specialization. He shifted the focus of scientists from the idealized experiments of their laboratories to the messy world of reality. And he made it clear that many of the most penetrating insights into nature like fractals emerge from asking simple questions and exploring the obvious; What's the length of Britain's coastline? What's the shape of clouds? How does the heart beat?

It is hard to think of a twentieth century thinker whose ideas have influenced so many disciplines, and the fruits of Mandelbrot's labors promise continuing revelations long after his death in 2010. His memoir makes a resounding case for the virtues of indulging in, in Feynman's words, "perfectly reasonable deviations from the beaten track".