Field of Science

Thanksgiving

When the Protein Data Bank releases a much-awaited protein structure which shows you *exactly* what you wanted to see, it's much like looking at a newly written Mozart symphony. There is a purpose to every single atom and water molecule, a rhythm to every single bond and interaction, a graceful yet deliberate elegance to every single curve of a loop, to every turn of an alpha helix, to the meandering pleats of every beta sheet.

The structure contains precisely the right number of interacting parts, not a single one more or less. Only this time the composer is nature, and the conductor is evolution. Looking at the structure you cannot help but get the feeling that nature has lifted a corner of her great veil for you to gaze in awe and appreciation at the edifice.

I don't say it enough, but today I feel grateful and humbled to be a scientist, to be granted the privilege of being able to sit in my modest little corner of the universe and revel in a movement from this grand performance.

Good luck to B.R.S.M!


B.R.S.M. is off to the US for a postdoc. And he does Woodward Wednesdays. What's not to like.

1. What is your message for BRSM?

It's the only time in your life when (unless you have a family) your only responsibility will be research; no teaching, no exams. Make the most of it while it lasts.

2. What is one postdoc survival tip you would give to BRSM?
Time is of the essence. While it's essential to have fun during your postdoc, you should hit the ground running and start making plans for job applications, grant proposals etc. right away.

3. Do you have a fun story you could share from your postdoc and/or US academic experience?
There was the time when a naive rotation student referred to the abbreviated name of a dye in her powerpoint presentation. Some of us wanted to know the structure, but before we had second thoughts and could stop her, she had already done a Google image search. Turns out the abbreviated dye name was also the name of a porn star. I thought she was going to faint with embarrassment (the rotation student, that is; porn stars never faint with embarrassment).

4. A survival tip for living in the US?
Unless you are living in a city, get a car. You don't want to add to your hectic postdoc schedule by walking in the heat/cold or waiting for the bus that never shows up.

5. What would you like to see on BRSM blog in the future?
More Woodward Wednesdays, naturally.

6. Anything else?
Live long and reflux!

Chemistry, fluid dynamics and an awful radioactive mess

When it comes to handling radioactive waste the Hanford site in western Washington state is the opposite of a role model. Ever since its reactors started producing the plutonium which was used in the Nagasaki bomb, Hanford has been generating waste with little foresight and responsibility. It has the dubious honor of being the most contaminated radioactive site in the country.

Scientific American has an article which gives an idea of how truly awful the problem is. It's not just that there's a lot of waste or that it's everywhere. It seems like the waste basically conforms to the devil's definition of the word "heterogeneous" and takes a form representing the average nuclear chemist's version of hell:
"Overall, the waste tanks hold every element in the periodic table, including half a ton of plutonium, various uranium isotopes and at least 44 other radionuclides—containing a total of about 176 million curies of radioactivity. This is almost twice the radioactivity released at Chernobyl, according to Plutopia: Nuclear Families, Atomic Cities, and the Great Soviet and American Plutonium Disasters, by Kate Brown, a history professor at the University of Maryland, Baltimore County. The waste is also physically hot as well as laced with numerous toxic and corrosive chemicals and heavy metals that threaten the integrity of the pipes and tanks carrying the waste, risking leakage. 
The physical form of the waste causes problems, too. It’s very difficult to get a representative sample from any given tank because the waste has settled into layers, starting with a baked-on “hard heal” at the bottom, a layer of salt cake above that, a layer of gooey sludge, then fluid, and finally gases in the headspace between the fluid and the ceiling. Most of the radioactivity is in the solids and sludge whereas most of the volume is in the liquids and the salt cake."
"Plutopia", by the way, is a very interesting book. In any case, the waste problem at Hanford looks like it will engage the services of every conceivable kind of chemist, engineer and fluid dynamics expert that I can imagine.
"All of these considerations contribute to the overall problem, which can be summed up in one word: flow. To get to the glass log stage the waste has to travel through an immense labyrinth of tanks and pipes. It has to move at a fast enough clip to avoid pipe and filter clogs as well as prevent solids from settling. This is quite a challenge given the multiphasic nature of the waste: solids, liquids, sludge and gases all move differently. The waste feed through the system will be in the form of a “non-Newtonian slurry”—a mixture of fluids and solids of many different shapes, sizes and densities. If the solids stop moving, problems ensue."
The article also talks about two serious concerns; the possibility that enough plutonium in the waste could build up to trigger a chain reaction (although one which in bomb parlance would be a "fizzle") and the possibility that the heat and radiation could split water up and lead to a buildup of hydrogen. For now these concerns are about unlikely events and are secondary in any case to the much more important problems of Sludge Management and the Battle against Viscosity. Just tells you how important it is to nip problems with reactor waste in the bud before they turn into a godforsaken headache for future generations.

On synthesis, design and chemistry's outstanding philosophical problems


Chemists need to move from designing structure - exemplified by this synthetic receptor - to designing function (Image: Max Planck Institute).
Yesterday I wrote a post about a perspective by multifaceted chemist George Whitesides in which he urged chemists to broaden the boundaries of their discipline and think of big picture problems. But the article spurred me to think a bit more about a question which I (and I am sure other chemists) have often thought about; what’s the next big challenge for chemistry?

And when I ask this question I am not necessarily thinking of specific fields like energy or biotechnology or food production. Rather, I am thinking of the next outstanding philosophical question confronting chemistry. By philosophical question I don’t mean an abstract goal which only armchair thinkers worry about. The philosophical questions in a field are those which define the field’s big problems in the most general sense of the term. For physicists it might be understanding the origin of the universe, for biologists the origin of life. These problems can also be narrowly defined questions that nonetheless expand the understanding and scope of a field; for instance in the early twentieth century physicists were struggling to make sense of atomic spectra, which turned out to be important for the development of quantum theory. It’s also important to note that the philosophical problems of a field change over time, and this is one reason why chemists should be aware of them; you want to move with the times. If you were a “chemist” in the sixteenth century the big question was transmutation. In the nineteenth century when chemistry finally was cast in the language of elements and molecules the big question became theconstitution of molecules in the form of atomic arrangements.

Synthesis is no longer chemistry’s outstanding general problem

When I think about the next philosophical question confronting chemistry I also feel a sense of despondency. That’s because I increasingly feel that the great philosophical question that chemists are going to face in the near future is emphatically not one whose answer they will locate in the all-pervasive activity that always made chemistry unique: synthesis. What always set chemistry apart was its ability to make new molecules that never existed before. Through this activity chemistry has played a central role in improving our quality of life.

The point is, synthesis was the great philosophical question of the twentieth century, not the twenty-first. Now I am certainly not claiming that synthesizing a complex natural product with fifty rotatable bonds and twenty chiral centers is even today a trivial task. I am also not saying that synthesis will cease to be a fruitful source of solutions for humanity’s most pressing problems, such as disease or energy; as a tool the importance of synthesis will remain undiminished. What I am saying is that the general problem of synthesis has now been solved in an intellectual sense (as an aside, this would be consistent with the generally pessimistic outlook regarding total synthesis seen on many blogs.)

The general problem of synthesis was unsolved in the 30s. It was also unsolved in the 50s. Then Robert Burns Woodward came along. Woodward was a wizard who made molecules whose construction had defied belief. He had predecessors, of course, but it was Woodward who solved the general problem by proving that one could apply well-known principles of physical organic chemistry, conformational analysis and spectroscopy to essentially synthesize any molecule. He provided the definitive proof of principle. All that was needed after that was enough time, effort and manpower. If chemistry were computer science, then Woodward could be said to have created a version of the Turing Machine, a general formula that could allow you to synthesize the structure of any complex molecule, as long as you had enough NIH funding and cheap postdocs to fill in the specific gaps. Every synthetic chemist who came after Woodward has really developed his or her own special versions of Woodward’s recipe. They might have built new models of cars, but their Ferraris, Porches and Bentleys – as elegant and impressive as they are – are a logical extension of Woodward and his predecessor’s invention of the internal combustion engine and the assembly line.

A measure of how the general problem of synthesis has been solved is readily apparent to me in my own small biotech company which specializes in cyclic peptides, macrocycles and other complex bioactive molecules. The company has a vibrant internship program for undergraduates in the area. To me the most remarkable thing is to see how quickly the interns can bring themselves up to speed on the synthetic protocols. Within a month or so of starting at the bench they start churning out these compounds with the same expertise and efficiency as chemists with PhDs. The point is, synthesizing a 16-membered ring with five stereocenters has not only become a routine, high-throughput task but it’s something that can be picked up by a beginner in a month. This kind of synthesis might have easily fazed a graduate student twenty years ago and taken up a good part of his or her PhD project. The bottom line is that we chemists have to now face an uncomfortable fact: there are still a lot of unexpected gems to be found in synthesis, but the general problem is now solved and the incarnation of chemical synthesis as a tool for other disciplines is now essentially complete.

Functional design and energetics are now chemistry’s outstanding general problems

So if synthesis is no longer the general problem, what is? My own field of medicinal chemistry and molecular modeling provides a good example. It may be easy to synthesize a highly complex drug molecule using routine techniques, but it is impossible, even now, to calculate the free energy of binding of an arbitrary simple small molecule with an arbitrary protein. There is simply no general formula, no Turing Machine that can do this. There are of course specific cases where the problem can be solved, but the general solution seems light years away. And not only is the problem unsolved in practice but it is also unsolved in principle. Sure, we modelers have been saying for over twenty years that we have not been able to calculate entropy or not been able to account for tightly bound water molecules. But these are mostly convenient questions which when enunciated make us feel more emotionally satisfied. There have certainly been some impressive strides in addressing each of these and other problems, but the fact is that when it comes to calculating the free energy of binding, we are still today where we were in 1983. So yes, the calculation of free energies – for any system – is certainly a general problem that chemists should focus on.

But here’s the even bigger challenge that I really want to talk about: We chemists have been phenomenal in being able to design structure, but we have done a pretty poor job in designing function. We have of course determined the function of thousands of industrial and biological compounds, but we are still groping in the dark when it comes to designing function. Here are a few examples: Through combinatorial techniques we can now synthesize antibodies that we want to bind to a specific virus or molecule, but the very fact that we have to adopt a combinatorial, brute force approach means that we still can’t start from scratch and design a single antibody with the required function (incidentally this problem subsumes the problem of calculating the free energy of antigen-antibody binding). Or consider solar cells. Solid-state and inorganic chemists have developed an impressive array of methods to synthesize and characterize various materials that could serve as more efficient solar materials. But it’s still very hard to lay out the design principles – in general terms – for a solar material with specified properties. In fact I would say that the ability to rapidly make molecules has even hampered the ability to think through general design principles. Who wants to go to the trouble of designing a specific case when you can simply try out all combinations by brute force?

I am not taking anything away from the ingenuity of chemists – nor am I refuting the belief that you do whatever it takes to solve the problem – but I do think that in their zeal to perfect the art of synthesis chemists have neglected the art of de novo design. Yet another example is self-assembly, a phenomenon which operates in everything from detergent action to the origin of life. Today we can study the self-assembly of diverse organic and inorganic materials under a variety of conditions, but we still haven’t figured out the rules – either computational or experimental – that would allow us to specific the forces between multiple interacting partners so that these partners assembly in the desired geometry when brought together in a test tube. Ideally what we want is the ability to come up with a list of parts and the precise relationships between them that would allow us to predict the end product in terms of function. This would be akin to what an architect does when he puts together a list of parts that allows him to not only predict the structure of a building but also the interplay of air and sunlight in it.

I don’t know what we can do to solve this general problem of design but there are certainly a few promising avenues. A better understanding of theory is certainly one of them. The fact is that when it comes to estimating intermolecular interactions, the theories of statistical thermodynamics and quantum mechanics do provide – in principle – a complete framework. Unfortunately these theories are usually too computationally expensive to apply to the vast majority of situations, but we can still make progress if we understand what approximations work for what kind of systems. Psychologically I do think that there has to be a general push away from synthesis and toward understanding function in a broad sense. Synthesis still rules chemical science and for good reason; it's what makes chemistry unique among the sciences. But that also often makes synthetic chemists immune to the (well deserved) charms of conformation, supramolecular interactions and biology. It’s only when synthetic chemists seamlessly integrate themselves into the end stages of their day job that they will learn better to appreciate synthesis as an opportunity to distill general design principles. Let the synthetic chemist interact with the physical biochemist, the structural engineer, the photonics expert; let him or her see synthesis through the requirement of function rather than structure. Whitesides was right when he said that chemists need to broaden out, but another way to interpret his statement would be to ask other scientists to channel their thoughts into synthesis in a feedback process. As chemists we have nailed structure, but nailing design will bring us untold dividends and will help make the world a better place.

First published on the Scientific American Blog Network.

George Whitesides on the responsibility of chemists and the future of chemistry

Catching up on a few articles I had missed, I came across a characteristically deep and wide-ranging essay called "Assumptions" by George Whitesides about science, its future and our responsibility as scientists. It's a very general and kaleidoscopic essay not restricted to chemistry, but the bits about chemistry, its role in understanding the major problems confronting humanity and chemists' responsibility in extending the scope of chemical science are quite thought-provoking:

Chemistry, by its culture, has been almost blindly reductionist. I am repeatedly reminded that “Chemists work on molecules”, as if to do anything else was suspect. Chemists do and should work on molecules, but also on the uses of molecules, and on problems of which molecules may be only a part of the solution. If chemists move beyond molecules to learn the entire problem—from design of surfactants, to synthesis of colloids, to MRI contrast agents, to the trajectories of cells in the embryo, to the applications of  regenerative medicine—then the flow of ideas, problems, and solutions between chemistry and society will animate both. 
Whitesides is clearly making a plea for chemists to become even more interdisciplinary than what they already are, to pursue not just the development of the solution but its application and integration; his own group provides a remarkable example of chemists, physicists, biologists and engineers working together on highly multidisciplinary problems. It's quite clear that to achieve this interdisciplinary expertise we have to completely break down the traditional barriers between synthesis, structure determination, biology and materials (in this world the professor who rejected my biochemical literature seminar topic because it "did not include any synthesis" would be an anachronism). The next paragraph makes clear the role of the "central science"
As a technology, chemistry has built the foundation from which many of the discoveries of “biology” or “microelectronics” or “brain science” (or “planetary exploration”, for that matter) have grown. There would be no genomics without chemical methods for separating fragments of DNA, and for synthesizing primers and probes, and for separating restriction endonucleases into pure activities. There would be no nuclear ICBMs without methods of refining plutonium, and making explosive lenses. There would be no drugs without synthesis and mass spectroscopy. There would be no interplanetary probes without fuels, and carbon/carbon rocket throat nozzles, and silicon single crystals. 
And here's something about what the future of chemistry should be:
Those are the past. What about the future? Chemistry is, still, everywhere: It must be! It is the science of the real world. But to remain a star in the play rather than a stagehand, it must open its eyes to new problems. It is impossible that the human life span will increase dramatically without manipulation of the molecules of the human organism, but understanding this problem will require more than manipulating molecules. Communication between the living and non-living will require engineering a molecular interface between them, but designing this interface will require understanding the nature of “information” in organisms and in computers, and how to translate between them. A society that uses information technology to interweave all its parts requires new systems for generating, distributing, and storing power, but batteries will be only one part of these systems.  
Chemistry has always been the invisible hand that builds and operates the tools, and sustains the infrastructure. It can be more. We think of ourselves as experts in quarrying blocks from granite; we have not thought it our job to build cathedrals from them. Whether we choose to focus on the molecules, materials, and tools that are at the beginnings of discovery, or bring our particular, unique understanding of the world to bear on unraveling the problems at the end, is for us to decide.  I believe that everything from methane to sentience is chemistry, and that we should reexamine our own assumptions concerning the boundaries of our field. Examining the broader assumptions that follow may provide some stimulus to do so.  
Indeed, examining the "broader assumptions" of their field in the broadest sense of the term is what chemists should do. The first paragraph presents a fair sampling of the myriad problems in which chemistry can play a central role. They involve everything from engineering interfaces between computers or electronics and human brains to harnessing the power of chemistry in generating, storing, interconverting and deploying energy in all its forms. I strongly think that the future of chemistry lies in recasting itself as an informational science in the broadest sense. At the level of biology chemistry has already manipulated information in the form of sequencing and genomics; synthetic biology will take this capability to a whole new level. But there are other areas in which chemistry can serve to manipulate information, and part of what Whitesides is doing is challenging chemists to become informational scientists in hitherto unexplored areas like energy and transportation.

The essay ends with a systems-level view of chemistry that every chemist should keep in mind, even as she works in her narrow world of natural products, zeolites, ROMP or kinases.
Because chemistry contributes broadly to the foundations of technology, it is particularly difficult to guess its future impact: a new chemical reaction might be used to make a cancer therapeutic, or a chemical weapon. Some of the opportunities that seem within the reach of investigation, if not within the reach of solution—technologies that might substantially prolong life, or develop new forms of life, or lead to sentient systems that rival us in intelligence—will do both good and harm. At minimum, those of us whopursue these problems should accept an obligation to explain to our fellow citizens fully and clearly what we are doing, and why, and (to the limited extent we can) with what possible outcomes. Humankind will do what it will do, but at least everyone should understand—in so far as is possible—what the choices are, and what the consequences might be. Chemistry, if it takes more interest in (and responsibility for) the full scope of programs—from molecules, to applications, and to influence on society—may be able to use the very breadth of its connections to technology to help in this explanation.
Whitesides Image: Boston.com

Splenda and - wait for it - DDT? You've got to be kidding me

Just when you think the perpetrators of chemophobia (actually this particular case makes chemophobia look like a knight in shining armor) cannot outdo themselves, someone seems to hit a new high.

This time it's "alternative" "medicine" "physician" Joseph Mercola. In a diatribe against Splenda he tosses out this gem:

"Splenda—"Made from Sugar" But More Similar to DDT...

That's right.
The catchy slogan "Made from sugar so it tastes like sugar" has fooled many, but chemically, Splenda is actually more similar to DDT than sugar."
There is no mention of how exactly Splenda is even remotely close to DDT in structure, function or any other conceivable parameter for that matter. I really shouldn't have to do this but here are the structures of the two molecules.



Why in the name of merciful Odin would these be considered similar? Because both of them have chlorines and we all know that chlorine is a toxic gas used in World War 1? I would say then that they share even more hydrogens than chlorines, and hydrogen is of course an inflammable gas, which can only mean that both Splenda and DDT have got to explode when consumed.

Naturally this goes beyond chemophobia and handily ends up way inland in the territory of unadulterated twaddle. Read the entire page if you are craving for that migraine and nausea which you have been longing for. I haven't read the whole thing, and who could blame me for this? A quick look through some of the references reveals the usual egregious howlers (studies extrapolated from rats who have been fed unrealistic doses of the material for an abnormally long period of time etc.) and I have no reason to believe that a more detailed look won't accomplish the same thing.

It strains my imagination to contemplate how even the loopiest of quacks could actually write something like this, let alone sincerely believe it. It's one of those very few times when freedom of speech starts sounding like a bad idea.

H/T: The promising "Chemicals are your Friends" page on Facebook, via Stuart Cantrill.

Stephen Hawking's advice for twenty-first century grads: Embrace complexity


Charles Joseph Minard's famous graph showing the decreasing size of Napoleon's Grande Armée as it marches to Moscow; a classic in data visualization (Image: Wikipedia Commons)
As the economy continues to chart its own tortuous, uncertain course, there seems to have been a fair amount of much-needed discussion on the kinds of skills new grads should possess. These skills of course have to be driven by market demand. As chemist George Whitesides asks for instance, what's the point of getting a degree in organic synthesis in the United States if most organic synthesis jobs are in China?

Upcoming grads should indeed focus on what sells. But from a bigger standpoint, especially in the sciences, new skill sets are also inevitably driven by the course that science is taking at that point. The correlation is not perfect (since market forces still often trump science) but a few examples make this science-driven demand clear. For instance if you were growing up in the immediate post-WW2 era, getting a degree in physics would have helped. Because of its prestige and glut of government funding, physics was in the middle of one of its most exciting periods. New particles were literally streaming out of woodwork, giant particle accelerators were humming and federal and industrial labs were enthusiastically hiring. If you were graduating in the last twenty years or so, getting a degree in biology would have been useful because the golden age of biology was just entering its most productive years. Similarly, organic chemists enjoyed a remarkably fertile period in the pharmaceutical industry from the 50s through the 80s because new drugs were flowing out of drug companies at a rapid pace and scientists like R. B. Woodward were taking the discipline to new heights.

Demand for new grads is clearly driven by the market, but it also depends on the prevalence of certain scientific disciplines at specific time points. This in turn dictates the skills you should have; a physics-heavy market would need skills in mathematics and electronics for instance, a biology-heavy market would mop up people who can run Western blots and PCR. Based on this trend, what kind of skills and knowledge would best serve graduates in the twenty-first century?

To me the answer partly comes from an unlikely source: Stephen Hawking. A few years ago, Hawking was asked what he thought of the common opinion that the twentieth century was that of biology and the twenty-first century would be that of physics. Hawking replied that in his opinion the twenty-first century would be the "century of complexity". That remark probably holds more useful advice for contemporary students than they realize since it points to at least two skills which are going to be essential for new college grads in the age of complexity: statistics and data visualization.

Let's start with the need for statistics. Many of the most important fields of twenty-first century research including neuroscience, synthetic and systems biology, materials science and energy are inherently composed of multilevel phenomena that proliferate across different levels of complexity. While the reductionist zeitgeist of the twentieth century yielded great dividends, we are now seeing a movement away from strict reductionism toward emergent phenomena. While the word "emergence" is often thrown around as a fashionable place-card, the fact is that complex, emergent phenomena do need a different kind of skill set.

The hallmark of complexity is a glut of data. These days you often hear talk of the analysis of 'Big Data' as an independent field and you hear about the advent of 'data scientists'. Big Data now has started making routine appearances in the pharmaceutical and biotech industry, whether in the form of extensive multidimensional structure-activity relationship (SAR) datasets or as bushels of genomic sequence information. It's also important in any number of diverse fields ranging from voter behavior to homeland security. Statistical analysis is undoubtedly going to be key to analyzing this data. In my own field of molecular modeling, statistical analysis is now considered routine in the analysis of virtual screening hits although it's not as widely used as it should.

Statistics was of course always a useful science but now it's going to be paramount; positions explicitly looking for 'data scientists' for instance specifically ask for a mix of programming skills and statistics. Sadly many formal college requirements still don't include statistics and most scientists, if they do it at all, learn statistics on the job. For thriving in the new age of complexity this scenario has to change. Statistics must now become a mandatory part of science majors. A modest step in this direction is the publication of user-friendly, popular books on statistics like Charles Wheelan's "Naked Statistics" or Nate Silver's "The Signal and the Noise" which have been quickly devoured by science-savvy readers. Some of these are good enough to be prescribed in college courses for statistics non-majors.

Along with statistics, the other important skill for students of complexity is going to be data visualization and formal college courses should also reflect this increasingly important skill set. Complex systems often yield data that's spread over different levels of hierarchy and even different fields. It's quite a challenge to visualize this data well. One resource that's often recommended for data visualization is Edward Tufte's pioneering series of books. Tufte shows us how to present complex data often convoluted by the constrains of Excel spreadsheets. Pioneering developments in human-computer interaction and graphics will nonetheless ease visual access to complicated datasets. Sound data visualization is important not just to simply understand a multilayered system or problem but also to communicate that understanding to non-specialists. The age of complexity will inherently involve researchers from different disciplines working together. And while we are at it it's also important to stress - especially to college grads - the value of being able to harmoniously co-exist with other professionals.

Hawking's century of complexity will call upon all the tools of twentieth century problem solving along with a few more. Statistics and data visualization are going to be at the forefront of the data-driven revolution in complex systems. It's time that college requirements reflected these important paradigms.

First published on the Scientific American Blog Network.

Some thoughts on the events around Boston

We were enjoying a quiet evening of music and reading on Thursday when my wife alerted me to a message she got from the MIT emergency system that there had been a shooting somewhere on the campus. A while later we came to know that a police officer had been shot and killed right in front of my wife's department. After making sure that folks we knew from MIT were safe, we stayed awake for about two more hours reading the news updates. By the time we went to sleep we had found out that there was a connection between the shooting of the MIT police officer - a promising young man who later tragically died - and the Boston marathon bombing.

When we woke up the next day the situation was a little surreal: "Has Boston turned into Baghdad?", a friend tweeted. The police had pursued the two bombing suspects into the neighboring suburb of Watertown where there had been a terrific firefight. One suspect died (died, as it turned out, because his brother ran him over) and his brother escaped. By the time we woke up in Cambridge, Watertown was already in lockdown and police were getting ready for house to house searches within a 20 block perimeter.


Then we heard that Boston and a few of its suburbs - roughly an area comprising a million inhabitants - were in lockdown and all residents had been asked to stay at home. I thought then and I still think that this was an overreaction. Watertown, where the suspect was thought to be hiding? Sure. But Boston, Cambridge, Belmont, Newton and four others? A little over the top in my opinion. I understand that many people stayed home out of deference to authorities' wishes to be able to do their job unfettered. It's also ok to ask residents to be vigilant and to venture out at their own risk, but we do this anyway. Every time we are out we run the risk of being in a traffic accident. I suspect that this risk in a random suburb which is not Watertown is probably higher than  a 19 year-old fanatic suffering blood loss coming out of the blue with guns blazing and shooting at you. Now I understand that the police did not force people to stay indoors but they were also quite emphatic about this; I watched a woman who stepped out in the middle of the day to walk her dog being emphatically told to stay inside by two officers out on patrol.


The huge police presence in Watertown also seemed like an overreaction to me. By one account there were 9000 local, state, and federal authorities looking for this kid. Armored vehicles patrolled the streets, and I am not sure what additional purpose they would have served. Sure, the authorities were erring on the side of safety and they were clearly anxious to apprehend the suspect as soon as possible, but I think it's constitutionally healthy to be skeptical when your whole neighborhood resembles a war zone and armed officers wielding every kind of weapon perform intrusive house searches.


For me the ultimate irony may be that this guy was located - not by one of the 9000 officers and military personnel - but by an ordinary citizen. In a boat in an area that was not part of the 20 block perimeter. After the lockdown order had been rescinded.


What happened there? I know that hindsight is always twenty-twenty but here's something that bothers me: From what I read it seems that the spectacular shootout occurred at the intersection of Laurel St and Dexter Ave in Watertown. The suspect was found hiding in the boat at 67 Franklin St. If you look at these locations on Google Maps they are less than a mile apart. For all the meticulous house-to-house searches and lockdowns, why did the perimeter not include a location that was less than a mile from where the shootout took place? And most importantly, how could the police miss the boat, a large, roomy object that's ideal for a human being to hide? Can you say that your operation was really successful when an ordinary citizen locates a suspect only after you are done with house-to-house searches? So on one hand there seemed to be an overreaction and on the other, the meticulous operation seems to have been unsuccessful in its primary purpose.


I understand that there were a lot of police officers and other personnel who immersed themselves into this investigation. Many of them had not slept in 24 hours and they were clearly committed to finding this guy as soon as possible. These people clearly did an admirable job and we should applaud their dedication. But in my opinion there seem to be a few important clues that were missed, and discussing these clues is not only an important part of a healthy democracy where public officials are answerable to the public but also a part of any system of self-improvement and feedback where you learn from your mistakes. Most importantly though, when a 19 year-old nutjob brings a major American city to a standstill, makes it resemble a state with martial law and makes people stay put in their houses and away from their jobs in anxiety, if not fear, the terrorists have already won (as the cliche goes, in this case because it's true). As Ben Franklin memorably put it, if you sacrifice freedom for security you risk losing both. And the key here is to realize that this sacrifice may not even be forced upon you by the state; it can be entirely self-imposed.


At 5 PM I grew really restless and decided to go outside to get some milk (I need my morning coffee fix, terrorist scares be damned). Everything except for one convenience store was closed. Parking on Massachusetts Ave never looked better. The next day we went to the Esplanade along the Charles River. The cherry blossoms were in full bloom. Something about fear being the only thing we should truly fear came to my mind.


Moore's Law for batteries: No dice


The REVAi/G-Wiz i electric car charging at an on-street station in London (Image: Wikipedia Commons)
Ever since Gordon Moore came up with the ubiquitous law bearing his name, it has been applied to paradigms far beyond those which it was intended for. This is perhaps not surprising; the history of science and technology - and of religion - has consistently demonstrated that the followers of a prophet usually extend his principles into domains which the prophet never really approved of.

Transistor technology does neatly seem to follow the Moore's Law curve and a few other cutting-edge technologies like genome sequencing also seem to do this. Yet Moore's proselytizers have extended his law to pretty much everything. The law especially seems to break down when applied to biomedical research; for instance a review from last year pointed out how the pace of drug development almost seems to have been following a reverse law, titled "Eroom's Law" of declining productivity. Kurzweilian prognostications notwithstanding, research in neuroscience might follow the same trajectory, with a burst of rapid mapping of neuronal connectivity followed by a long, fallow period in which we struggle to duplicate these processes by artificial means.

The basic reasons why an emerging technology may not follow Moore's Law is either because we tend to underestimate the complexity of the system to which the technology is applied, or we underestimate the basic principles of physics and chemistry which would inherently constrain a Moore-type breakthrough in that field. In case of medical research both these constraints seem to rear their ugly, emergent heads, and this is the main problem I have with futurists like Ray Kurzweil who seem to imagine an entire universe governed by Moore's Law-type exponential progress in every field. Not all levels of complexity are created equal, and we just don't have enough evidence to know how general Moore's Law (which I think should simply be re-named "Moore's Observation") is in the world of practical problem-solving.

The argument about basic science limitations may especially apply to much-touted battery research whose proponents often seem to declare the next breakthrough in battery technology as being just around the corner. But a perspective from Fred Schlachter from the American Physical Society in the Proceedings of the National Academy of Sciences puts a brake on these optimistic predictions. His point is simple: any kind of Moore's Law for batteries may be limited by the fundamental chemistry inherent in a battery's workings. This is unlike transistors, where finer lithography techniques have essentially enabled a repetitive application of miniaturization over the years.
There is no Moore’s Law for batteries. The reason there is a Moore’s Law for computer processors is that electrons are small and they do not take up space on a chip. Chip performance is limited by the lithography technology used to fabricate the chips; as lithography improves ever smaller features can be made on processors. Batteries are not like this. Ions, which transfer charge in batteries are large, and they take up space, as do anodes, cathodes, and electrolytes. A D-cell battery stores more energy than an AA-cell. Potentials in a battery are dictated by the relevant chemical reactions, thus limiting eventual battery performance. Significant improvement in battery capacity can only be made by changing to a different chemistry.
And even this different chemistry is going to be governed by fundamental parameters like the sizes of ions and the rates of chemical reactions and current flow. Schlachter goes on to note the problems that lithium batteries have recently encountered, including fires. There is thus no guarantee that there will be a breakthrough in battery technology that's equivalent to that in computer technology over the last thirty years. And the article is right that while we are waiting for such breakthroughs, it's a really good idea to push forward with improving energy efficiency in cars, making their lighter, smaller and and more powerful. Energy efficiency would not ultimately solve pollution problems since the cars would still be fueled by gasoline, but it would certainly take us a long way while we are waiting for the next battery breakthrough engineered by Moore's Law. A law which may not really hold when it comes to next generation electric technology.

First published on the Scientific American Blog Network.

Friday levity: 'Nature' discusses ghosts.

One of the pleasures of thumbing through old issues of science journals is the opportunity to accidentally discover articles or letters that make you do double takes, often followed by face palms. 

As I was about to read a letter in Nature bemoaning the closure of the Hoffmann-La Roche Institute of Molecular Biology in Nutley, NJ (Nature, 1995, 373, 184; deja vu, anyone?) I came across a letter on the same page that pristinely tosses out the following for readers' benefit (click for clarity...or the lack thereof).



I love the fact that the letter writer dismisses one hypothesis about ghosts only to come up with another. And this isn't 1885, it's 1995. Oh how I miss the old Nature.