|The LHC is a product of both time and multiple disciplines|
It is not uncommon to hear experimentalists from other disciplines and even modelers themselves grumbling about the unsatisfactory state of the discipline, and with good reason. Neither are the reasons entirely new: The techniques are based on an incomplete understanding of the behavior of complex biological systems at the molecular level. The techniques are parametrized based on a limited training set and are therefore not generally applicable. The techniques do a much better job of explaining than predicting (a valid point, although it's easy to forget that explanation is as important in science as prediction).
To most of these critiques I and my fellow brethren plead guilty; and nothing advances a field like informed criticism. But I also have a few responses to the critiques, foremost among which is one that is often under-appreciated: On the scale of scientific revolutions, computational chemistry and molecular modeling are nascent fields, only just emerging from the cocoon of understanding. Or, to be pithier, give it some more time. This may seem like a trivial point but it's an important one and worth contemplating. Turning a scientific discipline from an unpolished, rough gem-in-the-making to the Kohinoor diamond takes time. To drive this point home I want to compare the state of molecular modeling - a fledgling science - with physics - perhaps the most mature science. Today physics has staked its claim as the most accurate and advanced science that we know. It has mapped everything from the most majestic reaches of the universe at its largest scale to the production of virtual particles inside the atom at the smallest scale. The accuracy of both calculations and experiments in physics can beggar belief; on one hand we can calculate the magnetic moment of the electron to sixteen decimal places using quantum electrodynamics (QED) and on the other hand we can measure the same parameter to the same degree of accuracy using ultra sensitive equipment.
But consider how long it took us to get there. Modern physics as a formal discipline could be assumed to have started with Isaac Newton in the mid 17th century. Newton was born in 1642. QED came of age in about 1952 or roughly 300 years later. So it took about 300 years for physics to go from the development of its basic mathematical machinery to divining the magnetic moment of the electron from first principles to a staggering level of accuracy. That's a long time to mature. Contrast this with computational chemistry, a discipline that spun off from the tree of quantum mechanics after World War 2. The application of the discipline to complex molecular entities like drugs and materials is even more recent, taking off in the 1980s. That's thirty years ago. 30 years vs 300 years, and no wonder physics is so highly developed while molecular modeling is still learning how to walk. It would be like criticizing physics in 1700 for not being able to launch a rocket to the moon. A more direct comparison of modeling is with the discipline of synthetic chemistry - a mainstay of drug discovery - that is now capable of making almost any molecule on demand. Synthetic chemistry roughly began in about 1828 when German chemist Friedrich Wöhler first synthesized urea from simple inorganic compounds. That's a period of almost two hundred years for synthetic chemistry to mature.
But it's not just the time required for a discipline to mature; it's also the development of all the auxiliary sciences that play a crucial role in the evolution of a discipline that makes its culmination possible. Consider again the mature state of physics in, say, the 1950s. Before it could get to that stage, physics needed critical input from other disciplines, including engineering, electronics and chemistry. Where would physics have been without cloud chambers and Geiger counters, without cyclotrons and lasers, without high-quality ceramics and polymers? The point is that no science is an island, and the maturation of one particular field requires the maturation of a host of others. The same goes for the significant developments in mathematics - multivariate calculus, the theory of Lie groups, topology - that made progress in modern physics possible. Similarly synthetic chemistry would not have been possible had NMR spectroscopy and x-ray diffraction not provided the means to determine the structure of molecules.
Molecular modeling is also constrained by similar input from other science. Simulation really took off in the 80s and 90s with the rapid advances in computer software and hardware; before this period chemists and physicists had to come up with clever theoretical algorithms to calculate the properties of molecules simply because they did not have access to the proper firepower. Now consider what other disciplines modeling is dependent on - most notably chemistry. Without chemists being able to rapidly make molecules and provide both robust databases as well as predictive experiments, it would be impossible for modelers to validate their models. Modeling has also received a tremendous boost from the explosion of crystal structures of proteins engendered by genomics, molecular biology, synchrotron sources and computer software for data processing. The evolution of databases, data mining methods and the whole infrastructure of informatics has also really fed into the growth of modeling. One can even say without exaggeration that molecular modeling is ultimately a product of our ability to manipulate elemental silicon and produce it in an ultrapure form.
Thus, just like physics was dependent on mathematics, chemistry and engineering, modeling has been crucially dependent on biology, chemistry and computer science and technology. And in turn, compared to physics, these disciplines are relatively new too. Biology especially is still just taking off, and even now it cannot easily supply the kind of data which would be useful for building a robust model. Computer technology is very efficient, but still not efficient enough to really do quantum mechanical calculations on complex molecules in a high-throughput manner (I am still waiting for that quantum computer). And of course, we still don't quite understand all the forces and factors that govern the binding of molecules to each other, and we don't quite understand how to capture these factors in sanitized and user-friendly computer algorithms and graphical interfaces. It's a bit like physics having to progress without having access to high-voltage sources, lasers, group theory and a proper understanding of the structure of the atomic nucleus.
Thus, thirty years is simply not enough for a field to claim a very significant degree of success. In fact, considering how new the field is and how many unknowns it is still dealing with, I would say that the field of molecular modeling is actually doing quite well. The fact that computer-aided molecular design was hyped during its inception does not make it any less useful, and it's silly to think so. In the past twenty years we have at least had a good handle on the major challenges that we face and we have a reasonably good idea of how to proceed. In major and minor ways modeling continues to make useful contributions to the very complicated and unpredictable science and art of drug design and discovery. For a field that's thirty years old I would say we aren't doing so bad. And considering the history of science and technology as well as the success of human ingenuity in so many forms, I would say that the future is undoubtedly bright for molecular simulation and modeling. It's a conviction that is as realistic as any other in science, and it's one of the things that helps me get out of bed every morning. In science fortune always favors the patient, and modeling and simulation will be no different.