Field of Science

Modular drug design software?

The latest issue of C&EN has an interesting article (unfortunately subscription only) about how quantum chemists are making code for standard protocols in quantum chemistry calculations available to each other as off-the-shelf modules. The movement has been driven by the realization that whenever someone develops a new quantum chemistry program he or she has to go through the tedious process of rewriting code for standardized algorithms like the Hartree-Fock method for calculation of potential energies of molecules. Why reinvent the wheel when you can simply buy it off the shelf in a centralized local tire shop?

I like this idea and I applaud the quantum chemists for having the generosity in sharing their code. But that left me wondering how soon it would be before something similar could happen in the world of computational drug design, or whether it would even be feasible.

The essence of methods like Hartree-Fock is that their highly iterative and standardized nature made them instantly amenable to computation. Your code for Hartree-Fock may be faster and cleaner than the other fellow's but the basic methodology which can be captured in a well-defined flowchart is not going to change. Contrast this with 'standard' drug design software protocols like docking, similarity searching and molecular dynamics calculations. 

Even though the objective is the same in every case, every practitioner uses his or her own favorite technique for their calculations; for instance docking can be physics-based or knowledge-based or it may depend on genetic algorithms. The sampling algorithms in MD may similarly be different in every case. Docking or MD are thus not as 'standardized' as say the Hartree-Fock method so it may be difficult to offer these protocols as standardized modules.

However I cannot see why it may not be possible to offer even more specialized components that are in fact standard for the wider use of the community. For instance, certain force fields - parameters and equations for calculation of structure and energetics - are pretty standard; the MMFF force field will have a certain set of components and the MM2 will have another. Similarly in a protocol like MD, the precise methods of sampling can be much more standard compared to the overall package. So in principle these methods could be packaged as standardized modules and offered to users.

The ideal situation for computational drug design would be an age where a variety of protocols ranging from quantum chemistry, docking and MD to homology modeling, gene and protein sequence comparison tools and toxicity and PK prediction algorithms would be available for any user to patch together, rearrange and deploy in the solution of his or her particular problem. 

Going even further, we could envisage an age where the tools of systems and computational biology are thoroughly ingrained in the drug discovery process so that one can now add standard systems tools to the toolbox; for instance, in such an age, not only would I be able to snatch standard docking protocols from a website but I would also be able to combine them with some kind of a wiring diagram of the protein which I am trying to target linked to its partners, so that I know exactly which partner hubs I should additionally dock my drug against in order to maximize its efficacy and minimize its toxicity. And who knows, maybe I can even get to a stage where I can download some kind of a minimalist but accurate model of an entire cell and observe how my drug will qualitatively perturb its network of organelles and signaling pathways.

For now this sounds like a pipe dream, although I suspect that the cultural barriers to sharing algorithms with commercial potential may be much harder to overcome than the scientific hurdles to actually incorporating systems biology in drug discovery and making the whole process modular. That's where the siren song of these socialist quantum chemists would be particularly relevant.

3 comments:

  1. Pipelining tools like Knime and Pipeline Pilot allow you to stitch tools from different sources and vendors together at the moment. They even have network visualization and analysis tools . Is this the sort of thing you have in mind?

    ReplyDelete
  2. I don't mean to sound snarky but this kind of software has been available since the early '80's, possibly earlier than that. Code was available and shared on mainframes, Vaxen and workstations. Recall QCPE and other program libraries. There have been commercial and semi-commercial vendors of QM and modeling code since that time. Over the last 10+ years, toolkit approaches from both commercial and academic sources have proliferated.

    People can go the handhold route if they wish by writing checks to get what they want. They can indulge their programming desires working with Python/Java/C++/... toolkits, not to mention networking tools like KNIME. Or, they can support F/OSS efforts if that's what they want to do. FWIW, KNIME's merely the latest in a series of network/pipelining tools that go back to the E&S machines of the 70's and 80's.

    ReplyDelete
  3. Sure, workflow tools like KNIME are great. I was thinking of standard docking and MD protocols. The QM community seems to have gotten there first for a variety of reasons.

    ReplyDelete

Markup Key:
- <b>bold</b> = bold
- <i>italic</i> = italic
- <a href="http://www.fieldofscience.com/">FoS</a> = FoS