Field of Science

Arsenic DNA, chemistry and the problem of differing standards of proof in cross-disciplinary science


Arsenic-based linkages in DNA would be unstable and would quickly break, a fact suspected by chemists for years (Image: Johannes Wilbertz)
When the purported discovery of the now infamous “arsenic DNA” bacteria was published, a friend of mine who was studying astrobiology could not stop praising it as an exciting scientific advance. When I expressed reservations about the discovery mainly based on my understanding of the instability of biomolecules containing arsenic, she gushed, “But of course you will be skeptical; you are an organic chemist!"

She was right. As chemists me and many of my colleagues could not help but zero in on what we thought was the most questionable aspect of the whole discovery; the fact that somehow, contrary to everything we understood about basic chemistry, the “arsenic DNA” inside the bacteria was stably chugging along, replicating and performing its regular functions.

It turned out that the chemists were right. Measurements on arsenic DNA analogs made by researchers several months later found that the arsenic analogs differed in stability from their phosphate versions by a mind-boggling factor of 1017. Curiously, physicists, astronomers, geologists and even biologists were far more accommodating about the validity of the discovery. For some reason the standards used by these scientists were different from those used by chemists, and in the end the chemists’ standard turned out to be the “correct” one. This is not a triumph of chemists and a blemish on other sciences since there could well be cases where other sciences might have used the correct standards in nailing down the truth or falsehood of an unprecedented scientific finding.

The arsenic DNA fiasco thus illustrates a very interesting aspect of modern cross-disciplinary science – the need to reconcile what can be differing standards of evidence or proof between different sciences. This aspect is the focus of a short but thought-provoking piece by Steven Benner, William Bains and Sara Seager in the journal Astrobiology.

The article explains why it was that standards of proof that were acceptable to different degrees to geologists, physicists and biologists were unacceptable to chemists. The answer pertains to what we call “background knowledge”. In this case, chemists were compelled to ask how DNA with arsenic replacing phosphorus in its backbone could possibly be stable given everything they knew about the instability of arsenate esters. The latter had been studied for several decades, and while arsenic DNA itself had not been synthesized before, simpler arsenate esters were known to be highly unstable in water. The chemists were quite confident in extrapolating from these simple cases to questioning the stable existence of arsenic DNA; if arsenic DNA indeed were so stable, then almost everything they had known about arsenate esters for fifty years would have been wrong, a possibility that was highly unlikely. Thus for chemists, arsenic DNA was an extraordinary claim. And as Carl Sagan said, they needed to see extraordinary evidence before they could believe it, evidence that was ultimately not forthcoming.

For geologists however, it was much easier to buy into the claims. That is because as the article points out, there are several cases where elements in minerals are readily interchanged for other elements in the same column of the periodic table. Arsenic in particular is known to replace phosphorus in rocks bearing arsenate and phosphate minerals. Unlike chemists, geologists found the claim of arsenic replacing phosphorus quite consistent with their experiences. Physicists too bought readily into the idea. As the authors say, physicists are generally tuned to distinguishing two hypotheses from one another; in this case the hypothesis that DNA contains arsenic versus the hypothesis that it does not. The physicists thus found the many tests apparently indicating the presence of arsenate in the DNA to provide support for one hypothesis over another. Physicists did not appreciate that the key question to ask would be regarding the stability of arsenic DNA.

Like chemists biologists were also skeptical. Biologists usually check the validity of a claim for a new form of life by comparing it to existing forms. In this case, when the genetic sequence and lineage of the bacteria were inspected they were found to be very similar to garden variety, phosphate-containing bacteria. The biologists’ background knowledge thus compelled them to ask how it could possibly be that a bacterium that was otherwise similar to other existing bacterium could suddenly survive on arsenic instead of phosphorus.

In the end of course, none of the duplicated studies found the presence of arsenic in the GFAJ-1 bacteria. But this was probably the least surprising to chemists. The GFAJ-1 case thus shows that different sciences can have different standards for what they regard as “evidence”. What may be suitable for one field may be controversial or unacceptable for others. This fact helps answer at least one question for the GFAJ-1 paper: Why was it accepted in a prestigious journal like Science? The answer almost certainly concerns the shuttling of the manuscript to planetary scientists rather than chemists or biologists as reviewers. These scientists had different standards of evidence, and they enthusiastically recommended publication. One of the key lessons here is that any paper on cross-disciplinary topics must be sent to at least one specialist from each discipline comprising the field. Highly interdisciplinary fields like astrobiology, drug discovery, and social psychology are prime candidates for this kind of a policy.

Discipline-dependent standards of proof not only explain how occasionally bad science gets published or how promising results get rejected but it also goes into the deeper issue of what in fact constitutes “proof” in science. This question reminds me of the periodic debates about whether psychology or economics is a science. The fact is that many times the standard of proof in psychology or economics might be unacceptable to a physicist or statistician. As a simple example, it is often impossible to get correlations of better than 0.6 in a psychological experiment. And yet such standards can be accepted as proof in the psychological community, partly because an experiment on human beings is too complex to get more accurate numbers; after all, most human beings are not inclined planes or balls dropped from a tower. In addition one may not always need accurate correlations for discerning valuable trends and patterns. Statistical significance may not always be related to real world significance (researchers running clinical trials would be especially aware of this fact).

The article by Benner, Bains and Seager concludes by asking how conflicting standards of proof can be reconciled in highly cross-disciplinary sciences, and this is a question which is going to be increasingly important in an age of inherently cross-disciplinary research.

I think the GFAJ-1 fiasco itself provides one answer. In that case the most “obvious” objection was raised by chemists based on years of experience. In addition it was a “strong” objection in the sense that it really raised the stakes for their discipline; as noted before, if arsenic DNA exists then much of what chemists know about elementary chemical reactivity might have to be revised. In that sense it was really the kind of falsifiable, make-or-break test advocated by Karl Popper. So one cogent strategy might be to first consider these strong, obvious objections, no matter what discipline they may arise from. If a finding passes the test of these strong objections, then it could be subjected to less obvious and more relaxing criteria provided by other disciplines. If it passes every single criterion across the board then we might actually be able to claim a novel discovery, of the kind that rarely comes along and advances the entire field.

First published on the Scientific American Blog Network.

9 comments:

  1. Even if you say "This is not a triumph of chemists and a blemish on other sciences," it sounds a little smug to say that different disciplines have different standards and chemists were right because of the standard they used. Actually, if non-chemists were more accommodating about the validity of the study, I think that was probably because they trusted the expertise of chemists who were presumably involved in the publication including the co-authors, other collaborators that the authors surely must have consulted if not listed as co-authors, and the reviewers. It is difficult to make judgements on matters outside of your speciality. There was perception that specialists OKed this paper.

    I am an ex-physics major who became a biologist, so I can present views by physicists and biologists.

    Most physicists other than Paul Davies, who was one of the co-authors, did not have any stakes in this discovery. For them, this sounded like a cool discovery far outside of their field. So, I don't think there was anything wrong if they take this discovery at face value just as laypersons would. In any case, their opinions don't matter much because they are mostly not researchers in this field.

    For biologists, this was an odd little paper. Some obvious controls were lacking. There were indications that experiments were carried out sloppily. Some obvious simple experiments were not done. There were many things to criticize from biologist's point of view and indeed biologists criticized. But one problem was, the study also included fancy highly specialized physico-chemical analysis. This is not what we biologists are familiar with. We defer to the experts=chemists. If you assume that chemists think that the evidence is strong, this could be considered worth publishing in a prestigious journal, even if the biology is sloppy. It turned out that chemists also had problems with this study.

    So, I think it was not so much that we have different standards but different expertise. In order to evaluate cross-disciplinary study like this, experts in all the relevant disciplines are needed and they need to talk to each other.

    This is a little different from sloppy statistics used in psychological experiments. Psychologists need to be better statisticians because there work depends on statistics. Sloppy statistics is now undermining their field. Sloppy statistics is a huge problem in biology, too. This is where higher standard is needed.

    ReplyDelete
    Replies
    1. Yes, different expertise is what can lead to different standards. Perhaps the word standard (from the original paper) deflects attention from the central point but it's goal is not to besmirch the quality of expert opinion. Here's a crude analogy: if a Geiger counter is your only equipment and if you use it to try to study viruses then you are using the wrong standard. This does not mean you are a lesser scientist, it just means that your field gives precedence to certain kinds of questions or approaches that happen to be inadequate to one particular task. In this particular case it turned out to be chemists' standards which were correct, but the paper also points out other cases where other specialists used the right criteria and chemists did not. In one sense the paper is simply saying that certain sciences are better equipped to answer certain questions, a statement that should be wholly uncontroversial.

      As for your quip that "In order to evaluate cross-disciplinary study like this, experts in all the relevant disciplines are needed and they need to talk to each other" I (and the paper) couldn't have put it better.

      Delete
    2. Actually, Geiger counter could be used to study viruses. Biologists often use radioisotopes for labeling nucleic acids or proteins and measuring radioactivity is definitely part of what they do. If not Geiger counter, scintillation counters have been definitely used. For example, reverse transcriptase was discovered from virus by monitoring radioactivity in a biochemical assay. And biologists don't need to know exactly how the Geiger counters or the scintillation counters work. What is required is that physicists and engineers who designed the instruments did adequate job.

      Likewise, non-chemists have to rely on expertise on chemists when it come to research that is chemical in nature. Chemists did voice objections to the arsenic DNA paper. But that didn't happen during the writing and reviewing phase. Either that the chemist was not asked to review the paper or the one that was asked did not do his or her job. Either way, the expertise was not properly used when the paper was written and reviewed.

      Delete
    3. I was talking about non-radioactive viruses, but anyway, you get the point! The fact that chemists did not voice objections to the paper during the reviewing process is exactly the flaw that the authors of the paper are citing. They say that the paper was shuttled to planetary scientists who were not chemists, although I don't know they found this out.

      Delete
  2. Did you track the blog of fellow Field of Science Blogger Rosie Redfield (RRResearch)?
    Dr. Redfield is a member of Department of Zoology at UBC and her work provided evidence to debunk the arsenic paper. Perhaps you should ask her if she considers herself to be a biologist or a chemist?

    ReplyDelete
    Replies
    1. In my book Rosie Redfield is as much of a chemist as a biologist. The paper did say that biologists also expressed due criticism, although Rosie Redfield was outstanding in commanding it.

      Delete
  3. lets not forget cold fusion where chemists were the ones lacking in appropriate skepticism

    ReplyDelete
    Replies
    1. Indeed. Cold fusion was an example where chemists' expertise simply did not equip them to ask the right questions and apply the right standards. One of the commentators on the original SA post has pointed this out.

      Delete
  4. Anybody have the Brenner Astrobiology article NOT behind a paywall?

    ReplyDelete

Markup Key:
- <b>bold</b> = bold
- <i>italic</i> = italic
- <a href="http://www.fieldofscience.com/">FoS</a> = FoS