What did you say the error was??

I was looking at some experimental data for drug molecules binding to a pharmaceutically relevant protein.

The numbers reported were as percentages of binding relative to a standard which was defined to be 100%. Here's how they looked:

97.3 + - (plus or minus) 68.4
79.4 + - 96.1
59.5 + - 55.3
1.4 + - 2.5

Seriously, how did the reviewers allow this to go through?


  1. um... it was probably gotten via a plate reader and an enzymologist. +/- 100% is par for the course where I am from.

  2. Once I received from a pharma company some odd luciferase assay results belonging to an experiment we were collaborating. According to them, my plasmid was not working.

    The results were so unanticipated that I asked for raw data to better understand. As usual, they were normalizing luminometer data with protein content. Apparently, they got a problem to lyse the cells and both, BOTH luminometer and protein microplate data were negative (were slightly inferior than blanks). They did not care and divided both values obtaining such a strange results:

    raw RLU: 12
    luminometer blank: 20
    net RLU: -8

    raw protein: 0.100
    protein blank: 0.300
    net protein: -0.200

    RLU/prot = -8/-0.2 = 40 !!!

  3. 96well; That sounds pretty ridiculous! I try to do "predictive" modeling and it's not clear to me how to use such data and assign it activity values in the model. In any case, as you note, it can be immensely valuable to look at the raw data which unfortunately is not always reported.

    Milo: Then how does one actually use such numbers?

    The data are from a binding assay for a GPCR by the way

  4. I've met the article that do QSAR with such values (don't ask I dnt remember :)

    Also I think that for IC50 (EC50, etc) it's quite normal values

  5. Should I then be surprised that predictive modeling does not work?!


Markup Key:
- <b>bold</b> = bold
- <i>italic</i> = italic
- <a href="http://www.fieldofscience.com/">FoS</a> = FoS