I have more than once had the experience of seeing a research report cited in one article as supporting one or another point, getting the cited report, and discovering that no, it doesn't say that after all. We now have an important study by Steven Greenberg, of the World's Greatest University, that shows how the scientific enterprise can go awry in a big way for a long time. (BMJ has been making its peer reviewed research reports, such as this one, open access, while closing the rest of the journal to subscribers only. I find this policy annoying since it is usually the commentaries, policy analyses, and lay summaries which are of more interest to the general public, but I have faith in y'all to wade through this rather technical discussion if you are so inclined.)
Greenberg selects a rather arcane proposition for scrutiny: that the protein beta amyloid, well known for its association (though not necessarily causal) with Alzheimer's disease, is also associated, perhaps in ways that suggest a causal role, with an uncommon muscular disorder called inclusion body myositis. He finds that initially, there were four papers, all from the same laboratory and probably based on only 3 different sets of observations, that supported the hypothesis; and 6 papers that contained data which undermined or weakened the claim. However, the positive findings got almost all of the subsequent citations in other peer reviewed articles, while some of the negative findings got none at all.
It gets worse. Later review articles tended to selectively cite the positive findings, and review articles in turn generally receive many citations. This tended to further amplify the dominance of the positive view. Other articles cited previous articles which presented the association between Beta amyloid and IBM as a hypothesis, but described it as a finding. Others even cited research which refuted the hypothesis as supporting it, or articles which had essentially nothing to do with it. Using the Freedom of Information Act, Greenberg was able to discover that these same biases existed in funding proposals to the National Institutes of Health.
Why does this happen? Are scientists unconcerned with the truth after all? No, but we are all subject to forces which bias our assessment of evidence. One fundamental problem is publication bias: it is much easier to get papers published that confirm a phenomenon than papers that find no association. The former seems like news, whereas the latter seems less interesting to reviewers and editors. But in fact, both make an equal contribution to knowledge. Ruling out is just as much new information as is ruling in, but it doesn't stir the passions.
Second, funders such as NIH are strongly committed to hypothesis driven research. Exploratory research which is intended to describe phenomena and generate new hypotheses has a much harder time finding sponsorship. So once a hypothesis such as this one is out there, the way to get funding is to pursue it. A proposal claiming that we don't actually have any good idea of what causes IBM and asking to poke around with an open mind is just not going to shake down the long green, so long as there is an ongoing program of research based on a generally accepted model.
The result is that large chunks of the scientific establishment can become committed to a wrong idea, and invest a lot of time and money in it, in a self-perpetuating process. Now, Greenberg has exposed only a single case. We don't know how common this sort of amplification of weak hypotheses may be. Obviously many biomedical hypotheses have proven fruitful and the programs of research based upon them have yielded effective treatments. But this is nevertheless an important cautionary finding.
I am personally committed, in both research reports and funding proposals, to reading the literature as thoroughly as possible, to going to primary sources and not relying on review articles or the description of findings in citations, and to being as balanced as I can in presenting both supportive and critical findings regarding my own hypotheses and conclusions. Of course I may try to make a case for one conclusion or another, but I pledge to do it honestly and to accurately present objections. Unfortunately, there is no real accountability right now for citation abuse. On the contrary, it is rewarded. We have got to fix this.
Monday, July 27, 2009
Weird Science
Subscribe to:
Post Comments (Atom)
5 comments:
i did try the google, without success, to find out if my recollection of some factoid about the failure of an experiment or test yielding more info than a success is correct. so now i might have more info about google, or something like that.
that's an interesting bias of scientific literature that you describe.
Well, I would say that's an imponderable and depends on the specifics of the case. But the convention of labelling an experiment with negative results as a "failure" is just an indication of the problem. The idea of an experiment is to see what happens. If the experiment shows no association between A and B, it hasn't "failed," that's just the result of the experiment. It succeeded in demonstrating the negative. But you will have a hard time getting it published.
I suffer from IBM.
It is a confounding affliction. It has put me from an active independent life to a life in a wheeelchair.
I need a stand aid to get out of bed and on and off the commode as my hip flexors don't work anymore.
My arm and shoulder strenght are fading fast despite exercise.
Please keep looking for a reason, don't give up. I haven't!
I have a friend (yes, a friend) who argues that climate change isn't anthropogenic and the reason researchers are claiming it is, is because of the scenario you lay down in this post. Climate change research is where the money is, there's a bias towards papers that find anthropogenic change, etc.
He's one of the smartest people I know, but he's also a devout Mormon, so I figure it this must have stepped on his religious beliefs somehow.
Dear sir,
I have recently visited your blog and I also have three blogs related to healthcare and lifesciences and chemicals. So, I
want that you will place our links on your blog blogroll section and then I will place your links also on our B2B portals .
The links are given below:
http://www.jazdlifesciences.com/pharmatech/
http://www.jazdhealthcare.com/healthtech/
http://www.jazdchemicals.com/chemyellowpages/
Once you will place our links, then please inform me by giving a mail. As soon as I recieve your mail, I will place your
links to my Directories.
Thank you
Anoop Kumar
http://www.jazdb2b.wordpress.com/
Post a Comment