Map of life expectancy at birth from Global Education Project.

Friday, January 14, 2011

Most of this is kind of complicated . . .

But one piece is real simple. BMJ only makes its full research reports open access, when it probably should be the social and political analysis and commentary that's available to the public. This time, however, it works out with this report from RD Smyth and colleagues, PR Williamson senior author, Liverpudlians all.

To try to put this in a pistachio shell, they tracked down investigators whose published clinical trials seemed not to report all of the results they had obtained or specified in the trial protocol. We all know about publication bias -- negative findings (in this context meaning, the medication didn't work) tend not to get published, partly because of drug company perfidy and partly because of the biases of journal reviewers and editors. But these folks wanted to find out from the horse's mouth (where does that expression come from, anyway?) why these investigators only published some of what they found, or didn't publish on their protocol outcomes.

I will digress to remind readers that it is very important to a) report your pre-specified outcomes and b) report negative findings. Not doing so can bias the overall weight of evidence. Pre-specified outcomes are important because those represent true hypothesis tests. Therefore the associated p values are technically valid and the evidence is strong. Reporting on findings you weren't originally looking for can result in spurious observations being accepted as convincing evidence. Negative findings are important because bias toward positive findings obviously makes interventions look better than they really are. And even more obviously, evidence of harm must be revealed.

There was a very high refusal rate for this study, and you can reasonably presume that people were more likely to refuse when they thought they had something to hide. Also, industry funding was associated with frequent claims that trial protocols were confidential. So we can bet that the situation is worse than what the Liverpudlians could uncover.

Mostly, respondents said things like, they thought that negative and non-significant findings just weren't interesting; the word limits for journal articles made it hard to talk about everything; they just didn't think one or another result was important; or it turned out they couldn't bring in the sample size they needed within their budget. Somewhat shockingly, about half of protocols -- which managed to get funding and ethical approval -- did not have specified primary outcomes. Hmm. I'd like the contact info for those funders.

So this is not good news. For those who are suspicious of how evidence based evidence based medicine really is, your antennae should be tingling because it looks like a lot of investigators aren't being trained properly and don't understand what they need to be doing. But there is also this quote:

When we looked at that data, it actually showed an increase in harm amongst those who got the active treatment, and we ditched it because we weren’t expecting it and we were concerned that the presentation of these data would have an impact on people’s understanding of the study findings. It wasn’t a large increase but it was an increase. I did present the findings on harm at two scientific meetings, with lots of caveats, and we discussed could there be something harmful about this intervention, but the overwhelming feedback that we got from people was that there was very unlikely to be anything harmful about this intervention, and it was on that basis that we didn’t present those findings. The feedback from people was, look, we don’t, there doesn’t appear to be a kind of framework or a mechanism for understanding this association and therefore you know people didn’t have faith that this was a valid finding, a valid association, essentially it might be a chance finding. I was kind of keen to present it, but as a group we took the decision not to put it in the paper. The argument was, look, this intervention appears to help people, but if the paper says it may increase harm, that will, it will, be understood differently by, you know, service providers. So we buried it. I think if I was a member of the public I would be saying ‘what you are promoting this intervention you thought it might harm people—why aren’t you telling people that?

Res ipsa loquitur. And I'd sure like to know what that shit is so I can make sure not to take it.

8 comments:

kathy a. said...

that sounds like a pretty open [but anonymous] response about why bad findings aren't included. the risk may be small, but aren't patients and doctors supposed to weigh risks with a full understanding of what the risks are? i don't know how patients and doctors are supposed to make "informed decisions" if information is being hidden.

i don't know of any medication that carries no risks. aspirin was the old OTC standby in my youth, and more recent research suggests that low doses have protective effects for cardiovascular problems and lowering cancer risk. but it still carries risks: reye's syndrome in kids, stomach irritation, excess bleeding, and allergic responses in susceptible people.

i can, as a consumer, find that information about aspirin. but there's a problem when i'm expected to make an informed decision about some RX, and neither my doctor nor i can have access to any possible down sides.

Cervantes said...

Actually I would say your reaction is pretty mild. I find this extraordinarily egregious. The way I see it, this is something docs are doing, and they don't want any possible bad news about it to get out. They just choose not to believe it.

These results need to be published. They might be spurious but if they are published, they will be tested and confirmed or not. The quote demonstrates highly unethical conduct, and something fundamentally wrong with the culture of medicine, as far as I'm concerned.

kathy a. said...

we aren't in disagreement. there's a collision here between marketing and research ethics, and both patients and their treating physicians come out on the short end. which is unacceptable, scary, and could be very dangerous for individuals. i wonder how widespread the practice of burying bad findings is?

what is kind of interesting is that the anonymous commenter wanted to publish the bad findings -- so he knew there was a problem with burying them -- but he did not seem to recognize the seriousness or the ethical problems.

Cervantes said...

He appears to have been junior faculty or a post-doc who got talked out of it by the poo bahs.

C. Corax said...

I'm confused. Are you saying that doctors who participate in the clinical trials are aware of the unreported findings? Are these trials on behalf of Big Pharma?

Also, ethical or not, if negative findings tend not to get published and someone's working in a publish ore perish environment, then of course they're going to suppress negative outcomes.

It seems that the whole publication process needs an overhaul.

C. Corax said...

I'm confused. Are you saying that doctors who participate in the clinical trials are aware of the unreported findings? Are these trials on behalf of Big Pharma?

Also, ethical or not, if negative findings tend not to get published and someone's working in a publish ore perish environment, then of course they're going to suppress negative outcomes.

It seems that the whole publication process needs an overhaul.

roger said...

that's a very damning quote. no wonder so many people don't trust doctors. the BMJ report carries on the same process by not identifying the studies they studied. apparently they are no more ready to inform the public than the the ones who hid the data. they study publication bias in secret. they gave us findings with no way for anyone to check on their own biases.

Cervantes said...

Yes, the whole point of the study is that investigators don't publish all of their findings; and do publish stuff they shouldn't, at least without representing it properly. The main bias it to turn negative findings (i.e., the drug doesn't work) into positive ones (it does something good), either to please the funders, or just to have a better chance of publication.