Map of life expectancy at birth from Global Education Project.

Thursday, January 17, 2008


Item One: It is my duty to comment on the study published last week in NEJM that came up with a new estimate of civilian deaths in Iraq following the 2003 invasion. The consensus spin on this was "See, those peaceniks got all hysterical over the earlier estimate of 600,000, it was only 150,000." So I guess the war was a good idea after all. Actually neither study is conclusive. The new one was based on a larger sample, which all things being equal is better, but it was conducted later, which is worse. The basic problem is that households in which people have died are less likely to be around to be sampled, and the problem gets worse as time goes on. This happens for a few reasons -- some households get wiped out entirely, and then there is no-one left to sample. Households that have lost the breadwinner are likely to dissolve, to move in with relatives, to leave the country entirely -- as 2 million Iraqis have done. The researchers tried to correct for this using various assumptions, but these were largely guesswork. Probably the worst thing about this study is that they used the Iraq Body Count database for geographic weighting. I could go on, but the bottom line is, this study might make you lean toward the lower bounds of the Johns Hopkins confidence limit, but it doesn't make me reject the earlier study. The truth might lie somewhere in between the two, but who cares? 200,000, 400,000, 600,000 -- it's still horrific.

Item 2: The Vytorin debacle -- which I linked to a couple of days ago -- is worse than you think, on many levels. To recap, if you watched even one hour of TV in the past months, you saw that weird ad in which people are presented who are dressed up to look like various items of food. It was telling you that a combination of a (patented, expensive) statin and a drug called ezetimibe is the best thing for preventing heart disease. The fact is, the manufacturers had absolutely no basis for believing that, ever. It lowers LDL cholesterol more than a statin alone, but that doesn't prove it is more effective at the ostensible purpose, which is to prevent heart attacks. The FDA nevertheless approves drugs based on these so-called "surrogate endpoints," even though it has turned out, repeatedly, that they do not predict health outcomes after all. Even worse, the main point of the two-drug combination was to force doctors to prescribe the more expensive statin, instead of a cheaper generic, if they wanted to give people ezetimibe. Now it turns out that the companies knew for months the results of studies showing that people taking Vytorin had worse arterial plaque than people taking a statin only, and they delayed releasing those findings while they continued the massive advertising campaign, obviously trying to milk every last dollar out of the situation they could. Last but not least, ezetimibe has side effects, sometimes quite severe -- which they were inflicting on people for no benefit, at the same time they were robbing them. But that's just typical drug company behavior. The jails are full of crack dealers and smack dealers, let's put some Vytorin dealers in jail for a change.

Item 3: Suppose one of the leading Democratic presidential candidates was a radical extremist who wanted to amend the constitution to accord with Marxist doctrine. How do you think that candidate would be treated by the corporate media? Then we have Mike Huckabee.

Item 4: This just makes my blood boil. (Abstract only available to you uncredentialed scum.) The incremental reforms at the FDA now include the requirement that drug companies register all the trials they have done on their dope, including the ones that aren't published. So these sleuths managed to wangle all the data on antidepressants. Of the trials done testing anti-depressants, almost exactly half found a beneficial effect, and half did not? Guess which half got published. Actually, it's worse than that -- some of the ones with negative findings were published, but claimed positive findings. They get away with that by fishing around for a comparison somewhere in the data that makes the drug look good, even though that wasn't what the trial was intended to test, and then pretending that was the point of the trial all along. They would often omit to mention the negative finding at all. (The reason that is totally not legit is because some apparently significant findings will appear by chance, even though there isn't any real effect. If you comb through your data looking for significant p values, and just report those, you're committing a kind of statistical fraud.)

I mean it. The wrong people are in jail.

1 comment:

Anonymous said...