Y'all know how much I love those Europeans for being a beacon unto the world when it comes to health care policy, including the ban on Direct to Consumer drug advertising and the UK's wise and delightsome National Institute for Clinical and Health Excellence, that produces evidence based guidelines for cost effective treatment. Yes, those are prosperous, democratic capitalist countries with civil liberties and markets and comfortable middle classes and all that good stuff yet somehow they manage to do it without letting drug companies run the health care show. What would the Cato Institute say?
Anyhow, the drug companies aren't giving up without a fight. Hannah Brown in the new BMJ tells us how they are trying to undermine the EU ban on DTC advertising, and Clare Dyer tells us how Pfizer is trying to overturn NICE's guidelines for use of the (largely useless) drug Aricept for dementia. (Sorry, the second one is subscription only but you can read the first couple of paragraphs.)
As Brown explains, the drug companies aren't asking the EU parliament to let them advertise, exactly, but just to let them fund "informational resources" for consumers. They claim that otherwise, people won't have access to reliable information, and that the ban harms the "competitiveness" of the European drug industry. Both claims are obviously nonsensical on their face: there are plenty of reliable informational resources for consumers about drugs, and there is certainly no reason to think that information sponsored by drug manufacturers is somehow going to be less biased than what is already out there. As drug company critics point out, it's pretty much impossible to clearly delineate where "information" ends and "advertising" begins, when the vendor is providing the information. If there's a concern about people getting info from unreliable sources, all the EU needs to do is put its stamp of approval on high quality info. And the European drug industry doesn't somehow become more competitive when all drug companies can shill equally in Europe.
As for Aricept, it's a perfect example of why we need a NICE here in the U.S. It has been shown that it can briefly slow the progress of Alzheimer's disease. Then, after a few months, it stops working. So, NICE has approved it for moderate AD, but not for early or late stages, which is perfectly sensible. Use it when it can do the most good, and don't waste money on it when it won't do anything. But Pfizer wants to be able to sell it to people for years on end, as it does in the U.S., even though the evidence shows that is pointless.
So, we'll see if the superior wisdom of our friends across the ocean holds up. Whether the Greatest Country in History ever discovers common sense remains to be seen.
Friday, March 30, 2007
These guys are relentless
Y'all know how much I love those Europeans for being a beacon unto the world when it comes to health care policy, including the ban on Direct to Consumer drug advertising and the UK's wise and delightsome National Institute for Clinical and Health Excellence, that produces evidence based guidelines for cost effective treatment. Yes, those are prosperous, democratic capitalist countries with civil liberties and markets and comfortable middle classes and all that good stuff yet somehow they manage to do it without letting drug companies run the health care show. What would the Cato Institute say?
Thursday, March 29, 2007
Yes, we take requests
We had an inquiry a while back about how to dispose of prescription drugs, specifically prescription opioids. I thought that it would be okay to flush that particular class of drugs, since the biological effect would be trivial if they were highly diluted, and they will biodegrade and not bioaccumulate. For what it's worth, the White House Office of Drug Control Policy agrees with me. They're wrong about everything else, but I suppose we can trust them on this one since it doesn't intersect in any obvious way with their warped ideology.
However, they recommend this only for drugs whose label says to do so, which basically are those which are particularly subject to diversion and abuse, including opioids and amphetamine-like drugs. (Check the link for a list.) They don't say so, but I would strongly emphasize that people should not flush antibiotics or hormones.
Most drugs, ODCP recommends taking out of their original containers, mixing with coffee grounds or kitty litter or some such so nobody is tempted to try them, putting them in a sandwich bag, and throwing them in the trash.
Best of all, however, are community take-back programs where you can drop off your left-over pharmaceuticals for proper disposal (ultimately incineration, I should think). But, they aren't available to most people. We really need to consider comprehensive "take back" centers for several classes of items, including used electronic equipment, compact flourescent light bulbs, and hazardous chemicals such as pesticides, solvents and finishing materials. Many of these items contain recyclable metals, including mercury which is hazardous, or chemicals that can be safely neutralized. It's good to keep these out of landfills and to recyle where possible.
An unhappy announcement
I will no longer be participating in Today in Iraq. I hope to write more about Iraq here, and perhaps there will be other opportunities for me to help bring the story of that unhappy land to English language readers. Unfortunately an interloper, for motives which are not clear to me, seized control of the site from the community which had been sustaining it, and the project is effectively over.
This medium, obviously, presents a whole new context for human interaction and communication. It's obviously more remote than the personal contact our species grew up with, but it can bring us into closer interaction than mere pen pals. For one thing, web sites are material resources that people can control, improve, or damage. That gives us something to collaborate or fight over, honorably or dishonorably, honestly or dishonestly. So, just as in the world of atoms, watch out for con artists, thieves, and liars in the world of bits.
Wednesday, March 28, 2007
None dare call it treason
It so happens I'm doing a study right now which required me to get some information from an agency of the federal government -- a presumably non-political branch of CDC, to be slightly less vague -- about the racial and ethnic identification standards used in a particular data system. After a couple of weeks of trying, I couldn't get anybody to answer my phone calls or e-mails, so I called a friend somwhere in a relevant federal office.
She told me that she didn't think she could be of any help. Career civil servants right now are being bird-dogged by political appointees, who subtly threaten them. Questions from the general public about any potentially politically charged issues -- with race and ethnicity obviously being one -- are booby traps, and they don't want to touch them. I could FOIA this info, but who knows how long it would take to extract it, and as it happens it isn't worth it to me. My friend, naturally, is looking for a job.
The neo-cons have substantially succeeded in their project of trashing the federal government. Even if we do get rid of these depraved fanatics in 2009, it's going to take decades to rebuild the infrastructure of non-political, professional, effective government that we once took for granted. And that's not just in the places where it's gotten a lot of public attention -- DoD, Justice, State, EPA, even NOAA. It's everywhere.
As usual, I put the blame for this immeasurable national disaster mostly on the corporate media -- the TV networks, the New York Times, the Washington Post, and most of the rest of them. They inflicted these malignant clowns on us, and they still haven't grasped what they have done.
Tuesday, March 27, 2007
Another Open Door Crashed Through
No doubt you have encountered all the flapdoodle over the new findings that angioplasty and stenting in coronary arteries does not prevent heart attacks or reduce the risk of death. (Authorship by a multitude. This does not apply to people who have an acute blockage of a coronary artery, for whom an immediate roto-rooter intervention to reopen the blocked artery is indeed beneficial.) This news immediately caused the stock of Boston Scientific, which manufactures stents, to dive.
But, uhh, this is not exactly unexpected, or even news. There has never been any evidence that angioplasty and stenting as a prophylatic strategy can prevent heart attacks or prolong life. The only justification for these interventions in people who are not already having an acute heart attack is to relieve symptoms of angina. (Ditto for coronary artery bypass graft surgery.) What is new, if anything, about this study is that it finds that even those benefits are very modest.
Nevertheless, there has been a sort of vague impression out there among the public, and evidently many doctors, that these procedures do have preventive value. I can't cite chapter and verse on that, but the vast excitement that greets this study is proof of it. As the authors state right up front, with a veritable torrent of foonotes:
PCI reduces the incidence of death and myocardial infarction in patients who present with acute coronary syndromes,5,6,7,8,9,10 but similar benefit has not been shown in patients with stable coronary artery disease.11,12,13,14,15 This issue has been studied in fewer than 3000 patients,16 many of whom were treated before the widespread use of intracoronary stents and current standards of medical management.17,18,19,20,21,22,23,24,25,26,27,28
Although successful PCI of flow-limiting stenoses might be expected to reduce the rate of death, myocardial infarction, and hospitalization for acute coronary syndromes, previous studies have shown only that PCI decreases the frequency of angina and improves short-term exercise performance.11,12,15 Thus, the long-term prognostic effect of PCI on cardiovascular events in patients with stable coronary artery disease remains uncertain.
So, hundreds of thousands of people have undergone these procedures largely on the basis of speculation. I think this is a very powerful example of the interventionist bias in American medicine. We fear that a government financed, single-payer system will mean that health care needs to be "rationed," but the fact is, that will be a good thing. The compulsion we feel to "do something" is very likely to lead us to do more harm than good.
Of course, in the case of coronary artery disease there is a good deal we can and should do, both for prevention and treatment. Number one, do that boring stuff that doesn't make money for any Fortune 500 company, that is lose weight, eat right, exercise more, quit smoking. Number two, there are medications that are beneficial under the right circumstances, but for the most part, they are available as inexpensive generics. (Please ignore the whorish Dr. Jarvik in those slick TV ads, trying to sell you an expensive patented statin.) But it isn't nearly as exciting, or profitable.
Monday, March 26, 2007
Sesquepedalianism in the service of truth
I mentioned the Transtheoretical Model a couple of posts back and I guess I should explain what that is. It isn't really a sociological theory, it's actually a concept in applied counseling psychology. Here's an abstract, written by one of the developers and a colleague (the other originator was DiClemente), which brags about how great it is.
Okay, this is really just a common sense idea that is dressed up in a lot of jargon and high sounding rhetoric. If someone has a bad habit -- "bad" defined as we think they ought to stop doing it -- they may not be thinking about stopping at all. (I don't have a drinking problem. I drink, I get drunk, I fall down, no problem.) They are in "pre-contemplation." Maybe some other people are sorta kinda thinking about stopping but they aren't really trying yet. They are in "contemplation." Then you have people who have decided that they really ought to do something about this, but they haven't really tried yet. They are in "preparation." Then you have people who are going to counseling or AA meetings or just seriously working at stopping. They are in "action." They may relapse briefly and still be in the action phase, or they may backslide all the way to preparation or contemplation, but eventually they may get sober and stay that way for a while, in which case they are in "maintenance." Finally, some people get to the point where they don't even have to keep working at it, and they are in termination. (That's me with tobacco -- I never even think about it any more.)
Now, this isn't really the sort of idea that can be true or not true. It's tautological -- applying a set of labels to states that common sense tells us do exist. If you fit the description of contemplation, then you're in contemplation. QED.
Its main virtue is that it guides counselors to talking with people about the right stuff. If they aren't even trying, then concentrate on why they need to try. If they are trying, concentrate on how to succeed. If they're maintaining, concentrate on how to maintain. If they're terminated, say goodbye and stop cashing their checks. (Hah!)
It works, but it's not exactly rocket science. However, if you want to get published in peer reviewed academic journals, you have to talk in an exalted manner. So you call it the "transtheoretical model." Whatever.
Friday, March 23, 2007
I've got a bad feeling . . .
This is relevant to public health because blowing people up contributes to morbidity and mortality.
As you may have heard, a GAO report says that something like 400 munitions depots in Iraq are unguarded, and people are just bopping in and helping themselves to artillery shells, RPGs, high explosives, and what not. It seems the original plan was to have the Iraqi army guard the sites, but then LarryPaulJerryWhateverTFHisNameIsPresidentialMedalOfFreedom Bremer disbanded the Iraqi army and well, nobody had another idea. The U.S. military is too busy right now driving around getting blown up by the bombs made from all these free explosives to actually guard the places where the bombers are getting their ingredients, and you can't trust the new Iraqi army to guard them because they're mostly the same people who are looting them and blowing up Americans and each other in the first place.
Thank God the grownups are in charge.
Okay. This makes at least as much sense as invading Iraq in the first place. And while I think it's a great thing that Congress is doing oversight and passing legislation in the House that won't pass the Senate and would be vetoed even if it did, I'm not nearly jumping nearly as high for joy as some of my friends.
It's terrific that Henry Waxman and Patrick Leahy are going to force the corporate media to remind everybody that the gang of murdering thieves lied to start the war, massively screwed it up and lied about it the whole time, let their pals loot the Iraqi and U.S. treasuries to the tune of a a few tens of billions, sent the troops into combat without enough armor or equipment, let them lie in their own excrement after they came home with half their brains blown out, betrayed the country by blowing the cover of a CIA agent who actually was trying to do something about illicit weapons in the Middle East, fired U.S. Attorneys who refused to maliciously prosecute their political rivals or who legitimately prosecuted their crooked friends, systematically violated the 1st, 4th and 5th Amendments, etc. . . .
Fine. Congress can hold hearings but Alberto Gonzales isn't about to prosecute himself, therefore so what? And the Decider has already decided what he's going to do to get back in the Wartime President groove. If they get close to checkmating him, he's going to knock over the board. He's going to start a war with Iran.
Thursday, March 22, 2007
Why can't you behave yourself?
In order to get funding for a study or an intervention, you need to articulate a behavioral theory on which your project will be based. These theories are incredibly banal ideas tarted up in fancy dress -- with names like the Health Beliefs Model, Theory of Reasoned Action, Social Learning Theory, and the powerhouse Transtheoretical Model, also known as Stages of Change.
The Health Beliefs Model is more or less how your doctor is supposed to think. You make a so-called "rational" analysis comparing the risks of taking (or not taking) an action, the alternatives, the benefits, and do the math. Let's see, if I don't take the pills I'll have a reduction in life expectancy of 4.6 months, with a 20% risk of dying 2 years early, and a 30% chance that it won't make any difference, plus a .05% chance of significant side effects. If I do take the pills it will cost me $450 a year for the next 20 years which if invested in a conservative bond fund in a Roth IRA will be worth $6,000 by the time I'm 64 . . .
Uh, no, we don't really think that way. Social Learning Theory notices that we tend to imitate what other people do, and do what we are taught to do, in other words we start smoking because we aren't even thinking about dying of lung cancer, we start smoking because the cool kids do it. The Theory of Reasoned Action means I get benefits from having unsafe sex, such as I enjoy it or whatever, so maybe I'm not so dumb after all.
You get the idea. When people behave in ways that may harm their health, in the long or short term, they are making choices for which from their point of view they may have perfectly good reasons, even if we older, wiser adult types don't agree with those reasons. Can we really claim that the world will be better off if everybody accepts our idea of what's good for them, instead of their own? One argument in favor of that proposition is that the equation usually looks a lot different to people later on when they find that they actually have lung cancer or HIV or heart disease, and they regret the choices they made earlier. Another argument is that the rest of us have to take care of them. And, if we invest in efforts to persuade them or motivate them to make healthier choices, those will still be their choices, after all -- we aren't forcing them, we're just looking for ways to rebalance their Social Learning and their Reasoned Action.
But, it turns out to be very difficult. Community interventions to influence health related behaviors often have some effect, but usually small and not a lot of bang for the buck. One exception in the U.S. has been tobacco control -- the prevalence of smoking has gone down steadily for the past 20 years, although it has accelerated considerably with the introduction of coercive measures such as workplace smoking bans and effective efforts to prevent youth from buying tobacco. So far, we're getting nowhere fast with diet, physical activity, and obesity. But maybe we will figure out better ways in the years ahead.
Wednesday, March 21, 2007
I would venture to say that the single most widely studied issue in the sociology of medicine is what we used to call compliance, but which it is now politically correct to call adherence. That means, why do people do what the doctor tells them to do, or not do it, and how can we get them to do it more consistently? There is usually an unspoken assumption that they ought to do what the doctor wants. The shift from "compliance" to "adherence" was intended to acknowledge that the assumption might be incorrect, but it didn't really have any effect on people's thinking about this.
For all the overhyped and overprescribed pills out there, including ones that shouldn't be on the market at all, I will most definitely affirm that there are pills that you would probably want to take under the right circumstances. There is also plenty of standard medical advice that none of my readers will argue with, such as not smoking, drinking less than a fifth of Old Grandad a day, being physically active, etc., and that there is dietary advice which is very important for some people, such as people with diabetes.
Of course, given the inherent biases in funding and in the culture, most of the attention goes to pill popping. It turns out that in the case of long-term medication "regimens,"* as we call them, such as medication for blood pressure control, only about half of the people continue to take the pills consistently as prescribed. In the case of antiretroviral medications for HIV, doctors are obssessed with the problem, because the regimens can be relatively difficult to follow, people with HIV often have particular reasons why it's hard for them to follow a complicated drug regimen, and the standard of adherence doctors believe is necessary is very high (taking 90% or more of recommended doses). Even short courses of medication, particularly antibiotics, are problematic. Many people stop taking antibiotics as soon as they feel better, but doctors believe this increases the risk of letting drug resistant strains of bacteria loose in the world.
I have been studying this problem for a long time, and like my colleagues, I'm sorry to have to say that I still don't know a whole lot about it. Certainly people need to be given the minimum necessary conditions for adherence, which are:
- They have to understand clearly what they are supposed to do (and in the case of most people, probably have some idea of why they are supposed to do it, although there are some who don't really care about that); and
- They have to be able to afford to buy the pills and obtain them with reasonable convenience.
Once we've provided those conditions -- which we very often do not -- it's far from clear why Fred takes the pills and Alice doesn't. Researchers have looked into all sorts of basic facts about people -- gender, age, race and ethnicity, level of formal education, etc. etc. -- and it doesn't show a whole lot. Even if it did, that wouldn't tell us what to do about it. Of course, people are less likely to be adherent when the pills have unpleasant side effects, but see previous sentence. Active drug addicts and alcoholics have trouble taking their pills, but that's a special case.
They've tried all sorts of interventions -- patient education, physician communication strategies, etc. -- but only the most intensive, expensive, and probably unrealistic ones, not far short of having a nurse move in with you, really seem to work.
People just do what works for them, basically. They take the doctor's opinion into account, but if it gets to be too much of a pain in the gazongas, if it reminds them all the time that there is somehow supposedly something wrong with them, if it's costing $15 a month they'd rather spend on lottery tickets, or if they just have a vague idea that it's better not to be taking a lot of pills all the time, they'll take the pills when they feel like it, or not at all. And so far, that's still the bottom line.
How about you? Are you supposed to be taking pills? Are you doing it? Why or why not?
* "Regimen" sounds military to most people, but etymologically it's really derived from "regime," meaning government, and refers to rules or good order. However, the set of available drugs to treat a condition is called the physician's "armamentarium," indeed one of those classic military metaphors in medicine.
Tuesday, March 20, 2007
I just discovered I had that problem again where the blog wasn't displaying properly in IE. Should be fixed now. In case you were wondering, yes, I have been posting for the past few days, please do read all the great stuff you missed, if you missed it, and if it's great.
P.S. -- Get Firefox! Right now! What are you waiting for?
One of the most difficult challenges the current U.S. administration will pose for historians is deciding which of its innumerable crimes against humanity constitutes the greatest outrage. Of course even posing the question may constitute a sin of reductionism - the various outrages are all avatars of the same God of greed and powerlust. It is, after all, loyalty to the billionaires of the oil industry, and contempt for humanity in general, that brought us the campaign of lies that produced the Iraq war, and the even more sustained and elaborate campaign of lies denying global climate change that prevented any serious effort to do anything about it.
Right now the lies are unravelling so fast, and in so many places, that we can't even take it all in. Henry Waxman got done with Valerie Plame on Friday and moved on to carbon dioxide yesterday, but the climate change hearing was scarcely noticed. Here's the money shot:
One example showed how a report originally said the U.S. National Research Council had concluded that "greenhouse gases are accumulating in the atmosphere as a result of human activities, causing surface air temperatures to rise and subsurface ocean temperatures to rise."
Philip Cooney, the oil lobbyist who became chief of staff at the Council on Environmental Quality, changed that to read: "Some activities emit greenhouse gases that directly or indirectly may affect the balance of incoming and outgoing radiation, thereby potentially affecting climate on regional and global scales."
An oil industry lobbyist as chief of staff at the Council on Environmental Quality? Now that's a crime against humanity. I simply cannot understand the depravity of people who not just blithely, but in an agressive and bullying manner set out to severely damage the world their own grandchildren will live in just to add to their already obscene wealth.
Is that worse than an illegal war of agression that's killed a few hundred thousand people and left millions homeless? I would say yes. So how do these people compare to, say, Adolf Eichmann?
Update: It turns out I'm not the only person thinking about Nazis in this connection.
A government scientist, under sharp questioning by a federal panel for his outspoken views on global warming, stood by his view today that the Bush administration's information policies smacked of Nazi Germany. James Hansen, director of the Goddard Institute for Space Studies in the National Aeronautics and Space Administration, took particular issue with the administration's rule that a government information officer listen in on his interviews with reporters and its refusal to allow him to be interviewed by National Public Radio.
"This is the United States," Hansen told the House Oversight and Government Affairs Committee. "We do have freedom of speech here."
Now, dig Republic Party hack Darrell Issa:
But Rep. Darrell Issa (R-Vista) said it was reasonable for Hansen's employer to ask him not to state views publicly that contradicted administration policy.
"I am concerned that many scientists are increasingly engaging in political advocacy and that some issues of science have become increasingly partisan as some politicians sense that there is a political gain to be found on issues like stem cells, teaching evolution and climate change," Issa said.
Uhh, Darrell, see, that's not how it is. Scientists aren't engaging in political advocacy or being partisan when they teach evolution or study climate change. There's this concept called truth. Yes, yes, it has a liberal bias. You're just going to have to get used to that.
Monday, March 19, 2007
Weapons of Mass Communication
I finally broke down and got cable TV -- yup, for the past 30 years, when I've had a TV at all, I've had rabbit ears sitting on top of it. I haven't been able to watch CNN, Fox News, or for that matter the Daily Show, any of that stuff. But I needed cable to watch the Red Sox, so I had no choice.
So, for better or for worse, I can now experience the alternate realities of the current age of mass representation. I actually watched Valerie Plame Wilson's testimony before Henry Waxman's committee on C-Span, which means that I now have actual grounds for my complaint about the news coverage. Here are some things you would never know if you got your information about this event from the TV news, or for that matter from the New York Times.
Only three Republic Party Congresspersons bothered to show up, whereas the Democratic side of the dais was full. The Chair allotted equal time to all members, but thanks to inexorable laws of arithmetic this meant that the Republic Party only had about 15% of the time. During Democratic questioning of the witness, it was established that VPW was indeed an undercover CIA operative, who had traveled abroad under cover within the past five years, whose employment status was classified information, and who kept her work a secret from everybody but her husband and colleagues. It was further established that multiple administration officials, in the White House and State Department, including Karl Rove and Richard Cheney, revealed her identity -- classified information -- to people in government and to reporters. (BTW, it is not known who provided the information to Rove or Cheney, that was not established in the Libby trial.) The revelation that VPW was a secret agent destroyed ongoing intelligence operations, derailed her career, and put the lives of U.S. intelligence assets at risk. The CIA approved having Congressman Waxman read a statement to the above effect, and Plame Wilson testified to it under oath. It was not VPW's idea for her husband, Ambassador Joseph Wilson, to undertake a fact finding mission to Niger. She did not recommend him for the job. It was not nepotism. In fact, she would have preferred that he not go. She was merely asked to convey the message to him that the CIA was interested in having him do it. Every one of these facts is in direct contradiction of widely repeated Republic Party talking points, including assertions repeatedly made by reporters and pundits on TV talk shows as well as avowed administration apologists.
The Republic members of Congress, in their questioning, established the following: that Joseph Wilson is a Democrat. That Valerie Plame Wilson is a Democrat. That VPW once attended a conference at which her husband spoke about the Iraq war. She was there as his spouse and did not participate. Then they complained that they hadn't had enough time to ask all the important questions they still had. Of course, if a couple more of them had shown up they would have had all the time they wanted. Furthermore, if they had a bunch of zinger questions lined up, why did they use their time to ask a bunch of lame ones?
So, here's how the corporate media covers this hearing. Since reality did not supply "balance," they had to alter reality, giving equal time in the newscasts to Democratic and Republic questioners. Since that would not permit them to enumerate most of the important facts established in the hearing, we get to learn only two: That VPW says she was undercover (and not that the CIA confirms this), and that she is a Democrat. Written reporting was little better, but you don't have to take my word for it, here's Greg Mitchell at Editor & Publisher.
C-Span also carried the rest of the hearing, which the corporate media completely ignored, including the revelation that there had been no internal investigation of this leak by the White House at all! Translation: George W. Bush is a stone liar.
Now, one has to consider the theater of this as well. I doubt that more than 1/20th of 1 percent of the public actually watched it on C-Span, but it did seem more like the movie version than the real thing. I'm not sure how this plays in Peoria, but VPW is -- how should I say this? -- a stone fox, woo woo woo, hubba hubba hubba. Now, that probably boosts her cred as a secret agent woman (Plame, Valerie Plame. Why thank you, I'll have a sweet Manhattan, stirred not shaken, with a cherry), but as a witness, she seemed more like Sharon Stone playing Valerie Plame in the movie version than the actual real person -- even though I admit, she is the actual real person.
Also, there was the slightly deranged looking woman with the pink T-shirt that said "Impeach Bush Now" who stayed in the frame as VPW testified. The CNN Newsroom show, to their credit, did a respectful interview with her, and she is not deranged at all, she's authentic, unbriefed and unpolished, but she did pretty well. Nevertheless, in Peoria she is likely to be perceived as an offensively countercultural 60's refugee. Anyway, she added to the whole feeling of the event as artifice, a prop added by the auteur to bring the controversy raging outside into the hearing room.
The facts elucidated in the hearing ought to be most offensive to right wing super-patriot militarist wing nuts, in fact. They would think that leftists would be happy to see a CIA agent exposed, and they think people who do that sort of thing ought to be shot. But in this case, the world has been turned upside down -- and your Liberal Media are making sure it stays that way.
Saturday, March 17, 2007
Am I missing something here?
Nowadays, it seems, anybody who isn't insane is obliged to appear deranged by ranting furiously about the obvious.
Based on information fully on the public record, including sworn testimony before a grand jury, a criminal trial jury, and Congress, as well as information widely reported in the news media, the putative President of the United States, his chief of staff, and the Vice President, are traitors. They betrayed the country to its enemies, in what they themselves declared to be a time of war, in order to gain political advantage -- specifically, the political advantage of not being held to account for a campaign of lies to the American people, the Congress, and the world, for the purpose of launching an illegal war of aggression, which is officially considered a crime against humanity.
Yesterday's congressional hearings on this matter merited a story starting on the lower right-hand corner of the front page of the Boston Globe, which is progress, since previous coverage has generally been on page A-17. The coverage on NBC news featured a Republican member of Congress accusing Valerie Plame Wilson of being a Democrat, a fact which, being under oath and all, she was forced to acknowledge. So obvious she just decided to be an undercover CIA agent tracking nuclear weapons programs in Iran and get her cover blown by Karl Rove in order to embarrass the Republican Party. It's all a plot after all.
I shouldn't even bother to add that the so-called president promised a thorough investigation when this first happened and that he would fire anybody involved. Of course there was no investigation, and he has yet to fire himself. But that doesn't seem to be a problem either. Oh, did I mention that the DOJ ordered U.S. Attorneys to undertake malicious prosecutions in order to rig elections, and fired the ones who refused? Well, that's only a minor point.
Now, I realize that life must go on and we probably can't expect everybody to drop whatever else they are doing and descend on the White House with torches and pitchforks. But maybe we should. In this circumstance, we don't have to worry about how to define High Crimes and Misdemeanors, because the Constitution specifically names treason as grounds for impeachment. But that's off the table because there is no fellatio involved. Look, this situation has evolved to the point where I'm starting to question my own sanity. How can it continue? Richard Nixon was properly impeached and forced to resign, and his crimes pale in comparison to these. What the hell is wrong with us?
Friday, March 16, 2007
Sorry for the absence
I was in meeting most of Wednesday, and didn't have a chance to post, then I was disconnected from Your Intertubes for a while as I had a new cable service installed. Whoa, what a difference!
Anyhow, the meeting, in a complicated and peripheral way, concerned the burgeoning Pay for Performance movement in health care, whereby Medicare, Medicaid, and private health plans give providers - mostly hospitals - bonuses for following certain standard, supposedly best procedures with high percentages of their patients. These are things like giving people an aspirin right away when they come into the ED with an apparent heart attack, and giving them a prescription for beta blockers when they leave the hospital, etc.
It sounds like a good idea -- if you're supposed to do it, then why the hell don't you and why not offer a financial incentive to make sure you do? The main trouble is that, at this point, we don't have a lot of evidence linking these supposedly best practices to better patient outcomes. Another problem with all this is that by the time you get to the hospital with some serious condition, it's already too late, and wouldn't it have been better to prevent it? But there are daunting conceptual and practical difficulties when it comes to implementing this sort of system with primary care practices.
For one thing, doctors can offer all the interventions they should to control high blood pressure and cholesterol, prevent diabetes, etc., but the patients have to actually carry them out -- take pills, lose weight, etc. -- and there's no sense writing a prescription for somebody who isn't going to take the stuff, but as a physician you don't want to be penalized in that situation. Doctors have to use judgment and do what's right in specific situations -- formulaic guidelines don't work very well. Another problem is that it is just technically very difficult and expensive to collect the data that would be necessary.
There are great difficulties trying to measure, or even define, quality in health care. But we do want to get our money's worth. I'll have more to say about this in the days ahead.
Wednesday, March 14, 2007
Seriously, though, I don't advocate a ban on DTC ads. Why does Pharma not get to advertise if other businesses do? If you say it's because you have to go through a doc to get an Rx or because people are not smart enough to understand the relevant medicine/science, it's almost like saying 'doctor knows best' or that people can not be savvy healthcare consumers. If you say it's because Pharma should be spending money on R&D instead of advertising, it suggests they are not autonomous businesses that can spend money however they see fit. mean, sure the ads are basically crap, but why not just tune them out along with all the annoying ads for cars and whatnot?
There are a couple of issues in there that I'll have to tease apart in order to respond. The first is the criticism of medicine as paternalistic, the traditional "expert model" in which patients are completely passive consumers. Would a ban on prescription drug advertising suggest that we don't think consumers are capable of processing messages skeptically, winnowing out the wheat from the chaff, and participating effectively in decisions about their own treatment?
The second is the question of why pharmaceutical products should be treated any differently from cars or men's cologne or cosmetics or junk food. Most advertising is largely bullshit, so what else is new?
I agree that we should do our best to be informed consumers of health care and to participate in decisions about our own treatment. I would say that in the real world, we more often need to resist our doctors' efforts to give us drugs than to encourage them. But that's an empirical question. The problem with advertising is that it is not neutral information (see above paragraph), it is intended to manipulate our behavior, not to inform us. Barbara Mintzes and colleagues compared physician visits in Sacramento and Vancouver, Canada, where patients had less exposure to prescription drug advertising. (It wasn't a perfect choice of site since people in Vancouver can see U.S. TV.) They conclude:
Patients who requested DTCA drugs were much more likely to receive 1 or more new prescriptions (for requested drugs or alternatives) than those who did not request DTCA drugs (OR 16.9, 95% CI 7.5–38.2). Physicians judged 50.0% of new prescriptions for requested DTCA drugs to be only "possible" or "unlikely" choices for other similar patients, as compared with 12.4% of new prescriptions not requested by patients (p < 0.001).
Interpretation: Our results suggest that more advertising leads to more requests for advertised medicines, and more prescriptions. If DTCA opens a conversation between patients and physicians, that conversation is highly likely to end with a prescription, often despite physician ambivalence about treatment choice.
So, when patients come in "empowered" by DTCA, doctors are likely to go along with their requests, even though half the time, the doctors don't actually think it's a good idea. If the patients had gotten their information from a neutral source, perhaps their requests would be more likely to be appropriate; and, since the requests would be based on rational analysis rather than Pavlovian behavior modification, the patients would be more easily talked out of it if the prescription wasn't really appropriate after all.
As for the question of why prescription drugs should be treated differently from other products, there is precedent, of course. Tobacco ads are banned from television - by a voluntary agreement, to be sure, but the agreement was an alternative to legislation that surely would have ensued without it. Distilled spirits similarly are not advertised on TV, and there are restrictions (not so you'd know it) on wine and beer advertising.
While drugs obviously can benefit people, all drugs are also poisons. They have risks as well as benefits, and if used by the wrong people in the wrong way, are more likely to be harmful than helpful. That's why many drugs are available by prescription only, precisely because the question of whether any individual ought to take them is complex and requires expert knowledge and judgment. I wish we could all know as much as our doctors but the reason why we go to doctors is that we don't. What we are buying (whether out of pocket or through insurance) is their expertise. If we knew as much as they did, they wouldn't exist. To me, the concept of the informed consumer does not mean that we should try to replace the physician's expertise with our own research, but rather that we should be able to process the physician's technical expertise in light of our own goals, tolerance for discomfort, risk aversion, and values.
That doesn't mean doctors are Godlike or infallible. Indeed, they too are manipulated by drug companies that send attractive young people in flattering clothing around to their offices, buy them fancy meals, and even golf vacations, as well as giving them trinkets and free drugs for their needy patients. Drug ads in medical journals use imagery and emotional manipulation as well as providing information. I think all that should be banned as well. (The free samples have a minor social benefit as long as we have uninsured people, but if they are supposed to be a public service, they should be allocated according to need and delivered without a sales pitch.)
Nevertheless if we don't ascribe expertise to doctors we have a much mre profound problem than DTCA to worry about. Anyway, most current proposals in the U.S. don't call for banning DTCA, but for restrictions, such as not allowing advertising of drugs within 2 years of initial approval, before substantial knowledge about safety and effectiveness in the general population has been acquired; or requiring that they consist only of information presented in a neutral, non-manipulative manner.
Thanks to the Internet, consumers who want to be informed about prescription drugs have much better alternatives than watching drug ads on TV. You can get all the downside news here, from Public Citizen's "Worst Pills" site. They list some drugs, including some which are heavily advertised, that they don't think anybody should take. (Now there's a second opinion for you.) You can get a fair and balanced assessment from the National Library of Medicine here. There are a lot of other sites that offer prescription drug info, but they are mostly supported by advertising, so I'm a little leery.
Anyhow, that's what I think. I'll be happy to hear from dissenters.
Tuesday, March 13, 2007
Cleaning up the desk
Too much to talk about -- much of it concerning developments on issues we've been covering here.
A group of health policy experts convened by the Brandeis University Health Industry Forum says the U.S. needs an agency to evaluate medical procedures, drugs and devices, similar to the UK's National Institute for Clinical and Health Excellence (NICE). I concur.
The International Narcotics Control Board finds that misuse and trafficking in prescription drugs are about to surpass problems with illicit drugs. They are mostly talking about narcotics. Diversion and misuse of narcotics is underreported, they say. Nevertheless, consumption of opioids increased by more than 100% in more than 50 countries since the mid-1990s. In the U.S., that's 7.8 million people abusing prescription drugs in 1992, and 15.1 million in 2003. Fentanyl, hydrocone, and oxycodone are leading to growing numbers of deaths in North America. We have discussed this previously here and here.
Friend Libby Bradshaw, in the March 7 JAMA, reviews When Illness Goes Public: Celebrity Patients and How we Look at Medicine, by Barron H. Lerner. Summarizing Lerner, Dr. Bradshaw notes that the media narratives of celebrity illnesses demonstrate "how celebrity patients can both inform and misrepresent issues to the public." Benefits may include informing people about diseases and treatments, and illuminating difficult choices and ethical dilemmas. But of course conclusions drawn from anecdotes - single cases - can be misleading.
I was reminded of a woman with HIV who I interviewed a few years back. She had decided to start taking antiretroviral medications because Magic Johnson was taking them. Magic was an advocate of early treatment -- starting on ARVs before any symptoms of HIV disease appeared, including reduced CD4+ cell counts. This is actually probably not the right choice for most people, because of drug side effects and the risk of creating viral drug resistance. Celebrities also are likely to get the best, most expensive care, and they have other advantages that mean they are likely to do better than the average person with a similar condition.
I haven't had a chance to read the book but I would venture to say that a common narrative portrays the ill celebrity, along with medicine and doctors, as heroic and triumphant. Remember the advertisement showing Christopher Reed rising up and walking? We're still a long, long way from reconnecting severed spinal chords.
Monday, March 12, 2007
Take a pill, be happy!
If you didn't blink, you might have caught media coverage of this study by Dominick Frosch and colleagues about Direct to Consumer Advertising (DTCA) of drugs. Being braver and more stoical than I, Dr. Frosch and his friends couch potatoed for two full weeks and watched all the prescription drug ads during prime time.
You know what the drug companies say -- DTCA is good for you, because it educates you about medical issues. And as we also know, Mighty Morphin' Power Rangers educates your kids. Here's a sample drug ad from the report.
Using black humor, the first 2 frames show "Joe" running through the "Land of No," a grim and deserted urban setting. Joe has lost control over his cholesterol, and the narrator suggests that lifestyle changes alone are not enough to keep him healthy. In the next 2 frames, Joe visits his doctor, who welcomes him approvingly and encourages him to take rosuvastatin. In the final 2 frames, Joe leaves the doctor’s office and enters into sunny suburbia, or the "Land of Success," where his smiling neighbor waves as he walks home to enjoy a picnic with his smiling family.
They have a lot of facts and figures about the content of these ads, and they generally prove that the educational value is negative. The FDA requires the ads to have what the researchers call "rational content," because they have to discuss side effects. But like the car ads that say "80% of Toadmobiles sold since 1995 are still on the road," the rational content is only truthy. (Think about it. What does this really say about the average life span of a Toadmobile?)
Anyway, car ads seldom rely on those kinds of appeals. Instead they tell you that if you buy a Toadmobile, women will fall to their knees and beg you to ravish them. Drug ads work the same way. Take the pill, and you'll feel in control, you'll associate with beautiful people, you'll be respected in the community, you'll enter a wondrous land of enchantment and delight. And as far as keeping yourself healthy by eating right and keeping fit -- well, that can't possibly work.
Can we please ban this crap?
Sunday Sermonette: What you don't know is good for you
While religious fundamentalism has always been with us, it seems resurgent now, at least in the United States. But polls don't show that more people hold to fundamentalist doctrines, on the contrary. While Americans remain far more likely to hold irrational beliefs than do Europeans -- disturbingly so, in fact, including a majority who do not believe in the reality of Darwinian evolution -- secularism and, indeed, atheism have been growing in the United States. Nevertheless, atavistic fundamentalist religion has become increasingly assertive, political and uncompromising.
This is a reaction to a newly felt threat from reason, a sign of weakness rather than growing power. It is a symptom of growing discomfort with uncertainty. Now many people are openly challenging faith, and doing so with rationally unassailable evidence and arguments. No matter how fervently one believes and prays and denounces, that vexing, itch-making mosquito of doubt must now always flit about the edge of the true believers consciousness.
To live by reason, however, is to happily live by doubt. Reason does not, as some people imagine, sort propositions into three categories, false, unproven and true. Rather, ideas are on a continuum, from as surely false as they can be - e.g., the earth is flat, the universe is 6,000 years old - to as surely true as any can be - e.g., the earth revolves around the sun. But the domain of science is neither of these. It is everything in between, the realm of ignorance and uncertainty.
Evaluating scientific claims is not a matter of running them through a truth detector. There is no formula, no single "scientific method," that can assign a degree of certainty to a claim. Rather, weighing scientific evidence requires assessing what kind of evidence it is -- whether observational or experimental -- and its quality as such, including the reliability of observations, the strength of inferences including the probabilities derived from statistical analysis, the cumulative weight of evidence from multiple sources, and the subtractive force of apparently contradictory evidence. Since nobody can be an expert in more than a small portion of scientific endeavor, we must also assess assertions in the rest of the sphere of knowledge based on the general picture we can gain of how they are derived, and the reputation of the people who make them.
Scientific understanding of the world gains credence from coherence. When the pieces from various lines of inquiry fit together, we become more confident in each piece. So, biologists trust the work of geologists because their findings cohere; equally, geologists trust the work of astronomers. It all makes sense as a single picture. When something doesn't seem to fit, we become less certain. Is there something wrong with the odd piece, or with the larger theory into with which it seems not to accord? Sometimes these odd pieces trigger a revolution in our understanding of the world; sometimes they just turn out to be a mistake. The erratic and sometimes violent progress of understanding is troubling to people who live by belief and anchor themselves in certainty, and they will claim that it undermines the credibility of science.
On the contrary, it is the foundation of scientific credibility. The willingness to overturn old beliefs, to accept error, to embrace ferment and change, is what makes reason more credible than religion, which cannot encompass falsification. Reason can progress. Religion can only stagnate and rot in place.
Friday, March 09, 2007
More than you wanted to know about drug eluting stents
The hordes who have been hanging on every word of my jihad on behalf of open access scientific publishing will probably consider it good news that this week's NEJM is, in an unprecedented move, largely open access. Yup, you can read two "perspectives" pieces, three research reports, two meta-analyses, and an editorial, all on the subject of the relative long-term safety and efficacy of drug eluting vs. bare metal stents.
The bad news is, it's highly unlikely that you care enough about this subject to take advantage of this exciting, one-time offer. If you do care enough -- perhaps because you or a loved one are considering or have recently undergone a stenting procedure -- what you really need to know probably boils down to less than this avalanche of information. So I find this decision by the NEJM editors rather odd. Usually their weekly open-access piece is an analysis of a policy or ethical issue that does have much broader public interest, and is more accessible to a broad readership.
Anyway, since they obviously think this is a matter of urgent public importance, let me give you my take on it. Stents, for those of you who don't know, are metal devices in the form of what might be called cylindrical baskets - sort of like those novelty "Chinese" handcuffs made of reed - that are inserted into blocked coronary arteries and then expanded to let the blood flow freely. The Prince of Insufficient Light, "Crashcart" Dick Cheney, is the proud owner of two or three of them.
They work fine, but in the end, the plaque often regrows over the stents and the arteries close up again. So, some companies started manufacturing stents with coatings that release compounds which inhibit regrowth of the plaque. In clinical trials, they worked well, albeit only with 9 months of follow-up, so the FDA approved two such devices in 2003 and 2004, with the proviso that the companies follow patients for 5 years to make sure no problems cropped up.
Alas, a conference presentation in 2006 found that after 7-18 months, people with drug eluting stents had worse outcomes than people with plain metal stents -- higher rates of heart attacks and cardiac death. It seems the problem was that blood clots (as opposed to atherosclerotic plaque) were forming at a higher rate on the drug eluting stents. Other studies tended to confirm or contradict this finding.
After wading through all the conflicting and uncertain data, William Maisel in one of those free perspective articles basically concludes that the drug eluting stents are superior for the situations in which they were first tested -- people with "discrete, previously untreated lesions in native coronary vessels." The problem seems to be that doctors have been using them "off label," in situations such as people who have had heart attacks (I'm being sloppy with the vernacular, we're really talking about a specific kind of heart attack called an Acute Myocardial Infarction, but you don't care), in coronary artery bypass grafts, and in more complicated or serious kinds of blockages. It turns out that 60% of use has been off-label. In these situations, it appears, the drug-eluting stents are riskier.
So all you really needed was Dr. Maisel's article. This turns out to be a very clear example of a pervasive problem in medicine today. Drugs and devices are approved for specific uses in specific populations, based on trials that are limited to those circumstances. Then the manufacturers start selling them for all sorts of other purposes. Surprise, surprise! Bad things happen.
The law says that once a device or a drug is approved, doctors can use it off-label. The manufacturers aren't supposed to market it for off-label uses, but wink wink -- detailers can do so very easily, and subtly, and not get caught. Also, there's nothing illegal about manufacturers doing small, underpowered, short-term trials that appear to support off label use and then publicizing them -- even though they don't provide good enough evidence to get the products approved for those uses. And the manufacturers certainly don't discourage off-label use. There have been a couple of cases of manufacturers getting into trouble for off-label marketing, but they are rare. Another recent example of off-label prescribing which has gotten a lot of attention is giving powerful psych meds to children. That one happens to really outrage me, and I hope to write about it soon.
So here's just one more example of the pervasive corruption in the medical industry. The news you can use here is that if you have coronary artery disease and you are considering stenting, make damn sure your doctor is familiar with these studies and discusses them with you in light of your particular circumstances. Believe me, the FDA is not going to protect you.
Thursday, March 08, 2007
I picked up my NYWT this morning (yup, they've re-earned the "W" with the attempted Whitewatering of Barack Obama, their wailing and gnashing of teeth over Patrick Fitzgerald destroying the free press, and sundry other acts of whorishness, but I digress) to find that "conservatives" have risen up to demand a pardon for Lewis Libby.
So what is "conservative" about lying to Congress, the American people, and the world, systematically and ruthlessly, over nearly 18 months, in order to manipulate the nation into a disastrous, criminal war of aggression? What is "conservative" about deliberately disclosing a legitimate national security secret -- the identity of a covert agent tracking unconventional weapons programs in the Middle East, no less -- thereby destroying an intelligence operation built up over many years, in order to punish a public servant who had done his patriotic duty by refuting one of those "conservative" lies? What is "conservative" about lying in order to obstruct the investigation into this crime? And what is "conservative" about demanding that the president undo the workings of the criminal justice system?
What exactly does the word "conservative" mean nowadays? What does the Republican Party actually stand for? Does it have a coherent ideology, an articulatable vision of how the country should be governed? As far as I can tell, it's the party of insanity, the party of believing things that aren't true.
It is now a "conservative" belief that the earth is 6,000 years old, that humans used to ride around on dinosaurs, and that the Grand Canyon was made by Noah's flood.
It is now the "conservative" position that the international community of atmospheric scientists is nothing more than a vast conspiracy to undermine capitalism by fabricating conclusions about the effects of burning fossil fuels.
"Conservatives" believe that a fundamentalist version of Christianity, anchored in the Old Testament, should be the state religion of the United States, and should determine our laws, what is taught in school, and who is eligible for political office.
It is now a "conservative" point of view that the descent of Iraq into violent anarchy, the grinding decline of the U.S. armed forces, the nearly universal hatred and contempt for U.S. actions around the world, and the squandering of $2 trillion to achieve all this, have amounted to a triumphal success that has made us safer.
It is now the "conservative" theory that the President of the United States, merely by asserting the existence of a state of war, becomes not only Commander in Chief of the armed forces but also Commander in Chief of the nation, with unlimited, unaccountable power to ignore laws passed by Congress and to ignore the Constitution, including but not limited to the power to make any person, anywhere in the world, disappear into a secret dungeon, never to be seen again, to be tortured at the president's whim, with no legal recourse; the power to attack any nation on earth on a mere assertion that it may present a danger to the United States at some time in the future; the power to intercept communications, search private homes, even determine what library books people have been reading, without a warrant or judicial review and in total secrecy.
In other words, to be a "conservative" is to be a deranged fascist. A large majority of the people, in spite of the studied refusal of the corporate media to tell them the truth, have now recognized this. They do not want these people to be in power any longer. But the Democrats in Congress are still, apparently, terrified of them.
Richard Cheney and George W. Bush are both criminals. They are dangerous psychopaths. They hate America. They hate you, because you are not of their wealthy, powerful secretive tribe. They hate humanity. They must be impeached and removed from office. That is the only way to restore the Constitution and to create any hope of putting the United States on the right side of history.
Wednesday, March 07, 2007
Man Bites Dog
Since this is a popular topic here, and also, as it turns out, in the medical profession, I thought I should do a bit more with the question of dog bites. Just after I first posted on the subject, BMJ happened to come out with a substantial discussion of the issue. Physicians see dog bites so often, whether in primary practice or the ED, that proper treatment is one of their most important skills. In a clinical review (subscription only) Marina Morgan and John Palmer write:
Bites and maulings by dogs, sometimes fatal, are a worldwide problem and particularly affect children. Every year 250 000 people who have been bitten by dogs attend minor injuries and emergency units in the United Kingdom, and some of them are admitted to hospital for surgical debridement or intravenous antibiotics. . . The “hole and tear” effect—whereby canine teeth anchor the person while other teeth bite, shear, and tear the tissues—results in stretch lacerations, easily piercing immature cranial bones. The biting force of canine jaws varies with the breed, from 310 kPa* to nearly 31,790 kPa in specially trained attack dogs. Large wounds, significant devitalisation, and high mortality can result, with the highest mortality in neonates (six times that in toddlers), who are usually bitten by household pets.
Eeew. Trevor Jackson, writing in the same journal in 2005 (and yes, even you common rabble can read this because it's more than six months old) asks, "Is it time to ban dogs as household pets?":
After tobacco, alcohol, and sports utility vehicles, how long will it be before public health experts get serious about the menace of widespread dog ownership? Despite ongoing research into dog bites and zoonoses, the occasional media outcry about pit bull terrier and rottweiler maulings, and legislation such as the United Kingdom's Dangerous Dogs Act of 1991, pet dogs and their owners have mostly been given a rather long leash. And yet it increasingly seems extraordinary to me—considering all the things that the law prevents us from doing—that it is legal for people to keep a potentially dangerous wild animal in their home. Or even, as many postmen and postwomen have discovered to their cost, in their front gardens. . . .
The usual rejoinder to complaints about dog behaviour is that it is the owners, and not their pets, that are to blame—which is precisely why dog ownership should be curbed. We need responsible dog owners, people say. Call me dogmatic, but responsible dog ownership is mostly a contradiction in terms . . .
By now, I can envision readers barking in outrage and wacking their computer monitors with a rolled up newspaper. There is absolutely no way that such a measure would even be proposed by any politician who wanted to avoid impeachment, followed by tarring and feathering and being run out of town on a rail. But that happens to be what I find interesting about this question. Why are we so devoted to Canis lupus familiaris? What is this relationship all about?
In the same issue, June McNicholas and colleagues discuss the broad evidence on pet ownership and human health. Despite some early studies showing benefits, the evidence generally does not support a conclusion that owning pets is associated with measurable reduction in disease. However, these authors invoke a broader definition of health, in which the emotional bonds people have with pets are valuable in and of themselves.
So what is this cross-species relationship all about? I'm not an expert on the subject, but it seems pretty obvious that the symbiosis between humans and wolves -- which is what dogs actually are -- arose in the context of hunting. It is easy to imagine packs of human and canine hunters discovering that they could benefit from cooperation. When humans domesticated food animals, it was a short step from hunting to herding. Today, a small minority of dogs still work as herders, and as hunting partners. Dogs also use their talents in law enforcement to sniff out drugs, bombs, fugitives and bodies, and in private security to guard junkyards and other places where valuables are stored. The United States military, uniquely among nations, uses dogs to torture prisoners. And of course, blind people use dogs for guidance.
However, the vast majority of dogs are simply family members, pampered and generally useless. As McNicholas et al put it, "Companionship—a commonly stated reason for pet ownership—is regarded as theoretically distinct from social support in that it does not offer extrinsic support but provides intrinsic satisfactions, such as shared pleasure in recreation, relaxation, and uncensored spontaneity, all of which add to quality of life." But people most certainly are not interested in inviting wolves into their families. What has happened is that through selective breeding, people have developed a subspecies of wolf that retains juvenile characteristics throughout life. These eternal wolf puppies see us as the adults in their lives, and yield (sometimes reluctantly) to our authority. As pack hunters, humans and wolves share enough of the basics of social interaction that our behavior is mutually interpretable, and we have found a way to live together.
But I suspect that a visitor from Alpha Centauri would find this situation quite surprising. We actually take the quite non-trivial risk that a predatory animal might maim or kill our children, just to keep it around for yucks and cuddles. That's the just way it is -- it is bedrock in our culture, dogs mean a lot to us, and that's all there is to it.
However, people need to understand that all dogs, even the sweetest tempered, can in fact be dangerous. Yet we take them for granted and treat them far too casually. Most authorities recommend that dogs and children not be allowed together unsupervised, but that is completely alien to our usual practice. Morgan and Palmer write, "Generally, children should be taught to treat dogs with respect, avoid direct eye contact, and not tease them. They should be taught not to approach an unfamiliar dog; play with any dog unless under close supervision; run or scream in the presence of a dog; pet a dog without at first letting it sniff you; or disturb a dog that is eating, sleeping, or caring for puppies." In other words, folks, love them all you want, but that means taking them seriously.
*KPa, KiloPascal, is a unit of pressure equivalent to 1,000 Newtons per Square meter. A Newton is a unit of force equal to the force of earth's gravity at the surface on a mass of 102 grams. An adult of approximately average weight (70 kg) experiences gravitational force of approximately 700 Newtons. 31,790 KPa is a helluva lot of pressure.
Tuesday, March 06, 2007
The Decider Decides to Screw You Yet Again
On the day when Scooter takes the fall for his boss, a day when the legacy of their lies is one very grim day in Iraq, I'm going to stick to the knitting. All of this, along with the scandal-we've-actually-known-about-for-years-
but-are-only-suddenly-worrying-about-now over the abandonment of troops wounded in Iraq; the daily good news/bad news of the yo-yoing stock market; and, most important of all, Britney shaving her head and Ana Nicole still being dead have made it impossible for the newspapers and teevee to tell you much of anything about the Administration's budget proposals. So let's take a look, shall we?
The American Public Health Association summarizes the damage to public health in this press release. The budget was actually released one month ago, Feb. 5. Does anybody remember that? The Decider wants to knock $162 million bucks out of CDC. That's a lot of dough -- enough to keep the Iraq war going for 12 hours. The purpose of invading Iraq, as I understand it, was to keep us safe from those Weapons of Mass Destruction™. Well, while the CDC budget does include an increase for pandemic flu preparedness, it completely eliminates the Preventive Health Services Block Grants to the states, which the states use to meet their most pressing needs. The budget also knocks out anti-obesity programs, cuts preparedness grants to the states, and takes $143 million from childhood immunization programs.
The Health Resources and Services Administration gets cuts to rural health programs and workforce development; and an increase in Ryan White CARE Act funds, but one that's well short of meeting growing demand as more and more people are living with HIV.
According to Kim Krisberg writing in APHA's member newsletter, CDC's budget request notes increased funding since 2002 to address the threat of terrorism, but level funding for chronic disease prevention even as chronic diseases continue to affect more and more people. CDC Director Julie Gerberding told an APHA-sponsored meeting on Feb. 7, in Krisberg's words, that "core funding for infectious disease science is being eroded. CDC is facing a number of new and old health threats, but federal funding isn't coinciding with the expectations being put on the agency" -- except, of course, for increases in "emergency" funding (read "terrorism" preparedness).
Let's be clear here folks. If Al Qaeda suddenly started mounting successful Sept. 11 scale attacks every month from now until Christmas, they would end up killing less than 5% of the Americans who will die in that time from heart disease; a little over 5% of the number who will die from cancer; just 1/5 the number who will die from strokes; fewer than half the number who will die from diabetes; and here's a good one, less than 1/3 the number who will die from accidents. (We aren't supposed to use the word "accidents" in public health but you know what I mean -- everything from wrapping the Porsche turbo Carrera around a tree to slipping in the bathtub.)
So how about it, Dems in Congress -- will you keep us safe?
Monday, March 05, 2007
Big Brother, M.D.?
As I was preparing to post on something else, I read Ana's comment on the previous post and realized I ought to do this instead. The issue she raises for Switzerland is one that we are facing here too, but with really minimal public awareness or general debate, and that's the issue of the integrated electronic medical record (EMR).
As of now, many hospitals have adopted EMRs. If you go to a community health center or ambulatory care practice affiliated with a major hospital, chances are your doctor already uses an EMR. The doctor probably logs on to the computer while you're sitting there in the examining room, rather than having an administrative assistant pull a paper record (or not get it there in time, just as likely). Along with the EMR, there is probably an computerized order entry system, so your doctor enters your prescriptions in the computer, and it prints them out, while automatically incorporating them in the record, along with the doctor's notes on your visit, lab results, etc.
Now, imagine if all the doctors and all the hospitals in the world had access to the same record, over the Internet. Voila! The problems with polydoctory I discussed below are solved, or nearly so, in the twinkling of an electron. Every doctor you go to can instantly find out about all the other doctors you are seeing, what they have found, and how they are treating you. Furthermore, if you change doctors, or go to a new specialist, no problem -- the record is right there. The software can automatically flag potential drug interactions and counterindications, remind doctors when screening tests are due, alert when test results should be followed up . . .
The potential of such a system to improve health care, eliminate waste, reduce medical errors and enhance patient safety is obvious, and the idea is causing a lot of excitement. Most doctors absolutely love it, and why not?
Well, of course, there's always a why not. Technical issues are fairly daunting, and the capital investment needed would be huge. It actually doesn't help that many hospitals have a head start with their own EMR systems, because those don't start out being compatible with each other, and you'd have a huge problem of integrating information across multiple existing systems and platforms. For small practices, even stand-alone EMR systems aren't cost effective, so there would need to be massive subsidies to get all of those family docs on-line.
But then there are the privacy and other human rights issues. I'll give one fairly obvious example. I mentioned below that most primary care docs have never seen a mental health record, but there's a good reason for that. Those records (and substance abuse treatment records) are even more protected than ordinary medical records, by federal legislation that pre-dates HIPAA. Mental health clinicians need to have information about history of incarceration, illegal drug use and other criminal activity up to and including serious violent crimes, suicide attempts, behavioral and personality disorders, and other stuff you really don't want in your medical record.
You might or might not want your primary care doc to know some or all of this, but you don't want every health care provider you'll ever meet to know all about it. You also don't want to take a chance that it could get out beyond the circle of health care providers, and become known to an employer or potential employer, perhaps relatives, insurers -- you get the idea. On the other hand, you certainly want behavioral health providers tied into the EMR system because they do need to have your medical records, and your primary care doc needs to know about your psych meds, at a minimum.
Of course the potential problems go beyond these relatively extreme cases.
So how do you feel about this brave new world? The theory is that the record is yours, it belongs to you. It goes wherever you go, and of course you can see it, even from home, just by entering your username and password . . . uh oh, we know those can be stolen or cracked. Can the problems be solved? Remember, the upside is huge. There are enormous potential benefits. Are they worth the risks?
Friday, March 02, 2007
Polypharmacy is the term we use when people who have ridiculous numbers of prescriptions, which creates all sorts of problems -- drug interactions, side effects, difficulty in adherence, cost. It's often hard to sort out what particular drug or combination of drugs might be causing a side effect, how one drug may affect the potency of another (a very common occurrence because the drugs compete for the liver enzyme systems that break them down and remove them from the blood stream), what's really working . . .
An equally, probably more serious problem these days is that people have multiple doctors. Medicine has become more and more specialized; the systems of medical and so-called behavioral health care (mental health and substance abuse treatment) are largely separate; and now we have the phenomenon of the hospitalist.
It used to be that primary care physicians would have "admitting privileges" at local hospitals. If you needed to be hospitalized, your personal Marcus Welby M.D. would sign you in and visit you while you were there, consulting closely with whatever surgeon was responsible for slicing and dicing you. I remember my own pediatrician handing me a bowl of ice cream after I'd had my tonsils out. No more. The chance that your primary care physician will see you while you're in the hospital, or even have the least clue what is going on there, is nil. Instead, your care will be overseen by a hospitalist -- or probably several, actually, as shifts change -- a doctor who meets you when you are admitted and will never see you again after you are discharged.
Your doctor is supposed to get what's called a discharge summary from the hospital, a document that gives your test results, results of your surgery or other procedures, follow up plans, prescribed medications, etc. But Sunil Kripalani and colleagues, in the new JAMA, find based on a literature review that it just ain't happening. At the first visit with your primary care provider after hospital discharge, the chance that she or he will have that discharge summary is somewhere around 25%. Even four weeks later, it's still not there something like 1/4 to 1/2 of the time. Obviously, this can be very dangerous.
For people with serious comorbidities -- such as the large percentage of people with HIV who have substance abuse disorders or mental health problems -- the lack of communication among providers is a continuing problem. Most medical doctors, as far as I can tell, have never even seen one of their patient's mental health records. (Any M.D.s out there care to comment?) Conversely, mental health and substance abuse treatment providers have great difficulty in keeping up-to-date on their client's medical condition and treatment. Elderly people often go to specialists such as cardiologists, who do not communicate with their primary care doctors, who don't even know what prescriptions their patients have. This can actually kill people.
Thursday, March 01, 2007
Speaking Ill of the Dead
In 1991, Arthur M. Schlesinger, Jr., the distinguished historian and advisor to Democratic presidents, published a little book called The Disuniting of America: Reflections on a Multicultural Society. The book was originally published by Whittle Communications, the people who put commercial TV into American classrooms through the Channel One Program, and who have been trying to privatize and corporatize public education through the Edison Project. I'll discuss it as standing for a prominent current in recent thought, of which it is a notable example.
Schlesinger sees himself as a lonely Horatio, defending the body politic against a rising "cult of ethnicity" that threatens to destroy the United States. Unless the "cult of ethnicity" is conquered, he fears, "we invite the fragmentation of the national community into a quarrelsome spatter of enclaves, ghettos, tribes." Given the tendency of "tribes" to hate each other, the "mixing of peoples" going on now in our shrinking world is "a major problem for the century that lies darkly ahead." His proposed solution is, in essence, that everyone who is here now must assimilate into the dominant, anglophone European settler culture, and give up any claims of separate identity or efforts to maintain distinctive culture.
To be sure, Schlesinger is careful to decry racism and exclusionism by the majority. Recent immigrants and Blacks (he refuses to say "African American") should want to become assimilated into a unified American culture, but those who already consider themselves the "owners" of that culture must accept new members of the club. But these are passing acknowledgments. Schlesinger sees the threat to American unity, not in white racism and ethnocentrism, but in claims and demands being made by representatives of disadvantaged groups, particularly Blacks and Hispanics.
Oddly, Schelsinger does not anywhere define the "cult of ethnicity," though he uses the phrase repeatedly. Ethnicity is, after all, a real phenomenon. How are we to distinguish cultists from legitimate students of ethnicity or proponents of ethnic pride? He identifies the "height of the ethnic rage" as 1974, when "Congress passed the Ethnic Heritage Studies Program Act -- a statute that, by applying the ethnic ideology to all Americans, compromised the historic right of Americans to decide their ethnic identities for themselves. The act ignored those millions of Americans -- surely a majority -- who refused identification with any particular ethnic group."
So, while this dire phenomenon peaked in 1974, and the republic has survived since, the danger is apparently still with us and greater than ever. "[T]oday it threatens to become a counterrevolution against the original theory of America as 'one people,' a common culture, a single nation.'"
The problem with all this, from my point of view, is that the "original theory of America as 'one people', a common culture" was never anything but a myth. The framers of the constitution, and the early panegyrists to the "new race" of Americans, saw the new race as encompassing only people of Western European origin. The immigrants who came to the new nation with the goal of leaving their old national identities behind, and being assimilated as Americans, were English, French and Dutch. They had no intention of assimilating with Black slaves or the remnants of the Indians.
Later, as waves of immigrants came from Ireland, Central and Eastern Europe, Asia, and Latin America, Schlesinger recognizes full well that most of them did not intend to leave their identities behind. They entered ethnic enclaves where they clung to their language and cultural distinctiveness, with schools often featuring instruction in their native languages. Ultimately, many of their children and granchildren did leave the enclave and become assimilated, but others did not. Irish-, Italian-, Chinese-, and Polish-American neighborhoods exist to this day in U.S. cities, where not only newcomers but also third and fourth generation descendants of immigrants cling fiercely to their ethnic identities.
Black Americans, of course, whether or not they wanted to assimilate, have not been permitted to do so. Schlesinger finds something artificial and ludicrous in North American Blacks identifying with Africa. But this is not a new creation of the cult of ethnicity, as Schlesinger would have it. Black Americans have always had an emotional identification with Africa. A newspaper called the Afro-American has been published in Baltimore for more than a century. Marcus Garvey's movement in the 1920s drew thousands of Black American demonstrators into the streets in cities throughout the U.S. in celebration of African identity.
There is a very credible body of opinion allied with the anthropologist Melville Herskovits that finds important original African influences in African-American culture today. In any event, the true degree of connection between African-American and African culture is largely irrelevant to the problems Schlesinger addresses. No-one can plausibly deny the existence of one or more Black American sub-cultures, whatever their origin or similarity to African cultures. And the emotional attachment of many Black people to Africa, regardless of whether it reflects a true cultural affinity, is understandable. People are within their rights in having such feelings.
By the way, I must take this opportunity to point out that the Hispanic people of the Southwest are not immigrants. They never crossed the border -- the border crossed them when the United States seized what is now Arizona, New Mexico and California from Mexico in a war of conquest. Through a more complicated chain of events, Texas also became part of the U.S. by conquest, after being part of Mexico. So many illegal immigrants from Mexico are just visiting their family members. To Chicanos, it is the Anglos who are "illegal".
But Schlesinger's fears of the "disuniting of America" don't really center on the existence of distinctive ethnic subcultures. He does not complain about the insularity of the Hasidim or the Amish, the bigotry and exclusionism of Bensonhurst and South Boston, or the arrogance of Anglo-Saxons. Rather, his concerns focus on certain demands being put forward by African-Americans and Latinos.
He complains at length about proponents of "Afro-centric" history. Quite rightly, in my view, he suggests that people who insist on the empirically unsupportable claim that the origins of Western civilization are to be found in sub-Saharan Africa are showing evidence of having colonized minds. There is no reason why African cultures have to be the source of Hellenic civilization in order for them to be worthy of respect and study in their own right.
But Schlesinger is really attacking a straw man (straw person?). The proponents of this version of Afro-centrism are a small fringe group. The debate over Euro-centrism in the curriculum focuses on whether non-European history and culture are to be ignored or included, not on whether Africa was the womb of Europe. New York City College's Leonard Jeffries, on whom Schlesinger expends many pages of vitriol, is widely regarded as a racist and a nut by proponents of multiculturalism of all races and ethnicities.
Schlesinger does offer a disclaimer: "Cultural pluralism is not the issue. Nor is the teaching of Afro-American or African history the issue; of course these are legitimate subjects." But in fact these are issues. The public school and college curricula do give short shrift to non-European cultures. Furthermore, traditional and current curricula present a distorted and romanticized view of European and Euro-American history, including a sanitized view or total evasion of Southern slavery, the genocide of the original inhabitants of North America, the oppression of women, U.S. imperialism, and many other subjects. These are the important current debates about the curriculum, not the ideas of Leonard Jeffries.
It is true, as Schelsinger lays out, that some advocates of curricular reform see the presentation of positive versions of the history of oppressed groups as conducive to children's self-esteem. Schlesinger asserts, on no particular grounds, that this is false, but again I think he misses the point.
I conducted a special study of the public schools in Fitchburg, Massachusetts a few years ago, and I do know that Latino, Southeast Asian and African-American students in that city, where Latin American, Southeast Asian and African-American history and literature were not taught, deeply resented the absence in the curriculum of what they saw as their history.
Whatever this may or may not mean for their self-esteem, it certainly affects their feeling of alienation from their school. Ironically, perhaps, it also discourages some of them from learning European and European-American history. The Latino students I spoke with would be more comfortable learning the language and history of their new country if they did not feel that their own heritage was devalued in the process.
On this subject, Schlesinger himself writes bad history. Remember the "Ethnic Heritage Studies Program Act" which I mentioned earlier? It doesn't bear the slightest resemblance to Schlesinger's caricature.
The Act authorizes grants and contracts for educational institutions to develop curricula relating to the history and culture of particular ethnic groups and their contribution to the American heritage. That is all it does. There is nowhere in the statute the slightest indication of "applying the ethnic ideology [whatever that is!] to all Americans, compromising the historic right of Americans to decide their ethnic identies for themselves," as Schlesinger puts it.
Nor does the Act ignore "those millions of Americans - surely a majority - who refused identification with any particular ethnic group." The Act does not enumerate ethnic groups, say who does or does not belong to one, or otherwise label or categorize anyone. The Act does not preclude anyone from applying to develop a curriculum in Anglo-American history or culture. As for Schlesinger's claim that the majority of Americans refuse identification with any particuar ethnic group, he should be embarassed. To Schlesinger, apparently, culture is something that other people have. "Americans", presumably, lack culture or ethnicity.
After all the ink wasted in demolishing Leonard Jeffries, this is his true agenda. He wants to promote his own idealized version of American history, one that will serve his political ends. He writes:
"Above all, history can give a sense of national identity. We don't have to believe that our values are absolutely better than the next fellow's or the next country's, but we have no doubt that they are better for us, reared as we are - and are worth living for and dying by." But more than that, he lets it escape that Western culture is, after all, superior to all others. "Whatever the particular crimes of Europe, that continent is also the source - the unique source - of those liberating ideas of individual liberty, political democracy, the rule of law, human rights and cultural freedom that constitute our most precious legacy and to which most of the world today aspires."
He invokes the case of Salman Rushdie: "What the West saw as an intolerable attack on individual freedom the Middle East saw as a proper punishment for an evildoer who had violated the mores of his group." The other cultures of the world, he tells us, are "collectivist ... in which loyalty to the group overrides personal goals .... There is surely no reason for Western civilization to have guilt trips laid on it by champions of cultures based on despotism, superstition, tribalism and fanaticism .... Certainly the European overlords did little enough to prepare Africa for self-government. But democracy would find it hard in any case to put down roots in a tribalist and patrimonial culture that, long before the West invaded Africa, had sacralized the personal authority of chieftans and ordained the submission of the rest."
It is news to me that democracy cannot take root in a culture that sacralizes the personal authority of chieftans and ordains the submission of the rest. I was taught in school (evidently incorrectly) that until only a few hundred years ago -- and till this century in some countries -- Europe was ruled by kings who claimed divine authority, sanctioned by a church that burned dissenters alive. I have also learned of many instances of "despotism, superstition, tribalism and fanaticism" in Europe in this century. Perhaps Schlesinger has forgotten about Naziism. And isn't Bosnia in Europe?
As for Rushdie and the Moslems, Schlesinger offers a sweeping indictment of diverse nations and cultures stretching from Morrocco halfway around the world, based on a single stereotype. The fatwah issued by the late Ayatollah Kohmeini against Salman Rushdie is viewed with horror and outrage by people throughout the Moslem world. But Schlesinger apparently feels he is under no obligation to learn anything about the diverse and complex Islamic world in order to write about it.
In fact, contrary to Schlesinger's fears, there is no "human instinct" of tribal hatred. Humans do not have instincts, beyond such elementary ones as sucking and grasping. Human have culture. They learn to hate and fear each other, or else they learn to accept and value each other.
As a sociologist, I have had the privilege of studying, and personally experiencing, many cultures other than my own. From the point of view of other cultures, American culture is pathological in its individualism. What Schlesinger excoriates as "collectivism" is, to many others, a value given to community and mutual responsibility which is sadly lacking in our own culture. The ideas that we should care for one another and be willing to sacrifice for the good of the community are not ones to which we ought to feel superior.
It is particularly strange and contradictory that Schlesinger holds up tolerance for diversity as one of the supposedly superior values of Western civilization, since it is precisely the main point of his book that we should not be tolerant of diversity. I must add that the United States has hardly lived up to this particular ideal with any consistency, nor for that matter Schlesinger's other great ideal of democracy, nor are they held in common by Americans today. Where do Pat Robertson, Pat Buchanan, and David Duke find their followers? Among immigrants, Blacks, Native Americans?
An honest American history curriculum would hardly serve to unite Americans, who would see their own ancestors enslaved, nearly exterminated, lynched, impoverished, or perhaps living lives of luxury and ease off the unremitting labor and deaths of millions. This seems not to have occurred to Schlesinger.
The creeds of individual freedom, democracy and diversity indeed have unifying power. They inspire me, though I would temper them with equal recognition of community and responsibility. But they are just that: creeds, not historical reality. Furthermore, these creeds demand that we do precisely what Schlesinger rejects: to value equally and to understand respectfully the diversity of the world's cultures, and to celebrate a diversity of cultures within our own nation. There is no single American culture, no single American ethnic identity, and there never has been one. To long for vanished days when, as Schlesinger claims, we were "one people", is to wish to fall back into a dream.