This time, I mean it literally. Can we please not set the damn clocks back? Mayer Hillman in the new BMJ (sorry, subscription only) argues that we could reap a major public health benefit if we'd leave well enough alone with the time in the fall.
"Research has shown that people are happier, more energetic, and less likely to be sick in the longer and brighter days of summer," he tells us. We can't stop the days from getting shorter, but we can spend more of our waking hours in what daylight there is, and more of them out of doors. By leaving the clocks ahead, we would have more "accessible daylight." Yes, more of us would wake up before dawn, but we spend the next hour or so in the house anyway, showering and shaving and eating breakfast and (for a dwindling few) reading the newspaper, etc. Today, I will leave my office at the end of the workday in the light, but one week from Monday, after we set the clocks back, it will suddenly be deep darkness.
That is always one especially depressing Monday. The result is less physical activity -- many people don't like to go out after dark, or let their children out to play. People will walk less and drive more. We'll all get fatter and sadder. And why? It just seems pointless.
So that's my vote. Keep "daylight saving time" year round.
Friday, October 29, 2010
This time, I mean it literally. Can we please not set the damn clocks back? Mayer Hillman in the new BMJ (sorry, subscription only) argues that we could reap a major public health benefit if we'd leave well enough alone with the time in the fall.
Thursday, October 28, 2010
Back in the day when I was a co-blogger on the old Effect Measure, I invited people to think about how they would feel if state government took various emergency measures in response to an infectious disease outbreak -- restricting travel, isolating regions, quarantining people, that sort of thing. My post attracted a very weird, antisocial troll whose basic rant was that she was a constitutional law scholar and all this was settled under constitutional law and therefore I was a complete idiot and an arrogant fool even to ask the question.
My post was not about constitutional law at all. I wasn't asking people whether they thought the measures in question were constitutionally permissible, I was asking whether people personally would approve of them, and under what circumstances. I could not get this person to comprehend, or at least to acknowledge, that these were separate questions.
I am reminded of this strange experience by the lawsuits filed by various right wing extremist state Attorneys General against the Patient Protection and Affordable Care Act. Their fans, it should go without saying, don't know anything about the constitution and have no idea why the PPACA would or would not be constitutional. They just don't like it, or think they don't like it, because they don't understand the legislation either.
Sarah Rosenbaum, in the new NEJM, addresses both halves of this question. She says not to worry, the Act is clearly constitutional. Basically, it's a two part question. Congress has the power to regulate individual activity under the Commerce clause if a) it's part of a "broader regulatory statutory scheme that permissibly regulates interstate commerce" and b) the provision in question -- in this case the individual mandate to obtain health insurance is "essential to the Act's larger regulation of the interstate business of health insurance."
Rosenbaum says both halves are a no-brainer. Not being a lawyer or a constitutional law scholar, as I hereby proclaim lest my long-ago troll show up again, I can't evaluate any of that. But I can say that the teabaggers' objections to the mandate are based on a fundamental misunderstanding of the problem of liberty. Liberty does not mean that you, Pemberton B. Throckmorton, can do whatever the hell you want, because you might just do something that deprives Hermione Q. Binglestrock of her liberty. In this specific case, if Pemberton is run over by a motorcycle, and does not have health insurance, Hermione, and all the rest of us, will be forced to pay for his very expensive trauma surgery, whether we like it or not.
This simple and obvious class of problem easily discredits libertarianism before it even gets started. Therefore, libertarianism is a fortiori arbitrary about exactly which liberties it is for and which ones it is against. Since it is fundamentally a nonsensical philosophy, it is merely an excuse for wealthy and powerful psychopaths such as the Koch brothers to rally dupes in favor of their own dispossession. If the Koch brothers are free to make billions by spewing carbon dioxide into the atmosphere, billions of people will not be free to live on the land they now inhabit.
Liberty involves tradeoffs and competing interests. It has to be managed through some sort of political process in order to share it around with some degree of justice and to enable the commonwealth to prosper. That means you might just have to give something up for the sake of the next person, and government might just have to mediate that. It's not hard to understand.
Wednesday, October 27, 2010
You've heard all about the Large Hadron Collider, and the Human Genome
Project got plenty of press, not to mention the Hubble Space Telescope. The corporate media breathlessly hype some purported medical breakthrough every week, which promises to cure a dread disease in a decade or two.
Strange, then, that few people are aware of one of the biggest of big science projects, the National Children's Study. The study was first authorized by Congress in 2000, had some ups and downs securing funding since then, but now has in hand $414 million of the more than $2.5 billion (probably more like $5 billion) it will ultimately cost. I expect the reason it doesn't get a lot of attention is because it's epidemiology -- the mission of the study is to understand the environmental determinants of health, including the social environment, as they interact with people's genetic inheritance. It is planned ultimately to follow more than 100,000 people from the womb to age 21, recruited from 105 sites across the country carefully selected to produce a representative sample of the U.S. population.
Or, in their own words:
The National Children’s Study will examine the effects of the environment, as broadly defined to include factors such as air, water, diet, sound, family dynamics, community and cultural influences, and genetics on the growth, development, and health of children across the United States, following them from before birth until age 21 years. The goal of the Study is to improve the health and well-being of children and contribute to understanding the role various factors have on health and disease. Findings from the Study will be made available as the research progresses, making potential benefits known to the public as soon as possible.
Right now the study is in a pilot phase, testing recruitment strategies at about40 sites. Obviously, you shouldn't hold your breath until they get final results, but findings will be published continually along the way as they emerge. I certainly think it's more than worth it -- this can go a long way toward teaching us how to maintain a healthy population, instead of stepping in after things go wrong and spending trillions of dollars to try to fix them.
But what are the chances of the study surviving Republican hegemony? This money is chump change compared to modest cuts in marginal tax rates for the obscenely wealthy, but I'm not counting on president Rand Paul to pay for it.
George Monbiot, a columnist for The Guardian. Today's bon mot:
Most of these bodies [funded by the Koch brothers] call themselves "free-market thinktanks", but their trick – as (Astro)Turf Wars points out – is to conflate crony capitalism with free enterprise, and free enterprise with personal liberty. Between them they have constructed the philosophy that informs the Tea Party movement: its members mobilise for freedom, unaware that the freedom they demand is freedom for corporations to trample them into the dirt.
Monbiot's weekly columns are archived here.
Tuesday, October 26, 2010
Okay, I didn't, I wasn't even born. However, before these researchers did this study, my colleagues and I already had pretty much the same idea. They were working on the problem of overweight and obesity, and we are working on medication adherence, but the issue is the same. Doctors do talk to their patients about these and other health related behaviors, but their usual approach is to scold, hector and generally threaten people with death.
That doesn't work, and it's not hard to see why. First of all, when somebody starts criticizing you, what are you likely to do? Push back, of course. Adults don't like to be judged, lectured and talked down to, so they're likely just to say, even if silently, screw you. Anyway, patients already know that their doctors think they should take their pills and maintain a healthy weight, and they've heard all about the consequences if they don't. Ergo, if they don't do those things, it isn't because they don't know any better and just repeating the information isn't going to solve the real problem, whatever it may be.
Equally obviously, whatever the real problem may be, it must take the form of conflicting motivations. Yeah, I don't want to get AIDS or diabetes, but there's something about taking the pills or skipping the doughnuts that I also don't like. What the doctor needs to do, instead of yelling at the person, is work with him or her to understand the issues and then see if there isn't a way to resolve those conflicting motivations that's going to be better for the person in the long term.
The method for doing this, developed in the field of alcohol and other drug abuse counseling and well supported by evidence, is called Motivational Interviewing. We have NIH funding right now to teach MI to doctors who provide HIV care, and our hope is that we'll see improvements in their patients' medication adherence as a result. (It's a dual PI project, my esteemed colleague is the contact person.)
The Pollak study is obviously encouraging for us. Long term, if we and others can reinforce, extend and deepen these findings, it will suggest that medical education ought to include a big component of communication and relationship skills training. Actually most people already believe that in principle, they just haven't had enough evidence about exactly what the doctors should be taught. We're finally figuring it out.
Monday, October 25, 2010
My partner in the world girdling journalist enterprise Iraq Today is on vacation in the gamma quadrant so I have an extra blogitory burden this week and may have to scrimp a bit on Stayin' Alive.
Since I have Iraq on the brain, like it or not, let me say something about it. You, like the vast majority of Americans, have probably forgotten all about Iraq, but in March of 2003 the U.S., joined by a significant British force and a few camp followers from hither and yon (Don't forget Poland!) invaded a sovereign nation in what by all definitions and standards in effect up until that time was an illegal war of aggression. The ostensible justification for the crime against humanity was given by then U.S. Secretary of State Colin Powell before the U.N. Security Council. Mainstream European newspapers and broadcast news services reported within two days that every single non-trivial factual assertion in Powell's presentation was demonstrably false; but in the United States, it was universally portrayed as incontrovertible proof that Iraq was amassing Weapons of Mass Destruction™ to give to Al Qaeda to use to massacre Americans.
At the time of the invasion, Iraq was under the compulsion of UN Security Council Resolution 1441, which compelled it to cooperate with UN weapons inspectors. Iraq was doing so. According to Head weapons inspector Hans Blix, it would require only a few weeks to complete verification that Iraq did not possess banned weapons, but the U.S. ordered the U.N. mission out of Iraq and invaded anyway.
While the U.S. and British administrations had given various rationales for the war, all of them evaporated after the invasion. The Iraq Survey Group the U.S. sent in to find the Weapons of Mass Destruction™ concluded that Iraq had suspended its banned weapons programs in 1991. Iraq had no links, operational or otherwise, with al Qaeda, and in fact al Qaeda and the secular Iraqi regime were enemies -- as were the regime and the Kurdish Islamist organization Ansar al Islam (now called Ansar al Sunna) which George W. Bush frequently invoked as an example of "terrorists" harbored by Saddam Hussein. (The organization was actually based in Iraqi Kurdistan, protected from Saddam Hussein by U.S. aircraft.)
As for spreading "democracy" to the Middle East, democracy does not issue from the barrel of a tank or a 500 pound bomb. To have a democracy, nations must have a political culture in which democracy can flourish. Iraq does not, but its chances got a lot worse as U.S. troops stood idly by and watched civil society dissolve, mass looting destroy what was left of Iraq's infrastructure, already badly damaged by a decade of sanctions, followed by the U.S. Viceroy dissolving Iraq's military and banning all of its former civil servants from employment.
Today, Iran is the most influential foreign power in Iraq, China is the most active foreign nation in exploiting Iraq's oil resources, and Iraq remains physically devastated; without a functioning parliament or an elected government; it's military and police forces controlled by sectarian parties close to the Iranian regime that routinely torture prisoners; with millions of refugees both internally displaced and living abroad; hundreds of thousands if not more than a million excess deaths from violence; disease, and lack of medical care; severe shortages of electricity, clean water and sanitation services; and daily bombings, assassinations, and mass murders in most of its major cities.
Oh yeah, it's cost the U.S. a couple of trillion dollars, more than 4,000 military lives and tens of thousands of permanently disabling injuries. And done severe damage to the country's image and reputation around the world.
So the obvious question is, "why"? Invading Iraq wasn't just a Republican obsession, by the way. Jonathan Schwarz notes that the Clinton administration was looking for a fake pretext to invade, as the Bush and Blair administrations continued to do. They needed a fake pretext because, duhhh, they didn't have a real pretext. So why was it so important to do?
Inquiring minds still want to know.
Friday, October 22, 2010
I'm referring to your body. No doubt you've heard about this projection from CDC that 1/3 of U.S. adults could have diabetes by 2050. As I always say, if there's one thing that's hard to predict, it's the future. So I wouldn't worry too much about the quantification. But it's an natural born true fact that we have a ridiculously high and growing prevalence of diabetes -- currently 10.7% of all people 20 and older, and over 23% of people 60 and older.
The diagnosis of diabetes, however, is just a signpost for something broader. The risk factors for diabetes are pretty much the same as the risk factors for cardiovascular and cerebrovascular disease (except that tobacco also contributes to the latter), while diabetes itself also compounds heart disease risk. But wait, there's more. Obesity, the major risk factor for diabetes, is also associated with some cancers.
But wait, there's more. I'm afraid I can't give you a link, but Marcus Richards and Carol Brayne in the new BMJ review an emerging paradigm of Alzheimer's disease. The association between the cognitive deficits of Alzheimer's and the neuropathological changes generally thought to be it's diagnostic signs -- accumulation of beta amyloid protein and neurofibrillary tangles -- is actually pretty loose. Some people who have the signs on autopsy were cognitively intact at death, others with considerable dementia have limited pathological signs. On the other hand there is a strong association between Alzheimer's and atherosclerosis, and cerebrovascular disease; in many cases Alzheimer's may really be cerebrovascular disease.
And again, the same risk factors are at work: obesity, inactivity, "bad" cholesterol (dyslipidemia) which comes in part from obesity and inactivity and also eating the wrong foods.
I don't know about you but without my fabulous wetware, I got nuthin'. But other complications of diabetes -- blindness, loss of extremities -- don't appeal to me either.
This is in fact a national crisis, as urgent as any that we face. We just can't allow this to happen to us. The nation cannot possibly afford to take care of 100 million people with these awful diseases, none of us wants to be there or see people we care about living with stroke, dementia, heart disease, blindness, immobility . . . But what do you think are the chances of the kind of major national mobilization that would be required to save our lard butts?
Thursday, October 21, 2010
This is a slightly complicated story. A few days ago I got an e-mail promoting a film called "Hot Flash Havoc." This is purportedly one woman's story of the horror, the horror, of menopause and her long search for relief. Finally she found the one doctor who could end her suffering. Okay, here's a link, but I warn you, it's a very slick advertising site with an annoying sound track. I found it interesting that it also seemed to promote a particular physician named Alan Altman, who, as it turns out, also has a a very slickly produced web site that promotes his practice and various products.
It turns out that Dr. Altman is a big-time promoter of hormone replacement therapy, and he claims that the Women's Health Initiative (WHI) Study, which concluded that HRT increases the risk of cardiovascular disease and breast cancer, is bogus. He apparently thinks that HRT will keep you young, sexy, horny and healthy and that anybody who says otherwise is part of some sort of conspiracy against womankind.
I thought this all rather odd so I wasn't going to say anything about it, but then along comes longer term follow-up analysis from the WHI, which concludes that HRT is even more dangerous than we used to think.
The WHI was halted after most participants had been receiving HRT for 5.6 years, because it became clear to the investigators that health risks were exceeding benefits, including an elevated risk of invasive breast cancer, cardiovascular disease and stroke, and pulmonary embolism. This was a major shock, because HRT had been heavily promoted for years as a fountain of youth. The elevated risk of breast cancer was in fact anticipated, but HRT was thought to be protective against cardiovascular disease based on observational studies. Since the consumption of HRT plummeted in the wake of these findings, so has the population-wide incidence of breast cancer, an association which many people believe is causal.
Nevertheless, some people continued to believe that even though HRT increased the incidence of breast cancer, it resulted in a higher percentage being of a less dangerous variety. Not so, as it turns out. The new findings are:
Results In intention-to-treat analyses including all randomized participants and censoring those not consenting to additional follow-up on March 31, 2005, estrogen plus progestin was associated with more invasive breast cancers compared with placebo (385 cases [0.42% per year] vs 293 cases [0.34% per year]; hazard ratio [HR], 1.25; 95% confidence interval [CI], 1.07-1.46; P = .004). Breast cancers in the estrogen-plus-progestin group were similar in histology and grade to breast cancers in the placebo group but were more likely to be node-positive (81 [23.7%] vs 43 [16.2%], respectively; HR, 1.78; 95% CI, 1.23-2.58; P = .03). There were more deaths directly attributed to breast cancer (25 deaths [0.03% per year] vs 12 deaths [0.01% per year]; HR, 1.96; 95% CI, 1.00-4.04; P = .049) as well as more deaths from all causes occurring after a breast cancer diagnosis (51 deaths [0.05% per year] vs 31 deaths [0.03% per year]; HR, 1.57; 95% CI, 1.01-2.48; P = .045) among women who received estrogen plus progestin compared with women in the placebo group.
Now, it's important to understand here that while relative risk seems pretty substantial, the absolute risk we are talking about is indeed pretty small -- 2 additional deaths per year per 10,000 women. However, I should also point out that over the course of the follow up period the survival curves continued to diverge, so it is reasonable to suppose that over a longer time period the absolute risk will increase. And of course, you also need to remember that even at the observed rate of difference, over ten years we're talking 20 deaths per 10,000, and more over longer periods. This does not include the cardiovascular and other risks, which I don't have handy although I do know they are of comparable magnitude.
As Peter Bach argues in the same issue of JAMA, while many physicians still feel that a short course of HRT around the time of menopause presents little risk, we don't really know this. It's true that the cardiovascular risk does not seem associated with HRT initiated right at menopause, but unfortunately timing of initiation is not related to the other risks.
I'm not personally familiar with the symptoms of menopause, but I do know many women including of course my own mother who have gone through it. From what I could observe, it didn't seem to be hellish enough to justify increasing your risk of dread diseases and early death just to relieve the symptoms. While I do believe that women should make up their own minds, I do not like slickly produced promotion of such a highly dubious product. I also have no reason to believe that the results of the WHI have been in any way misrepresented; nor do I believe it does a disservice to women to tell them the truth about scientific findings.
Wednesday, October 20, 2010
As you might surmise, I don't watch Glenn Beck, but according to some of his critics who I do follow he hawks various products that are supposed to help you survive the collapse of civilization. I'm not sure how you're supposed to microwave the freeze dried lasagna during the tribulations, but whatever.
Beck is of course a professional paranoid and his entire stock in trade is irrational fear. He doesn't specify the nature of the coming apocalypse, but I imagine the idea is that Obama declares the Islamocommunofascisto One World Order, sends in the Mexican Army to seize your guns, followed by the Martyrs of the New Caliphate to impose Sharia law on all of North America, whereupon the Patriot Militias and the Oath Keepers rise up and the economic infrastructure gets wiped out in the crossfire.
I don't think all that is very likely but I do sometimes contemplate the ineffable problem of how much to invest in preparing for low probability, very high impact events. The giant rock from space is sufficiently improbable that I don't lose sleep over it. (Although a Tunguska scale event could happen any time, it is extremely likely that it would occur over a densely populated area and even if it did, the impact would be local.) But there are some fairly horrifying possibilities that people don't like to think about but are worth some consideration.
The prospect of nuclear war, sad to say, has not evaporated. The biggest worry would seem to be events spiraling out of control in the Asian subcontinent, ending in a nuclear exchange between India and Pakistan. The catastrophe would not only be regional. It would have global climatic effects causing crop failures for more than one year. We absolutely must rid the world of nuclear weapons. Let's talk about that, and make it the urgent political issue it must become.
Then there is the perfectly plausible, though unquantifiable probability of a novel pathogen causing a global epidemic and mass die off. HIV proves that we can't necessarily figure out how to control a pathogen quickly. Fortunately, HIV is not easily transmissible, but something that is could indeed produce a new Black Death. This time it wouldn't be limited to a single continent. We plod along with virological and bacteriological research, slowly build public health infrastructure. Maybe we'll be ready for whatever comes along and maybe we won't. There's really no way to say whether we ought to invest more in this particular worry. We should, however, certainly do more to stop the evolution of antibiotic resistance.
Then there are the Yellowstone and Naples supervolcanoes. I won't go into that, you can google it if you're interested. Could another financial crisis cause the economy to grind to a halt and then just sit there in gridlock, with commerce paralyzed? (Think of Kurt Vonnegut's novel Galapagos, if you have read it.)
Notice I don't include terrorism on this list. It's already going on around the world all the time, and it adds up to less death and destruction than ordinary crime. In the march of civilization, it's background noise, except for the immense social amplification it gets. Even a single loose nuke could only destroy one city. Major bummer, but not the time to break out your Survival Seeds.
Global climate change and peak oil are indeed happening. They will produce what Jim Kunstler calls the Long Emergency. Civilization will muddle through with a lot of pain and dislocation,but the grocery store and the monetary economy will still be there, even if you personally are poorer, or perhaps drowned. I'm certainly worried about the ongoing plutocratic destruction of our republic, aided and abetted by Beck and his fellow lunatic fringers, but that's a different category of catastrophe.
The point of this no doubt depressing post is that we can't assume that the basic structure of our lives will be at all similar ten years from now, or even tomorrow, and that any and all long range plans should be viewed with great skepticism. On the other hand, there could be some astonishing technological breakthrough providing unlimited clean energy and everything will turn wonderful, who knows?
So my advice is, we need to do something about the problems we clearly can do something about -- viz. nuclear weapons -- and keep trying to learn more about the problems we can't necessarily fix yet. Meanwhile, keep a week's worth of emergency supplies around if you're the worried type -- ordinary canned beans and a few gallons of water are fine -- and fight to keep your republic. I expect to be over my flu like illness tomorrow whereupon I will return to sanity.
Tuesday, October 19, 2010
I have a bit of a cold, no big deal. According to the National Institute of Allergy and Infectious Diseases (and I don't know where they get this from) there are over 1 billion cases of the common cold in the U.S. each year, which means the average person gets about 3. That is a bit more than a nuisance, since presumably it means a lot of lost productivity and possibly some more serious consequences if people who are groggy from too much Nyquil try to drive a car or operate an excavator.
I came to work today anyway, although I might not generate quite as much dazzling brilliance or baffling bullshit as I do no a typical Tuesday. But naturally, I got to thinking about the FAQs concerning this most commonplace of human afflictions.
Question #1: They do a better job of curing cancer than they do of curing the common cold. Why are scientists so stupid?
Answer: Actually, it doesn't really follow that just because a condition is relatively mild, it ought to be easier to cure. I am also a victim of male pattern baldness, and I have no hope about that. But there are additional reasons. There are more than 200 viruses known to cause symptoms we call a cold; half of all colds are caused by viruses we don't even know about! Some people will say that technically, a cold is disease caused by a class of virus called a Rhinoviruses. But really, since nobody bothers to try to find out what virus you actually have, a cold is whatever seems like a cold. It would probably be pretty easy to come up with a vaccine against almost any given one of these viruses, but why bother? You'd still catch the other 199, plus any of the unknown multitude.
Since viruses are not actually alive, but rather hijack the cellular machinery to make copies of themselves, antiviral drugs generally have to interfere with biological processes that are there for good reasons, ergo they have side effects. It's worth taking antiviral drugs for HIV, for example, but it would not be worth taking such toxic drugs for a mild, self-limiting illness.
Finally, since colds are, as in my case, no big deal, it just isn't worth the huge investment in biomedical research we put into more serious diseases.
Question #2: What should I do about it?
Answer: Not much. All of the over the counter medications marketed for cold symptoms really don't work very well. Suppressing your cough is not actually a good idea -- you cough for a reason, to clear out the gunk. Cold symptoms are mostly the body's response to the infection -- the fever and mucus secretions are part of how your body fights the infection. Probably best to leave well enough alone, for the most part.
Hot liquids will relieve the symptoms. Ginger tea is good. Get fresh ginger, cut it up, and boil the living shit out of it. Drink it hot, with lemon and honey if you like.
People have thought that various nostrums -- zinc, echinacea, vitamin C -- work against colds but more careful investigation has so far found, so sorry, they do not. So hold onto your wallet.
Whatever you do, don't take antibiotics. Which brings us to . . .
Question #3: What is the worst harm done by the common cold?
Answer: I would say, inappropriate treatment, particularly prescribing of antibiotics for cold symptoms, or use of over the counter antibiotics in countries where they are available. That helps to create antiobiotic resistant bacteria (which do not cause colds), which constitute a grave danger to humanity. (I'm not exaggerating.)
Then there is all the money wasted on marginally effective remedies. Oh yeah - Do not, repeat DO NOT give over the counter cold remedies to young children. Sayeth NIH: "However, do not give aspirin to children. And do not give cough medicine to children under four."
Question #4: So, how can we prevent colds?
Answer: While hand washing was greatly overhyped last winter for influenza control (flu is mostly transmitted by aerosols and "fomites," inert surfaces on which virus resides), it really works to prevent transmission of cold viruses. If you have a cold, do others a favor and wash your hands frequently. Cover your coughs and sneezes, but not with your hand, obviously, if you can help it. Stay home if you're really symptomatic and spewing viruses everywhere.
Other than that, live with it. It's the human condition.
Monday, October 18, 2010
I spend a lot of my curmudgeon budget trashing the "medical science" that gets published and the stuff doctors do to us as a result, and obviously JI's work backs me up, but I feel a need to be a little bit contrarian to our contrarianism. While it is true that a lot of published findings turn out to be wrong, the good news is that they turn out to be wrong. In other words, as flawed as the scientific enterprise may be, the truth ultimately will out, because it's out there. Sooner or later, it rises up to bite you in the face.
Scientific findings are often wrong because, among other reasons:
- If you test a whole lot of hypotheses, some of them will appear to be true just by coincidence. If you tried it again, the association would not be found. The "p value" which is usually used as a test of statistical significance ignores a profound epistemological fact, Bayes Theorem. Most of what you can imagine is highly unlikely; you must take that into account before believing anything.
- Once a finding is published, it's unlikely that people will repeat the experiment because there's no glory in it. It's hard to get replication of earlier studies published, at least in prestigious journals. Refutations can get published, but it's equally hard to get funding for confirmatory studies, even though proving a well-known result wrong would be a big coup, so they don't often happen.
- Investigators usually want a particular result. They want their theories to be productive, which means they want their hypotheses to be confirmed. Unconscious bias can affect every step of the research process, from formulating the questions, to the research design, to subject recruitment and the conduct of the intervention, to measurement, to interpretation of results. Fraud is seldom involved. It's just too easy to fool yourself.
- Once an idea gets entrenched in the research and clinical communities, it's tough to dislodge. People get used to thinking and acting on the basis of certain propositions, and knocking them out of people's heads requires at least a 2 x 4.
Still, when a finding just isn't true, we will know it sooner or later because as it is applied in further studies that build on the idea, or implemented in clinical practice, it will emerge that it just isn't working. Predictions based upon it won't come true, patients won't do any better, somebody will test a better idea and it will succeed spectacularly.
What I naturally fear is that people who read the Atlantic article, or for that matter this blog, will decide that so-called scientific medicine is all BS and you might as well go to the homeopath. Wrong, wrong, wrong. Your doctor might be wrong about some things, but your doctor is right about a hell of a lot more than homeopath is right about, which is absolutely nothing. Science is self-correcting, homeopathy is not.
So what should you do? Basically, beware of the latest remedy or the supposed big breakthrough. Unless your situation is urgent, it's probably best to wait until a drug has been on the market for a few years before you take it. Be conservative about medical intervention. Less is often more. Make sure your doctor understands the basic statistics behind treatment decision making -- number needed to treat, positive predictive value of a test, absolute versus relative risk. Believe it or not, most of them don't. Do you? If there's a demand for it, I'll discuss these issues here (again).
Friday, October 15, 2010
In the circles where I travel this is well-digested news, but if you're a normal person you may be hearing for the first time about this meta-analysis of trials of a drug marketed as an anti-depressant, called reboxetine. I say "marketed as an anti-depressant" because as it turns out, it is not an anti-depressant.
This is one of those rare occasions on which the U.S. comes out ahead of the EU on drug approvals. Reboxetine was never approved in the United States, but it was sold in Europe. Published data showed it to be the equal of the most popular class of anti-depressants, selective serotonin re-uptake inhibitors such as fluoextine (Prozac). (Actually that isn't saying much -- as you know if you've been reading along time, SSRIs don't actually work very well, and really can only be justified in cases of fairly severe depression. But I digress.)
It turns out that of all the clinical trials that had been conducted on reboxetine, Pfizer had published data on only 26% of patients. After the German equivalent of NICE blackmailed them into giving up the rest of the data, this analysis found reboxetine to be a) completely useless and b) to have a worse side effect profile than SSRIs. (In these trials, SSRIs were only slightly better than placebo, as we would expected now, but again I digress.)
It is likely, though not publicly known, that the drug was not approved in the U.S. because regulators had access to some of this unpublished data. And since 1997, all clinical trials funded by the U.S. government or sponsored by companies seeking approval for a drug have to be registered with the feds; no more hiding unfavorable results from regulators.
But as extensive commentary in BMJ, from multiple perspectives (off limits to non-subscribers, I'm sorry to say) makes clear, that isn't the only problem. Here's my bullet list, for what it's worth:
- Even if regulators have access to results of unpublished trials, doctors don't. The drug companies can still bias the literature and, as long as their products pass the minimal standards for approval, make them look a lot better than they really are. Indeed . ..
- Previous meta-analyses of reboxetine, based obviously only on published results, made it seem jes' fine. That's not the fault of the analysts (for the most part, these authors do catch a mistake), but it's inevitable when we don't have access to all the trials.
- Even when we have the results of trials the authors choose to report, we can still be misled by what they do and do not choose to tell us. If you were lucky enough to be around for my statistics primer, you know that if you make a whole lot of comparisons, some associations will appear just by chance. You can always dredge through your data and find that a drug seems to work for some arbitrarily chosen sub-group of patients, or on some endpoint or other. Registered trials have to specify their end points in advance, but publications are often based on end points chosen after the fact -- and this is not always disclosed.
- Trials can be designed in the first place to with a big fat thumb on the scale. Often the dose of the comparison drug is deliberately made too small. Or people who drop out of the trial -- perhaps because of side effects, or because the drug wasn't working -- are not properly accounted for. Or the assessment of effects is done by people who aren't really blinded to the treatment. Or whatever. Sometimes these flaws are detectable by critics who have access to the original raw data, but usually nobody does. It's proprietary.
- Drugs are approved and marketed based on short-term studies of carefully selected people. But then they are used over the long term, in the real world with all sorts of people. Far too little investment -- and often none at all -- is made in following up to see what really happens when drugs are widely prescribed.
I could go on -- that isn't all of it. But the bottom line is, if companies that stand to make big bucks not only pay for the studies, but also control their conduct, analysis of the data, and what gets published, we're going to have these scandals again and again. The public is tired of hearing about them by now and probably just tunes them out, while the misleading TV ads still have them popping the pills with faith in a miracle cure.
Don't get me wrong. There are pharmaceuticals that have been around for a long time, whose safety and effectiveness and the limitations thereof are well understood, and you and your doctor may reliably agree that it makes sense for you to take them. But too often, you really don't know, and there are far too many nasty surprises. No, you can't trust drug companies, their executives, or for that matter the research scientists who take their money, either to actually do research or just to put their names on work they didn't really do and papers they didn't write. And no, the professors don't get in trouble for it. At least not yet.
Thursday, October 14, 2010
I'm not sure what the best short diagnostic label is, but it's a kind of pervasive hubris. We're the Greatest Country on Earth, whatever we do is right even it's wrong when "they" do it, and above all, we know no limits. Petroleum will never run out, the atmosphere and the oceans will absorb our waste forever, our empire will never decline, our resources are infinite and if you don't have everything you need, it can only be your own damn fault.
And oh yeah -- we will never die.
This is largely behind the aversion to even knowing, let alone making use of the information, about the relative value of medical interventions. Since our resources are infinite, it devalues human life even to ask the question. My colleague Peter Neumann (Neu-mannnnnnnnn!) and Milton Weinstein discuss the practical effect of this psychopathology in the new NEJM. (And yes, this is open access.)
They start by quoting from the health reform act:
The Patient-Centered Outcomes Research Institute . . . shall not develop or employ a dollars per quality adjusted life year (or similar measure that discounts the value of a life because of an individual’s disability) as a threshold to establish what type of health care is cost effective or recommended. The Secretary shall not utilize such an adjusted life year (or such a similar measure) as a threshold to determine coverage, reimbursement, or incentive programs under title XVIII.
They give various arguments as to why this is just silly, mostly centering on the very obvious fact that resources are, in fact, finite, and if we want to get the most benefit from our finite resources we need some way of knowing what gets us the most value for the money we spend. Go ahead and read it.
But I want to cast the argument in terms they don't make entirely explicit. QUALYs and similar measures don't really discount the value of a life because of the individual's disability. As I pointed out before, if they did, Stephen Hawking would indeed be dead, since the UKs NICE does indeed use QUALYs to authorize treatments by the National Health Service.
You need to understand how these are used in cost utility analysis. They are not applied to individuals to determine whether a given person will get a treatment. I.e., there are no death panels, or anything remotely similar. Rather, they are applied to particular treatments or preventive interventions, to compare their value with each other.
If I happen to have a disability or a chronic health problem, that is completely irrelevant to the question of whether I, as an individual, will be a candidate for a brain transplant or whatever the question may be. What is relevant is the average benefit to a population of people with the given disease who may receive a brain transplant vs., say, cognitive behavioral therapy. If my disability is unaffected by the procedure, it just doesn't enter into the equation. At all. If the intervention makes it worse, or potentially causes some new form of disability, I would want to know that before deciding whether to have it or not.
In fact, as Peter points out, in many situations very sick or disabled people stand to benefit the most from a treatment. If it doesn't extend their life at all, but just makes their lives better, then the only way you can show the value of the treatment is with QUALYs. By not adjusting life years for quality of disability, you actually end up depriving people with disabilities of potentially valuable treatments.
To take the example of Trig Palin, whose mother (?) frequently invokes him in this context, he is a baby. Therefore any medical intervention that benefits him will have a very high weight because he has a long life expectancy. That he has Down syndrome is completely irrelevant. Obviously any intervention that exacerbated his cognitive challenges would be worth less, but it ought to be, no? But of course we know that anything his mother says is a fortiori completely idiotic.
Now, we must concede, as does Peter, that any metric that compares the value of treatments will make treatments appear less valuable for people who have a short life expectancy, which includes the very old. If given the choice of spending the same scarce dollars to cure Trig of cancer vs. a 99 year old, well, how would you choose? But using QUALYs instead of raw years of life actually benefits older people, whose lives cannot be extended forever but whose quality of life may well be improvable.
Now if we could only find a way to get this common sense through the concrete skulls of the American public . . .
Wednesday, October 13, 2010
I think this link probably won't work for you, although it should get you to the abstract. I can't tell because I'm sitting at an all powerful computer that has JAMA totally pwn3d. Anyway, I'll tell you what it says.
This is physician Michael Steinman and pharmacist Joseph T. Hanlon telling the story of an apparently real 84 year old man with moderately advanced dementia who showed up in Dr. Steinman's office with prescriptions for 13 different medications and 16 doses per day. His wife pretty much had to deal with all this. Dr. S fairly quickly figured out that he didn't need most of it, in fact most of it was doing more harm than good. He was on pain medications that he had been prescribed after surgery, but he didn't need them any more. Once he stopped them, his walking improved and he stopped falling. He was on heart medication that he also didn't need. In fact after he stopped it he started going to the gym. He was on iron supplements. Not needed.
It turns out that older folks are typically taking at least one unnecessary drug. Furthermore, they are often taking medications that might make sense for younger people but are counterindicated in the elderly. They might do well on an alternative, or just be better off with nothing at all. As I believe I have told y'all, my father was twice -- twice -- given antipsychotic medications while he wsa in nursing homes with dementia, which left him a drooling vegetable and threatened his life. This is even though my mother had signed an order forbidding them to give him these drugs, which have a black box warning telling physicians not to prescribe them to elderly people with dementia.
Finally, it doesn't necessarily make sense for people with advanced dementia to be taking drugs for conditions like high cholesterol or diabetes. People and their caregivers need to decide on the goals of treatment and whether there is anything to be gained.
With increasing numbers of prescriptions, not only do we get adverse side effects, a lot of cost, and a lot of hassle, but it also turns out that people are much less likely to follow their scheduled doses -- what we call be "adherent" -- meaning they are less likely to take the pills they ought to be taking.
The mystery is why it's so hard for doctors to figure this out. It requires a major article in JAMA to remind them of something that ought to be a basic part of the repertoire of doctorly skills. Doing this right might not make a huge cost saving but it wouldn't hurt in that department either.
You, as a consumer, need to be very alert to this as well. If you are getting along in years, or have loved ones who are, make sure their doctors review all the medications the person is taking, understand the whys and wherefores of each one, and eliminate any that aren't really necessary or appropriate. It shouldn't be that hard.
Monday, October 11, 2010
It's a destructive cult. Via Joe Romm I come across this National Journal piece pointing out what we already know if we have been paying attention -- "Republicans in this country are coalescing around a uniquely dismissive position on climate change. The GOP is stampeding toward an absolutist rejection of climate science that appears unmatched among major political parties around the globe, even conservative ones."
Of the 20 Republican challengers for Senate seats surveyed by NJ, only one believes that anthropogenic global warming is happening. (At least that's what they say. Who knows what they really believe, but all the Koch money talks.)
You already know that Republicans in general do not believe in evolution, do believe that if very wealthy people have to pay 35% in top marginal tax rates instead of 32% they will stop doing whatever it is they do that supposedly makes the economy hum, believe that unemployment is caused by unemployment insurance, believe that Barack Obama is a Muslim socialist who was born in Kenya, and think that providing health insurance to people in their 50s means the government will murder people in their 70s. I could go on.
I can't imagine why anybody would vote for these lunatics. If enough people do, we're totally, irredeemably screwed. This country faces huge problems that require aggressive, rational action by government. We cannot afford to hand it over to a cult of malignant clowns.
Friday, October 08, 2010
By way of BMJ I came across the latest publication from the International Centre for Science in Drug Control Policy, on cannabis prohibition in the U.S. (Yes, they spell it Centre, they can't help it, they're British.) Did you know that the U.S. drug enforcement budget, adjusted for inflation, increased from $1.5 billion in 1981 to more than $18 billion in 2002? And did you know that from 1990 to 2006, cannabis-related arrests increased from 350,000 to 800,000? Over the same time, seizures increased from 500,000 pounds to 2 and half million pounds?
Ahh, but the price fell.
All those people in jail, made into criminals; violent criminal enterprises destroying Mexican society; massive law enforcement resources, diverted. To what end? Didn't we learn anything from prohibition of alcohol, which happens to be a more addictive and more damaging drug than cannabis?
Regulate it, tax it, grow it in the U.S.A. We'll have a more peaceful, law abiding, richer society; a lot fewer people in jail and living with criminal records; actually have better luck restricting access by minors; painlessly increase government revenues and reduce wasteful spending; and everybody from the dirtiest fuckingest hippie on earth to the Cato Institute and Andrew Sullivan will donate to your re-election campaign. And of course, if the people in your state or county believe Jesus wants pot to be illegal, you don't have to go along.
So what are we waiting for.
Thursday, October 07, 2010
I've long been a skeptic about the relative contribution of health care to health and longevity. I haven't changed my mind in the slightest that social determinants of health are ultimately more important - I'd rather not have diabetes or lung cancer or heart disease in the first place than get medical treatment for any of them, obviously. We should spend less on health care and more on other good stuff.
What I am about to tell you doesn't challenge that, in fact it supports it, but it does put health care front and center in the story of the poor health of Americans compared with people in other wealthy countries. Meunnig and Glied in Health Affairs (sorry, I'm not sure whether you have full text access because I'm permanently signed in via cookie)* use trends over time to sort out the effect of health care from other determinants in explaining our poor life expectancy.
This is a complicated story so I'll have to be fairly elliptical for this blog post. As the banner at the top of this page says, we spend far more on health care than other countries but we get less for it. As M&G tell us, the U.S. now ranks 49th for life expectancy at birth - a severe decline since the 1950s, and we now spend more than twice the median of other wealthy countries. But some people object that the life expectancy comparisons are flawed because of differing definitions, e.g. fetal deaths might be counted as an infant death. Others attribute the gap to social determinants, such as smoking, obesity, motor vehicle crashes, etc. and argue that it's even possible our higher spending on health care reduces the gap that would otherwise exist.
Well, it turns out that if you just look at 15 year survival of people age 45 and 65 -- eliminating the infant mortality problem, getting to ages where medical care becomes more important, and eliminating problems with coding of death for the very old -- the story looks pretty convincing. In 1975 we were already in last place for 15 year survival of both sexes at age 45, but we did pretty well for survival of older people. Our health care spending was above average but not an outlier.
Over the next 30 years survival rates and health care spending went up everywhere, but our performance in both categories grew dramatically worse in comparison to other nations: "By 2005, . . . the United States had become a high outlier in spending and a low outlier in 15 year survival." They find that non-Hispanic whites taken separately did worse in survival gains than people in any other country, so it's not just our racial inequality that drags us down.
They also rule out some of the usual suspects. Smoking rates in the U.S. are actually lower than in comparison countries and also fell faster over the period than in most of them. Yes, we are more likely to be obese than people elsewhere but our obesity rates haven't been growing faster, which would be necessary in order for obesity to explain the trends. And the share of deaths in the U.S. attributable to motor vehicle crashes, while comparatively high, has been falling.
They conclude that maybe we should actually blame high health care spending for the mortality gap:
It is possible that rising US health spending is itself responsible for the observed relative decline in survival. There are three reasons why this might be so. First, as health spending rises, so, too, does the number of people with inadequate health insurance. Notwithstanding the uncertainty surrounding the impact of lacking insurance on the health of the US population, higher spending could be reducing survival by decreasing the number of insured people.
Second, rising health spending may be choking off public funding on more important life-saving programs. Health spending now constitutes a sizable proportion of the federal budget. At current spending levels, investments in public health, education, public safety, safety-net, and community development programs may be more efficient at increasing survival than further investments in medical care.
Finally, unregulated fee-for-service reimbursement and an emphasis on specialty care may contribute to high US health spending, while leading to unneeded procedures and fragmentation of care. Unneeded procedures may be associated with secondary complications. Fragmentation of care leads to poor communication between providers, sometimes conflicting instructions for patients, and higher rates of medical errors. For example, two separate physicians are probably more likely than a single primary care provider to prescribe two incompatible drugs to a single patient.
Talk about your death panels.
*And apologies for the bad link yesterday -- my university's computer network is so powerful that it gets me into JAMA without having to log in or go through our library proxy server, so I didn't know it was closed access. I'll have to find a way to descend among the mortals.
Wednesday, October 06, 2010
Lawrence Gostin, in the new JAMA, discusses the Supreme Court's recent holding that the Second Amendment confers a personal right to own firearms, (District of Columbia v Heller) and furthermore that the Second Amendment applies to the states as well as the federal government (McDonald v Chicago). (Sorry, no free full text access, you just get the first 150 words.) While he does not use the term, Gostin's analysis amounts to a total demolition of libertarianism and of right wing philosophical and constitutional arguments more generally.
Gostin points out that the "right to bear arms" is "antithetical to social order and public safety." The empirical facts are that we are shooting ourselves and each other at a rate of 35 per 100,000 population each year, of which more than 28% of those injuries are fatal; and more than half of all firearm fatalities are suicides. The firearm death rate for children -- defined here as younger than 15 years -- is 12 times higher than the average rates in other industrialized countries. A gun kept in the home is far more likely to end up killing or injuring a family member than it is to ever be pointed at an intruder.
One cannot argue that our liberties are enhanced by a right to gun ownership and gun carrying without sensible regulation or restrictions because when you exercise such a right, you are likely to deprive me of my own right to live in safety and be free of coercion. The putative right is quite different from the others enumerated in the Bill of Rights, which as Gostin puts it are "fundamental to the fulfillment of personal autonomy, dignity, and political equality."
The original intent of the Second Amendment is much contested but only because "gun rights" advocates argue tendentiously. The plain meaning, and evident intention of the amendment, is to protect the states from having their militias disarmed by the federal government. By interpreting the Amendment as a restriction on the rights of the states to regulate gun ownership and use, the Court has turned it upside down. This is entirely typical of conservative judicial philosophy, which argues for states rights when it wants to restrict the federal government -- as in school desegregation, environmental protection, etc. -- and against states rights when it wants to restrict the states, as in this case.
As Gostin cogently argues, the appropriate regime of firearm regulation depends very much on the local social context. The considerations are entirely different in rural areas where substantial numbers of people hunt and criminal violence is comparatively uncommon, than in urban areas where firearm-related recreation or economic uses are essentially absent, and gun-related crime is a major problem. There can hardly be an issue for which state and municipal control is more obviously appropriate and for which democratically elected lawmakers are more appropriate arbiters than judges.
I have a very strong liberty interest in being free to walk around town unarmed in reasonable safety. Alito, Roberts, Scalia, Thomas and Kennedy have deprived me of that most important right. As we know, their true concern is not your liberty, or mine, but the profits of firearms manufacturers. Turning the radical, anti-liberty faction that currently controls the Supreme Court into a minority should be a foremost concern of voters.
Monday, October 04, 2010
Before I get to the enormously important subject of today's post, I'll let the title do double duty by letting you know that I showed up today for my first official day as a new faculty member at a prestiggeeous university and somehow they hadn't entered the fact of my existence into the right computer. As a result, I can't get a badge, including access to my own office; a computer account; or a parking space. Without a computer account, I can't actually do anything. (I'm using my wireless modem and my own portable computer to do this post.) I have articles people have sent me that I need to read, but I can't print them. I can't upload my files from my old job to the network here, and I don't have access to any software. I could do some writing, but what I really need to do at this point is edit some documents and I can't do that.
In the old days, if I needed to read something I would have gone to the library - you know, an actual physical building with books and magazines and shit. No more. I would have written on yellow legal pads and had my secretary type it. There wasn't any software so if I wanted to analyze data I would have shuffled index cards around or done calculations with a pencil. Then it wouldn't matter if I existed in 'the system' or not. They're working on it, but in this modern world, until about 4:00 this afternoon, I'll be a nonperson.
Oh well. A commenter (anonymous -- now don't be shy) wonders what other experiments American mad scientists did on vulnerable or despised people. Actually there have been quite a few. This blog post tells much of the story. After the Nazi scientists who had used concentration camp prisoners as experimental animals were properly tried and convicted, pretty similar outrages continued in the U.S. for a long time. Prisoners were available and easy to recruit. They would be offered points for good behavior in exchange for being infected, poisoned, having their testicles irradiated (yes) or otherwise put at serious risk. Perhaps the most astonishing case is that of the Fernald "School" in Massachusetts, an institution for people with retardation, where children were fed radioactive cornflakes.
In 1977, legislation put an end to these practices and no, they don't continue as far as I know. Every institution that does federally funded research on human subjects must have an Institutional Review Board that assures that research is done ethically. In my experience they are if anything overzealous at times. But, drug companies often use for-profit IRBs and it is questionable how careful they are, since they don't want to annoy their customers. They can't get away with anything too blatant, but informed consent forms often read as though they were written by lawyers concerned with covering the asses of the experimenters and are not really comprehensible to the subjects. There are semi-professional guinea pigs who "volunteer" for research projects continually in order to receive stipends. The stipends are supposed to be compensation for time and trouble, not an incentive, but they are real money and could override desperate people's other considerations. So yes, there are still problems, the point of my previous post is that nevertheless, things have at least gotten somewhat better.
It's okay to think that sometimes, about some things.
Friday, October 01, 2010
In case you have been feeling sorry for yourself lately in these tough economic times, via Brad Delong I find this useful corrective. Just enter your annual income and see where you stand compared to the world's population. I'm only about the 43 millionth richest person in the world but if you can grasp orders of magnitude you have already noticed that puts me well inside the top 1% of humanity. I probably don't have many readers outside of the top 5% or so (though I wish I had more, of course). Just something to think about.
I also doubt there are a lot of current nicotine addicts among our readers but many state health departments and other toilers for The Good have put together a web site to help you quit. No idea how well it works but we won't know unless people try it. Today is a great day to take that first step to get clean, so go for it.
At first I didn't think I'd even mention this because what can I add? But actually I do want to comment on the grotesque experiment carried out by U.S. government agencies on Guatemalan prisoners and mental patients, who they intentionally infected with syphilis shortly after World War II. The timing is particularly instructive, since we had just gotten done defeating the Nazi regime and we were feeling all morally superior because they were racists who performed horrific medical experiments on Jewish and other prisoners of the inferior races. Hmm. Of course as you probably know that wasn't the last similar occurrence, including the so-called Tuskegee experiment (which was not actually an experiment and was basically pointless)
But my value added on this one is actually hopeful. It's proof that humanity can make moral progress, at least temporarily, and maybe we do ratchet up a notch each time. In response to the Nazi atrocities and other provocations (the Tuskegee syphilis study had not yet been publicly revealed) the assembled nations of the world adopted the Declaration of Helsinki in 1964, which proclaims an absolute requirement to protect the interests, safety and dignity of human research subjects. Since then elaborate rules and mechanisms have been put in place throughout the world to enforce these requirements.
Not to say there aren't violations, and there has been plenty of controversy about clinical trials conducted in poor countries where people are unlikely to benefit because they can't afford the products being tested; and where placebo controls that would be considered unethical in wealthy countries may be used on the grounds that the people couldn't get standard treatment anyway. There are still plenty of legitimate ethical questions about the conduct of human subjects research, but at least we have these arguments and nothing like the Guatemala or Tuskegee studies could happen today. I think.
On a personal note, I'm cleaning out my office today. This will probably be the last time I blog from this location. One more milestone . . .