Monday, February 28, 2005
CDC has been working to combat the HIV epidemic for 15 years now, but it's taken a long time for us all to figure out what actually works. At first, a lot of agencies were funded to just go out and talk to groups about HIV and AIDS. As you may recall, Surgeon General C. Everett Koop, who Ronald Reagan appointed to office as an exemplar of what was already referred to by the Christian right as "moral values," surprised everyone by mailing out straightforward, factual information about HIV and AIDS to every household in the U.S. Posters went up on subways and buses, there were toll-free 800 numbers to call, people were trained as community HIV educators and they spoke to school assemblies, church congregations, and the rotary club. The gay community mobilized and organizations sprang up such as Gay Men's Health Crisis in NY and AIDS Action Committee in Boston. They developed outreach programs to bars, bathhouses, and other "scene" settings.
All of this was necessary but, as it turned out, not sufficient. People need accurate information, but for many people, avoiding HIV infection is not just a question of knowledge. People need skills, they may even need power in social situations that they don't have. For example, women don't always have the ability to insist that their partners use condoms nor do they know what the men may be doing that puts both partners at risk. Gay men can also be in abusive or unequal relationships. Some people have addictions, severe material needs, psychological distress or mental illnesses which also affect their ability to avoid risk behaviors. Most important, overarching everything, is the stigma associated with homosexuality, illicit drug use, and HIV itself, which has made it difficult to talk openly and has also driven many people to avoid disclosing information about themselves which is essential if counselors and educators are to work with them effectively. Behaviors which are driven underground, and pursued furtively, are the most difficult to modify.
Over the years, we have developed effective, evidence based methods to help people avoid risk of HIV infection, and to help already infected people avoid transmitting the virus or being reinfected. CDC is using this knowledge to target its funding more effectively and to work with community based organizations providing HIV prevention services to develop and implement interventions that work. A problem is that in order to monitor and evaluate these efforts, and assure that their increasingly limited resources have as much impact as possible, they need to collect information about the programs, their clients, and the outcomes. This presents major challenges for confidentiality and trust. Obviously, if people think an agency is going to report information about them to the federal government, they aren't likely to tell the truth about themselves.
CDC has set up its PEMS system to segregate data agencies need to manage and evaluate their own programs, which includes ways of keeping track of information about their clients which is potentially identifying, from information CDC needs to demonstrate to Congress that the spending is worthwhile (assuming Congress cares), to hold grantees accountable, and to allocate its resources. The question for those of us doing the work is whether we can trust that this information will indeed be inaccesible to the government. As I asked the guy in charge of the program today, I trust you when you say you won't be able to get access to the data, but can you make me believe that Alberto Gonzales won't be able to see it? He said yes, but he's not going to try to prove it to me until Thursday. Meanwhile, I'm still skeptical, as are my colleagues here. We shall see what we shall see.
Sunday, February 27, 2005
I hope to get some material from the meeting but it's boring, technical stuff. They're trotting out their new data system for HIV counseling and testing. Like everything else the federal government does, it has an acronym, PEMS, Program Evaluation and Monitoring System. There is also a one hour session on how to fill out your expense reimbursement form.
Boring this may be, but obviously timely. The Retrorvirus Conference here in the galactic capital city has featured reports about the continuing substantial number of HIV-infected people who don't know their status. CDC is concentrating on "Positive Prevention" as the most effective way to control the epidemic here in the U.S. That means trying to identify every infected person, providing treatment to control their viral loads, and working with them to make sure they don't retransmit the virus. In the case of some people, this just requires a bit of counseling, but for others, it's a substantial undertaking. People may need substance abuse treatment, mental health treatment (the rate of HIV infection is disproportionately high among people with mental illnesses, and vice versa), and stable lives including housing, jobs, etc. All of this comes together with HIV disease control because people need to adhere to their drug regimens, stay sober, stay out of prison, stay out of the sex business or sex trading, not share needles and works, etc. if they are to avoid transmitting the virus. I emphasize that this must not stigmatize people with HIV. There is no way to generalize about them. People with substantial psychological problems or social disadvantages are at increased risk for HIV, as they are for many health problems -- and one consequence of the relationship to social disadvantage is the marked racial and ethnic disparity in infection rates in the U.S. -- but anybody can become HIV infected. The most common way is to have unprotected sex with an infected person -- which could even be your spouse.
CDC funded programs concentrate on outreach to people at high risk, case finding (i.e., giving HIV tests and identifying seropositive people), referring people to services including medical treatment, and partner notification. There are many practical and ethical problems involved in all this, which I will discuss in the days ahead. Such measures, along with more general education and risk reduction aimed at seronegative people, have succeeded in keeping the epidemic at a stable (though still substantial) level in the U.S. But as we know, the epidemic in much of the rest of the world is out of control and represents a growing crisis.
So I hope we can spend the next week concentrating on HIV and some of the many important scientific questions, social problems, and ethical and political debates associated with this extraordinarily socially complicated medical problem.
Saturday, February 26, 2005
On the other hand, they did have problems of their own. It's not that there wasn't enough to eat, at least most of the time. However, it was considerable effort. They had to run after the animals and walk after the plants, climb trees, dig up roots, crack nuts, and carry water or carry themselves and their food to the stream. Meanwhile they were dodging tigers and swatting mosquitoes. So they did have to make sure that the calories they took in exceeded the calories they put out to get them. (They needed a few extra for everything else they did, of course.)
So evolution equipped them with the appropriate preferences. Sugar tasted good, especially to kids who were trying to grow, because it gave them a lot of those scarce calories in a small package. And it came in fruits that were nothing like the bloated sugar bombs of today, produced by millenia of selective breeding. Instead, the fruits had a lot of fiber, so the sugar was absorbed slowly and didn't cause a spike in blood sugar. Fat tasted rich and satisfying for the same reason. Too much salt is bad, but we need a little of it, and it's scarce in natural foods, so salt tasted good as well.
Fast forward to today, in the US of A. Food is so cheap that it barely makes a notch in our budget. McDonald's wants to supersize us because the food costs them less than the packaging and the money they're paying the kid behind the counter. You're a big food tycoon and you want to sell us as much as you can. What do you do? You put in sugar, and salt -- in fact, two of the major categories of food that business analysts pay attention to are called sugary snacks and salty snacks. You want to get people to try it, so you start when they're young and impressionable, hiring cartoon characters to tell them to eat sugar frosted sugar puffs for breakfast, drink sugar water all day, and eat cowfatburgers with melted dairy fat and strips of fried pig fat for lunch. You put salt on fried starch and show 60 second movies of sexy people eating it at parties. All this stuff tastes good, because it pushes the buttons evolution built in to us when eating was hard work, so we eat a lot of it.
And you synthesize trans fats and put them in everything you can. They make crackers and fried starch (e.g., potato chips and fritos) crisp, cake products firm and rich tasting, and best of all, they don't go rancid, so your hohos can sit on the shelf at the 7-11 till you cash in your private social security account and still be good to sell. There's very little fiber in any of this stuff, so all that sugar goes right to our pancreases which squirt out insulin. It makes us feel good.
Instead of spending their days walking across the savannah in search of chips and cookies and burgers, the customers spend their days sitting in front of televisions and computers, or riding in cars. So we get fat. Then we get sick. And those food tycoons get rich.
Friday, February 25, 2005
The new guidelines may be all right, but they are also all long. You can go to Dietary Guidelines to read the report -- all 70 pages of it. JAMA has provided a handy side-by-side table showing recommendations from the old guidelines issued in 2000, and the new guidelines issued this year. Examples:
Old: Let the pyramid guide food choices
New: Consume a variety of nutrient-dense foods and beverages. Follow a balanced eating pattern such as the USDA Food Guide or DASH* Eating Plan.
Old: Choose a diet low in saturated fat and cholesterol and moderate in total fat
New: Keep total fat between 20% to 35% of calories, with most fats coming from sources of polyunsaturated and monounsaturated fats.
* The Dietary Approaches to Stop Hypertension Eating Plan.
To their credit, in spite of industry pressure, USDA and HHS (who are jointly responsible for the report) call for cutting out "added" sugar and trans-fats, and limiting salt to 2300 mg per day. They also tell us to exercise for 60 minutes every day if we want to lose weight.
So far, so good. Now the bad news. First of all, you would need to read the labels on prepared foods, look up the nutritional contents of all your meat, produce, and other ingredients of home-made foods, and enter everything into a spreadsheet in order to figure out if you were following the guidelines. But the pamphlet for consumers they have issued is far less specific. JAMA quotes advocate Marion Nestle as saying "It's left entirely up to individuals to figure out how to read food labels. It's really not easy," noting that the FDA's guide on reading food labels is 10 pages long.
For anybody who eats prepared foods -- including canned vegetables, soups and sauces, baked goods of any kind, frozen cuisine, beverages -- it's impossible. The supermarket shelves are lined with sodium, sugar, and trans fats as far as the eye can see. You'd blow your limit on sodium with one can of soup, on sugar with a bottle of Coke, on trans fats with a handful of crackers or a doughnut.
The fact is, we live in a toxic food environment. It isn't really all that hard to eat well -- just buy fresh fruits and vegetables, lean meats, whole grains, nuts and non-fat dairy products, use olive oil, and go ahead and prepare good meals without using a lot of salt or sugar. If you do that, you won't even have to think about counting, weighing and measuring and you'll do just fine.
But people don't have time, they don't know how any more, and they are bombarded by marketing messages to persuade them to eat great globs of low fiber starch and fatty meat and cheese drenched in salt, sugar and trans fats. It's all so convenient - just send your kids off to school with the Lunchables™ and a Pepsi, and when they get home, order up a bucket of KFC. This stuff is poison. We don't need a 70 page instruction manual, we need either a different way of living, or a food industry that isn't bent on killing its customers.
Today's bonus question: Why do they prefer to poison us? (Answer in tomorrow's edition.)
Thursday, February 24, 2005
If it's jug band music, or rhythm and blues . . .
Kirk Johnson and Reed Abelson in today's NYT tell the remarkable story of the Medicaid reform plan undertaken by former Utah Governor Michael O. Leavitt, now Secretary of Health and Human Services. It seems Mr. Leavitt is pointing to his own state's innovation as a model for the country.
Here's how it works: You extend health insurance to more people. You can do that without spending any more money. How is the miracle accomplished? Medicaid no longer pays for treatment if you actually, well, get sick. Instead, the system "relies on the generosity of doctors and hospitals to provide specialty services free of charge. In doing so, the state has in many ways reframed and reshaped the national debate over Medicaid . . ."
I'll say. To continue: "Everyone here, from state officials to patients, agrees that one big problem with the system is is finding a free-of-charge specialist. They do not exactly advertise in the Yellow Pages."
I'll bet you think I'm making this up. Check it out: Model in Utah May Be Future for Medicaid (registration required)
Now I kind of wish I'd stayed down in that hole . . .
Your genial host is a bit eccentric. I live in the city, but I heat mainly with wood. I have a woodlot in deepest, darkest Connecticut but come the end of February, I never seem to have laid in quite enough to get me through the winter. A couple of weeks ago, a solid oak pallet appeared in the schoolyard across the street, just inside the fence. It sat there, apparently abandoned. Three days ago we got about four inches of snow, but you could still see its outline.
I figured, what the heck, obviously nobody wants this thing so I'll break it up for firewood. Plus, hey, I'm doing them a favor and cleaning up their trash, right? At 10:00 pm, I went across the street and picked up the edge of the pallet. Next thing I knew, I was standing at the bottom of a pit, looking up at a rectangle of moonlit sky two feet above my head. It happened so fast I had no awareness of falling. Suddenly, I was just there. After a few moments contemplating my bizarre end, frozen to death in an 8 foot deep pit belonging to the Boston Public Schools, I discovered a ladder built into a wall. I rejoined the world of humanity, pulled the pallet back over the hole, and went home to await the existential revelation and spiritual enlightenment that was bound to come.
So far it hasn't, but coming home from work the next evening I saw the schoolyard full of kids having a snowball fight, and I got to thinking how easy it would be for one of them to slip, dislodge the pallet, and go down to a much harder landing than I enjoyed. Or, they could just have gotten curious and, well, I already knew the rest.
I'm not interested in parsing how culpable I was in my own near-death experience, but I think a jury would have found the School Department more than 50% responsible if someone had been injured or killed. That particular location had always been just a part of the lawn. There must have been a roof to the vault, a foot under the grass. For some reason they had opened it and then just left it in that condition, with no warning sign.
The relevant problem here is that the major means we have to discourage dangerous idiocy is the civil liability system, and it isn't really designed for that purpose, it's designed to compensate injured parties. Lawyers always follow the money, i.e. they go after the deep pockets. In this case it was likely a contractor of some kind who opened the pit, but the school system would likely be a co-defendant in any suit. It's unclear whether the right people, the ones who should have been thinking harder, would be the ones who got the dope slap in this situation. Punitive damages can be added to an award in order to deter people from acting negligently. The Republicans don't like punitive damages and they're trying to limit them, and they are proposing various other strategies to advantage defendants in liability suits.
From a public health point of view, its is true that such after-the-fact feedback is not a very efficient way to encourage other people to be more careful in the future -- people who may never even hear about a specific incident and are unlikely to think about it very hard or recognize that it applies to them. Since tort reform is a major crusade of the people currently in power, we need to ask. Are there problems with the system? Can it be improved? Is the problem really that too many people are winning "frivolous" lawsuits, or that damage awards are excessive? Or should we affirm the importance of civil liability in making us all safer? (And yes, we should all try to be careful for our own sakes as well.)
Wednesday, February 23, 2005
- The Supreme Court has agreed to hear a Bush administration challenge to Oregon's law -- approved by the voters in 1994 -- allowing physicians to prescribe lethal doses of drugs to terminally ill people. In 1997, the Supreme Court ruled unanimously that there was no constitutional "right to die," but that states could permit the practice, and since then, 170 people have taken advantage of the law in Oregon. But John Ashcroft (remember him?) declared upon taking office that doctors who wrote lethal prescriptions were in violation of federal drug laws. Oregon appealed, the lower court ruled its favor, and now we'll get a definitive ruling. I seem to remember a day when the Republican Party represented itself as a champion of states' rights. . .
- Terry Schiavo is back in the news. As you will recall, this is a severely brain damaged woman who has been on life support for 15 years. Her husband wants to remove the feeding tube, but her parents, fueled by religious fervor and backed by Florida Gov. Jeb Bush, have resisted. The Florida legislature passed a special law, just for her case, allowing Bush to order to feeding tube to remain. The courts ruled the law unconstitutional, but appeals haven't yet ended. They may today.
- Here in the Hub of the Universe, Mass. General Hospital wants to end life support for a woman with Lou Gehrig's Disease who has been on a ventilator for 5 years. Her doctors believe she is "suffering significantly and needlessly," but her daughter refuses to let them pull the plug. The doctors say she is "locked in," completely unable to move or communicate, but her daughter insists she can still "appreciate her family." A judge will have to decide this case as well.
The people in question are not brain dead, of course. They are indisputably alive. The affected people in Oregon are also competent and capable, and wish to make their own decisions. The people in the other two cases can no longer decide for themselves: they are helpless and passive. Others wish to decide for them, one way or the other. Who should decide? What should be the law?
Tuesday, February 22, 2005
Now, I'm not by any means downplaying this problem. It probably will happen that a particularly virulent strain of flue will emerge fairly soon. But like most public health problems, this is far more of a threat to the poor countries than to the rich ones. Thanks to vaccine and improved health care infrastructure, we probably won't experience anything like the fatality rate we did from the 1918 epidemic. A severe flu epidemic would strain health care resources and would be a notable event for us, but it would be far more devastating elsewhere.
The World Health Organization has a very good page packed with info about the influenza threat and the surveillance and response infrastructure it has established. Folks who want to get the straight dope on this should check it out.
WHO Influenza Pandemic Preparedness page
Excerpt: In the past, new strains have generated pandemics causing high death rates and great social disruption. In the 20th century, the greatest influenza pandemic occurred in 1918 -1919 and caused an estimated 40–50 million deaths world wide. Although health care has improved in the last decades, epidemiological models from the Centers for Disease Control and Prevention, Atlanta, USA project that today a pandemic is likely to result in 2 to 7.4 million deaths globally. In high income countries alone, accounting for 15% of the worlds population, models project a demand for 134–233 million outpatient visits and 1.5–5.2 million hospital admissions. However, the impact of the next pandemic is likely to be the greatest in low income countries because of different population characteristics and the already strained health care resources.
Monday, February 21, 2005
Unfortunately, if you cast the conflict in those terms, as public discourse generally does, there isn't any common ground. If human life is somehow sacred, inviolable, or infinitely precious, and a fetus is human life, then a pregnant woman has no right to choose to destroy her fetus, any more than I can choose to rid the world of my obnoxious neighbor. To portray the issue as a conflict between a woman's right to autonomy and a human being's right not to be murdered leads to only one possible conclusion.
Politicians and even the most ardent advocates of the right to choose abortion seldom talk about the real issue here. A fetus is indisputably alive, just as an amoeba, a horse, or a tumor are alive; and it is indisputably a form of human life, in the sense that it can be assigned to the human species and only the human species. The question is whether it is morally, a person, entitled to the respect we accord persons. ("Person," in this case, is used in a specialized sense.) Persons gain respect because they are moral agents -- self-conscious, aware of themselves and their surroundings, capable of planning, choosing, and being morally responsible. We gain these properties gradually, not by virtue of possessing human DNA but by virtue of the development of our central nervous sytems and the accumulation of experience, knowledge, and connection with others.
Newborn infants cannot plan, choose, or be morally responsible, but they are at the beginning of awareness of self and the world and clearly they can form connections with others. Late term fetuses undoubtedly have some form of awareness but the earlier in pregnancy, the less they must have, and before they have a well developed frontal cortex they must have none. This is closely related to the moral controversies over maintaining the lives of severely brain damaged individuals, at the other end of life. Is the "human life" of people who cannot communicate, act or perceive infinitely precious, or has their status as persons been so diminished -- as they lack autonomy, awareness, connection with others (except in a purely passive sense) or moral agency -- that it no longer makes sense to preserve their biological life?
It is important to remember that there is not one word anywhere in the Bible about abortion, Old Testament or New, although abortion, and for that matter infanticide, were common in the ancient world. The Catholic Church did not decide that abortion was a sin until the latter 19th Century, at a time when the major cultural issue was the economic, social and sexual liberation of women, not respect for human life -- of which the Church, historically, had very little indeed. It is apparent that many people in the "Right to Life" camp at least subconsciously hold doubts about the personhood of the fetus, as they support the right to abortion in case of rape. How is the moral status of the fetus affected by whether the mother consented to the act which created it? Obviously it isn't. This shows that their true moral concern is with sexual behavior, not life. Many of the same people also condemn contraception, and homosexuality, which is further evidence that the true issue for many people is whether sexuality can be separated from reproduction, not the value of human life.
But for the many people who are sincerely disturbed by the destruction of what they perceive as human life, I ask two questions:
From where does this value derive? Haveyou ever really thought about it?
And why are you not even more passionate about the 29,000 young children around the world, already born, who die every day from readily preventable causes -- a holocaust that exceeds the toll (if that's what is) of abortion in the United States by orders of magnitude?
Sunday, February 20, 2005
There are various reasons why people become beggars. In much of the world, the economy just doesn't offer enough viable niches, and many people beg on the street simply because they have been squeezed out. In the U.S., though, it usually takes major misfortune and some form of disability for people to end up in that position. We had a huge increase in the numbers of homeless street people starting a couple of decades ago, when states unanimously adopted a progressive policy of deinstitutionalizing the severely mentally ill, who until then had been warehoused in horrific, dungeonlike "hospitals." The idea was to create sufficient supportive services -- staffed group housing, day activity programs, supported work environments, intensive outpatient treatment, etc. -- to enable people to live more independently, and be part of the community.
Unfortunately, the states did the first half -- shut down mental hospitals and discharge the people -- without doing very much of the second. Now, innumerable people with severe mental illness live in shelters, on the street, or -- and this is very common -- are in jail. I visited my state's maximum security prison a few years back in the company of a judge, and we met with a group of lifers -- men with long sentences, including life in prison. They were surprisingly accepting of their situation, but it was also very grim. Most of them had started spending time in correctional institutions as teenagers, and they said one thing had definitely changed: the prisons today were full of people who really belonged in the Pine Street Inn. (That's Boston's biggest private shelter.)
The judge said there were two reasons for this: desintitutionalization of the mentally ill, and determinate sentencing. He no longer had discretion, as a judge, to give mentally ill defendants probation and force them into treatment. Even if he did, there probably wouldn't be adequate options for them anyway, but he would have liked to have tried in many cases. But the legislature wouldn't let him -- it was off to jail with them. These people are very difficult for the prisons to manage, of course. They usually go unmedicated, get no counseling, and just get sicker. This is a national disgrace.
Saturday, February 19, 2005
Some children just won't, or can't do it, to the point where their teachers don't want them in the classroom. They talk when they aren't supposed to, won't sit still, engage in activities other than those assigned. Of course this is harmful to their prospects in life because they don't get their book learning and they find themselves in opposition to authority figures. Eventually this pattern of behavior was given a disease label and, based on the hypothesis that the children's brains weren't making the right chemicals, doctors tried giving them psychoactive drugs. They found that amphetamines and an amphetamine-like drug called Ritalin™ caused at least some of these children to sit more quietly and focus better on the assigned tasks. As with all psychoactive drugs, it is difficult to sort out placebo effects -- perhaps in this case operating as much through parents' and teachers' expectations as through the children's -- from the biological action of the drug.
These drugs happen to be controlled substances which are addictive and very commonly abused. Their mechanism of action is the same as that of Methamphetamine, against which the U.S. government is currently on a major crusade. They are weaker, to be sure, but amphetamine addicts do purchase them on the black market. They have the potential for severe adverse effects.
The U.S. government, through the National Institute of Mental Health and the National Institute on Drug Abuse, maintains that ADHD is a real, specfic, biological disease; that these drugs are effective for treating it; and that children with ADHD who receive the drugs do better in later life than children who don't. Some psychiatrists and parent activists dissent quite radically from this position. The radical dissent is that drugging children who don't behave he way we would like them to represents a failure to engage them respectfully as human beings, that counseling, negotiation, skill building, and changing the school environment to be more accomodative of normal childhood behavior are all better alternatives which get pushed aside in favor of the quick fix of drugging. Many of these people also point to what they consider harmful effects and dangers of the drugs.
The middle position is that the drugs are overprescribed and do represent an easy way out in many cases, but that there are some children who really do have a brain dysfunction which is ameliorated by these drugs. The challenge is to discriminate more accurately, and to come up with the resources to address the problems of children who don't fit in well at school without resorting to drugs.
I find this debate important and illuminating, but I haven't made up my own mind fully between the radical critique and the moderate position. For those who wish to pursue this further, you can get started with the NIDA position, which links to NIMH and other federal agencies. Then you can visit a psychiatrist who is a leading critic of the practice of drugging children. Information about the side effects of Ritalin and other stimulant drugs is here.
Let me know what you think!
Friday, February 18, 2005
However, they do damage enough as it is. The view of the United States as engaged in a "Global War on Terror" which is in fact a war between God and Satan profoundly shapes our foreign policy. Indeed, it appears that Mr. Bush himself believes this, based on numerous of his public remarks. But Bush has not publicly equated the satantic forces with Islam, as many of his followers do -- including the army general William Boykin, who is in charge of antiterrorist operations. Short of this Manichean view of global affairs, many fundamentalists support Israeli expansionism and have allied themselves with the most violently nationalist elements in Israel because they believe Israeli conquest of all of the historic land of Judea will bring about the apocalypse. As they anticipate the end of time, they see no reason to preserve the terrestrial environment as a life support system.
Regardless of how the current violence and turmoil in the Middle East and elsewhere evolves, they have already succeeded in assuring that a large percentage of American school children will remain ignorant about basic facts that can save their lives, by intimidating teachers into avoiding the subject of evolution. Those children will grow up without essential elements of health literacy, unable to comprehend microbial drug resistance, the etiology of cancer, and other concepts that will enable them to communicate with their physicians, make appropriate decisions about their health care, and properly manage their own treatment. And of course they will not understand the most fundamental and critical issues of public policy, and will be misinformed citizens and voters.
Listen folks: we know more than the people did who made up stories to explain what they could not understand in 2000 BC. I, for one, don't care to go back there.
Thursday, February 17, 2005
It used to be that pneumonia, or a strep or staph infection, or tuberculosis, were often a death sentence, even for young, healthy people. As a matter of fact that's still true for many people in the poorest countries, but it isn't true for most of us today and we have antibiotics to thank. Unfortunately, microbes evolve, and no matter what Jerry Falwell or George W. Bush think about evolution, they are doing it right now. They are evolving immunity to the chemicals we use to kill them.
APUA, in 1998, created the Global Advisory on Antibiotic Resistance Data (GAARD), which has issued a report entitled Shadow Epidemic: The Growing Menace of Drug Resistance. They write, "The problem of resistance has insinuated itself into virtually all the infections that strike humankind." This is a global problem that threatens catastrophe for humanity. "Antimicrobial resistance is undermining every clinical and public health program designed to contain infectious diseases worldwide." The human and financial costs of this problem are already staggering, and they're just getting worse.
Why does it happen? Overuse, inappropriate prescribing, failure of patients to properly adhere to prescribed regimens and to finish the prescribed course, poor infection control in hospitals -- all of these contribute. So is marketing of all sorts of antibiotic-impregnated household products. (Don't buy any of that junk!) One of the biggest problems is the bulk feeding of antibiotics to animals so that they can be raised in crowded, unsanitary conditions. Antibiotics from animal waste are in rivers and lakes, and enter the natural food chain. But people who don't believe in evolution aren't going to respond to this problem.
Check out the link above, where you can read the GAARD report and a lot more.
Wednesday, February 16, 2005
The corporate media and the chattering classes have consistently misrepresented what the protesters at international trade meetings have been all about. They aren't against international commerce, for crying out loud, and they aren't against "globalization" in the sense that they want to stop the inevitable consequences of telecommunications and jet travel. They are against the creation of international agreements and institutions that erode national sovereignty over natural resources, environmental and worker protections in favor of the rights of what the agreements call "investors," i.e. multinational corporations.
The WTO and regional trade agreements generally require countries to prove that their laws and regulations are the "least restrictive" with regard to trade and are not trade barriers in disguise. Corporations have the right to sue governments based on this burden of proof. Shaffer and colleagues give several examples, a couple of which I'll summarize:
- Under NAFTA, Metalclad Corp. of the U.S. successfully sued Mexico after the state of San Luis Potosi prohibited Metalclad from reopening a toxic waste dump.
- The Canadian corporation Methanex sued the U.S. because California banned the gasoline additive MBTE -- which is a very nasty environmental problem.
- The U.S. invoked the WTO to stop South Africa, Thailand, Brazil and India from producing low-cost HIV medications. (Yup, that condemned people to death, to protect the profits of pharmaceutical companies.)
- The U.S. successfully overturned the European ban on beef treated with artificial hormones.
And it goes on and on. Currently under negotiation are regulations that will remove restrictions on corporate involvement in public hospitals, water and sanitation systems -- yup, the WTO may actually force countries to privatize their national health programs and basic infrastructure. Unfortunately, the movement against the corporate takeover of the world appears to have lost a bit of steam lately. I'm not saying that masked college dropouts breaking windows of McDonald's restaurants have discovered the best way to fight back, but fight back we must, for a democratic, not corporatist, international order.
Tuesday, February 15, 2005
Medical errors and iatrogenisis: The Institute of Medicine undertook a two year health care quality initiative that issued two major reports in 2000 and 2001. They found that as many as 98,000 Americans die in hospitals every year from preventable errors -- just the tip of the iceberg of medical error, of course -- and they recommended major changes in health care delivery.
Is that one of the top 25 stories of the past 25 years? I would say so!
Racial and ethnic disparities in health and health care: In 2002, another report, "Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care, found that a consistent body of research demonstrates significant variation in the rates of medical procedures by race, even when insurance status, income, age, and severity of conditions are comparable. This research indicates that U.S. racial and ethnic minorities are less likely to receive even routine medical procedures and experience a lower quality of health services."
Is that news? Maybe not if you're one of the overwhelmingly white, Anglo corps of reporters and editors . . .
Thanks to Susan_USA for nominating gun violence. I might have nominated violence in general. Violence gets more coverage than anything else on your local TV news, of course, but never from a public health perspective. Why is the U.S. such a violent country? Why do we have much higher rates of homicide and assault than all but the most troubled societies? What can we do about it? The coverage is sensationalistic and focuses on law enforcement and draconian punishment as the only solution.
Speechless, RD and DPR bring up some of the various drugs that have turned out to have undisclosed risks. (CNN has Prozac on its list but I suspect this refers to all the over the top hype when Listening to Prozac became a best seller.) This story is about the capture of the FDA by drug companies, aggressive marketing of drugs like the Cox-2 inhibitors and hormone replacement therapy that exaggerated their benefits, even as the drug companies suppressed evidence that they were dangerous and, in the case of HRT, completely useless, while the FDA collaborated with them to keep the truth from the public.
Court brings up the ultimate four bowler, how we pay for all these miracles, how much we pay, who pays, and how the organization and financing of health care affects what we get, and who gets what.
It shouldn't be a surprise, I guess, but these are subjects I've been focusing on here. So I guess we need to keep doing it.
Monday, February 14, 2005
To save on bandwidth, here's the list.
1. Human genome mapping
2. Stem cells
5. Living liver transplant
6. Self-contained artificial heart
8. Seatbelt laws
9. Overweight America
10. Body mass index
12. Low-carb diet
13. Cosmetic surgery boom
14. Gene therapy
15. Anthrax attack
16. West Nile/SARS
17. Prescription drug advertising
18. Tobacco settlement
19. Silicone breast implant settlement
20. Reproductive surgery
21. Laser-eye surgery
22. Mad cow
24. Tylenol tampering
I can think of one or two items that might be missing from this list. Let me know if you spot anything. Warning: they might be four bowlers.
Meanwhile, I've got all the material I need for the next few months . . .
Until the mid 20th Century, the generally accepted relationship between physicians and patients was one of benevolent paternalism. The expertise and wisdom to choose the appropriate treatment of disease resided entirely with the physician. The patient's role was to trust the physician and to follow "doctor's orders." Patients could be said to have consented to treatment if only because, as a practical matter, they had to physically submit to the surgeon's knife or swallow the doctor's potions. However, there was no expectation that the patient would be specifically informed about the physician's theory of the patient's disease, the theoretical basis of the proposed remedy, possible adverse effects, or alternative treatments.
In 1981 the U.S. Congress, believe it or not, commissioned a study of "the ethical and legal implications of the requirements of informed consent" in medical practice.” The study proposed an ideal model of medical decision making in which physicians and patients are partners. While physicians possess expertise about diseases and treatments, the patient is the expert on his or her own tolerance for pain and inconvenience, fear of disability or death, and other subjective factors essential to determining the consequences of a treatment choice for the patient's well-being. The report also put forth self-determination as having intrinsic value.
I don’t know whether to credit the Congressional report, but the culture has definitely evolved to the point where physicians are taught that this is the ethical way to practice, and patients generally view themselves as sharing decision making with their doctors. Unfortunately, in this case, perception is not reality. Research has shown that, at least in routine prescribing, most the required elements of informed consent are usually missing. The doctor says, "Take two of these three times a day, " and that's it. Immediately after a medical visit, people don't accurately remember at least half of what the doctor told them anyway. Most important, at least in interviews I have done, most people say they shared the decision about their treatment, but the only real rationale they have for their decision is that the doctor recommended it.
So, in my previous post, I did my best to explain basic facts that are important for people taking antiretroviral medications to know, in the most accessible terms I could find. No doubt a lot of people could do a better job than I did, but still, it's hard! With a decent grasp of basic ideas of modern biology, it's not that hard to understand, but most people don't have that grasp. It's not just of academic importance -- it's essential for all of us as we try to advocate for ourselves and our loved ones with the health care system and our individual practitioners, make the right choices about treatment, and take care of ourselves properly.
We have no hope of ever having a health literate population if people aren't taught the theory of evolution. It's at the heart of biology, and it is the key to understanding viral drug resistance, antibiotic resistance in bacteria, the emergence of new infectious diseases, resistance of insect and crop pests to pesticides and even, surprising as it may seem, cancer. So the forces of darkness whose "moral values" supposedly helped elect the White House Resident want to codemn their own children to ignorance and yup, ill health. Nothing morally valuable about that.
Sunday, February 13, 2005
No, they don't, and that includes people who have lots of education and are good at finance, gardening, cooking, child rearing, chess, music, architecture, 16th Century Chinese literature, and automobile mechanics. What most of us don't know squat about, for some reason, is biology. What we ourselves are. I'd like to lay out a few basic facts which are highly relevant to commonplace medical problems, and then go on to the specific example of HIV which happens to be how this discussion started. Then I'd like to invite readers to let us know if they learned anything, or if it was all completely familiar to them.
First of all, we are made of eukaryotic cells. Those are cells with a nucleus, and they have other important features such as mitochondria, and ribosomes. All multicellular organisms -- animals and plants -- are eukaryotes, but bacteria are not. All cells, including bacteria, are enclosed in a waterproof membrane. In eukaryotes, the cell's DNA is located in the nucleus.
DNA is an extremely long molecule that is analagous to a strip of old-fashioned magnetic tape once used to store computer programs. DNA is a long chain of codes corresponding to various molecules called amino acids, which when strung together in their own long chains form molecules called proteins. Proteins have an astonishing variety of properties. Some of them serve as structural elements of the body, others control various chemical reactions, others have essential roles in protecting the body against intruders, etc. Shorter strings of amino acids may serve as signalling molecules between cells and organs of the body. The activity of these various proteins and signalling molecules controls our development from fertilized egg to adult, and our biochemical functioning, including our feelings and thoughts.
Sections of DNA, called genes, contain the code strings defining various proteins and signalling molecules. Some proteins actually control the reading of genes to produce new protein molecules. Others repair damaged DNA. A molecule called messenger RNA is "transcribed" from DNA and then carries the instructions from the nucleus out into the cells ribosomes, where other molecules called tranfer RNA go out and get the required amino acid from the intracellular fluid and assemble them into the correct chains. Complex interactions and feedback loops among the external environment, the internal environment of the cell, and the various proteins and signalling molecules, determine which pieces of DNA will be read and when. All our cells have the same DNA, but different types of cells have different segments of DNA permanently turned off.
Viruses are not really alive. They cannot eat, or grow, or move. They just float around in our bloodstreams. They consist of some genetic material in a little protein package. Most of them consist of DNA, but HIV is actually RNA.
Cell membranes keep the good stuff inside the cell, and keep bad stuff out, but they have to be able to take in what they need and secrete what they're supposed to, so they have protein structures called receptors embedded in their membranes that act like gateways, letting in what needs to come in. When HIV happens to bump into a receptor called CD4, on a certain kind of white blood cell called a T-lymphocyte, its protein coat has the key, and HIV's RNA gets into the cell. Their, part of its RNA synthesizes an enzyme called Reverse Transcriptase, which runs the genetic system in reverse and makes DNA based on code contained in HIV's RNA. When this DNA is read, in turn, it makes the rest of the protein and RNA components of HIV. Some of HIV's proteins actually consists of different pieces cut out of the same chain of amino acids, so another HIV protein, called a protease enzyme, cuts the chains into the correct pieces. Then all the pieces assemble into new virus particles. When there are enough of them, the cell disgorges them. Eventually, its machinery hijacked to make HIV, the cell dies. It is the destruction of these cells that causes the characteristic immunodeficiency of HIV disease.
Whoo. Still with me? Here's where evolution comes in. The process of making DNA from HIV's RNA, and vice versa, doesn't always go perfectly. Sometimes one or more of those amino acids in the various chains is replaced by a different one. Sometimes this doesn't matter much, sometimes it makes the new HIV non-infectious, or less infectious. In the latter case, the progeny of that altered version will die out.
Now suppose we are taking a drug to control HIV. Most of them work by either stopping the action of reverse transcriptase, or stopping the action of protease. Now suppose we take the drugs all the time, on time, so there is always a good level of them in our bloodstream. HIV will scarcely be able to replicate at all, and our disease will be controlled. (Unfortunately, it will not be eradicated -- it will be there lurking in the DNA of some of our cells.)
Suppose we forget a dose one day. The virus will start to replicate. Then we take the next dose. If we're unlucky, somewhere among the millions of virus particles that got made while we were goofing off will be one that happens to have a mutation that makes its reverse transcriptase or protease work even in the presence of the drug. That one virus particle will keep replicating even though we are now taking our medicine again, and in a very short time, it will have billions and then trillions of descendants. We will now be infected with a drug resistant strain of HIV.
That process is called mutation and natural selection, i.e. evolution. It's how we got to be here in the first place to figure all this out.
Saturday, February 12, 2005
I have spoken with many HIV-infected people about their situations. We have discussed their understanding and experience of being HIV-infected, their experiences with physicians and the health care system more broadly, how they made decisions about their own treatment, and how they have implemented those decisions -- for example, how closely they follow the pharmaceutical "regimen," as it is called, prescribed by their doctors.
Most people, it turns out, do not understand HIV in quite the same way that their doctors do. One woman, quite well educated and knowledgeable about many subjects, told me that she couldn't understand what viral load was until "This guy told me that it's how many babies the mother virus is having. When she's having a lot of babies, that's a high viral load." Logically enough, she always referred to the virus as "she." Many people believed that if they didn't take their pills regularly, their bodies would become resistant to them and they would no longer be able to defeat the virus. However, their ideas of what consituted taking the pills regularly were often quite different from what their physicians must have believed.
Of course, their doctors also did not believe that erratic pill consumption would make their patients' bodies resistant to the drugs. They believed, instead, that it would make the virus resistant to the drugs. The difference between drug resistant patients and drug resistant virus is not a quibble, it is of the utmost importance to humanity.
Drug resistant patients may sicken and die. Bad. Drug resistant virus may spread from person, to person, to person. Much worse. People who are already HIV infected, who are worried about becoming a drug resistant person and are therefore taking their pills 100% of the time, on schedule, may presume that since they are already HIV infected, and they are taking their pills, getting infected again is inconsequential. People who are not HIV infected may presume that if they do become infected, they can take the pills, and count on remaining reasonably healthy. The new strain of HIV which people in New York City are worried about is quite likely a result of these mistaken conclusions. I will leave it to the reader to imagine how that came to be, and the possible further ramifications.
The reason so many people have this fundamental misunderstanding is that they do not really understand how viruses replicate, and they do not understand Charles Darwin's theory of evolution, which is the mechanism by which drug resistant virus is created. In order to understand the former, you have to know something about eukaryotic cells, the gene-protein system, and the genetic transcription process involving DNA and RNA. In order to understand the latter -- evolution -- you need only have been provided with a coherent explanation. This coherent explanation would not only make you wiser -- it might one day save your life, and the lives of people you care about.
Friday, February 11, 2005
Thanks to the Rule of Rescue, which prevents us from putting a finite value on human life when, but only when, it is in extremis, many people feel they cannot disconnect their next of kin from those machines so long as they are alive. So we need a way to say that people whose hearts are beating, who are breathing, are dead. The new definition concerns the brain. Brain death, by the way, does not mean that the cells of the brain are or, or even mostly, dead. It means that a part of the brain, the cerebral cortex, is not producing electrical waves.
In spite of what many people say, nobody has ever actually believed that human life is infinitely precious. The same people who say this are frequently all in favor of wars, and they certainly don't spend their last cent, or even necessarily two cents, to save people they don't know. We don't save inifinitely precious African children from death by malaria or diarrhea, although we could save dozens of them for the price of a night out. On the other hand, it is impossible to save anyone's life, because we are all mortal. The question is when, not if, death comes.
Death, as a matter of fact, is as precious as life. If not for death, none of us would be here. Evolution cannot take place without reproduction, and a reproducing species would quickly exhaust all of its ecological opportunities but for death. If nobody died, there could be no children. Evolution assures that we will die and indeed, it must select for an optimal lifespan that best supports reproductive success. In other words, as crass as it seems, yes, we must eventually make way for our children, and grandchildren, and great-granchildren.
All of this has a great deal to do with public health. By far the bulk of investment in medical research, and the bulk of medical spending, is devoted to adding a fairly small amount of life to people who are approaching the end. Inevitably, we have to consider the appropriate limits of this endeavor.
Thursday, February 10, 2005
On February 7, President Bush released his fiscal year 2006 budget proposal. In the proposed budget, the discretionary budget authority of the Department of Health and Human Services (which funds non-entitlement programs and agencies such as CDC, HRSA, FDA and more) would decline by $300 million to $68.9 billion. Within the budget resolution, all of these agencies and programs are contained within Function 550, health discretionary spending.
Although the President's proposed budget includes increases for HIV/AIDS (both global and domestic), community health centers and influenza preparedness, it fails to adequately invest in disease prevention and health promotion activities and cuts funding directed towards state and local bioterrorism preparedness activities. The budget contains proposed cuts in funding for the Centers for Disease Control and Prevention ($550 million), a 64 percent cut for the health professions program under the auspices of the Health Resources and Services Administration (the agency’s budget overall would be cut 12.3%) and eliminates the Preventive Health and Health Services Block Grant and the VERB program, among other proposed cuts in funding.
The Senate and House versions of the FY 2005 budget resolution are likely to propose cuts in discretionary spending, including spending for public health programs at the Centers for Disease Control and Prevention, Health Resources and Services Administration, National Institutes of Health, Agency for Healthcare Research and Quality, Substance Abuse and Mental Health Services Administration, Indian Health Service, the Food and Drug Administration, and Developmental and other Disability Programs within HHS. Therefore, it is of utmost importance that APHA members contact their senators and representative today, voicing their strong support for public health funding, so ultimately funding for these essential agencies and programs is not cut.
You can call your Senators and Representative through the Capitol Hill Switchboard at (202) 224-3121. APHA asks you tell them,
to vote AGAINST all budget resolutions that cut funding for Function 550 health discretionary spending and FOR amendments that increase funding, and our national investment, in public health programs and activities that protect our health; prevent, or treat early, diseases and disabilities; and provide a safety net for the medically underserved. Although the proposed cuts to public health programs will achieve savings in the short term, they will ultimately yield a higher investment in treatment in the long term, in addition to lower worker productivity.
* The VERB program is the CDC effort to inspire kids to be physically active.
Most reporters are not as blatantly partisan, but their output isn't very different. At best it includes some additional transcription of remarks from a "centrist" Democrat, but no attempt to sort out who is telling the truth. One reason that reporting for the corporate media consists largely of stenography is simply that reporters, as a rule, don't know anything about the issues they cover, and their editors don't care -- their mission is to attract eyeballs that they can sell to advertisers, not to enlighten the public.
Many suns ago, when a mighty big-nosed bubba bestrode DC like a colossus, I attended a grantee meeting in the Imperial City, where the editor of the Washington Post weekly "Health" section spoke to us about how to attract coverage of our programs. With my typical effrontery, I asked her why the media had failed to explain to the masses how the Clinton reform proposal would really work and how it would affect people's finances and health care.
I have to give her credit for honesty. She said that stuff is wonkish and boring. "Reporters bring me stories and I tell them, 'That's a four bowler.'" That referred to her vision of the typical American family (baby Maggie is too young to read the paper, but she includes Bart and Lisa) sitting around the breakfast table, and falling asleep with their faces in the cereal bowls.
Nope, the stories had to be about intriguing medical mysteries, news you can use for a healthy lifestyle, and heroic scientists and doctors conquering death. Of course the universities and drug companies feed them these stories, and they just write down the conquering heroes' boasts.
The example today is a good half page-worth of dead tree in my local bird cage liner about some local sorcerers who have found stem cells that grow into heart tissue. Some day, they will grow new hearts for us all, and heart disease will go the way of appendicitis. There is no real discussion of the actual probability that this might work, and how soon, but more important, there is no mention of why people get heart disease, how we might prevent most of it in the first place, or of what this all will cost, who is going to pay for it, and what happens to people with bad hearts who can't afford it. All that, of course, would make it a four bowler. It might also require the reporter to know something, and to think critically. That is too much to ask.
Wednesday, February 09, 2005
10% of the U.S. population had gone without health insurance for a year or more, when surveyed last year. Naturally, there is major inequality in this, as in all things here in the Land of the Free. 6.3% of white, non-Hispanic respondents had been in that perilous position; 10.5% of black, non-Hispanics; and 27.8% of Hispanics. (I prefer to say Latino, but they don't ask me.)
Now don't forget -- people without health insurance are almost always employed, or live in households with a breadwinner. These folks are not on welfare, they are working poor. Hispanics are by far the most likely to be migrant or seasonal workers, or to have the most marginal jobs, that don't provide them with insurance.
Sure, it's about good public policy, and it's about public health. But it's also about justice. The people who harvest your vegetables, who cut up and package your Perdue chicken breasts, who prepare your restaurant meals, who take care of your grandmother in the nursing home, who clean your office at night while you're home watching the boob tube, and then go to an overnight shift as a security guard -- that's who we're talking about. Those are the people who can't see a doctor.
(BTW, the number who are uninsured at any one time is much higher -- these are just people who had gone at least a year without insurance.)
I say surprisingly shallow because it is a very short wade to the deep end of this particular pool. The Times reporter just interviewed a bunch of shrinks, some of whom thought that calling Ted Bundy and John Wayne Gacey psychopaths inappropriately tends to excuse them as they were actually just evil but sane (whatever that means, and the question was never even suggested, let alone addressed); versus others who thought that while you can call them evil, for their psychiatrists to say so would be unscientific.
There was no mention in the article that psychiatry is marbled with moral judgments, even far away from questions of anti-social personality, while paradoxically, any science of the mind has a profoundly problematic engagement with morality. Neuroscience and its handmaiden psychiatry have long ago abandoned the Cartesian dualism of mind and body. The mind, and behavior, are just manifestations of a physical system. Every thought, every feeling, every impulse, is represented by patterns of neuronal activity, which can now be observed in a very crude aggregate and, in principle, could be completely described. Current research into addictions and eating disorders has taken a big chunk out of the illusion of free will, and in the view of many, the rest of it will soon melt away like the snows of March.
No-one created himself or herself. Many people with sociopathic spectrum diagnoses had head injuries as children. Many were severely abused or emotionally abandoned. For others, no such dramatic history can be discovered, but we are all the product of hereditary endowment interacting with the physical and social environment as our personalities got wired into our brains. Our brains are part of our bodies, which bathe them in nutrients and hormones, and supply the sensory input and experiential feedback that our brains process back into behavior. And of course we are all part of complexly interacting social systems. People with no diagnosis of psychopathology, when placed in the right social context, will readily commit evil acts.
I personally do not believe that neuroscience can take us completely "beyond good and evil," because moral feelings and judgments are intrinsic to our subjective experience and are essential heuristics for the functioning of society. But the ontological status of the moral categorization of individuals is questionable at best.
Tuesday, February 08, 2005
- Substance Abuse Prevention and Treatment Block Grant, which supports state programs: level funded at $1.78 billion. That is of course, an effective cut given inflation and the increasing population.
- Center for Substance Abuse Prevention, which makes grants to local organizations and provides technical assistance and capacity building: cut by $15 million, or more than 7% in nominal dollars.
- National Institute on Drug Abuse (NIDA): $4 million increase in a $1 billion budget, far less than the rate of inflation. (Not that my heart bleeds -- NIDA's research agenda is distorted by ideology.)
- National Institute on Alcohol Abuse and Alcoholism: $2 million increase in a $440 million budget, far less than inflation.
- State grants for Safe and Drug Free Schools and Communities: eliminated.
- Alcohol use reduction grants: eliminated.
There were a couple of programs in the area that got funding increases -- the biggest chunk of which is for mandatory drug testing of students.
See my earlier post, Soft on Crime, for a hint about what this is actually going to do for government spending. Hint: prisons cost money.
Consider tuberculosis, which used to be the leading cause of death. But by 1900 deaths from tuberculosis had fallen to less than half their earlier level. No effective treatment existed for the disease until 1948. But the declining trend continued throughout the early 20th Century, and the downtrend showed no evident acceleration when antibiotics were introduced. The TB death rate was already very low by then anyway.
It turns out that people may harbor the tuberculosis organism without becoming ill. Active disease develops only when they are weakened by other causes. Furthermore, TB spreads most readily in conditions of overcrowding and poor ventilation. Hence the tuberculosis pandemic in the 19th Century was a product of the social conditions of the industrial revolution: workers crowded into stifling tenements, working twelve hour shifts, and continually on the edge of starvation, developed active tuberculosis and spread it to others. As workers won the 8 hour day, and better wages, and housing codes were written and enforced, tuberculosis declined. Biomedicine only stepped in to mop up the residue.
There is a famous analogy in public health, the origins of which are lost in the mist of time. Suppose there is a steep cliff in the town, and people are falling off. At the bottom of the cliff are all the caring, compassionate people who make up the medical industry. As the people hit the ground, the medical workers rush in to stanch their bleeding and truck them off to the gleaming new hospital.
Meanwhile, at the top of the cliff, there is no warning sign or fence. Indeed, people from tobacco and food companies are selling them tickets to jump off. Some people, who work in hazardous occupations, are actually being driven toward the cliff by overseers with whips. There are some good souls up there who are trying to stop the people with addictions from marching toward the cliff, but the police are preventing them from helping.
What is the sensible thing to do? Spend more on the doctors and ambulances and hospitals? Or stop squandering all that money and put up a fence? We do the former because we depend on the market: individuals who have already fallen off the cliff will pay (or their insurers will pay) for treatment; but only society, through its government, will ever pay to put up a fence.
In the 20th Century, we began to build some fences. We established the Centers for Disease Control, the Environmental Protection Agency, the Substance Abuse and Mental Health Services Administration. The federal government took an important role in improving the nutrition of children and pregnant women through the WIC program, school lunches, food stamps and other initiatves; and in making housing affordable for poor families through the Section 8 program and public housing (which has been done wrong more often than it has been done right, but we've learned, and lately we've been doing it better).
Comes now the proposed budget of the current administration. In the name of fiscal discipline, and because we must defend freedom by assuring that rich people don't pay taxes and we have the military resources to invade countries that pose no threat to us, in order to get the government off the backs of the people, all of these violations of the Free Market®, have got to be scaled back and, logic tells us, ultimately eliminated.
It's great news, however, that we have a president who is strong and resolute. You know where he stands.
Monday, February 07, 2005
I don't know what to say about this, but I find it oddly disturbing. According to Jacques Steinberg in today's NYW*T, ABC is planning a new "reality" show in which a team of physicians goes around the country and finds desperately ill people who can't afford medical care, then they give them the heart surgery or antiretroviral drugs or whatever they may need. Also, they'll give their families free head shrinking, apparently on the presumption that they've already been traumatized by mama's formerly impending demise.
The name of the show will be Miracle Workers. I would presume that they'll cast families with 2 1/2 cute kids and a sad-eyed dog. If you don't have the right pathos appeal, you'll be SOL.
Q: Why do they call them "reality" shows when the whole idea is, la realidad es otra?
*"W" not in original name of newspaper.
- Not only do nothing to fix the supposed fiscal 'crisis' in the system but actually make it worse;
- Result in people receiving far less in total benefits than they would even if nothing were done now to raise revenue or delay retirement, etc.
- Be even worse for people who are young today, including newborns, than for older workers
However, the corporate media will not report this, because it is the truth. What they report are the facts. The facts are what the Resident and unnamed administration officials speaking on background say. The truth has a liberal bias, and therefore is inappropriate to report.
It is true that people tend to use more health care and take more pills as they get older. Well, kinda. Children see the doctor frequently and gestating and delivering them is more expensive. But older people do cost more on a continuous basis. However, there is no correlation between the proportion of elderly and health expenditures in the OECD countries. Why not?
As much as 40% of health care expenditures occur in the last two years of life or so, in the process of dying. This is going to happen to everyone at some point. The longer we live, the later it happens, which actually means that we spend less this year and then spend the same amount at some time in the future. If we can succeed in compressing the time between the onset of serious illness and death, total expenditures will actually go down.
So those longer life expectancies in the European countries actually save money! The key is public health -- reduce rates of smoking and obesity, improve people's nutrition, keep folks active.
Now it is true that as more people turn 65, a higher proportion of health care costs in the U.S. will be paid for by Medicare and Medicaid. (Poor elders are eligible for Medicaid and they need it for things Medicare won't pay for, including long-term nursing home care.) That is a public policy issue we need to address. But we can definitely afford it.
In fact, we can easily afford it -- We need universal, comprehensive, single payer national health care. That way, our country's total health care bill will go down, even as we grow older. But we'll see if any prominent politician, or any reporter for the corporate media, says this publicly at any time in 2005.
Sunday, February 06, 2005
The other countries which oppose the protocol? China, Russia, India, Pakistan, Cuba, and Iran. People are free to draw their own conclusions. I already have.
An absurdly major flap here in the Hub of the Universe concerns a couple of laboratory workers at Boston University who got sick a few months ago -- not mortally ill, not permanently injured, not suffering what Alberto Gonzales would consider tortuous symptoms such as major organ failure or death -- just plain old sick with a flu-like illness. They got better. Then a few weeks later, for undisclosed reasons, they took antibody tests and it turned out that the batch of tularemia they were working with, which was supposed to be a weakened form, was actually pathogenic and they apparently had been infected. They were supposed to report this to the Department of Public Health but they were a little late getting around to it. The results of this major public health catastrophe included a media frenzy featuring at least two Page 1 stories in the local bird cage liner, and the resignation of BU's infectious disease lab director.
FYI, tularemia is a bacterium which causes a disease of the same name. It is not contagious from person to person. It's typically spread by ticks, and the natural reservoir is cute little furry animals such as rabbits. It's basically no big deal, but some people have the idea that it could be worked up into a biological weapon, which personally I doubt. Anyhow, this total non-story probably would have gotten far more attention than it deserved in any case, but it got pumped up big time because BU is planning to build a so-called Level 4 biosafety lab -- where else but in the middle of a low-income predominantly Black neighborhood. That's a laboratory for working with the most dangerous pathogens, designed so that it would be very difficult for the bugs to escape. It has been highly controversial, with the local student social action groups joining with community organizations and lefty profs in Mortal Kombat with BU's lobbying and marketing apparatus, which is major league. BU has, for example, taken over entire subway trains with ads touting the beauty and glory of its biosafety lab, and it has the Mayor and City Council securely in its pocket.
The purpose of this endeavour is supposedly to do research on "bioterrorism." I'm not going to take a position on whether the lab would really pose a significant threat to the surrounding community, I like to have some idea of what I'm talking about before I pretend that I do, but I will take the opportunity to deconstruct this entire discussion.
First of all, while one cannot rule out that such research might yield knowledge which is helpful to combatting natural epidemics, it would be a highly inefficient use of resources for that purpose. Most of the research consists of figuring out ways to produce biological weapons, because you have to be able to produce them if you want to study how to counter them. The rest of the world just has to trust us that our intentions in doing this are honorable. Just imagine how the U.S. would react if Iran or, say, Venezuela, was engaged in such research. Could this possibly be considered a Weapons of Mass Destruction Related Program Activity? Just asking.
Second, bioterrorism (or biological warfare, as presumably it would be called if the perpetrator was a powerful, sovereign nation as opposed to a relatively weak, non-state group) is not all that it's cracked up to be. Non-contagious organisms, such as anthrax or tularemia, are not in fact Weapons of Mass Destruction (tm). Colin Powell told the UN Security Council that Saddam possessed enough anthrax spores to kill millions of people. Actually he didn't possess any, but suppose he did have all those anthrax spores? The only way they could kill millions of people is if somebody went around and systematically put precisely measured quantities up millions of people's noses. By that standard, firecrackers are also Weapons of Mass Destruction (tm) because they could kill millions of people using the same procedure. In order to harm people using non-contagious organisms, you have to deliver them to the people's respiratory systems or some other route into the body. If you can do that, you can also hit the people with a bomb or a bullet. Remember that for all the disruption caused by the anthrax mailer, only five people died from that attack. An equally, or even more destructive effect could easily have been achieved by mailing bombs. If Saddam were to have given anthrax spores to Osama bin Laden, it is not clear that al Qaeda would have become notably more dangerous. This is a long, hard wrangle which I won't get into in more detail here.
Contagious diseases could indeed cause massive destruction, but they are not very tempting as weapons because the attacker cannot control where they go, which is around the world including right back at the attacker and the populations he (or maybe she) cares about. In principal, a mad conqueror could immunize his entire population against a novel pathogen, and then release it on the rest of the world, but it is questionable whether research in a biosafety lab has anything to do with preventing such a catastrophe, assuming one considers it a plausible risk to begin with. Biological agents are potentially useful for assassinations, and perhaps for larger scale targeted attacks where the perpetrator wishes to remain concealed (viz Bagwhan Shree Rajneesh).
There are, however, very real, very dangerous weapons of mass destruction in the world today. The United States owns the largest share of them, followed by Russia, and including our buddies over in Merry Old England and those snooty, brie-eating, wine sipping surrender monkeys. And oh yeah, our best pals Israel and Pakistan. Funny thing -- the Resident said a few months back that "democracies don't manufacture weapons of mass destruction." So it turns out that nuclear explosive devices are no longer considered Weapons of Mass Destruction (tm).
Saturday, February 05, 2005
They want to cut the budget of the Centers for Disease Control and Prevention by 9% -- and that's in nominal dollars, meaning the real cut is greater. Overall cuts to HHS discretionary programs will be 2.4%. Among the programs to be drastically cut are, get this, bioterrorism preparedness programs., also chronic disease prevention programs and block grants to the states for urgent health needs. We have yet to see what they propose to do to Medicare and Medicaid.
They are doing this to all of us -- risking our lives, our well being, and the future of our children -- so they and their rich friends won't have to pay taxes, and they can pursue their megalomaniacal delusions of world domination. That's it. No more. It stops, now.
Friday, February 04, 2005
I began to address some of these problems in my earlier posts about the Rule of Rescue. The RoR is a so-called deontological, or rule-based ethic. It says that when we see someone in immediate peril or great distress, we are obliged to help them. This rule, as it turns out, can come into conflict with so-called consequentialist or utilitarian ethics, which say we should behave so as to get the best outcome -- however defined, and that's huge, just as picking your rules is huge under a deontological approach. If the resources we expend on rescue are taken from people who are not in immediate danger, but who will ultimately suffer or die because we deprive them today, we have failed to maximize the general good.
People who write about medical ethics usually finesse this sort of problem by talking about principles, which lie somewhere in between. These are ideas like human rights, justice, freedom, etc. -- grand abstractions that we can try to apply to specific cases. In medical ethics, the current standard is a set of principles taken from Beauchamp and Childress. These are called justice, respect for persons (or autonomy), nonmaleficence, and beneficence. Be fair, let people control their own destinies, don't hurt people, try to help people. Most people think these are terrific ideas, although it is very easy to see how they can come into conflict with each other. And of course, who is to say what is just?
A more basic problem is one that some commenters have raised: where do ethics come from? The Bible? Your preacher? The Buddha? Your personal communion with the Almighty? Your parents? Your favorite teacher? The state legislature, if signed into law by the Governor or approved by a 2/3 majority? Randy Cohen, the official New York Times ethicist? And where the hell does he get his authority from anyway?
When progressives and conservatives argue about the fate of Social Security, and Medicaid, and environmental regulation, and taxation, and foreign aid, and everything else they are arguing about, facts and figures do come into it. But ethics are at the heart of the great divisions of belief in society. Where they come from, what they tell us. For those who are interested in a liberal Christian perspective on ethics, you might want to go to Adventus. I am a humanist, and I would like for there to be an opportunity for believers and non-believers to discuss ethics and truth from both perspectives. Some of that can happen here but I don't want to derail the main subject. Is there interest out there in a new forum of some kind?
- Reduce benefits for wealthy retirees, who don't really need them. Obviously a political non-starter under our current leadership.
- Index future benefits to the Consumer Price Index Rather than beneficiaries' pre-retirement earnings. (See note below.)
- Raise the retirement age to 74. Okay for college professors maybe, not so great for bricklayers.
- Discourage people from taking early retirement. Ditto. A study has estimated that 25% of early retirees are too frail to continue their normal work -- and that's without the already scheduled increase in the retirement age to 67.
Andrews has a 24 column inch discussion of these possibilities and why none of them is satisfactory. Then, in inch 25, we get this:
Some Republicans have even gone so far as to suggest the one approach Mr. Bush did not mention in his speech, raising the ceiling on income subject to payroll taxes, which is now about $90,000 a year. The idea appeals to some politicians because only about 6 percent of Americans earn more than $90,000 a year. Imposing Social Security taxes on incomes of up to $200,000 would come close to eliminating the entire deficit.
Mr. Bush has adamantly opposed any increase in payroll taxes. At least for the moment, that idea is off the table.
So that's it. This entire problem could be solved simply by eliminating the cap on payroll taxes. The entire "crisis," the endless obfuscating yammering of elocutionists on TV wearing hairpieces molded from a single piece of plastic, the terabytes of outrage and wonkery sloshing through the blogosphere, the forests laid waste to make the newsprint to fail to explain to the people exactly what the Resident is talking about doing to them -- all of it could be aborted tomorrow just by making corporate lawyers, executives, and stock traders pay the the same percentage of their salary in Social Security taxes as a Burger King counterperson. But the only place this idea is even mentioned in public is in the last paragraph of a story on page 15 of the New York Times.
Note: This indexing thing is confusing to a lot of people. Over the years, wages have tended to rise faster than inflation. This is usually interpreted to mean that the U.S. standard of living is increasing, which is actually debatable, and anyway, it hasn't been happening for the past few years. But over the long run, the initial level of SS benefits people get when they retire depends on their lifetime earnings, so it has tended to rise. Once you retire, your benefits are indexed to a measure of inflation, the CPI. The problem with indexing initial benefits to the CPI as well is that, if wages continue to increase over the decades, upon retirement, people's incomes will plunge precipitously. The conventional argument is that they'd still be as well off as retired people are today, even if they're poor compared to their unretired neighbors, but that incorporates some unexamined assumptions about the nature of monetary value. This is deep stuff that we may want to get into later.
Thursday, February 03, 2005
Check it out here (second link down as of now). Then get on the phone to your representative in Congress and tell them that Medicaid needs to be strengthened, not cut.
The reasons why we pay so much more for health care than the other wealthy countries do were reviewed quite comprehensively by Uwe Reinhardt, Peter S. Hussey, and Gerard F. Anderson in Health Affairs last year. I'm afraid I can't give you a link because it's subscription only, but here's the short version.
First, the comparison. This is done using so-called Purchasing Power Parity (PPP) dollars, which is just a fancy way of comparing costs across countries which is thought to be more accurate than using the currency exchange rate. The U.S. , in 1992, spent $4,887 per person on health care. Sweden came in second, at $3,322, or 68% of U.S. spending. Canada was fifth, at 57%, and Japan, a wealthy country with excellent health care, spent just 44% as much as we do per person. New Zealand spent 35% as much. Not only that, but the rate of growth in health care spending in the U.S. exceeds the overall rate of economic growth by much more than it does in other countries, so these disparities are just getting larger over time.
1) We are the richest country, so we spend more on everything. But, our actual spending on health care is still 42% higher than would be predicted just by our overall profligacy, and we aren't getting more for it -- our health status and life expectancy are poor compared to these other rich countries.
2) The other countries have much more concentrated market power on the purchasing side, either single payer systems or multiple payers who work within budgets established by governments (e.g., Germany). By driving a bargain, they get lower prices for drugs, medical devices, etc. Also, there is less overall economic inequality in those countries, so physician's salaries, like the salaries of business executives and financiers, are not as far out of line with most people's pay as they are here. That's actually not such a bad bargain for doctors, though, because their medical education is free and they don't have to pay back a quarter million dollars in student loans.
3) Admnistrative complexity. Because of our very complex system with each provider needing systems for billing dozens of different payers, and the payers having their own overhead and marketing expenses, etc., 24% of total U.S. health care spending was on administrative costs. Note that this is not because of our public insurance programs (Medicare and Medicaid). Administrative expenses for private insurance are 2 1/2 times as high.
4) Finally, and this is a small part of the total and difficult to quantify, there probably is somewhat less use of extremely expensive interventions that have a small payoff in the other countries. Here, as Reinhardt et al write, "neither private health insurers nor . . . Medicare and Medicaid appear to have any explicit guidelines on the maximum pricer per [Quality Adjusted Life Year]* procured through health care. . . . For low-income Americans without health insurance, there may well be much lower, haphazardly imposed implicit upper limits on the price per QALY that society is willing to pay on their behalf."
Can we afford to keep paying this much? The basic conclusion of Reinhardt and friends is that we're a rich country, the country as a whole can pay it if we want to, but low income people will increasingly either be deprived of health care, or impoverished by paying for it. In the Ownership Society, however, that's just too damn bad for them. They should have gotten rich, like self-made man George W. Bush.
* I discussed QALY's earlier. Basically it's a unit meaning a year of life adjusted for sickness and disability. It gets weird to think about life in dollar terms but it's unavoidable because that's what spending on health care is supposed to be buying us.