Thursday, May 16, 2013
Science and Evidence
This may not be the most entertaining post ever, but it's necessary in order to get on with our story. Clumsy exposition, if you will.
Many people make a distinction between science based medicine, and evidence based medicine. They're closely related, to be sure, but not quite the same.
Science depends on evidence, and respects evidence. But it does consist only of evidence. It includes deductions from evidence; hypotheses -- conjectures to be tested; and theories, which are explanations about the causal relationships among phenomena and the unobserved structures that underlie observations.
I'm sure most readers already know that the word "theory" is widely misunderstood, as being synonymous with "hypothesis." It is sometimes casually used in that way, by people who should know better, but I have been trying to discipline myself not to do that. Theories can be conjectural -- some of them also have the status of hypothesis -- but they aren't necessarily. Some of them are very well tested and as certainly true as anything can be, subject to refinement. Often a broader, more embracing theory will swallow up an old one, without exactly falsifying it. For example, Newtonian gravity still works well enough for many applications, but it does break down where conditions are extreme or we need extraordinary precision.
Anyway . . .
There are empirical remedies, that seem to work even though we don't know why. Often, alas, they don't work very well, or they don't work with everybody who seems to have the indication, or the balance of good and bad effects is not what we would like it to be. Psychiatric medications are, at best, in this category. People with disabling psychoses generally calm down and have reduced delusions and hallucinations if they take anti-psychotics, but nobody knows why. Randomized controlled trials provide evidence for effectiveness -- along with a lot of terrible side effects -- but there isn't any real scientific understanding of psychosis.
On the other hand, we now have a good understanding of how, say aspirin works. For millennia willow bark was an empirical remedy, then acetylsalycilic acid was isolated in the 19th Century, then we figured out -- or rather John Robert Vane did, in 1972 -- that it inhibits the synthesis of cell-signaling molecules called prostaglandins and thromboxanes. The former accounts for the anti-inflammatory and analgesic effects, the latter for the anticoagulation effect. (I think -- I'm not a real doctor.) Anyway, knowing that we can figure out a whole lot more about aspirin's good and bad effects, and try to find drugs that have more of the good ones and less of the bad ones. (We've made some serious mistakes along the way with that, but that's another story.)
Philosophically, this distinction is very important because the strength of new evidence depends not only on the inherent properties of an observation, such as the design of the experiment that produced it, but also on its prior plausibility. The famous p value is almost universally misunderstood. If we do an experiment and get a p value below .05 for a result which is a priori highly implausible, we cannot conclude that the chance the observation is true is 95%. It just isn't. It's likely just a fluke. On the other hand if we do a trial and get a p value of .2 or .3 for a highly plausible result, the hypothesis is very likely still true - in fact, we should be more confident that it is true than we were before, even though our observation is officially called "statistically insignificant." This misleads many people into thinking that the study undermined the hypothesis, when it did no such thing.
A very good example is the Oregon Medicaid experiment. In fact, enrolling in Medicaid almost certainly does ultimately have beneficial biological outcomes for people with diabetes and high blood pressure. Contrary to general interpretations, and in fact to its own authors' stated conclusions, the study did not provide evidence to the contrary.
I'll try to explain further as I go on to discuss evidence.