Tuesday, November 18, 2014
A commenter on my recent post on chronic pain calls attention to Dr. John Sarno. You can read a summary and a critique of Dr. Sarno's ideas here. Superficially, they resemble the current consensus that chronic pain with no corresponding physical lesion is generated in the brain. But there's a big difference.
Sarno has a psychodynamic explanation. He thinks that suppressed anger is translated into low back pain. The linked essay by Todd Hargrove offers good arguments why that doesn't make sense. But I want to use this example as a jumping off point for a broader discussion.
It is very common for someone to get an idea that perhaps seems plausible (sometimes it does not, even superficially, viz homeopathy), and then become very attached to it. Sometimes these people have strong scientific credentials. Nobelist Linus Pauling, with his supposed discovery that large doses of vitamin C are a panacea, is an excellent example. These people may gain large followings. But .. .
There is no credible scientific evidence for the idea. The originator and his (or less often her) followers are convinced that it works, but we never get past anecdotes and testimonials. Many precincts of the Internet devote substantial resources to debunking such claims. I recommend Quackwatch, which has a lot of fun stuff to read, and Science Based Medicine, a blog by rotating authors that reliably puts up a new essay every day.
I won't trouble to repeat here the extensive resources you can get to via those links to understand how to recognize pseudo-science, and the harm it does. But I do want to talk briefly about the whole problem of motivated reasoning.
The originators of quackery probably actually believe in it, most of the time. They aren't consciously lying. But they do, of course, have a lot at stake: their self-image as breakthrough thinkers, the gratitude and loyalty of their devotees, quite likely money, reputation and fame. So their thinking becomes dedicated to strengthening the belief, and the whole world is seen through the lens of confirmation bias.
Unfortunately the whole world largely works this way. Social psychologists have found that showing true believers evidence that contradicts their beliefs just strengthens them. Sen. Inhofe says he was at first convinced by the argument that burning fossil fuels is changing the climate but then he found out how much it would cost to do something about it. So he changed his belief. Creationists don't want to give up the myth of original sin and redemption through the sacrifice of Christ, which makes no sense to begin with but they make their living from the collection plate or the TV ministry. I could go on.
So why am I so sure my own beliefs are generally correct and rational? I suppose I can't be entirely, but I am aware of the problems of confirmation bias and motivated reasoning so I make a serious effort to continually re-examine my ideas and to insist on standards of evidence for my understanding of the world. The evidence that I do this is that I have frequently changed my opinion about matters large and small.
I grew up going to church and Sunday school and believing in a high church protestant version of Christianity. Once I was old enough to think for myself, I concluded it was bunk. I was never a Communist but as a youth I read romanticized accounts of the Chinese revolution and I thought that on balance it was a good thing. I had no appreciation of the depravity of Maoist tyranny but I do now. At first I wasn't at all convinced that the official version of the 9/11 attacks was accurate but I sorted through the evidence and the arguments and ultimately concluded that it is, essentially. (I think it likely that the government has deliberately covered up involvement by Saudi royalty, for reasons of geopolitics. I also find it plausible that the Bush administration security establishment expected something of the kind to happen, without having specific information, and had no interest in preventing it. But that's only a suspicion, I certainly can't assert it.)
In my professional work, I used to think that average people could be empowered to have a mechanistic understanding of their health and treatments that more or less corresponded to the ways their doctors think but now I believe that is unrealistic for most people. What I am striving for now is to understand the nature of the information people really do need and can use to make good decisions on behalf of their own health and well-being, but I think it has a small component of science and a bigger component of other kinds of meanings.
I may well cling to some ideas due to confirmation bias, but I do recommend this exercise to you. Try to take an inventory of important ways in which you have changed your mind over the years. If you can't come up with very many, maybe you should undertake a critical assessment of one or more of your cherished beliefs and see what happens. Can you find reason to doubt? Make a serious effort to talk yourself out of what you think now. Go down that road and see where it leads. You might well decide you were right in the first place, but at least you will have put it to the test.