Map of life expectancy at birth from Global Education Project.

Friday, November 11, 2011

Follow up on psychology research

We had a guest who is planning to get a graduate degree in psychology inquiring about my drive-by comment that some people consider the whole field of social psychology to be "dodgy." This isn't my specialty and I don't know a whole lot about it, but here's Benedict Carey in the NYT reviewing the issue:

In recent years, psychologists have reported a raft of findings on race biases, brain imaging and even extrasensory perception that have not stood up to scrutiny. Outright fraud may be rare, these experts say, but they contend that Dr. Stapel took advantage of a system that allows researchers to operate in near secrecy and massage data to find what they want to find, without much fear of being challenged.

“The big problem is that the culture is such that researchers spin their work in a way that tells a prettier story than what they really found,” said Jonathan Schooler, a psychologist at the University of California, Santa Barbara. “It’s almost like everyone is on steroids, and to compete you have to take steroids as well.”

I'm afraid I'm not getting the distinction between "outright fraud" and telling a "prettier story than what they really found." When you publish research findings, the only story you are permitted to tell is what you really found. It is true that research reports normally end with a "Discussion" section in which the authors often speculate, going beyond the findings to adduce possible implications or further hypotheses. These can be tendentious, to be sure, but at least an alert reader will be able to spot that, if the "Results" section is accurate and presents the information needed to correctly interpret the observations. Abstracts are often misleading, as are titles, and that's a problem because many readers never look beyond them.

But Schooler obviously understands that using steroids is against the rules in athletics, and telling a prettier story than what you really found is against the rules in science. The analogy is imperfect, not least because the consequence of using performance enhancing drugs is that somebody wins a game, which doesn't actually matter; whereas the consequence of falsely reporting on research is that the world is misled, careers and money are spent chasing down the wrong path, and quite often, people -- likely in their role as patients -- are directly harmed. I don't know whether this is available to the public -- I don't think so, but unfortunately I'm using a computer that has privileges to read BMJ whether I log in or not -- but they have much more this week on the Andrew Wakefield fraud. It turns out that a) not only did the kids not have autism, they didn't have inflammatory bowel disease either; b) the pathologist whose name was on the paper as a co-author had in fact found them not to have bowel disease but signed onto the paper anyway; and c) the institution - University College London -- has refused to do any investigation of the whole matter.

Although I have never seen anything but integrity among my own colleagues, recent publicly reported scandals are making me wonder how widespread the corruption of science may be. It's disconcerting, to say the least.

2 comments:

Anonymous said...

Well in my experience (I’m a psychologist) data massaging of various forms does occur, and social psychology is actually more affected than other branches (clinical psychology, which is very varied, is a case apart.) A broader problem is the general way of doing research, e.g. doing it so that the results are a foregone conclusion; using measurements that are dodgy but accepted as there isn’t really an alternative; using analysis of variance and arbitrary probability cut-offs, and so on. Questionnaire studies are of course sort of the pits, anyone with some experience can obtain almost any result.

Ana

Unknown said...
This comment has been removed by the author.