As Kathy noted on my previous post, there can be a lot to process when you're doing focus groups or any form of qualitative social research. We often hear confidences from our friends and family, have to respond to trauma in our own lives and our loved ones' lives, and live through hard times. Health care and social service providers obviously have to interact with very sick, dying and bereaved people as part of their daily routine. But there is something particular and particularly challenging about hearing tragic stories as a researcher.
I have to do my best not to influence how people present themselves, or steer the data in any direction. Normally, when it's appropriate to offer sympathy or reassurance or advice, that's what sane people do. Sociologists, however, ordinarily have to be very careful not to become a part of the real life of the people who we're trying to understand. (There are exceptions, mostly from the anthropological tradition, where investigators enter life worlds. But that's not what I was doing yesterday.)
That doesn't mean we don't care, or they're just specimens. If people really need help right away, of course we'll try to get it for them, and in a real emergency we'll step out of our role if we have to. In fact, I really have to be in the moment, I really have to empathize and understand, so I can figure out what questions to ask and where the discussion needs to go. But I also have to stay in my role of an observer and data collector. It's very taxing. Fortunately, in the situation I just worked in, the people were supportive of each other -- which is an observation -- so I didn't have to feel so stressed about it.
With the development of cheap electronic digital computing, social science, which had once been a largely descriptive discipline, became much more quantitative, and obsessed with statistical analysis and mathematical models. Qualitative methods fell out of favor. They were widely seen as intellectually inferior, not really scientific, more akin to literary art.
Fortunately we're getting past that. Fancy math is worthless if you don't understand what it is you are counting. We got to the point where our reductionism was squeezing the juice out of life. Human social interaction is different from biological processes because it has a particular kind of meaning for us, that we don't have very compelling ways of collapsing into few and definite categories that have specific, quantifiable relationships.
Because it is easy to turn survey data into mathematical models, survey methods became extremely popular in the social sciences, but as the saying goes, garbage in, garbage out. The investigator decides how to phrase the questions, and what responses to offer. Whether it makes sense to you, or means very much to you, as a respondent, is not the researcher's problem -- and we all know how annoying it can be to respond to a survey that doesn't ask the right questions, or asks them the wrong way, or doesn't let us give the answer we want to give.
On the other hand, just because I talk with some people and I see particular meaning in what they say to me, or to each other, doesn't mean you necessarily have to believe that my insights are the right or most important ones, or that those people are like others. We need ways of cogently defining units of meaning in social interactions and people's social worlds, counting them, and testing their associations with each other. It's a slow process, working back and forth between qualitative induction and quantitative deduction, and the average lay person probably doesn't think we've really gotten anywhere. But we have. A lot of what we have learned, however, people don't want to believe. But that's a problem for another day.
Wednesday, November 17, 2010
The Quality of Quality
Subscribe to:
Post Comments (Atom)
4 comments:
even the "soft sciences" are hard, in the sense of difficult.i am glad someone is doing it.
i know that when i interview people, i end up with MUCH richer information than if i sent a questionaire, or if i restricted myself to a checklist of questions. one of the benefits of asking open-ended questions, following up on concerns expressed, etc. is that i find out about things that i did not contemplate in advance. people are complex, and so are their experiences and impressions.
this is one of the problems i have with, say, public opinion polls. the set questions may not be the right ones, and they leave out the reasons people might answer a certain way. narrowing things down to a point where it is easy to crunch numbers also precludes a lot of information that is useful. so, for example, i think a poll that addresses the president's popularity is basically uninformative -- we know from other information that some people are opposed to our president because he is of a different party, and/or a different racial background; some because he has not waved his magic wand to get one or another thing fixed already.
i try to understand statistical analyses of problems like the ones you discuss, research, investigate. and your primers on matters statistical have been helpful. but when we get into human behavior -- that is a lot more complicated than a survey of "does this screening procedure produce a better outcome in this population."
And we're also hard in the sense of being reality based on in search of definite facts. I don't consider there to anything squishy about what I do.
i don't think roger meant to imply squishiness, and i didn't, either.
you are the last person i'd suspect of believing that an anecdote equals evidence in a broad [over a population] sense.
Post a Comment