A few weeks ago, Research Access had an article that talked about cleaning data from panels. One of the methods was to remove all the "extreme" responses that occurred in rating scales. If you've ever received a survey with too many matrix rating questions and then mindlessly checked all the lowest ratings, mid-ratings or high ratings -- you know who you are and you know what I'm talking about.
This put a bug in my mind about those people who never give the highest rating -- or those people who consistently give a high rating. In many ways, they are no better than those panelists who were mindlessly answering their survey. In other words -- their responses were suspect in being valid answers to help us make a decision.
Then I ran into this article by Jeff Henning and realized that I wasn't the only one thinking about this. In fact, people much smarter than I'll ever be have done loads of research on the topic and have come up with several solutions that I'm going to show you here.
Henning's article actually gets into the discussion about cultural response bias, and you can read more about that in his article. The following are question types that Henning listed that will help you overcome general response style bias:
Lessons Learned: Surveys should provide additional data for your decision -- not MAKE Your Decision
So it isn't just panels that can "dirty" your data enough to mis-represent how respondents really feel. Response bias comes in as many shapes and forms as there are human opinions. The key point here is that decisions are ultimately made by human beings. Managers and business owners can use survey data as an additional information resource to help them decide -- sort of like the "Ask the audience" option in "Who Wants to be a Millionaire" but that doesn't mean you shouldn't use your own judgement.
This put a bug in my mind about those people who never give the highest rating -- or those people who consistently give a high rating. In many ways, they are no better than those panelists who were mindlessly answering their survey. In other words -- their responses were suspect in being valid answers to help us make a decision.
Then I ran into this article by Jeff Henning and realized that I wasn't the only one thinking about this. In fact, people much smarter than I'll ever be have done loads of research on the topic and have come up with several solutions that I'm going to show you here.
Henning's article actually gets into the discussion about cultural response bias, and you can read more about that in his article. The following are question types that Henning listed that will help you overcome general response style bias:
- Binary scales – As far back as 1946, to minimize response style bias, Lee Joseph Cronbach (ofCronbach’s alpha fame) advocated using two-item scales: yes/no, agree/disagree, dissatisfied/satisfied, describes/does not describe. Essentially this treats everyone as Extreme Response Style responders.
- Choose-many questions – Presenting a list of choices and instructing the respondent to “select all that apply” is an economical form of binary scale, prompting respondents to choose the items they agree with, find important or are dissatisfied with.
- Ranking questions – Another way to avoid traditional response bias is to use ranking scales, where each choice on the scale may be used only once: most important to least important, most satisfactory to least satisfactory, most likely to least likely.
- MaxDiff scaling -- Maximum-difference discrete-choice models are a more sophisticated type of ranking question, typically showing attributes four at a time and asking the respondent to select the best and worst attributes from each set: the attributes with the maximum difference. Research has demonstrated that MaxDiff scaling is superior to rating scales for cross-cultural analysis. (Steve Cohen & Leopoldo Neira, 2003, “Measuring Preference for Product Benefits Across Countries: Overcoming scale usage bias with Maximum Difference Scaling”.)
Lessons Learned: Surveys should provide additional data for your decision -- not MAKE Your Decision
So it isn't just panels that can "dirty" your data enough to mis-represent how respondents really feel. Response bias comes in as many shapes and forms as there are human opinions. The key point here is that decisions are ultimately made by human beings. Managers and business owners can use survey data as an additional information resource to help them decide -- sort of like the "Ask the audience" option in "Who Wants to be a Millionaire" but that doesn't mean you shouldn't use your own judgement.
Comments
Post a Comment