COMMON MISTEAKS MISTAKES IN USING STATISTICS: Spotting and Avoiding Them

Introduction   Types of Mistakes    Suggestions    Resources    Table of Contents   About    

Glossary    Blog


More on Wording Questions

Summary

Elaboration

1. Ambiguous wording

There are many examples of questions that could be interpreted in different ways. Here are a few.
One was proposed: "... Do you see things other people don't see or hear things other people don't hear?" The question did not work. Non-psychotic respondents answered yes in the cognitive lab, explaining that they were color-blind, had better than 20/20 vision, or had excellent hearing.  (p. 26)

For some specific types of ambiguities to watch out for, see the Question Wording page of the Research Methods Knowledge Base  or the references below 3.

2. Influence of wording and sequencing on answers

Including additional information that may influence answers. One elementary statistics textbook4 gives a case study that illustrates this and a related problem in surveys:

The Washington Post gave surveys to two different random samples of 500 people. Both surveys asked for the respondent's political affiliation. The first sample was asked the question,
 "President Clinton said that the 1975 Public Affairs Act should be repealed. Do you agree or disagree?"
The second sample was asked the following variation  of the question:
" The Republicans in Congress said that the 1975 Public Affairs Act should be repealed. Do you agree or disagree?"

In the first group, 36% of the Democrats said "agree"; only 16% of the Republicans agreed. In the second group, 36% of the Republicans but only 19% of the Democrats agreed. This illustrates how additional information included in questions can influence the responses. In some cases, such additional information is deliberately introduced to promote the desired response (deliberate bias), but in many cases it is included naively.

In fact, there was no such thing as the "1975 Public Affairs Act," so the example also illustrates how the results of surveys may be influenced by people's willingness to give an opinion on something they know nothing about. (Note that in each survey, less than half of the respondents did not give an opinion.)

Order of questions may influence answers. Bethlehem3 (p. 56) gives an example from a Dutch survey on housing demand. One question asked about respondents satisfaction with their housing situation. Several other questions asked about the presence or absence of various specific amenities in or near the housing. Researchers found that if these questions were asked before the question on overall satisfaction, the overall satisfaction rating was lower than if these more specific questions were asked later than the overall satisfaction question.

Suggestions


Notes:
1. See http://www.cdc.gov/nchs/nhis/nhis_disability.htm for more information on this survey.
2. Connolly, Michele (2009) "Disability: It's Complicated," Chance, vol. 22 No. 1, pp 22 - 27.
3. Bethleham, Jelke (2009). Applied Survey Methods, Wiley, pp. 43 - 50.
    Converse, J.M and S. Presser (1986). Survey Questions: Handcrafting the Standardized Questionnaire, Sage

    Fowler FJ Jr and Fowler FJ (1995).  Improving Survey Questions: Design and Evaluation, Sage
    For an account of some of the particular challenges of conducting surveys in the Global South, see Asher, Jana (2010), "Collecting data in challenging settings," Chance, vol. 23. No. 2, 6 - 13.
4. Utts, Jessica (2005). Seeing Through Statistics, Thomson, p. 40. The source for the case study is
Richard Morin, “What Informed Public Opinion?” Washington Post National Weekly Edition, April 10–16, 1995, p. 36.

Last updated July 31, 2012