USING STATISTICS: Spotting and Avoiding Them
Types of Mistakes Suggestions Resources Table of Contents About
More on Wording Questions
1. Ambiguous wording
There are many examples of questions that could be interpreted
in different ways. Here are a few.
- Implementation of the Americans with
requires data on the incidence of disabilities. The National Health
Interview Survey on Disability1 (NHIS-D) was designed to
help meet this need for data. Michele Connolly, formerly a statistician
at the U.S. Social Security Administration, has written an article2
describing some of the challenges in developing this and other surveys
on the incidence of disabilities. One example she gives involves the
difficulties in developing a question to address psychosis:
proposed: "... Do you see things
other people don't see or hear things other people don't hear?" The
question did not work. Non-psychotic respondents answered yes in the
cognitive lab, explaining that they were color-blind, had better than
20/20 vision, or had excellent hearing. (p. 26)
- How many rooms are in your house? (Do you
bathrooms? What if you have a large hallway that has some furniture in
it? What if you have a dining area that is separated from a living area
by a wide archway but no door? What about a laundry area in the
basement? What if your basement workshop is separated from the laundry
area by the furnace and a small passageway between the furnace and your
worktable?) A better question would
specify which types of rooms are and are not to be counted.
- Do you attend church regularly? (Does
Easter" count as regularly? What about "At least once a month?") A better question would be more specific
-- e.g., "How many times have you attended church in the past month?
- Do you use drugs? (Does the question refer
drugs only? To prescription drugs? To over-the-counter drugs?
Does it include alcohol and caffeine?) A better question would clarify which
types of drugs are and are not to be counted.
For some specific types of ambiguities to watch out for, see the Question
page of the Research Methods Knowledge Base or the references
2. Influence of wording and sequencing on answers
additional information that may influence answers. One elementary
statistics textbook4 gives a case study that illustrates
this and a related problem in surveys:
The Washington Post gave
surveys to two different random samples of 500 people. Both surveys
asked for the respondent's political affiliation. The first sample was
asked the question,
"President Clinton said
that the 1975 Public Affairs Act should be repealed. Do you agree or
The second sample was asked the following variation of
" The Republicans in Congress
said that the 1975 Public Affairs Act should be repealed. Do you agree
In the first group, 36% of the Democrats said "agree"; only 16% of the
Republicans agreed. In the second group, 36% of the Republicans but
only 19% of the Democrats agreed. This illustrates how additional information included in
questions can influence the responses. In some cases, such
additional information is deliberately introduced to promote the
desired response (deliberate bias), but in many cases it is included
In fact, there was no such thing as the "1975 Public Affairs Act," so
the example also illustrates how the
results of surveys may be influenced by people's willingness to give an
opinion on something they know nothing about. (Note that in each
survey, less than half of the respondents did not give an opinion.)
Order of questions
may influence answers. Bethlehem3 (p. 56) gives an
example from a Dutch survey on housing demand. One question asked about
respondents satisfaction with their housing situation. Several other
questions asked about the presence or absence of various specific
amenities in or near the housing. Researchers found that if these
questions were asked before the question on overall satisfaction, the
overall satisfaction rating was lower than if these more specific
questions were asked later than the overall satisfaction question.
1. See http://www.cdc.gov/nchs/nhis/nhis_disability.htm
for more information on this survey.
2. Connolly, Michele (2009) "Disability: It's Complicated," Chance, vol. 22 No. 1, pp 22 - 27.
3. Bethleham, Jelke (2009). Applied Survey Methods, Wiley, pp.
43 - 50.
Converse, J.M and S. Presser (1986). Survey Questions: Handcrafting the
Standardized Questionnaire, Sage
Fowler FJ Jr and Fowler FJ (1995). Improving Survey Questions: Design and
For an account of some of the particular challenges
of conducting surveys in the Global South, see Asher, Jana (2010),
"Collecting data in challenging settings," Chance, vol. 23. No. 2, 6 - 13.
4. Utts, Jessica (2005). Seeing
Through Statistics, Thomson, p. 40. The source for the case
study is Richard Morin, “What Informed
Public Opinion?” Washington
Post National Weekly Edition, April 10–16,
Last updated July 31, 2012