pass it on!
|
If you find the content in this newsletter useful,
- send it to a colleague (see below this column)
- share through twitter, digg, etc. (see way below)
- republish an article in
your blog, newsletter, or other media (credit to the source)
|

|
past issues
|
Scan past issues on such topics as design tips for writers and speechwriting.
|
ease in writing?
|
"Ease in writing" comes from a poem by Alexander Pope, the British poet:
True ease in writing comes from art, not chance, As those move easiest who have learn'd to dance.
Note he (and I) didn't say "easy writing." But just as dance lessons can help get your around the floor more gracefully, the goal for this newsletter is to share a tip or two to improve your writing.
|
|
|
Writing useful surveys
 Did you receive a phone call or email survey to "ask your
opinion" during the recent health care debate? Did some of the word choices in
the questions or multiple-choice answers
drive you nuts?
When done correctly, a survey "helps gather information for
better decision-making," said Steve Raabe, president and research director of OpinionWorks.
But beware. The proliferation of online and phone surveys makes this
information-gathering temptingly easy. As marketing expert Pat Lovenhart, Lovenhart Research & Consulting, warns, "it's worse to have a bad
survey and get deceiving information than no survey at all. You might
make bad
decisions based on bad data."
Although the writing is only part of what makes or breaks a
survey, "good writing is key," said Raabe. "You have to engage respondents
and make it interesting and friendly."
Here are experts' suggestions for writing surveys
that yield useful results :
- Work backwards: "Before you write, you have to understand the business needs and
what you want to do with the information you gather," said Lovenhart. "From
there, drill down to specific questions. Is the information we're asking going to help us
with the main objective?"
- Let the
purpose guide the questions: "Different types of questions lead you to
the
data you need to analyze and interpret," explained Stephen Rafe, Rapport
Communications. (See below for his definitions of six common types
of questions.)
- Stay true
to the purpose: Only include questions to elicit the information you need.
"A common pitfall is there are questions hanging around the organization that
are unrelated or irrelevant, but people want to tack them on, as long as you
are doing a survey," said Lovenhart.
-
Keep it
simple: Once you are ready to write, "Rule #1 is to use everyday language,"
said Raabe. "Eliminate all jargon. Make sure it's easily understandable."
-
Lob a few
softballs: That's how Raabe suggests easing respondents into the survey.
"Proceed from the general to the specific," he explained. "Make sure one
question moves to the next in a logical fashion." Lovenhart notes the balance
between not being too complex at the beginning of a survey but not waiting too
long to ask the most important questions.
-
Engagement
determines length: Both Raabe and Lovenhart said that there is no "optimal" length for a
survey. "It depends on the audience," said Raabe. "For example, an association
that doesn't survey its members very often will find that the respondents will
tolerate a longer survey."
- Demographics at the end: Best practice is usually to put these sorts of questions (age, income level, etc.) at the end of the survey.
-
Always
pre-test: You and your team may think the questions are clear, but how do
you know for sure? Test the survey, whether conducted online, by phone, in person,
or on paper, with people who are as similar to the target respondents as
possible. Avoid common question pitfalls (see below). "Even the best researcher always finds things to change after pretesting a survey," said
Lovenhart.
|
Know the lingo

You've answered, or written, all the types of questions
explained below, perhaps without knowing what they're called. You may not be the person who decides which types to use, but you should know their "real" names. Here is how
Stephen Rafe, Rapport Communications, defines them for his graduate students at
Stratford University:
1. Dichotomous
Questions: Especially useful in narrowing down your
responses by asking for "yes" or "no." An obvious example
might be a survey for men's aftershaves that begins with "Are you a
male?"
2. Rank-Order
Questions: You ask
respondents to tell you the level of their preferences for a number of items.
For example, a snack-foods manufacturer concerned with distribution might
ask people to number their preferences for several named products. Also called
"Ordinal Scale Questions," you might also write these when you want
general information rather than numerical ratings. In such cases,
you could arrange "clusters." For example, when asking people
where they prefer to live, you might provide such categories
as: "rural area," "suburban area," "urban
area."
3. Multiple-Choice
Questions: You ask
respondents to select one of three or four options. Some surveys
"scramble" the choices with other options in other questions to
achieve greater accuracy by comparing and contrasting the responses. Or you
might provide one "correct" choice and three "incorrect"
choices or the opposite. You might also provide choices that enable you to
couple two other choices or even choose "all of the above" or
"none of the above."
4. Likert
Scale Questions: Likert scales (also called "interval
scales") help provide the level of importance of an item. A variation are
"Multiple-Choice Matrix Questions" that provide the respondent with a
group of questions that use the same ordinal scale, for example in restaurant
surveys.
5. Semantic
Differential-Scale Questions: You provide respondents with terminology
(usually related to satisfaction or some other emotion) but ask them to use
numbers that express their feelings. For example: "On a scale of 1 to 5
with 1 being 'Disgusting' and 5 being 'Delightful,' how would you rate
___item________?"
6. Open-Ended
Questions: Perhaps the
most challenging to administer, analyze, and interpret, these questions must be
tightly worded so the responses can be sorted according to content, key
points, and significance. As a general rule, they are positioned at the end of
surveys to enable respondents to express additional comments and suggestions.
� 2007, Stephen C. Rafe. All rights reserved.
|
Pitfalls to avoid
Rafe also provided
four common examples of questions to avoid:
1. Double-barreled or incorrectly compounded: The writer expects the respondent to
provide one answer when the question asks for two ("Do you think we are
effective in recruiting and retaining members?")
2. Vague: The question doesn't define what the questioner meant, so the
respondent's answer is equally vague. Result: responses too subject to
interpretation and thus not useful for analysis.
3. Non-actionable: For example, in a survey about
specific faculty members, the question was asked, "Do you think that most
people believe our school provides an excellent education?" Not relevant to the
survey purpose, and nothing that the surveyor can do with the responses
4. Loaded or leading: "To what extent do you think gas-guzzling automobiles misuse our
precious natural resources?" Which brings us back to some of those awful
surveys about health care reform.
|
|
|
|