Tuesday, March 8, 2016

Writing Good Survey Questions (or Why I'm Not the Person You Want Responding to Your Surveys)

As I've said before, we all see the world through the lens of our discipline. And the things that irk us tend to be silly and overly specific to the outside observer.

This is why, when I receive a link to a survey, I usually don't respond. Of course, this results in others pointing out that I should be helping my fellow researchers out when I qualify for a survey. But sometimes, it's very difficult to ignore issues with those surveys: poorly worded questions, bad response options, and sometimes nonsensical instructions make it very difficult for me to respond and not be distracted. And if they have a spot for open comments - if the survey was especially bad - I let them know some of the mistakes there.

Yes, it's totally obnoxious and I can't help it. So I find it better to just ignore the survey.

I recently received a "survey question" - which was clearly meant to sell something - at which I rolled my eyes and immediately decided to write a blog post on writing good survey questions. The question was: "Do you want to be your healthiest self?" Like I said, they were selling something, and rules for good survey questions don't really apply in those situations, because the purpose is not to gather information. But I've seen some similarly worded questions on actual surveys.

So here are some of the mistakes I frequently see in survey questions:
  1. Loaded questions - Loaded questions encourage a person to respond in a certain, usually socially desirable way. More specifically, they show the bias of the researcher. The question above is one such example, but many examples are available. For example: Most Americans prefer to purchase products manufactured in the United States. Do you prefer to purchase products manufactured in the United States?

  2. Double-barreled questions - These are questions that ask about two things at once. One of my favorite examples I used when I was teaching research methods came from our course evaluations: The instructor explained concepts in a clear and concise manner. It's definitely possible to be clear but not concise, or concise but not clear. So this question really only works if the instructor is both clear and concise, or neither clear nor concise. If both of these concepts are important to you, you need to have two separate questions.

  3. Ambiguous response options - Let's say I'm surveying researchers and want to know what they use to enter data, and I include response options of: Excel, Access, a spreadsheet program, a database program. If someone uses Excel, they could also select a spreadsheet program. Probably I mean that option to be if they use another spreadsheet program besides Excel, in which case, I should state other spreadsheet program instead. You should of course list the other major options, before offering the other option. I would also recommend including a blank for people to specify the other program, because you'll get people who select other and write one of the answer options in; this way, you can recategorize them later.

  4. Unspecific timelines - For example: Have you felt sad or depressed? Yes, we all have, even the most mentally healthy of us. This question won't tell you anything about who has a current issue with sadness or depression. Worse yet is a question that asks the number of times you have done something, like How many times have you used Facebook? This would be impossible to recall. But add one simple instruction: In the past 7 days, have you felt sad or depressed? or In the past 7 days, how many times have you used Facebook? Now people can recall their feelings or behaviors from a more specific time, and you're more likely to get useful information.

  5. Asks about sensitive topics without guaranteeing anonymity - If you're going to ask about topics like drug use, sexual activity, etc., you need to make sure respondents feel secure that their answers won't be compromised. Otherwise, they may not respond truthfully, and then what's the point of doing a survey? True, some people aren't shy, and you'll get accurate information from them. But a) you won't know who is being truthful and who is not and b) your data won't represent your whole population.

  6. Too much or unclear branching - Sometimes you want additional information for people who respond in a certain way, and everyone else can skip to the next question. But if you have to do lots of branching, you risk confusing respondents, who may just throw away the survey in disgust. If you find you have a lot of branches, ask yourself if it's necessary, and if it is, is it because you're really only interested in a certain subgroup? You might want to consider either reformatting the survey, putting all the non-branching questions first, and then ask your specific group to fill out the rest, or even focusing on that particularly subgroup when sampling. If branching is minimal, find ways to format the survey to make it really clear, like putting a box around the branched questions and/or adding an arrow to show where to go next. (Note: This is really only relevant to paper surveys. Phone or internet surveys are perfect for branching, so you might explore those options instead.)

  7. Negative wording - It's never not a bad idea.

  8. Trying to cram too many questions on one page - We do this to make a survey look shorter and less burdensome to respondents, and it's a bad idea on all counts. If you need to make your survey look shorter, it's probably too long. Second, people are more likely to skip questions if they're crammed together, because they didn't see them. In short:

  9. Asymmetrical response options - I've definitely seen response options along the lines of Strongly Agree, Agree, Disagree. If you have positively worded options, you should try to have an equal number of negatively worded options, such as: Strongly Agree, Agree, Disagree, Strongly Disagree or Strongly Agree, Agree, Neutral, Disagree, Strongly Disagree.

  10. No metric for numerical answers - If you ask a question requiring an open-ended numerical answer, you need to let them know what metric they should be using. A question like How far do you live from your primary care doctor's office? followed by a blank could be answered in many ways. You may think people will give you miles, but if they only live 5 blocks away and you don't tell them to think in miles, you'll misinterpret their answer. Same goes for hours and minutes if you're asking about travel time.

  11. Making respondents rank too many things - I've taken surveys where I'm asked to rank a list in order of importance. If that list has more than 5 items, don't make people rank all of them - just ask them to rank the top 5 (and actually, top 3 is best). When an online survey once made me rank 10 items (and wouldn't let me continue until I gave every item a rank), I closed the survey.

  12. Relatedly, too many response options - Psychologist George Miller found that people can only keep about 7 (between 5 and 9) things in their head at once. If people can pick more than one (a check all that apply question), you should still limit, and if there are that many potential options, try to split into more than one question. But you shouldn't have more than 7 forced choice options; otherwise, people might forget the first option before they reach the end of the list.

  13. Open-ended questions when you should use a close-ended question - On one large survey I worked on, the surveyor included an open-ended question for something that could have been summed up with 4 or 5 options plus an other, specify option. As I feared, the open-ended data were a complete disaster and totally unusable. Every once in a while, I think I should go through and try to categorize, because it's an important question, but then I see the breadth of responses and curl up in a ball.
Now, in some of the above examples, I gave neutral options. This is a hotly debated topic in survey research. Some think neutral options are a bad idea and prefer to force people to "pick a side." On the other hand, others think neutral options are necessary, because people may not have an opinion on an issue and may skip the question entirely if they don't have an option they agree with. I'm generally a fan of neutral options, but there's really not a right or wrong answer here.


The best thing you can do when creating a survey is cognitive testing. What this means is giving your draft survey to a small number of people one-on-one, and asking them to "think out loud" as they read the questions. You may also ask them to paraphrase questions, to get better wording and make sure the question makes sense. This gives you a glimpse of how people approach your survey and if they're struggling with any of the issues identified above or other issues entirely, like unfamiliar terms (jargon) or important topics not assessed by the current questions (which we call "content omission").

As with most topics on this blog, I'm oversimplifying, and there are more nuances to survey design than I've discussed here. In addition to what I've recommended above, you really should have a survey methodologist on board, especially for very large surveys. Most of these considerations aren't intuitive (and that's okay - you don't have to be an expert in everything!), and designing a good survey can be an iterative process, with cognitive testing and even pilot-testing needed. Find an expert to do that dirty work, and wait for the data to come rolling in!

No comments:

Post a Comment