Is this sentence a) true or b) false?
The top mistakes people make when designing surveys
There are any number of mathematical techniques for screwing with data, but the best way to skew an output is not by playing with the numbers, but by fudging the questions.
In fact, it is far easier to write a biased survey than an unbiased one. Even professional researchers often write exceptionally poor questions by accident, without malice aforethought. So how do you avoid these pitfalls?
In this article I’m going to run through some of the worst errors that I encounter on a regular basis, to help you make sure that you don’t make the same mistakes (or if you do, at least it’s part of a conscious attempt to trick an audience, rather than the fruit of incompetence).
1. Schrödinger’s 20 year-old
If you are exactly 20, what do you tick? This has to be the error I see most frequently, and usually it’s a sign that whoever wrote the survey is not going to be the person analysing it. This immediately makes me wonder why it is being done at all. If the same people are not involved in the survey design and the analysis, the likelihood of getting useful information is low. If you just collect data will-nilly and feed it into a mathematician in the hope that something good will come out the other end, the result will still be statistics, but they won’t necessarily be of any use to you.
How to fix it: enroll in a primary-level maths class.
2. We insist you try the hotdogs
This Kafkaesque sequence is the product of a lack of editing, and another sign that someone has been made responsible for this exercise who considers that quantity is better than quality as far as data is concerned. We know whether people tried the hotdogs and we know whether the hotdogs were tasty. Problem solved, right?
How to fix it: use the logic options offered by your survey platform to make sure that respondents only see question 2 if they answered yes to question 1.
3. Your opinion is ̶i̶m̶p̶o̶r̶t̶a̶n̶t̶ ̶t̶o̶ ̶u̶s̶ compulsory
I once accidentally ended up being interviewed on Chinese television about the Euro crisis. I know nothing whatsoever about the topic, but nevertheless managed to perform my duty and come up with an eloquent and reasoned opinion lasting at least two and a half minutes. Most survey respondents are under less pressure, however, and will either nope out or answer randomly when forced to take a stand on an issue they don’t care about.
How to fix it: include both an “I don’t know” and an “I don’t care” button. This is valuable data too.
4. We have ways of making you talk
As a linguistic data guy, I spend 90% of my time pushing for greater use of open-ended questions and the other 10% begging people to stop. If you can’t think of a sensible answer to your own “why?”, then your respondents won’t be able to either, and being expected to try will just irritate the hell out of them.
How to fix it: Delete all “why?” boxes unless testing has shown them to produce useful data. If you cannot bring yourself to do this, at least allow respondents to skip them.
5. Do you like and/or dislike this question?
This question does not technically break the rules of logic, but most respondents will not see it that way. Make life easy for them. This will also enable you to collect more useful information.
How to fix it: When you use a preference word (like/dislike, agree/disagree) in the question, make sure it reappears in the answer options. This is much better:
6. Just C&P bro it’ll be fine
You would be surprised at how often I see a list of upbeat choices copied and pasted from the “I like X” logic pipeline to the “I don’t like X” logic pipeline, to hilarious effect. If one of these atrocities has slipped into your survey, chances are that it is far too long. If even the designer is getting confused by the logic options and copy-pasting questions, your respondents got bored and gave up long ago.
How to fix it: Firstly, pretest your survey among a small sample of trial respondents. This is recommended as best practice everywhere, but you would be surprised how seldom professionals do it. It costs fifty quid on Mechanical Turk and takes a couple of hours. Try it, you might like it. Secondly, remember that five good questions will provide you with more useful information than fifty bad ones.
7. Take me home, country roads
Whoever designed this imported a standard list of options from somewhere, and did not take time to check that they covered every possible respondent. This can happen with tick-box questions too, but is particularly common with drop-down menus, for obvious reasons. If someone removed San Merino from your website’s country list, would you notice?
How to fix it: There is often no easy fix for this, and you have to rely on the list you were given. Shorter lists can be checked through, however, and mistakes removed. And if your respondents come from all over the world, don’t force them to pick a US state of residence (an extremely common mistake).
8. You can have it all
Obviously everyone wants all of those things in a phone. I could have told you as much without running an expensive survey. They also want their phone to be free and pour them a martini when they get in from work, but knowing this won’t help you improve your business.
How to fix it: Either make people rank their answers by order of importance, or — even better — look at the question from a different angle, and ask them what their deal-breakers are when buying a phone. “I would refuse to buy a phone that didn’t have…”
9. Too much information
If you take anything away from this article, make it this: only collect data that is relevant to the result you want to achieve. Unless the company potluck is going to be followed by an orgy, this information is of no use to you. Because data is valuable, there is often a temptation to gather as much as is humanly possible. Resist this at all costs.
How to fix it: For every question you write, ask yourself how you are going to use the data. If the answer is “it might come in handy one day”, delete the question.
10. The old favourite…
Leading questions can be written deliberately, but more often they are simply a result of researchers finding it difficult to believe that their personal preferences may not be universal.
How to fix it: I’m not going to lie, if you’ve already written a leading question, whether by accident or design, there is pretty much nothing I can say here that will talk you out of it.
If you follow these 10 tips, your survey will already be better than 90% of surveys out there, if not, you can see what it’s like to be on the receiving end of a logic-bending survey here.
To contact the authors, please email firstname.lastname@example.org, or visit lexikat.biz.