Too often, survey results are presented in media as irrefutable truth. We have all wandered around a supermarket and picked up the toothpaste that made your teeth “99% brighter” (allegedly) or made an investment based on an article that told us something big was going to happen in X sector (allegedly).
The problem is that, just like the phrase “dermatologically tested”, those statements don’t mean anything without context.
A savvy business-person, just like a savvy investor, will want to know where data came from, how it was collected, and how it was analysed, before throwing resources and energy into acting on it. Next time you see numbers being thrown around in media, here are a few questions that can help you out:
What was the design?
Contrary to what some people believe, surveys are not always the best method of gathering data. They have many advantages and they are definitely useful for some problems, but they are not what you turn to if you’re trying to find out more about something completely new or innovative. Unless you know exactly what you want to ask your respondents, you can very easily rig the results to get people to tell you what you want them to say.
Speaking of which…
How many people were surveyed?
100% satisfaction rate sounds pretty impressive… until you realize you only surveyed 10 people. Can you still get interesting results? Sure. But if you are going to conduct a small, intimate study, why not do interviews? Or a focus group? Something where people can go off-script and share their ideas more freely?
Surveys are good if you have a large sample size and you need to sift through it quickly. 10 people may still have enjoyed your service, but that does not make them a representative sample of your client population.
Who conducted the research?
A lot of companies use their own in-house data to build surveys, and that is perfectly acceptable, so long as they are transparent about it. When used properly, company data and company-led surveys can be a great tool that shows transparency and a willingness to learn. On the other hand, opacity around who produced the research and how it was conducted could be seen as a red flag.
It is worth repeating that just because a certain company gathered data around a certain issue does not automatically mean they are unbiased. The food industry can (and should) conduct research around nutrition and its effects on the human body. Banks absolutely should keep an eye on the market and consumer spending in order to provide a good service. It’s when an organisation gets fidgety about how they reached a certain result or where they got certain information that things begin to look fishy.
Were the results compared against anything else?
Industry growth is pretty exciting news, but one does have to ask oneself, when reading the paper, what the researcher was comparing the current year against. Is it the last year? The last three? The last ten? Similarly, the effectiveness of toothpaste on teeth whitening depend a lot on the scale the company is using to compare.
When looking at survey results to purport dramatic change, it is worth asking what the situation was to begin with, and then, whether the changes observed would have happened anyway. Did the toothpaste company have a group of people using their competitors’ product, just to compare results? And, if they did…
What are the results, really?
Let’s return to that phrase, “dermatologically tested”. Seems sound – and desirable, particularly on products like moisturisers and anti-dandruff shampoo. However, there is an obvious loophole in the phrasing: just because something is “tested” does not mean it has “exceeded expectations positively” or even “passed”.
Advertising Standards Agencies have limitations on what companies are allowed to say and not say for marketing purposes, but still, there are lots of ways to present study findings, and still, not say anything significant. Describing the actual study is interesting, but in the end of the day, what you want is to find out what people said, and why.
So where does that leave us?
If your company wants to show survey results to demonstrate the effectiveness of a service or a product, that’s great! You should absolutely go for it.
Despite what has been written, the principles of reporting data to the general public are actually not that complicated. It boils down to:
Being transparent – don’t hide behind legalese or complicated terminology.
Being realistic – not every study will go to plan, and that is fine!
Being forward-thinking – if you didn’t find something out, tell the reader how you will go about finding it next. Or, better yet, invite them to give you their unbiased opinion. Survey, anyone?
If you are interested in embedding survey data and feedback to help direct your business growth, get in touch with the LORIC team to find out more!