Table of Contents

Introduction

According to the Pew Research Centre, the number of bad respondents for a given online panel ranged between 4-7%. In most industries, this would be considered abysmal, but considering falling engagement, weak quality checks and the rapid technology adoption by respondents, this makes a lot of sense. However, this research only omitted the most obvious poor responses i.e. duplicate accounts, those outside the country of the research, gibberish or contradictory answers.

The reality is, when Grey Matter carried out a more intensive quality assessment, disregarding the obvious conflict of interest in panel companies investigating their own quality control, they found that a shocking 46% of respondents had to be omitted for low quality. This was if a respondent committed four or more issues, which identifies just how much panel suppliers are under-analysing their output for clients.

How Did This Happen?

The purpose of this article is not to defame the online panel suppliers, nor is to take a superior higher ground as a new entrant. Quality control is incredibly difficult, it requires a huge investment in technology and a meticulous strategy for pursuing data that you can trust. This is not helped by the fact that panels have become commodised, with no incentive to deliver higher quality.

That being said, researchers are simply not doing enough to address these challenges, ignorance is certainly not bliss. Using over 40,000 interviews per month through a series of online panels, Harmon Research found that:

  • 25% don’t evaluate verbatims for bogus responses beyond obvious gibberish
  • 50% don’t eliminate straightliners (respondents who click “agree strongly” on every statement or rate every brand as “very familiar”)
  • 75% don’t use “red herring” questions (e.g. “Please answer #4 to this question,” or showing four squares with different colors and asking respondents to click on the green square)
  • 90% don’t track respondents’ time spent on individual questions
  • 90% don’t re-ask demographic questions to compare responses
  • 90% don’t go line-by-line through the data, searching for problematic or duplicate responses
  • 95% don’t include fake brands or names in awareness questions
  • 95% don’t evaluate numerical open-ends for bogus answers (e.g. answers of “12345” or “4444” for “How much money did you donate to charity in the last month?”)

Every insights methodology has challenges and quality issues. Good researchers recognize this and do everything they can to reduce the problems and mitigate the challenges without throwing out the methodology. Yet, online panels have dropped the ball, and it’s time to accept the fact that there is a deep rooted problem in the industry.

Why Does This Matter?

Unfortunately, these ‘uncaught’ respondents have damaging knock-on effects for the end-client. They fundamentally distort key business decisions, misinform strategic choices, and can lead to the demise of many companies across the world. The only thing worse than waste is being misled, and there has never been a greater need for transparency in the online panel industry.

Do you want to rely on a brand awareness study showing familiarity that’s three times as high as it should be? Or a channel study showing a social media platform is 188% more popular than it really is? How about a consumer spending study that reveals spending to be 16X higher than in the real world. Unfortunately, this is the truth behind online panels around the world.

How Do We Fix This?

The solution is to make the best of a bad situation, and start implementing the following:

  1. Doubt everything, demand transparency
  2. Take personal responsibility to Incorporate data quality into your research process, take some effort to analyse the respondents involved and data collected
  3. Include quality checks in your survey design, this forces panel providers to screen out bad responses, as you will find them yourself otherwise
  4. If you don’t have the time to do this, find someone you can trust who will demand transparency and quality assurance – what you don’t know will hurt you.

Conclusion

Ultimately, what you don’t know can most certainly hurt you. As an industry, it is essential that we tackle these problems head on. We must work together to demand more of both panel suppliers and research buyers to incentivise the time, money and effort necessary to solve these widespread issues.

Key Takeaways

  1. 46% of the respondents you get from online panels are disengaged, fraudulent or low quality
  2. 90-95% of researchers are not doing enough to identify and weed out bad respondents
  3. The resulting bad data you get can have a serious impact on your business decisions

Subcribe now

Join our mailing list and get regular insights!

Related Post

Leave a Reply