Online market researchers were conditioned years ago to equate the sample vendor selection process with data quality vetting. However, in today’s automated market, sample firms face a daily challenge of balancing their own business demands with normalized quality norms. Now is the time for every market leading research firm to develop their own data quality strategy that proactively addresses today’s complex threats.
For 13 years or more, survey data quality in online surveys had an inexorable connection to the quality of the community on the panel side. Industry associations and market leaders had very clear and public standards for what constituted legitimate sample. Most companies followed the same script-sometimes literally in the form of an intake questionnaire. Common approaches required asking vendors about recruitment approaches, address verification norms and panelist engagement and activity levels. These measures were all common “table stakes” for the leading online panel firms. Industry tools like the ESOMAR 28 were valuable aids to go even a level deeper for the more discerning buyers. Generally the thought process was: if I vet a panel carefully enough, I shouldn’t have any data problems. For years, that approach worked.
Correlating with the influx in capital from the investment community, online panel firms started using cash to evolve the business model into today’s market realities. Massive technology upgrades had demonstrable impact in every facet of the online data collection space. Sample prices came down, projects got harder, sample sizes were larger, and projects finished faster. While these benefits were realized, parallel changes occurred that were more complicated. Investment in proprietary panels shrunk; source blending became the standard; recruitment standards lowered; validation measures faded. The net result was a macro quality decline and a permanent relaxation of the old norms.
Automation has absolutely helped our industry, but it has come with a cost. Just in the research operation alone, multiple chain reactions are felt including lost labor hours and lost nights’ sleep for who worry if their data and story will “hold up” to client scrutiny.
It is reasonable and timely to develop a research-side Quality strategy that actually complement procurement strategies. Here are a few recommended strategies to consider:
• Create a conscious decision to deploy your core asset, your team, to value- creating tasks (data quality reviews don’t have to be one of those things).
• Develop a functional understanding of today’s sampling technologies and where the data risk lies.
• Assess current fraud threats in recent projects. Understand what’s driving those.
•Develop a Fraud Prevention strategy that addresses malicious attacks that are commonplace in today’s surveys.
I concede this effort will require some education of the teams and exploration of new technologies and processes. However, this decision will pay for itself many times over as your projects move faster, your team feels deployed more strategically, and your clients reap the rewards of more accurate insights.
Let us know how we can help.