Blog | July 6, 2020
Online market researchers were conditioned years ago to equate the sample vendor selection process with data quality vetting. However, in today’s automated market, sample firms face the daily challenge of balancing their business demands with normalized data quality standards. Now is the time for every market-leading research firm to develop its data quality strategy proactively addressing today’s complex threats.
For 13 years or more, survey data quality in online surveys had an inevitable connection to the quality of the community on the panel side. Industry associations and market leaders had obvious and public standards for what constituted a legitimate sample. Most companies followed the same script-sometimes literally in the form of an intake questionnaire. Common approaches required asking vendors about recruitment approaches, address verification norms, and panelist engagement and activity levels. These measures were all “table stakes” for the leading online panel firms. Industry tools like the ESOMAR 28 were valuable aids to go even deeper for the more discerning buyers. Generally, the thought process was: if I vet a panel carefully enough, I shouldn’t have any data problems. For years, that approach worked.
Correlating with the influx in capital from the investment community, online panel firms started using cash to evolve the business model into today’s market realities. Massive technology upgrades had a demonstrable impact in every facet of the online data collection space. Sample prices came down, projects got harder, sample sizes were more significant, and projects finished faster. While these benefits were realized, parallel changes occurred that were more complicated. Investment in proprietary panels shrunk; source blending became the standard; recruitment standards lowered; validation measures faded. The net result was a macro quality decline and a permanent relaxation of the old norms.
Automation has helped our industry, but it has come with a cost. Just in the research operation alone, multiple chain reactions are felt, including lost labor hours and lost nights’ sleep for those who worry if their data and story will “hold up” to client scrutiny.
It is reasonable and timely to develop a research-side Quality strategy that complements procurement strategies. Here are a few recommended strategies to consider:
• Create a conscious decision to deploy your core asset, your team, to value-creating tasks (data quality reviews don’t have to be one of those things).
• Develop a functional understanding of today’s sampling technologies and where the data risk lies.
• Assess current fraud threats in recent projects and drivers.
•Develop a Fraud Prevention strategy that addresses malicious attacks that are commonplace in today’s surveys.
I concede this effort will require some education of the teams and exploration of new technologies and processes. However, this decision will pay for itself many times over as your projects move faster, your team feels deployed more strategically, and your clients reap the rewards of more accurate insights.
Let us know how we can help.