Part 5: Research Accessibility vs. Research Accuracy

Buying technology can often be exciting.  You feel as if you’re creating a new future that is full of speed and time savings. Believe me, I know.  However, comprehensive evaluations should have a full scope of what’s being replaced.  

In DIY Research, many users fall in love with features and functions of platforms but overlook key Data Quality elements that matter.  They opt for simplicity and feel they’ve discovered a short-cut to meaningful insights at a fraction of the cost.  Too often, buyers overlook a major component that drives reliable research results – where the data is coming from.  

For DIYers, the respondents (collectively known as “sample”) are often an add-on feature, an extra click on project setup.  The user pours energy into question creation or concept design.  The respondent access is viewed as a final detail, but there are a host of expectations within the user.   

Commonly, DIY buyers assume they’re accessing “online panel” for a brand that’s integrated into the tech.  However, it’s often something very different.  It may not be a double-opted in community, but a third-party API to hundreds of other third-party traffic suppliers, none of whom collect even an email address. 

In the DIY world, there’s a hard truth- the more DIY or automated something is, the less focus the user places on sample design or data quality.  The user often makes hefty assumptions while being hypnotized by features.  Many sample companies have transformed away from respondent access firms and toward DIY Insights Platforms precisely to monetize all traffic free from the friction of accountability.  DIY is the ultimate in low scrutiny because most of the buyers don’t even know how to evaluate quality.  

Why does this matter?  Because DIY Research, which has been great for the growth of the market research industry, has also led to many marketers showing up for strategy meetings ready to make decisions based on highly compromised data.  Countless Consumer insights departments have seen this.

Here are examples of what this risk can look like:

  1. Inflated self-reporting of product interest.
    • Poor sample traffic has a tendency to inflate people’s reported interest in a new idea.  Effect: You may be investing in a product that doesn’t have the projected market you think.
  2. Fraud runs wild in DIY platforms.  Effect: Up to 30% of your traffic can in fact be not real people at all.
  3. Sample Bias may be hard coded into your traffic flow.  Effect: This can give you a false representation of the market opportunity because you’re only measuring one population group which isn’t well defined. 

The Good News is that DIY doesn’t have to mean D-I-R-T-Y.  You don’t have to make the compromise.   You can pair DIY Research Platforms with audience-access solutions that optimize for quality.  Check out our “What is Data Quality” Blog to learn more on how we build a plan for “Clean Clicks” into every survey.  

Also, if you’re a user of DIY Research tools and would like some extra help figuring this all out, we’d love to meet you and see if our Technology or Talent can prevent some pain in your Insights.  
Visit us at to send us a note.

Insights & News

News and Perspectives for the constantly evolving market researcher.

About OpinionRoute

Learn more about the team committed to redefining survey insights delivery.

Want to hear more?

Let us show you how to take your quantitative research to a new level.

About OpinionRoute

We deliver accurate data by utilizing our expertise in online survey sampling and proprietary technology solutions to simplify research processes. This enables clients to scale and researchers to stay ahead in a dynamic and competitive market.

Contact Info 216-282-0793

Headquarters: Cleveland, OH

© 2024 OpinionRoute, LLC. | All Rights Reserved.