By: Bob Fawson, EVP Business Operations, Dynata
Today’s highly competitive marketplace demands market research agencies to deliver meaningful insights to their clients at the speed of their business. But speed is not enough; it’s imperative to provide not only fast but accurate insights backed by high-quality data. This means foundational data quality principles – survey design and delivery tactics that emphasize engagement, the latest data fraud prevention practices, and a “fit for purpose” approach – are essential to creating a streamlined approach capable of delivering fast, world-class insights time and time again. The risk of not doing so could result in poor quality insights, having to repeat the research and a lower return on client delivery.
Let’s examine each of these aspects more closely to understand how they can become part of your foundation for better research, insights and delivery:
Four Essential Survey Design Practices for High-Quality Data
- Ensure your screener has been adequately designed to screen the right audience in and keep the wrong participants out. This simple yet vital step can often be overlooked, however is fundamental to ensuring you’re reaching the right audience for your clients. To test your screener, look at the demographics of your client’s audience and walk through the screener to double check the questions are accurately narrowing in on those demographics and that they’re no loopholes.
- Create an engaging experience by keeping the length of the survey under 15 minutes to avoid data quality deteriorating after this point.
- Design the survey to be compatible across all devices and curate the content to match the audience you are targeting to maximize engagement.
- Use measurement and time periods that won’t overtax participant’s memories or encourage wild guesses that may steer your findings in the wrong direction.
Streamline Quality Control with Fraud Detection & Prevention Technology
Online sample fraud occurs when a participant provides false information to qualify for a survey to gain a reward. This can occur at multiple stages in the research process, such as at enrollment and within the survey itself.
The following technologies should be considered to protect your research from fraud:
- At enrollment – tools such as the new Imperium tool RegGuard®, Real MailTM, Verity® and RelevantID® controls, ThreatMetrix®, device and IP anomaly and reputation checks, plus open-end engagement tests analyzed via machine learning, use multiple data points to confirm identity and look for unlikely patterns.
- In the survey router –digital fingerprinting, geo location clues and a second round of the checks used at enrollment confirm identity and identify suspicious behavior.
- Within the survey – encrypted end links, customer concern feedback links, and Imperium quality score; and a new quality management platform that evaluates performance and behavior inside the survey.
- Advanced techniques for rare targets – including asking people to describe their job in their own words, leveraging machine learning to sort them into relevant categories and confirming their identity via publicly available information about professional credentials.
By leveraging these technologies, your business can:
- Automate fraud detection to optimize efficacy and provide faster results
- Power accurate insights for clients via a quality assured process
- Save time, costs and impact on resources
- Increase revenue and maximize return
How a ‘Quality Fit for Purpose’ Approach Can Increase Speed and Avoid Wastage
When defining the scope of a project, quality fit for purpose should be considered to avoid wasting resources on quality controls which may not goal of a project – for example, the speed in which the research needs to be delivered, client budget, and/or the audience. Strict quality controls are best used when applied to studies that require niche audiences with specialist knowledge rather than broad population studies.
Tradeoffs should also be considered when building a plan to deliver a project on time and within budget. Often participants can be opposed to joining surveys that have stringent quality controls in which a tradeoff should be made. An example of this is that many people won’t share their date of birth, but almost all will share the year they were born. Other tradeoffs that can be made are restrictive quality controls vs reach and representivity, price vs source, and precision vs scale.
How a Partnership with Dynata Will Accelerate Your Business
Selecting the right data partner is essential for delivering accurate insights to your clients and accelerating your business’s growth. A partnership with Dynata will power better research and delivery through streamlined data quality controls and practices – increasing your speed of insights and improving the accuracy of your research.
Dynata is committed to enhancing and investing in data quality technologies and practices to ensure your clients benefit from high-quality insights.
To achieve this, Dynata is making the following actions:
- Making an incremental investment of $20 million in panel quality and capacity in 2021.
- Offering transparency to help market research agencies know how sampling works today and the best solutions for their client’s projects.
- Bringing a data-driven approach to sample efficiency and the participant experience and sharing data with the industry.
- Designing more accurate feasibility tools.
- Giving participants more visibility, choice and control in their survey-taking experience.
- Partnering with you and your clients to make a difference in quality outcomes.
- Committed to meeting the industry’s demand for reliable data to answer critical questions.
These actions allow us to continually enhance and build upon the high standard of quality, assurance and trust our clients depend on to deliver world-class insights at the changing speed of business.
About the Author
Bob Fawson, EVP, Business Operations, Dynata: Bob oversees Dynata’s data and research business running the largest first-party panel operation in the industry. Bob is an expert in ensuring global consistency and integrity of data, from its acquisition to its distribution, and has extensive experience managing panel partnerships, data quality, and survey programming.