Table of Contents
Online surveys are the future of market research, yet so many marketers are not using them to their full potential. Online surveys require a different methodology, a user-centric approach and a keen understanding of technology to succeed where traditional surveys have failed. In this post, we’ll provide an all-encompassing rundown of market research and survey design, including an explanation of why it is important, how to conduct it and all the tips you need to know to do it successfully.
What is market research and why does it matter?
For a business to succeed, they need to identify and fulfil a ‘gap in the market’ or know how to get ahead of your competitors. However, to correctly identify that gap or be ahead of the curve, a business can’t only rely on instincts and assumptions but instead, they need to reap the benefits of objective research that proves their value and ascertains their assumptions. This is where market research and survey design comes into play. Market research is the process of gathering and analyzing data that relates to a market and/or your product/service and is usually done via online surveys. Having data from a fully representative sample of the entire population and/or your target consumer will become invaluable to a new or existing business. Market research doesn’t just verify your business plan for investors but it also helps a company to better understand its customers. This creates an opportunity to improve their products and services, better their marketing and identify new gaps in the market and stand out from the competition.
The Rise of Online Surveys
The biggest change in surveys over the last 20 years has been the drastic decline of traditional surveys. In 1997, the average response rate to traditional surveys was around 36%. By 2003, it was 25%. By 2014, it was 9%. Now, in 2022, the completion rate for a traditional survey has fallen below 2% on average. In the past, surveys would have been conducted by knocking on doors with a pen and paper, or relying on phone calls and transcribing results. Now, a new age of social media and technology has spurred the rise of online surveys as a suitable replacement for failing traditional surveys. Recently, companies have moved to surveys delivered through email, but the world has already moved past this to online and mobile alternatives.
The Quality Problem of Online Surveys
Online panels may provide a cheaper, scalable solution compared to traditional survey panels, however, there are issues to be aware of. Ensuring quality responses is the largest challenge for online panel providers. 46% of the respondents you get from online panels are disengaged, fraudulent or low quality and few researchers do enough to weed out these poor respondents. The market research industry must start demanding more transparency in its online panels and take personal responsibility to incorporate data quality into your research process in order to tackle these widespread quality issues. These improvements should come from across the board from the survey design to the user experience to the cleaning of the data.
Improving survey quality
There’s nothing more deterring in research than finding out the data you’ve gathered is inaccurate or not what you expected. High-quality data is not only accurate or trustworthy but also relevant i.e. data can be reliable but still not mean anything to a business. Improving data quality increases the reliability of insights, reduces the cost of re-fielding and saves on potential time lost in an industry already plagued by slow, expensive research processes. Luckily for market researchers, when it comes to survey data there are many simple tactics we can incorporate into the process to avoid poor data quality.
How to conduct market research
There are many steps to conducting market research but the following list are the most important steps involved:
- Set clear objectives
- Conduct secondary research
- Identify your target audience
- Choose a sample size
- Create your research questions
- Conduct your survey
- Analysis the results
Each of these steps can be broken down into smaller steps and there are many nuances when it comes to them.
How to build effective surveys
While in the past, surveys would have been conducted in-person through knocking on doors with a pen and paper, or on the phone. More recently, companies have moved away from this more traditional method to having the survey delivered through email. However, as is with the case with the technology industry, the world had already moved on from this. The average person in Ireland spends between over 4 hours on their smartphone every day, we no longer rely on computers to access the internet and and therefore, online has moved to mobile, completely changing the survey experience for the user.
The first difference that must be factored into an online survey compared to a traditional survey is that respondents have drastically lower attention spans. Mobile phones often have constant push notifications distracting the respondent. Our attention spans have majorly decreased due to this and therefore, we cannot expect respondents to surveys to stick around and complete a survey longer than 10 minutes. Therefore, when creating your survey, you need to make sure each one question counts and is relevant to your final goal. Ideally, your survey should be between 4 – 8 minutes long and have 20 – 30 questions depending on the complexity of the questions.
Another aspect you should focus on when building an effective online survey is the logical flow. The survey should have straightforward questions that help each respondent understand their reason for being part of this research. Spike their interest, and group similar topics together, making the survey conversational and enjoyable to go through. A golden rule for the survey builder is to keep it simple and place yourself in the shoes of the respondent.
Improving Respondent Experiences
Despite advancements in technology and failures of traditional surveys, the market research industry has been extremely slow to adapt to online surveys. Even when they are used, they suffer from the same inflictions as traditional surveys. At Bounce, we have identified 4 key problems respondents encountered:
- Email Invitation: The dependence of email invitations among market researchers means that they are missing out on a whole demographic of Millennial and Gen-Z consumers who favour their smartphone over using emails.
- Profiling Questions & Screening Out: Many researchers ask the same monotonous screening questions at the beginning of every survey. When survey length should be kept to a minimum, spending extra time answering screening questions not relevant to the data being collected is a cardinal sin of research that must change.
- Survey Quality: Survey design is not created with the respondents in mind and respondents must suffer through long surveys and complicated questions. A good rule of thumb is to test the survey yourself, if you feel your attention dropping, then it is more than likely that a respondent’s attention will also. This has an overall affect on the data quality and the decisions you make off the back of these
- Incentives & Rewards: Unachievable rewards are detrimental to respondent engagement. Smaller but instantly rewarded incentives that will keep respondents coming back.
For online surveys to be any different from their traditional predecessors, they must prioritize the respondent experience, which is something we champion here at Bounce Insights.
Term to Know
By definition, a consumer panel is a group of individuals selected by a business or organisation to provide input on products and services for research purposes. The activities of consumer panels can range from focus groups to answering surveys and questionnaires. It is fundamentally a standard data collection technique for a business’s market research. The consumers in a panel can represent a cross-section of the population, but more likely, they reflect your target audience. It all depends on a company’s target customer and the purpose of the research.
What makes consumer panels different from posting a survey on our business’s social media or sending it out to your mailing list is research quality and avoiding confirmation bias. Using a consumer panel will allow you to ask questions to a fully nationally representative sample or a wider sample of your specific target consumer. Your insights and data will be more reliable since you’ll have gotten a more rounded response than if you used your own.
Tips For Building Online Surveys
The first thing that must be factored into an online survey compared to a traditional survey is that respondents have drastically lower attention spans. Mobile phones often have constant push notifications distracting the respondent and you cannot expect them to stick around and complete a survey longer than 10 minutes. Therefore, when creating your survey, you need to make sure each question counts and is relevant to your final goal. Ideally, your survey should be between 4 – 8 minutes long and have 20 – 30 questions depending on the complexity of the questions. Our Account Management Team is on hand to help you review your questions and ensure that your survey is reaching your objectives.
Another aspect you should focus on when building an effective online survey is the logical flow. The survey should have straightforward questions that help each respondent understand their reason for being part of this research. Spike their interest, and group similar topics together, making the survey conversational and enjoyable to go through. A golden rule for the survey builder is to keep it simple and place yourself in the shoes of the respondent. Respondents will become less interested if they continually get asked questions that are not relevant to them, this is where skip logics and advanced branching logic can become helpful in ensuring that your survey is flowing correctly.
Projective techniques, also known as enabling techniques, are methods that can be used by skilled researchers to tap into participants’ deep motivations and attitudes. Qualitative market research has always used projective and enabling techniques for in-depth work. The rationale is to help people surface and discuss things that lie beyond their immediate conscious awareness, yet influence their behaviour. Understanding and identifying these emotional drivers is a difficult and inexact science. However, it can be aided by using a variety of projective techniques, such as but not limited to, word and imagery associations, choice ordering, and grouping.
One of the most powerful tools available to any digital marketer in the 21st century is the push notification. In recent years, consumer psychology has fundamentally changed due to the influence of the smartphone, and more specifically, the mobile applications that ‘nudge’ and ‘trigger’ repeat engagement via push notifications, sent directly to change consumer behaviour. By ultilising push notifications, this aids us to get real time results as quickly as possible.
For the next time you’re not familiar with a phrase or concept, we’ve created a glossary of the most commonly used terms in Market Research!
Overall, it’s always important that as society and technology changes, that research and research methodologies are also changing with this. Here at Bounce, we are ensuring that we are merging the traditional research techniques with new and innovative technologies! For more information, please free to contact our Account Management team at email@example.com
Market Research Glossary
A - C
This occurs when respondents agree with all questions within the survey even if they are contradictory.
An ambiguous question is defined as one where there is no specific query, meaning that a respondent will struggle to answer, not because they are a bad respondent but because it is a bad question.
Respondents are often bias toward answer options higher on the list in a quantitative survey question, randomly shuffling the answer options for each respondent will reduce this bias.
A question that offers a limited selection of answer options to choose from.
Conditional logic allow you to create dynamic surveys that change what a respondent sees and what happens based on their responses, e.g. if a respondent answers yes to a certain question they are shown a different question to a respondent who answered no.
A group of individuals selected by a business or organisation to provide input on products and services for research purposes.
The price you pay per completed survey. This calculation is based on number of respondents, the targetting criteria and more.
Courtesy bias is a type of response bias that occurs when respondents tend to not fully state their unhappiness with a service or product as an attempt to be polite or courteous toward the researcher.
Segmenting respondents into subgroups based on their targetting criteria or on specific response in order to compare and analyse the results by those subgroups
A way of doing business that fosters a positive customer experience at every stage of the customer journey, building customer loyalty and satisfaction which leads to long-lasting growth.
D - K
Removing unqualified, biased or incomplete responses from a survey. This process improves data quality and protects against survey bias.
A question allowing respondents to input a singular date/time or even a range of date/time as responses.
Demand bias comes from the respondents being influenced simply by being part of the research, particularly when they are involved in longitudinal surveys or research communities.
A question where there can be only two answers, commonly ‘yes’ or ‘no’.
This occurs when researchers blend two questions into one, and then allow for only one answer. They are a form of ambigious questions and will cause inaccurate research.
Any survey are sent to targeted respondents via email.
This bias occurs when respondents provide extreme answers, whether it be positive or negative.
A question allowing respondents to select one image from many options. This question type improves respondents’ survey experience and gives a break from answering all textual questions. They are ideal for gathering research on branding and advertising choices.
L - O
A question using biased language to influence respondents in a direction which subtly prompts the respondent to answer in a particular way.
A single choice question that uses a 5 point scale that ranges from one extreme attitude to another.
Loaded questions make assumptions and the respondent may not be able to answer accurately.
Researchers performing a longitudinal study will run the same survey many times over short or long periods, in an effort to observe how the opinions, behaviors or habits of the same population change over time.
An Online Market Research Community as a closed network of profiled, opted-in research participants who take part in structured and unstructured qualitative research tasks.
A closed-ended question that asks respondents to evaluate one or more row items using the same set of column choices.
Any survey are conducted over a mobile app.
A question providing a list of answer options and asking respondents to select all that apply.
A nationally representative sample describes a sample that is representative of the national population by 1-3 attributes, generally age, gender and region. In this regard, some component of the sample mirrors the population (based on census).
A question that requires a numeric answer. For example, researchers may ask how much money you’d potentially pay for a product or service.
A question requiring respondents to type their response into a text box. Open-ended questions gather qualitative responses, getting a response from the respondent in their own words.
P - R
Any survey where the initial dataset is collected using pen-and-paper rather than electronic devices.
A question which asks respondents to order answer choices by way of preference.
Projective techniques, also known as enabling techniques, are methods that can be used by researchers to tap into respondent’s deep motivations and attitudes.
Qualitative survey questions aim to gather data that is not easily quantified such as attitudes, habits, and challenges, aiming to understand the ‘why’.
Quantitative research is about collecting information that can be expressed numerically.
This is a bias that occurs when the initial questions of your survey influences the answers your respondents give to the subsequent questions later on in the survey.
Your questionnaire is the list of questions you plan to ask your respondents.
A question displaying a scale of answer options from any range (1 to 3, 1 to 10, etc.)
A respondent is a person within your sample who completes your survey.
Response bias is a general term for when respondents answer inaccurately or falsely to questions and it covers a wide range of effects and influences.
Response rate is the number of people who answered the survey divided by the number of people in the sample.
S - Z
A question that either qualifies or disqualifies respondents from taking your survey—depending on how they answer, also known as screeners.
Selection bias or sampling bias occurs when you only capture responses from a certain segment of your audience which skewed your results.
A question with a list of answer options, from which respondents may choose one answer.
A question which allows respondents to include into a numerical scale, ideal for responses that require a percentage or monetary value.
This bias is linked with acquiescence bias and occurs because respondents want to be perceived in their best light. Therefore, respondents may exaggerate their habits, beliefs, and personal preferences so they are more socially attractive, even when surveys are anonymous.
Respondents who complete surveys far too fast to have actually read the questions or truly contemplated answers. Speeder reduce the quality of the survey data and should be removed.
The primary aim for the survey, essentially what the researcher wants to know and why they need a survey. A goal is not strictly measurable and tangible.
Objectives are more specific and measurable than a survey goal and they break down the steps to take in order to achieve the survey goal.
A telephone survey, also known as CATI or computer-assisted telephonic interview, is a research method where the researcher surveys respondents over the telephone. Unlike email surveys, researchers conduct data collection by conducting phone interviews and punching the responses themselves.
A type of stratification where quotas are used to weight respondent pools however the researcher would like. E.g. a researcher weighs their survey so 40% of respondents are aged 18-24, 30% are aged 25-34 and 30% are aged 35+.