Bounce Insights

How To Design A Survey

How-To Design A Survey
Table of Contents

In this blog, we’ll provide an all-encompassing rundown of survey design, including an explanation of why it is important, how to design a survey effectively and all the tips you need to know to do it successfully. So let’s get started!

Below are the steps we follow throughout the survey design process from concept to completion, you can follow these steps to when designing your own survey.

  1. Identify your survey objectives
  2. Choose your audience targeting criteria
  3. Decide on a sample size
  4. Create your research questions
  5. Implement structure and flow
  6. Launch your survey
  7. Analyse your survey results.
  8. Identify your survey objectives

Breaking Down Goals and Objectives

Before tackling any objectives, an easy first step is to set a goal for your survey. This is your primary aim for the survey, essentially what you want to know and why you need a survey. A goal is not strictly measurable and tangible. Your objectives for your survey are more specific and they break down the steps to take in order to achieve the survey goal.

Let’s use an example

You are a company that has a large customer base and there are currently no viable opportunities for expanding that customer base into new markets. You are concerned with keeping your existing customers as well as poaching potential customers from your competitors. You want market research to validate your strategy to drive customer loyalty among existing customers.

Your Goal: To understand what drives customer loyalty

Your Objectives:

  1. To determine the percentage of the current  customer base that are likely to purchase our product again over the next 6 months.
  2. To assess the level of customer loyalty towards your competitors.
  3. To describe what unique need our products are filling that leads to increased customer loyalty
    To explore marketing factors that influence customer loyalty.

How to set survey goals and objectives?

It can be pretty easy to set a survey goal, other times, it requires looking at the bigger picture and asking yourself some questions.

What is the subject matter? This is the general field that you are concerned with. Examples of the subject matter includes the product your company offers, your brand, customer experience, competitors etc, etc

What do I want to know? This is a pretty straightforward question to ask yourself and the more specific the answer the better e.g. satisfaction with your customer support, effectiveness of your digital marketing, etc. I would also follow up with the question “What do I not know?”. While sometimes the answer to these two questions are the same, other times you want to know something without having enough foundational information to actually learn what you want to know.

Who should I ask? Do you want to ask existing customers? Is there a specific demographic that you want to hear from? Is there a specific demographic that you don’t want to hear from? Here at Bounce, we can tailor your audience for you, we can target based on the normal things, like age, gender or location but also, if you want to only ask those people that use your brand or if they exercise more than twice a week, we can specifically target those people for you.

What do I want to be able to accomplish when I have my final results? This is a very useful question to ask yourself when it comes to your objectives. Knowing what you want to achieve at the end of the process helps you identify the practical steps needed for your objectives.

Aim to have two to five objectives for your survey. Have at least one objective that is concerned with quantitative research, e.g. what percentage of our customer base have visited our website. Your objectives can involve qualitative research but be as specific as possible about what you want to know. Objectives should follow the specific, measurable, achievable, relevant and time-bound (SMART) rule.

Why do you need a survey goal and objectives?

Defining the survey goal sharpens the focus on what you are trying to achieve and thus is of great importance to survey design. Survey objectives can also prevent ambiguous questions, which can ruin the reliability and usefulness of a survey. An ambiguous question is defined as one where there is no specific query. Examples of an ambiguous question include a question that could have more than one meaning, asking for several responses, or not clearly defining the subject/object. If a question doesn’t specifically relate to one of your survey objectives, then you need to get rid of it. If you, the researcher, don’t know exactly what you are asking your respondents, then your respondents definitely won’t know how to answer your question.

Objectives also provides a framework for asking the right survey questions. Often, pre-setting objectives creates a template for researchers. Depending on the objectives, researchers know how many questions to ask, what type of questions to ask and create a better logical flow to the survey. Following your objectives also helps researchers to craft better questions that allow respondents to give higher-quality responses. Furthermore, sticking to your objectives actually keeps your survey concise and short. Bad survey design is part of the reason that response rates within the industry are so low. By improving your survey design and by sticking to your objectives, respondents will more likely complete the survey and have a positive experience with your brand.

Throughout the survey process, you should keep your goal and objectives in mind. Don’t stray from them, otherwise, you risk getting responses back that don’t actually provide you with the information you need to achieve your goal. Objectives help keep you focused when writing your surveys and when you have finished your survey design, you should feel happy that each question within your survey helps achieve one of your objectives. Most importantly, resist the urge to dive head-first into designing your survey. Your goal and objectives need to be clearly outlined first so that you know exactly what you want out of your survey.

Choose your audience targeting criteria

Different audiences have different characteristics and may answer questions differently. It’s important to narrow your scope to get the most out of your research. Identifying your target audience is essential to creating a good survey. This is why you should think about your target audience before even beginning your survey design. The characteristics of your customer base may be a good guide to who you should target, but first and foremost, your target audience should depend upon your research objectives.

Basic Targeting Criteria

Many panel providers have pre-defined targeting criteria that you can use to send your survey directly to the respondents you want. Understanding these criteria can save you time and money when you create a survey. The range of pre-defined targeting criteria is dependent on the specific panel provider, however, below we go through the basic targeting criteria you should be aware of.

Age & Gender – The most basic targeting is age and gender. You probably have a general sense of these two categories. If your brand’s main customer is young people, targeting only 18 to 35-year-olds helps you save money on survey costs and allows you to focus on the people that matter.

Location – Another common targeting category is location. Not only does location include the county the respondent is living in but it can also include whether they live in a rural, suburban or urban setting which can be an incredibly insightful factor for brands and agencies.

Income & Education Level – Many panel providers, including Bounce Insights, have the ability to target respondents in a certain income range or depending on their education or job level.

Employment & Relationship Status – Similar to income level, employment or relationship status can be extremely useful for certain brands or companies that have niche markets.

Specific Targeting – Outside of basic demographics, many panel providers have more specific behavioural targeting, such as coffee or alcohol drinkers. At Bounce Insights, we have dynamic targeting which keeps an up-to-date profile of every user within our app based on their responses over time.

If you desire a bespoke audience that pre-defined targeting criteria can’t cover, screening questions may be the answer.

Screening Questions

Pre-defined targeting criteria may not always fulfil your research needs, so additional filtering is needed. You can do that using a screening question. A screening question, when worded properly, will disqualify those respondents who may be a fit from a demographic standpoint but aren’t an exact fit for your specific research needs. Screening questions do increase the size of your survey so avoid using multiple screening questions.

An example of a screening question could be where your product is only going to be used by that particular group. So when an alcohol brand wants to launch a new alcoholic drink, they will only want to ask their questions to those who drink themselves. Our platform allows us to capture those people by simply asking them if they drink that brand or type of alcohol.

Decide on a sample size

After defining your targeting, you need to actually settle on the size of your audience. But why is the sample size important? Even though you are interested in the entire target audience, aka the population, you realistically can’t get answers from everyone. Therefore, you take a sample of the population, however, the size of the sample is very important for getting accurate, statistically significant results and running your study successfully.

If your sample is too small, you may include a disproportionate number of individuals which are outliers and anomalies. These skew the results and you don’t get a fair picture of the whole population. If the sample is too big, the whole study becomes complex, expensive and time-consuming to run, and although the results are more accurate, the benefits don’t outweigh the costs. If you are confused by sample sizes, there are plenty of online calculators that can guide you in your decision.

Sample Balancing

An equally important factor in survey building that is often overlooked is survey balancing. Balancing is selecting the portion of your audience that is made up of a specific cohort. You could want 50% of your audience size to be located in a certain location with the other 50% spread across a wider location. However, you should be careful while balancing – it is often best to stick with a national representative sample or as close as you can get to one, unless your research needs are bespoke to a niche set of audiences.

Create your research questions

Once you have decided your goal and target audience for your survey, it’s time to get writing those questions. Here at Bounce, we offer a helping hand when it comes to survey design. If you wish to contact us, you can do so from our website.

There are many different types of questions you can ask your audience and we find that the best way to decide what type of question to ask or how to ask that question is to put yourself in the shoes of the audience themselves. If you were to be asked this question, how would you answer it? If your mother was to get asked the same question, how would she answer? The most important thing to remember is when you ask the right questions, you will get the right answers.

Before we explore the types of questions you can ask on the Bounce platform, we need to understand what the difference is between Closed vs Open Ended questions.

Closed vs Open Ended Questions

One of the first lessons for new survey designers is learning the difference between open ended questions and close ended questions. The difference is straightforward; a closed-ended question includes a predefined list of answer options, while an open-ended question asks the respondent to provide an answer in their own words.

Another way to look at this is that close ended questions provide quantitative research, while open-ended questions provide quantitative research. There may be some odd cases where a close ended question has an open ended component e.g. the other option in a list of predefined responses, allowing users to write their own response. However overall, you should be able to easily categorize your survey questions as open or close ended questions.

One of the main advantages of including closed-ended questions in your survey design is the ease at performing statistical analysis. These questions are ideal for calculating statistical data and percentages, as the answers set is known. Furthermore, respondents are more likely to answer a close-ended question as they are quick and require less effort.

An ideal questionnaire would include an open-ended question at the end of the questionnaire that seeks feedback and/or suggestions for improvements from respondents. By including open format questions in your questionnaire, you can get insightful and even unexpected suggestions.

The Advantages and the Disadvantages of Open-Ended Questions

Advantages
Open-ended questions can provide more holistic and comprehensive data than close-ended questions. The data is more diverse and unexpected as users can put their own mark on their responses and therefore your research can identify more nuanced takes.

They empower the respondent. Often respondents can feel frustrated and confused by surveys if they have to select responses for close-ended questions that don’t reflect their actual opinion or experience. Open-ended questions avoid this, allowing respondents to properly articulate their thoughts. Many respondents find this freeing and enjoy using their voice to give feedback and air grievances.

Respondents can provide a lot more data in open-ended questions. This is especially true if there is no limit in the text boxes used in the survey. Some respondents can go on to write long paragraphs and more, providing a huge amount of understanding. However this data is qualitative data, not quantitative.

Disadvantages
Open ended questions often take more time and effort for the respondent to complete than a close ended question. Firstly the response is entirely respondent-generated, as opposed to close-ended questions where the responses are generated by the survey designer. Thus the respondents have to put more thought into the answer than normal. Additionally, responses take more time to be inputted since the respondents are most likely typing the response out, compared to pressing a button for a close ended question. More time spent on questions means longer surveys and this will increase drop-off and decrease your response rate.

Analysing responses to open ended questions can take more time and effort than regular close ended responses. While close-ended responses can easily and quickly be combined and presented in charts and metrics, open-ended questions either need to be viewed individually or put through a text analysis tool, such as a higher chance of misinterpretation.

Open ended questions can be more subjective than close ended questions. This has two impacts. Firstly, respondents may give a response that doesn’t actually answer the question. This can be avoided by improving the phrasing of your question and through testing. Secondly, the researcher analysing the data could misinterpret the sentiment the respondent was giving. This is because open-ended responses are difficult to compare to one another and it is easy to misidentify the tone and context the respondent was providing.

When should you use Open-Ended Questions?

A general rule of thumb is to use open ended questions sparingly. As mentioned above, open-ended questions take longer to answer for respondents and therefore if you are trying to keep your survey shorter, which all researchers should, you should aim to have as few  open-ended questions as possible. Additionally, the fewer open ended questions you have to analyse, the easier and quicker it will be to analyse the data collected.

There are also two types of open-questions; supplementary and freestanding. Supplementary open-questions are questions that relate to another question, usually a close-ended question. For example, after the close-ended question “please rate our product”, you may follow up with the open-ended question “What was the reasoning behind your rating?”. In deciding whether you want to have a supplementary open-ended question, ask yourself how much nuance you are receiving from your initial question and if you are not satisfied, then go with a follow up open ended question.

Freestanding questions are open-ended questions that don’t directly relate to another question within the survey. A basic guideline is one freestanding open ended question per survey, unless you have a strong justification to add more. Strong justification means that you would not be able to answer the question you want to ask without an open-ended question. Sometimes, you may have very good reasons for open-ended questions; such as wanting respondents to highlight opportunities that you may have otherwise overlooked. Another reason to use open-ended questions is when you are dealing with complex or in depth topics and responses.

Tips for using Open-Ended Questions

Analysing responses to open-ended questions can be time-consuming if you don’t know what you are looking for. Viewing 1000 responses individually is the best use of your time so consider using a text analysis tool, such as the word cloud feature used on the Bounce Dashboard. Word clouds allow you to get a sense of what the most common sentiments were and then you can alter your approach when viewing the actual response to focus on what are the most common sentiments.

Before setting any open-ended question, you should brainstorm possible close-ended questions that could replace that question. Even if you don’t replace the open-ended question, you should use the close-ended question to complement your open-ended question.

Ensure your open ended questions are not double barreled questions. A double barreled question, or questions that are asking more than one question, is a common pitfall when you are trying to reduce the number of open-ended questions, thus bundling them together. However this just causes confusion for respondents and reduces the quality of the answers.

Types of questions

Now that you understand the difference between open vs closed questions, below we have listed the types of questions you can ask on the Bounce platform.

Open-text: This will allow the respondent to enter their own answer in their own words.

Dichotomous questions: The most common dichotomous question is the Yes/No question.  Dichotomous questions are useful for streaming respondents and creating conditional paths in your survey. Additionally, they are quick and easy for respondents to answer (if framed correctly).

Single choice: These are best used when trying to determine a respondent’s primary preference. Like a dichotomous question, it is quick and easy for a respondent to complete, especially on mobile. Single choice questions are also used for likert scale questions which we cover in the next section.

Likert Scale Question: This is a single choice question that uses a 5 point scale that ranges from one extreme attitude to another. Likert scales are widely used to measure attitudes and opinions with a greater degree of nuance than a simple “yes/no” question. An example of this would be determining a respondent’s level of agreement: Strongly Agree, Somewhat Agree, Neither Agree or Disagree, Somewhat Disagree, Strongly Disagree.

Matrix: Matrix questions collect multi-dimensional data. Each column can have sub-columns with a scale having opposites on its end. For each option in the row, the respondent needs to select one of the sub-columns. Most often, some form of a Likert scale is used in matrix questions. Generally, the sub-columns of a matrix question cover subtopics of a particular topic. Matrix questions are useful for reducing the length of our survey by linking similar questions with similar answer options.

Multiple choice: Multiple choice questions are useful for gathering personal preferences from a respondent. Often, the text of the question includes the lines “Select all that apply”.

Preference/Ranking: A ranking question asks respondents to order answer choices by way of preference. This allows you to not only understand how respondents feel about each answer option, but it also helps you understand each one’s relative popularity. Ranking questions are very useful for understanding the most important features/issues to a respondent and how they relate to each other, however they can take longer than the normal Likert or matrix question as respondents need to weigh the options. Therefore, you should only use a ranking question if it is necessary to know what respondents would prioritise.

Rating: These display a scale of answer options from any range (1 to 3, 1 to 10, etc.).

Slider: A slider question allows respondents to include into a numerical scale, ideal for responses that require a percentage or monetary value.

Image: These allow respondents to select one image from many options. This question type improves respondents’ survey experience and gives a break from answering all textual questions. They are ideal for gathering research on branding and advertising choices.

Date/Time: These question formats allow respondents to input a singular date/time or even a range of date/time as responses.

In-App Browser: This will allow you to link to another website, whether that be a YouTube video or a brand website. This works great for Pre/Post Campaign research.

Net Promoter Score: This is a pre-filled NPS question to ask consumers how likely they are to recommend your brand. A NPS question should only be asked by itself, with the exception of one follow-up question.

Avoiding Ambiguous Survey Questions

Just as it is important to consider in what way you ask questions, it is just as important to ensure you are asking those questions in the best way possible. The best way to get great answers is to ask great questions, which is done best by avoiding ambiguity/confusion.

What are ambiguous survey questions?

An ambiguous question is defined as one where there is no specific query. You could also describe an ambiguous question as a confusing question. An ambiguous question is one that a respondent will struggle to answer, not because they are a bad respondent but because it is a bad question. Generally, ambiguous questions are too broad and leave room for interpretation by the respondents. Examples of an ambiguous question include a question that could have more than one meaning, asking for several responses, or not clearly defining the subject/object.

Example of an ambiguous question

“Please rate the speed and quality of our customer service.”

This question is an ambiguous question because the respondent is being asked to rate two things – speed and quality. It is very possible that the respondents received speedy customer service but the actual quality of the service was poor, or vice versa. However, because of the framing of the question, a respondent will struggle to respond to this question, especially if there is no “other” or text input option. The most likely outcome is that an inaccurate or at least incomplete response will be given and this means that your survey is gathering misleading data.

How to avoid ambiguous survey questions?

Follow your objectives

If the researcher doesn’t know exactly what you are asking your respondents, then your respondents definitely won’t know how to answer your question. Therefore, you need to set survey objectives before even thinking about your survey designs. Keep these objectives in mind when you are setting your questions. This helps you keep your questions specific and generally avoids ambiguous questions.

Be specific

Again, ambiguous are usually broad questions, so to avoid that be specific. Let’s be more specific about how to be specific in your survey questions.

Here are a few tips:

  1. Avoid long, wordy questions that could lose respondent interest.
  2. Don’t use vague words.
  3. Specify the context of the question.
  4. Avoid double negatives at all costs.

However, as much as you might think using jargon, abbreviations or acronyms will keep your questions short and specific, don’t use them! We can guarantee you that most of your respondents aren’t familiar with jargon that you as a researcher use. Using jargon will just confuse respondents, so avoid them!

Split your questions up into multiple parts

To resolve the issue with the example above is to split it up into multiple parts. If you are asking respondents to rank something, break it down into different areas. So in the example above, split the questions into 1) rate the speed of the service, 2) rate the quality of the service.  An added benefit of this is that you get more detailed responses. Don’t be afraid to split up questions even if that means that you have more questions. More questions isn’t a bad thing in this case especially if they are short questions, such as rating questions. A couple of short questions is better than one long ambiguous question.

Test your survey

Put yourself in the shoes of a respondent and take the survey yourself. Enlist some colleagues or friends who aren’t familiar with the survey to take the survey too. Why is this important? If you, your colleagues and your friends give different types of responses to the same questions, or there are some unexpected responses, then you should probably review the questions. Also you can see if respondents will struggle with the questions. If they do, then you should definitely reword your questions.

Implement structure and flow

Now you know how to ask the best questions in the right way, it’s time to put a structure and flow to your survey.

Survey Structure

Choosing survey length

The attention span of respondents has decreased significantly, especially in the age of social media. This means that traditional online surveys have died a quick death, and any survey longer than 10 minutes will see engagement rates fall dramatically. This is further heightened when carried out on a smartphone, as distractions take charge via alternative push notifications dragging the respondent in different directions.

Here at Bounce, we believe the future is that surveys should be short, clear and direct. Researchers should make every question count, and appreciate the time spent by each person within the survey, valuing their time as a priority. The ideal survey length ranges between 4-8 minutes depending on the complexity of the questions asked – our rule of thumb is anywhere between 20-30 questions is the goal to maximise the return from your research!

Simplicity of structure

A central flaw to most surveys is unnecessary complexity. Researchers have the tendency to over-complicate survey design, spending more time over-thinking the depth of questions, and nowhere near enough time thinking about the people actually filling out the survey. Online surveys should be easy to understand, using a language that the respondent can easily comprehend. If possible, use language that the respondent group you are targeting is familiar with, so they are not slowed down by jargon.

In practical terms, an effective online survey should have a logical flow. Always provide options/potential responses where possible, particularly in the beginning, as respondents are familiarising themselves with the topic at hand. Use scales, rather than complex question types, to help respondents measure their feelings and answers accordingly. Simplicity and clarity will always deliver much higher quality in the long-run.

Respondent flow

Survey design is all about flow. The flow dictates the user experience for online surveys, so start by framing the survey with straightforward questions that help each respondent understand their reason for being part of this research. Spike their interest, and group similar topics together, making the survey conversational and enjoyable to go through. In terms of where to start with building a survey, we always recommend starting with what you are trying to prove or find out. What answers are you looking for, and how can you build backwards to the beginning of that research journey?

One of the reasons why response rates have fallen so dramatically in recent decades is that there has been a shift in power towards the consumer, and while other industries have adapted to the personalisation demanded by today’s population, online surveys have not. Market research holds a top-down focus, neglecting the actual people partaking in the research. Ultimately, in pursuit of research perfection, these researchers are finding low quality, low engagement and a disconnection from real people.

Conditional Logic

Conditional logic, also known as “conditional branching” or “skip logic”, is a feature of survey design that changes what question or page a respondent sees next based on how they answer the current question. Using conditional logic, you can create a custom path for each respondent, depending on their response to a specific question. This is a powerful tool for researchers, empowering them to create custom rules and get the right responses without confusing respondents or wasting their time.

How does conditional logic work?

To properly explain conditional logic, let’s use an example:

A brand can ask a group of consumers whether they have purchased one of their products before. If the answer is yes, the survey will jump to a question like “Please rate the quality of this product”. If the answer is no, the survey will jump to a question such as “Why haven’t you purchased X product?” with response options like “Haven’t needed it”, “Didn’t know it existed”, “Was too expensive” and “Other”. After these questions, the survey flow could reconnect, so all respondents will get the same questions or the survey paths will remain separate.

Conditional logic can be used for any research project, whether you’re gathering research for a new advertising campaign, assessing customer satisfaction, or developing a new product.

Advantages of Conditional Logic

Better survey flow:

Survey flow is an important consideration in survey design. Having a consistent structure makes answering questions a lot easier on respondents, as questions follow some theme or logic that respondents can follow easily. Additionally, respondents expect surveys to be interactive and conditional logic is one of the best tools for creating that interaction.

Shorter survey time:

With conditional logic, respondents automatically skip any questions that aren’t relevant. As a respondent only sees the questions that are relevant to them, they don’t waste their time on needless or confusing questions. A researcher should always strive to make their surveys shorter, as respondents don’t like long surveys.

Higher response rates:

A key metric that researchers should keep in mind is the response rate, that is how many respondents actually complete the survey and don’t drop out. When surveys contain questions that don’t apply to them, respondents are more likely to get irritated and leave the survey.

Higher quality:

With higher response rates as well as responses that are more applicable to the survey questions, researchers can expect higher quality research. Often when respondents see questions that don’t apply to them, their response will be a random pick, thus skewing the data. Conditional logic avoids that.

Launch your survey

With your structure & flow in check, you are now ready to launch your survey.

Pre-Launch Testing

Something that you have to understand early with online surveys is that you will rarely get it right the first time. Like everything, it is an iterative process and every survey should be drafted and re-drafted based on the feedback you receive from colleagues, and most importantly test respondents. Be your own biggest critic, and place yourself in the shoes of the respondent to understand how you can leverage them most in the short space of time that you have their attention.

The simplest way of achieving survey validation is pre-launch testing with a sample respondent group. This builds a culture of respondent-centricity, allowing each person to go through the survey and offer you feedback on the length, simplicity and design. In the long-run, this process of continuous feedback and improvement will save you time, money and effort as you fine-tune your survey design.

Timing

A crippling mistake many researchers make is that after weeks of designing a survey, selecting their target audience and going back and forth with the client, they launch their online survey into the abyss without a plan. Timing has become one of the single most important factors for increasing completion rates in online surveys, and an area often neglected by researchers.

The simple fact is that people are incredibly busy, all with differing schedules and a very limited time frame that they may be available to partake in an online survey. Thus, potential respondents must be notified when suits them, through a medium that suits them. Simply put, researchers need to ask their panel exactly what time frames suit them to partake in research, or track engagement based on previous surveys to maximise the timing of each survey.

At Bounce, we recommend keeping your survey completion time below 10 minutes, this ensures that our respondents remain engaged throughout the survey.

Launching

Before launching, your Account Manager will also complete a pre-launch test and will ensure that the correct target audience has been selected. Once the survey is launched, your results will be back within the next 24 hours.

Before we send you back your results, we will also clean the responses so that any responses that contain poor answers or if the respondent has gone through the survey too quickly will be screened out.

Analyse your survey results

Perhaps the biggest inhibitor to actionable insight is the clunkiness in which data is displayed to the end-client. In order to democratise insights, and make them readily available and digestible to people across an organisation, the data must be easy to organise and understand.

Although Survey result analysis will vary depending on which survey platform you’re using, we’ll give you an overview of how we analyse results here at Bounce. When designing the Bounce Dashboard, we wanted to make it as intuitive and seamless to navigate as possible. The goal was to bring clarity to the chaos of raw data that many researchers dive into on a daily basis.

We will notify you when your results are ready for you. Our Account Manager is always on hand where you have any questions.

Audience Breakdown

To represent the specificity of targeting selected by the client in their ‘audience section’, the overall audience breakdown provides a starting point to understanding your data. This will usually include a full demographic breakdown of your target audience, combined with any additional targeting criteria selected which may be bespoke to that project e.g. beer drinkers, sports fans, key decisions makers in a household etc.

Cross-Tabbing & Analysis

Any targeting criteria displayed in the audience breakdown can be segmented in any way you see fit – what this means is you can repopulate the data in your dashboard based on any data captured in the audience breakdown.

To supplement this, you can cross-tabulate the data based on individual responses to any question asked, or a multitude of responses and questions to generate an entire new insight report. This allows clients to understand the ‘why’ behind the data, and dig deeper to pull out actionable insights.

For example, if you wanted to analyse the data based on those individuals who stated that “climate change was very important to their friend group”, you simply select that subgroup to analyse. This will update the results to every question which just the answers from those individuals, to understand how they responded elsewhere.

This allows you to precisely pull out insight from the same data set, leveraging more from less without having to dive into all the raw data.

Individual Graph Types

Our mission was to make the insight digestible and deliverable for anyone within an organisation, so that research could transcend departmental barriers. The feedback that came from clients was the ability to take individual graph types and plug them straight into presentations and reports, so that insight could be used by multiple teams, in multiple divisions all with one common goal, to understand their consumers.

This allows data to be acted on quickly and decisively within teams, working in real-time to be customer-centric. We provide flexibility on percentage or absolute figures, with three graph types, to make it as easy as possible to understand and act on this data regardless of your research experience or competencies.

Our dashboard is built for speed and agility, with no compromise on quality or ease of use. This has been a defining capability for our platform with our clients, and we shall continue to learn from our customers to understand how we can consistently deliver on these core traits into the future.

Exporting Data

For deeper analysis, we also provide the ability to export individual questions as a PNG, PDF or CSV file. Entire reports can also be downloaded as a PDF or Excel. These reports can include any audience filtering or cross-tabbing you have selected, so multiple reports and data comparisons can be achieved easily, depending on what the research outcome was for the project.

Final Thoughts

Well done, you made it! You are ready to start designing your own survey.

Our biggest takeaway from survey design is always to remember your respondent, put yourself in their shoes and always make sure you are testing your survey yourself. Always ask yourself, is there an easier way to ask this question? What do I want to get from this question?

It’s impossible to create the perfect survey but hopefully, this overhaul will help you get close to it!

To get started on the Bounce Dashboard, you can get registered, or if you want to contact us for more information, you can book time to chat with a member of the Bounce Team.

Request A Demo

Speak With Us

Get direct access to your customers using Bounce's easy to use survey platform. Get results in real-time from survey respondents anywhere in the world.

Why Bounce Insights?

* We may contact you to help you get set up. Privacy Policy