Is your data reliable?

It's often hard to know whether we can use the audience data we collect to make confident decisions. Can you rely on your data?

Who are you asking?

The first thing to ask is whether you are asking a representative sample of your audiences. This can be hard when we can't always get to everyone. Online surveys usually go to the booker only, while paper surveys might favour people who have time to hang around and complete them. 

Think about how you might reach a really wide range of your audiences:

  • Use multiple collection methods e.g. QR codes, paper surveys, online surveys, face-to-face surveys with an ipad in the foyer;
  • Send reminder emails to encourage those who may have a tendency not to complete your online surveys;
  • Choose a range of events to survey rather than just one strand of your work;
  • Consider weighting the data you get (give us a shout if you need help with this!), for example down-weighting members who tend to be more likely to respond than your overall audience.

Are you asking enough people?

We find people often worry about this too much. In most instances, it's more important to worry about who you're reaching than how many. Most organisations using Indigo Share get more than enough responses to meet statistical standards of reliability.

If you want to find out how many survey responses you need to have very robust data, try out our calculators below. The industry standard is often a 95% confidence interval and a 5% margin or error or lower. So start with that as your guide.

If you're reaching enough people, then you can have confidence those results are a true representation of those you've asked.