Online survey panels are a researcher’s dream – instant, low-cost access to a pre-qualified pool of people willing to take surveys. But perhaps it’s too good to be true?
That suspicion intensifies when it comes to B2B audiences – are time poor, well-paid executives really going to spend their spare time filling out surveys for random companies in exchange for usually no more than one pound or dollar?
yes, online survey panels can be used to access B2B audiences, but choose your panel partner with care...
Online survey panels in B2B – The Experiment
To explore this, here at Savanta we ran an experiment. We created a survey and sent this out to panel respondents who claimed to be IT decision makers. As the responses came in, we were interested in the answers given to one question designed to determine if these panelists really were, who they say they were.
This question gave a list of IT brands and asked respondents to indicate which were suppliers to their organisation. Only one of these brands was real, with the rest comprising made up brand names which bore no resemblance to real brands.
The answer was shocking. The vast majority of respondents selected brands that didn’t exist suggesting that they weren’t IT decision makers at all. Rather, it’s likely that they were consumer respondents or automated bots trying to ‘game the system’ just to get their hands on the incentive payment.
However, all is not lost as this experiment also revealed that some online survey panels do provide reliable access to genuine B2B respondents. These panels had three things in common:
- They specialised in B2B audiences and recruited them in highly targeted ways, e.g. through deals with specialist trade media
- They incentivised respondents in multiple ways, typically coupling a large cash payment with information incentives (e.g. access to exclusive content) and charity donations
- They had stringent quality control processes, most notably checking through sources such as LinkedIn that people had the job and seniority they claimed
So, yes, online survey panels can be used to access B2B audiences, but choose your panel partner with care and as a safe-guard include a few trick questions that will let you spot questionable respondents.
How do we get the most out of surveys?
The rise of low-cost online survey platforms such as Survey Monkey has led to commensurate rise in poorly designed surveys. That’s dangerous – it reflects poorly on the brand behind the survey and it can lead to decisions being made using dodgy data. So, here’s some tips if you’re planning to design your own survey.
First, pause and reflect on why you’re conducting the survey – what’s your goal and what decisions are you going to make? Write this down in a short statement and constantly refer back to it when designing the survey. This will ensure that your survey gathers all the information you need and doesn’t suffer from scope creep.
Then ask yourself another question – to reach this goal, what information do I need to know? Again, write this down as it will form the skeleton for your survey.
Armed with this skeleton, you then need to write a question (or questions) for each information objective. When doing so, avoid these common pitfalls:
- Write plainly and clearly, avoiding formal language, jargon and complex sentence structures. This will make the survey more engaging. I find it helps to read each question aloud and if it doesn’t match how I’d speak naturally in conversation, then it needs to be revised
- Make sure your question isn’t leading as this will create a biased response. For example, ask “how do you feel about X?” rather than “how positively do you feel about X?” as the latter implies the answer should be positive
- Don’t ask double barrelled questions. For example, ask “how satisfied are you with the way in which your query was dealt with and the outcome?” and some people may struggle to answer because the query was handled well, but the outcome was poor. They’ll probably give an answer though and focus on just one of the two aspects. Problem is, you won’t know which one they were referring to
- If you’re using a rating scale, make sure that it’s balanced. For example, a satisfaction scale including ‘extremely satisfied’, ‘very satisfied’, ‘satisfied’, ‘not very satisfied’ and ‘not at all satisfied’ has three positive and two negative options. This will lead to a bias towards the positive
- Make answer option lists are as comprehensive as possible (a small number of personal interviews with the target audience is invaluable when forming lists like these, as you can establish all the potential answers up front). It won’t always be possible to create a perfectly comprehensive list though, so be sure to include an ‘other’ option so that people can write in their answer if it’s not on the list
- Always include a ‘don’t know’ in the answer options as otherwise people may guess an answer just to proceed to the next question. That will taint your data set
Now review the survey against the goals you set at the beginning of the design process to check that it will gather all the information you need. Be disciplined when doing so and remove any questions that aren’t essential to reaching your goal, even if they’d be nice to know. This will keep the survey as short as possible (aim for a maximum of 15 questions as this equates to a completion time of around 5 – 7 minutes) and in doing so, reduce drop out rates and enhance data quality (longer surveys see people randomly answer towards the end just to get through).
Finally, test the survey to check that it flows nicely, isn’t too long and works on mobile devices as well as PCs.