Lessons IBM Leared about Conducting Online Surveys

IBM's e-business Innovation Centre in Toronto ran an online business-to-business survey between 2000 and 2001, and after tweaking a number of variables they managed to double their survey's response rates in the second year. Here are some of the lessons learned:

Survey length - Keep the survey questionnaire to a minimum. In 2000, the survey was 37 questions. For 2001, IBM condensed the online survey form to 20 questions and greatly reduced the number of mouse clicks required to complete the survey.

Be honest about how long the survey will really take - The first draft survey would have taken much longer than the promised 10 minutes to complete. They did not want to annoy the survey respondent (and so affect the credibility and response rates), so they shorted the survey to ensure it would take no more than the promised 10 minutes.

Provide value, value, value - Online survey response rates increase dramatically when the participant gains value from responding. For the 2001 survey, IBM identified multiple and relevant value for responding to the survey. They offered a copy of final results, additional learnings on executions, and added a contest component as additional incentive.

Send the survey mid-week, mid-afternoon - Most email users will start their Monday mornings cleansing their mailboxes of non-corporate or personal emails. The likelihood of your email being read is increased by sending out email survey invitations mid-week, after 12pm. Other email marketing strategies (e.g. sender and subject line testing) can contribute to higher response rates.

Use 1 reminder email survey invitation - Standard to an online survey execution is sending out one reminder email to the survey non-respondents. Reminder email generated 15% more responses (some clients see their responses double due to reminders).

Allow for some open ended questions - Allow customers the opportunity to provide some open-ended answers instead of answering just "other". It can be disappointing at the close of a survey to discover very high "other" response. This indicates that there is customer insight that has not been presented in the options provided in the closed question format. To counter this from happening, add some open-ended questions whereby respondents can articulate what their "other" answer meant. An added bonus: when writing survey reports, these open-ended responses can be gold when used to confirm or articulate a finding.
blog comments powered by Disqus
Crimes in Design Webinar
Subscribe to our Monthly Newsletter