Everything you know about online surveys is wrong. Here’s why.

The internet is in its third-generation now and largely dominated by mobile and social experiences, this after the respective “dot com” and “web 2.0” eras. But you wouldn’t know it by seeing—or more likely, declining to answer—your most recent online survey.

Indeed, customer experience or “voice of customer” feedback has barely changed in over two decades. You know the drill. An ill-timed pop-up window either derails you at the start of a session or pesters you on the way out. In the unlikely event you do accept the invitation, you’ll be presented with a redundant, extraneous, and overwhelming 30–40 question survey.

Which is why fewer than 1-2% of online surveys go unanswered, in my 16 years working with many leading online survey platforms.

It’s also why researchers, coders, and marketers largely ignore the long-delayed and mishmash results that are rarely merged with existing web analytics. All told, the traditional method of collecting feedback delays or even forgoes learning and results in less actionable, impersonal, and incomplete insights.

In other words, consumer surveys are broken. Fixing them is easier than you think.


Less is more—the rise of micro surveys

To change your outlook, you really only need to ask a few good questions. In most cases, just one insight from a single question can optimize the way you buy media, direct prospects, or increase customer engagement. Think calculated quality, not catch-all quantity.

To see results, however, you cannot ask the same, passive questions as before. For instance, asking “who” visitors are or “how” you met their needs—aka the most popular line of questioning—is not nearly as telling as why they are, what they believe, and how they’re feeling, especially since web analytics already has the answers to the first two questions through session tracking, demographic data, and abandonment metrics.

In cases where customers don’t easily volunteer their answers (i.e. low converting surveys), the micro survey might instead address a company hunch or hypothesis through priming. For instance, “Is [insert reason] why you [insert action]?” Not only does this type of questioning lead to more targeted insights, it’s consistent with modern psychology practices.

Of course, when you collect feedback is just as important as how you collect it. Before and after surveys are not only outdated, they’re needy and disruptive. A thumbnail-sized survey that matches your style guide, coexists alongside your content marketing, and then elegantly disappears upon completion creates a lot less friction. When combined with limited but active questioning, this approach can lead to a 10% (or higher) response rate.


The takeaway—how to get 10% response rates

The bigger you are, the harder it becomes to listen to customers. That’s a fact of business.


Image: Example of embedded survey execution – matching styles, look, and feel (highlighted for illustration).

While the intentions of consumer surveys have always been noble, their less than 2% completion rate has failed you. Micro surveys are the answer, especially “embedded” surveys that tie-in current analytics and ask only one or two critical questions before getting out of the way.


Admittedly, this is a small step in survey evolution, but it’s long overdue.

1 reply

Trackbacks & Pingbacks

  1. […] or service. But also in the manner through which you respect your customers’ time. That means no more long surveys. Less is more. Make it easy for all involved to gather and then act upon feedback that’s less […]

Comments are closed.