6 useless survey questions you need to stop asking

For whatever reason, customer feedback surveys have changed very little (if it all) over the last two decades. They’re long, disruptive, and still ask a lot of lifeless or redundant questions.

So before creating your next survey—hopefully a fast-moving, in context, and micro one from Pulse—consider ditching these in favor of smarter ones:

 

1) Rate your overall experience.

This question certainly has its heart in the right place, but it’s backwards-looking. Instead, you’re better off asking something forward-looking and actionable, such as “Which of these enhancements would most improve your experience?”

 

2) How did you get here?

Thanks to capable web analytics, you should already know the answer to this question. If not, you’re probably not integrating your survey data with your existing metrics. Either way, stop wasting your customers valuable time asking something you can just observe. You wouldn’t ask a retail customer what color their shirt is. So stop asking your web customers what you can just observe through web analytics.

 

3) What country are you in?

Ummm, if you can’t see that from your analytics or survey software, you’ve got bigger problems to solve. Again, focus on questions you don’t have answers to.

 

4) Asking ratings on a 10 point scale.

Have you seen one of these before?

If so, you’ve probably never filled one out. The reason: It’s way too granular to be meaningful and causes more work for the respondent. Do you really need this many grades to your feedback? No, you don’t. It’s not meaningful or actionable.

 

Full disclosure: in less enlightened times with a less enlightened vendor, a Pulse Insights co-founder launched this exact question on a major ecommerce website. Live and learn!

 

5) Anything else that’s observable and not actionable.

Stop making your customers do extra work. Do it for them by connecting Voice of Customer to web analytics because it’s the right thing to do and shows your respect for them. And most importantly, you’re left with focused questions that yield better response rates.

 

6) Would you refer us to a friend? (aka RTF or Net Promoter Score)

Admittedly, this one is controversial. Who wouldn’t want to know if you’re driving customer evangelists?

 

The problem is that it’s tough to attribute changes in score to other changes, so it leads to lots and lots organizational overhead and bad science. If you work in an organization that tracks this metric, you’ve probably scrambled once or twice to explain a drop–was it a site redesign? seasonality? a shift in marketing or offers? downtime? I know I sure have!

 

We do believe there’s value in tracking this metric, but it’s more important to use micro surveys to learn which things drive RTF scores. And then to focus on impacting those things, bottoms up in order of priority. Then your rising RTF metric is just a scorecard that quantifies all the wins you’ve used micro surveys to help uncover.