Need better insights? Stop surveying and start observing

Need better insights? Stop surveying and start observing

The number of surveys and feedback requests I receive from companies is insane.

Buying a plant at Home Depot prompts a 15-min experience survey. Getting out of an Uber ride prompts for immediate rating. Every software tool I’ve used has asked me “How likely I am to refer a friend…” It has become too easy for people to design and launch surveys, leading to too little planning around what to ask, why, and how often.

Our team has certainly been guilty of this behavior as well. We conduct a couple email surveys a year, ask for feedback during account calls, and actively track our NPS score.

While I support this customer-centric culture, I also believe that we’re asking many unnecessary questions.

As result, customers are becoming inundated with annoying requests for feedback. I can’t help but think of a future where people block surveys like they block online ads. In a way, they already are: Response rate of our customer surveys are around 5%. To boost this, we often have to resort to contests, prizes, and bribes that bias the sample population. In the end, we can’t even trust the data from our surveys.

Therefore, I’m advocating for a less intrusive and more accurate method of gathering customer insights, by observing customer behavior. To illustrate this approach, I’m going to analyze three common questions found in customer surveys, and how we can answer them without talking to customers.

“How likely are you to recommend us to a friend?”

I understand the need to know how much users love our tool via NPS. It helps us evaluate progress on customer satisfaction, and even compare against other companies.

What I don’t understand is why we need to ask people this question when we can simply track referral rates. Besides, if someone answers 9 or 10, but never actually referred anyone… are they playing nice or lying out loud? Either way, knowing how likely someone is to refer us doesn’t help our business. Actually referring people to our business does.

So instead of surveying NPS, what I’d advocate for is a referral system that allows customers to actually refer their friends directly in-app. We can then gauge how likely customers are to refer us based on actual data. There’s a clear difference here: NPS measures a person’s likelihood to refer someone (mere words), whereas the referral rate measures the ratio of people actually doing it (an action). If I remember right, action speaks louder than words.

With this data, we can even take the analysis a step further, and calculate the referral rate over time by registration cohorts (i.e. % of people that registered in a specific month and referred friends in month X after registration). It would show us when people are most likely to refer after registering themselves, and when numbers plateau, indicating an opportunity to remind them of our referral program. Taking actions to increase this metric is much more impactful than trying to increase NPS – it directly drives customer acquisition, not just a sentimental score. 

But wait, don’t we already have referral systems? Yeah, so why do we keep asking that NPS question?

“What would you like to see improved?”

I recently took a flight to Florida, which was delayed and overbooked, after which I got an email asking me for feedback. Boy did I have lot of feedback to share… But did I answer the survey? No.

Why? Because I had already spoken to a customer service agent before the flight to make sure my wife and I would be on the flight, along with a flight attendant about some other issues in flight. I didn’t feel like repeating myself.

In my opinion, no company that cares about customer happiness should survey customers about how they can improve. Most customers, at least in the USA, proactively complain to customer service. To ask for it again via a second channel is like saying: “Hey, I don’t remember what feedback you gave our team. In fact, I don’t trust that customer service recorded anything at all. May I ask you to refamiliarize yourself with your frustrations and repeat them to me again?”

Do we really want people to think about what frustrates them once more?

I didn’t feel like repeating myself.

In my opinion, surveying customers on how we can improve means that we either don’t have a help desk, or don’t use our help desk data intelligently.

So to gain ideas on how to improve our business, let’s analyze our customer complaints and help desk data first.

“What features would you like to see?”

I’ve helped many product managers set up conversations with clients to get ideas on new features. Clients are usually excited to share their thoughts, and most have very specific features in mind. To help put their ideas into context, we often resort to further probing: Asking customers why they need XYZ feature, how they plan to use it, and how they’d prioritize their wishlist. This usually leads to hour long conversations where the client isn’t doing work they’re paid to do. While we only gain a tiny window into the challenges of our users. Hearing a story is simply not the same as being there. It lacks context.

Instead of all this questioning, I’ve found visiting clients and observing them using our tool, without disturbing them, is much more insightful. Shadowing users provides critical context around how they’re using the tool, as part of what process, in combination with what else, when, etc. This allows me to clearly understand the core challenge that a client is facing. And more importantly, it helps me gain ideas that can improve how our software is used in combination with other tools, and in different situations.

If engineers and product managers simply took the time to observe the users they serve in their environment (not some ideal lab setting), or maybe even do what their customers do for a day, the world would function much more effectively.

Allow me to share another example: I recently visited a grocery store where they had just installed a new cash register / payment system at all checkout lanes. Register clerks had a frustrating time using them, leading to long lines. We could blame the issue on improper training, or we could ask ourselves how a cash register could be so hard to operate… I’m willing to bet that the machine had no issues in the lab setting that it was designed in, but that engineers never even tried to use it in a real grocery store by themselves. They likely designed the whole thing based on indirect customer feedback, which rarely provides enough context to a problem.

I don’t doubt that we can find exceptions to what I’m advocating above. The point stands however that we should first see if we can answer our questions through observations rather than surveys. It yields much more comprehensive and accurate insights, and doesn’t waste our customers’ time. Action speaks louder than words.


Recommended exercise

Let’s look at all the questions that we’re asking on our customer surveys and ask ourselves: “Can this be replaced with insights from their actual behavior?”


Are you leading a startup team? Get started on the right foot with the Start-up Manager Handbook. And subscribe on the right for new insights every week!

Leave a comment

Be the First to Comment!

avatar
wpDiscuz