Drinking Our Own Champagne – How UserIQ Deploys NPS Surveys

For the second installment of the Drinking Our Own Champagne blog series, we will be discussing how we conduct NPS surveys and analyze the results here at UserIQ. Luckily for us, the UserIQ platform has the ability to deploy in-app NPS surveys to targeted customer segments—helping us gain valuable insights from user feedback and keep a pulse on how we’re doing as a customer-success-driven organization.

At UserIQ, we take our NPS campaigns and Voice of the Customer (VoC) initiatives very seriously. Results and feedback from NPS surveys—combined with other VoC programs launched within our application—inform everything from account outreach prioritization, to product roadmap decisions, to strategic company direction. Let’s take a look under the hood at how UserIQ conducts NPS surveys.

Example of UserIQ’s NPS survey results page, which gives both team members and executives an at-a-glance look at their NPS campaign.

The UI Evolution with NPS

We talk with our customers all the time about best practices when deploying campaigns—particularly NPS surveys. But, even for our team at UserIQ who lives and breathes NPS on a daily basis, there were some things we learned over time through trial-and-error that helped us hone our own NPS surveys. Best practices are a guide for any organization to follow, but at UserIQ we’ve also learned how to to uniquely tailor our NPS strategy so it suits both our and our customers’ needs. 

For example, through experimentation and examination of campaign performance data, we found that presenting both the NPS rating scale and the comment field simultaneously reduced users’ response rates. As soon as we broke up the NPS survey into more digestible pieces—showing the rating scale first, then presenting the comment field once the score had been selected—we were able to improve response rates and still capture users’ free form comments. Those early lessons have now been built into the behavior of UserIQ’s NPS surveys.

To highlight another area where our NPS surveys have evolved, we previously used the full, formal NPS question: “On a scale of 0 to 10, how likely are you to recommend UserIQ to a friend or colleague?” While that traditional route is the tried-and-true method that works for a lot of companies, we wanted to change the tone so it felt more casual, friendly, and on-brand for UserIQ. So, we decided to nix the “friend or colleague” part of the question and instead say, “On a scale of 0 to 10, how likely are you to recommend UserIQ?” Additionally, we added a header that asks the customer “How are we doing?” because we consider NPS survey results a referendum on how we’re doing as a company.

In our NPS survey, when the user hovers over and selects a score, the button itself changes color—green for Promoters (9 or 10), yellow for Passives (7 or 8), and red for detractors (0-6).

“With some of these small language and UI changes we’ve made over the years, we’ve been able to increase our NPS response rates three times over,” explained Lawton Ursrey, UserIQ’s VP of Customer Success and Growth. “Some wording and UI changes may appear to be minor, but these small tweaks are the types of things we work on with our customers’ in-app campaigns all the time.” By ensuring that the NPS campaign feels tailored to our customers and true to our own brand style, the more accurate and honest feedback we’ll get.

The Logistics

We know—both as NPS experts and as a SaaS company that’s been deploying NPS surveys ourselves for years—that consistency is key. We launch NPS surveys every quarter to all of our customers without fail. “We need to see how we’re doing from a benchmarking standpoint, and launching our own quarterly NPS survey is one of the best VoC tools to accomplish that,” Lawton continues. Part of that consistency is in how long we keep our survey “in field” as an active, customer-facing campaign, which is typically for about two weeks in order to reach our minimum response rate of 25%.

Another logistical aspect to our NPS survey deployment may be surprising to some people—we only show the campaign to a given user a single time. So, if a user dismisses the NPS survey with the “X” on the top right, we won’t show them another NPS survey until next quarter. “We want to keep it light, and not burden our users with an NPS campaign popping up in their face too many times. So, we’ve found that just showing the NPS to each of our users once has yielded the most consistent results over time,” Lawton added.

The Postgame Action

Now, for the most important part of our (and any company’s) NPS activities—what we do with the results. Between the scores themselves and the comments, there is an abundance of customer feedback and information we can glean from NPS surveys. The entire customer-facing organization looks at the NPS results in several ways—including by account, persona, industry, size, feature usage, the list goes on and on. While it is important to review all the findings from an NPS survey, for the team at UserIQ, results typically aren’t a huge surprise. UserIQ’s founder and Chief Product Officer, Aaron Aycock, puts it best: “NPS results tend to be more of a lagging indicator for us about how we’re doing. We can’t rely just on NPS to take care of all of our customer feedback needs, but the survey does confirm a lot of our current ideas about what customers do and don’t like about the platform, and it reinforces the direction we’re heading in.”

For that reason, we tend to look closely into anything coming out of the NPS survey that is a surprise to the team. Whether it’s a typical promoter who’s dropped down to a passive or a unique trend that’s happening within a certain subset of our customer accounts, the UserIQ team always looks into anything—good or bad—within our NPS results that catches us off guard.

Additionally, there are certain playbooks that our customer success team walks through for each of the NPS respondent categories. For promoters, the team will typically follow up during their next cadence call and thank the customer for their positive feedback. They might even ask the customer if they’d share their feedback for us in a G2 review, so that other people seeking a customer success platform can read real user reviews online.

Another great aspect of conducting NPS surveys within your application is an increase in passive respondents, who often have interesting product feedback that you may not receive via email-based NPS surveys. “When compared to in-app NPS, emailed NPS surveys tend to generate a more bipolar response. If someone is going to take the time to open an email and click on a survey, it’s a safe bet that they’re either really pleased or really unhappy with their experience,” explained Aaron. It’s critical to not overlook feedback from passive respondents, since those users are on the cusp of becoming promoters and may just need a small nudge to get there. 

Lastly, anyone who conducts NPS surveys is bound to get a few detractors. “We call all of our NPS detractors immediately,” says Lawton. “While with passives and promoters, we’ll tend to hold off on reaching out until the CSM’s next cadence call with that customer, we want to make sure we’re addressing any detractor’s concerns right away.” Those prompt calls can often times diffuse even the most tense situations. At the end of the day, customers want to be heard, and NPS is one of the great ways to solicit that customer feedback.

What’s next?

As we continue to optimize our NPS surveys, UserIQ’s product team continues to work on platform enhancements. There are a few items on our product roadmap that are centered around NPS, so stay tuned to learn about these and the other exciting enhancements we have coming soon!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.