Welcome to the User Adoption:Real Talk from the Experts series!
In this video series, we take a deep dive into UserIQ’s 2018 User Adoption and Onboarding Benchmarking Report, based on a survey taken earlier this year by over 450 SaaS leaders. We wanted to know — how are companies addressing user adoption challenges today? How are they measuring success, executing initiatives and maintaining momentum? To help us dig into these key findings, we’re featuring the customer success experts themselves.
Bill Cushard is the Director of Marketing at ServiceRocket. Bill covers the intersection of learning, software adoption, and customer success. His career has focused on helping companies adopt disruptive software through learning, change management, communications, and implementations that help people get the most out the software. He’s also a writer, releasing “The Art of Agile Marketing: A Practical Roadmap for Implementing Kanban and Scrum in Jira and Confluence” earlier this year.
Find Bill Cushard:
Want to read up on the Top Trends in SaaS User Adoption & Onboarding for 2018?
Download the 2018 SaaS User Adoption & Onboarding Benchmarking Report to see how more than 450 SaaS leaders handle these initiatives today and how your company stacks up.
All right everybody, welcome back to the User Adoption Real Talk From the Expert Series. If you are new to this series, we are doing this video series to talk about the report that we released earlier this year, which is the 2018 User Adoption and Onboarding Benchmarking Report. With this survey we surveyed over 450 SaaS leaders and we really wanted to dig into user adoption challenges and how companies are taking on those challenges today. So how are they measuring success? How are they executing initiatives, how are they maintaining momentum? So we thought what better way to kind of have everybody describe the findings that we got from this by getting on the experts themselves.
So today we have a familiar face with us who is Bill Cushard; he’s the Director of Marketing at ServiceRocket. So he’s going to help us kind of answer a few questions we’ve had from the report about onboarding. So before we go into the questions, Bill, if you just kind of want to introduce yourself, maybe talk a little bit about ServiceRocket and your background and everything like that.
Sure. Hi. Kelsey, it’s a pleasure to be here too. Thank you for doing this, I’m honored to be part of this. As Kelsey said, I’m Bill Cushard from ServiceRocket and ServiceRocket is in the business of helping software companies get their software deployed and adopted. All right, so we’re all about software adoption and we do that with training, customer education, implementations, support and we develop software applications that connect different software tools together like Jira, and Salesforce, and Workplace and ServiceNow and things like that. So a lot of integration work between software so that people have other reasons to use it. So that’s what ServiceRocket is all about.
Awesome. Perfect. So yeah, one of the big findings that we had found from the survey was that only 33% of our respondents had visibility into the onboarding process, which we just thought was very interesting considering we put so much emphasis on creating a good onboarding strategy. So that being said, our kind of first question we have based off that is what are the top maybe three to five metrics that companies should be measuring during the onboarding phase?
Yeah. This is a really fun question and the problem is obviously it would be different metrics for different companies, but one thing I will say is, and I’ll talk about some examples, but to simplify this, it’s almost like the metric matters when you measure it. So the way we look at it is in an onboarding process, so the goal of onboarding is to get customers up and running basically, right? So what does up and running mean? So if you focus on metrics in the first 30 days, for example, like one metric at a time.
So in the first 30 days, or whatever your timeframe is of onboarding, the metric might be one metric and you focus on one metric. Then in the next 30 days, or whatever time period, there’s the next metric and so on. So maybe your onboarding takes two weeks or three months. So you want to break that up and focus on one metric at a time and so I’ll share two quick examples. One is Workplace by Facebook, so that’s collaboration software, kind of like Slack, right? So a metric that Workplace by Facebook would measure is account activation in the first 30 days. So that whoever’s rolling out and doing the onboarding, the customer success team, the one metric they would follow in the first phase of a onboarding program is account activation. How many employees, and that’s the percentage of total users that activate their account. So how far can we get? And they set a goal. Let’s say it’s 80% of all employees activate it. Okay? So now everything we do in onboarding is to get people to activate the account.
Once you get to that goal, now you have some kind of a phase two and now you have a different metric. That might be, in the case of Workplace, is mobile chat app downloads, right? So of the people who activated, what’s the percentage of people who downloaded the mobile app? So there’s a metric and so everything we do in a phase two of onboarding might be getting people to download the mobile app and we do signs in the office, and we send out emails, and all the things that we could do to help champions do that.
Now a phase three might be activity or use of the product. So what’s the average weekly number of posts per person, right? So in other words, the metrics are a little bit different. Now for a product like Jira, which is like project management and bug tracking software, the first metric might be how many customized workflows were created for a customer, right? Because if you create workflows, you’re probably gonna start using the product because you’re customizing it for you. That might be the first metric and it might be you have to get to 10, maybe. Maybe it’s five, maybe it’s 10.
The second one, metric, might be in phase two. Okay, where is the customer creating projects for their work in Jira? So maybe if customers don’t do projects, that’s a bad sign. So we want to get customers to do X number of projects. So you really have to get specific in what you’re measuring in these KPIs, so whatever the software you sell, you want to find out what these actions are that people do in your software, download the software, share software, create the assets, make projects, activate users. Then the second part of that is now you want to phase it in because you can’t measure all the metrics, and some metrics matter more today than they do six months from now.
Right. Definitely makes sense. I think you make a really great point too in that you got to take kind of baby steps sometimes or just split it up into phases. I think people just hit the gas pedal, want to hit the ground running real quick and I think you kind of get too all in at once and it’s good to kind of have that perspective and take a step back. So love that. What are some of the key metrics that people should consider if they want visibility into user adoption, since we’re talking about kind of visibility and onboarding.
Again, the way I look at it is there is no metric because it’s different for all of us, but I will say this: you have to find out what your money ball metric is. I know that’s kind of a cliche, but there is a metric or there might be three things that your customers do in your software that makes them sticky, whatever that is, that core action that they take. I’ll give you an example without revealing the name of the company is that one company that we worked with has software where you create content assets. I’m using that term kind of broadly, right? So the more of those you create, you’re probably a better customer, but they actually found out that if a customer creates 33 of those content assets that those customers renew at three times the rate as customer that don’t create 33 of them.
I was working with this software company in the context of how do we create education programs for your customers. So once they said that to me, I said back to them, “Now, okay, let’s double check on that.” From now on, the only education strategy we do together is going to be to help customers to make 34 of those content assets and nothing else. If something else doesn’t contribute to helping a customer make 33 of those, we’re not going to do it. That’s how we’re going to prioritize.
So whatever that number is for you, now in the case of Jira that might be getting people to create 10 projects, it could be something like that. So in Salesforce and a CRM it might be getting a customer to create at least 35 opportunities, it could be that. So you have to find that number or that metric and go after that thing. That’s hard to do because you have to sort of get some data people involved and figure out how that correlates to a renewal rate, or an expansion rate, or a revenue target because that’s the thing that matters. But once you find that number, that focuses you on what you have to do to help customers get up to speed or onboarded it or whatever.
Right. Exactly. I think that’s actually a pretty good transition into the third question we have of this which is, what is the first step to understanding where gaps might be in your measurement process?
The gaps. Well, going back to my previous answer, the gap is we don’t know what that metric is. We’re sort of guessing and actually I think it’s okay to guess, but I think when people look at onboarding success, what software companies do is that we’re not precise enough. In other words, we say we want to get adoption of our software and then let’s say you double click on that. You say, “Okay, what does that mean?” They go, “Well, they’re using it.” “Well they’re using what?” “Well, they’re using the software.” “Well, what part of the software?” and then that gets into, “We’re coming up with things to measure,” and then you have to ask, “Okay, do we track that data today and/or can we run a report on that today?” Everyone says, “Well no,” and so I respond by saying, “Okay, then that’s not the metric we have to find. There’s a gap there.”
So even if you have to guess, so let’s say you don’t have your analytics team figure out what your 34 is in my previous example. Maybe you have to say that product adoption means the number of times a customer logs in. Well everyone can track that. Every SaaS company can run a report on number of logins by customer probably and you might argue that, “Well that doesn’t mean adoption.” Okay, well that’s fine, but at least you can measure it and if you could make the number of logins go up from two to four a day, maybe that’s good, but the gap is if you don’t have the ability to know that number, know it’s an objective number and to be able to run the report. Then it’s sort of meaningless to say the goal of our onboarding is to get more adoption.
You sort of have to pick that thing. Do they click on the opportunity object in Salesforce? Do they log in more often? Do they share? We have a customer that creates visual charts and so they know that the more people share a chart with someone else in their company, the more they renew. So it’s like well the metric is how many average number of shares per week is the metrics. The gap is not knowing that for you, that’s the gap but you have to be precise like that.
It’s definitely a learning curve for sure. But yeah, I think a lot of it also comes with team alignment and things like that and kind of bridging those gaps but yeah, I definitely agree. I think this was incredibly insightful. I think there was a lot of great takeaways that people that are watching this can take away today to put towards their own onboarding user adoption strategy. So I thank you very much for being here. This was awesome. I’m going to put links in the description box to everywhere where you can find Bill online. I’ll also put a link to where you can download the full report. So thank you so much, Bill. We appreciate it.