There is a lot of literature and opinion out there when it comes to evaluating customer sentiment collected from your surveys in a B2B environment. From Net Promoter Score (NPS) to Customer Satisfaction (CSat) to Customer Effort Score (CES), there is almost no end to the information available on how to measure, analyze, and report on these voice of the customer (VoC) metrics. While you likely have an idea which metric is more applicable to your situation, open-ended and free form commentary is often an under-utilized source for understanding your customer survey results.
In the context of a B2B customer survey, we’re referring to feedback gathered in response to open-ended questions (i.e. “Why did you score us the way you did?”). Instinctually, businesses know qualitative feedback is valuable, but analyzing it is more involved than calculating your NPS or CES.
With the growing number of tools available and promises of AI automation (which Satrix Solutions evaluates on a regular basis), is there a solution that can match the accuracy of human-centered analytical techniques when it comes to free form text?
My answer may surprise you – not yet.
How to Analyze and Manually Code Qualitative Customer Feedback
While the words “human-centered” may evoke fears of manual labor and time consumption, here are some tips to help you become more efficient as you analyze and code unstructured feedback.
Tip 1: Consider your top priorities when analyzing verbatim responses
Time and resources are almost always in short supply, so it’s worth revisiting your original intentions for collecting survey feedback. If you have 1,000 comments in response to a single open-ended question, pinpointing the key takeaways from that feedback will take some work, there’s really no getting around this. At the very least, you and/or your team may need to read all 1,000 opinions to gain a comprehensive understanding. However, if you are analyzing the results from a recent NPS survey and the original goal was to reduce customer churn, focusing on detractor commentary will likely give you a better sense of the “why” behind their responses and where your more prominent opportunities exist.
Another option is to zero in on the feedback from your highest-valued customers. Obviously, we want all our customers happy, but losing just one highly profitable account may have a more severe impact on your business than the next 10 combined.
Tip 2: Develop a codebook to evaluate open-ended feedback
In its simplest form, a codebook lists the themes most commonly observed in your open-ended survey feedback. At Satrix Solutions, we rely on our years of experience and client input to develop robust codebooks which then help us quantify the results and identify key takeaways. And although we have tools to help organize the data, reading customer comments and tracking the frequency of theme mentions is a big part of what we need to do. In this way, the codebook is our compass when it comes to navigating customer sentiment through free form text. So, what are some characteristics of a good codebook?
Codebooks are almost always divided by positive and negative themes. The specific themes usually depend on your line of business, but in the world of NPS and/or customer relationship surveys, feedback is typically good, bad, and/or both. Occasionally, there are “other” themes that don’t necessarily pertain to sentiment, but those tend to be things like platform feature requests, which we also recommend tracking. It’s important to note that some themes don’t always have a negative or positive converse. For example, customers will make note of mistakes/errors and while it makes sense to include this theme in your codebook, there isn’t really a positive equivalent because customers expect a service with little to no mistakes and are less likely to say as much.
Going a bit deeper, you can also organize your codebook by “buckets” which usually derive from how your customers interact with your business. As an example, we’ll build a positive and a negative bucket pertaining to the specific piece of technology and/or platform a SaaS client has developed. This is how we track things like the number of mentions around the system’s ease of use or perception of the features/functionality available. Customer service is also a very common bucket we include and impacts nearly all businesses that Satrix Solutions partners with.
If you’re unsure where to begin with your codebook, set aside 20 minutes and read some customer commentary. It shouldn’t take long before you see a few subjects and themes emerge. You can also talk to your front-line staff who interact with customers daily. These individuals are probably the most qualified employees to speak on behalf of your customers’ priorities and top concerns as they relate to your business’ value proposition.
The Benefit of a Human-centric Approach to Open-Ended Feedback
Despite promises of automation and slick platforms, simply reading what your customers took the time to write will arm you with the most valuable insight. Even if I was 100% certain an AI could produce the same quality of reporting I generate with human-centered verbatim coding, I would still be unprepared for my board meeting if I didn’t take the time to read the survey commentary. Of course, following the tips I outlined above will allow me to do so more thoughtfully but even then, there are no short cuts.
For the second half of this blog series, I focus on the specific ways you can analyze and share the verbatim analysis results. Read it here.