DYK # 16 (A): Did you know that the first phase of VoC maturity is all about foundational elements?

We’re back from our (well deserved) Christmas break and it’s time for our monthly DYK series again! For the first two DYK’s in 2024 we’ll look into VoC maturity, with # 16 - Did you know that VoC Maturity can be broken down into 2 phases?

This month we’re kicking it off with the first phase, the “foundational phase” and we’ll finish it off in February with the “tech acceleration” phase.

So let’s get started!

#16 (A) - Did you know that the first phase of VoC maturity is all about foundational elements?

While we focus on VoC specifically today, the same principles apply to most other maturity models.

In order to enable success, we need to start at the beginning. While technology plays a vital role in maturing your CX practice in general, technology must always be seen as an enabler. However, we can only enable success if we have the foundations set right to start with.

So what do I mean with that?

From program design, governance and metrics, to insights to action frameworks, getting the basics right is the first step to success.

All too often I witness organizations who think they’re customer centric, as they send lots of surveys to ask for feedback. While asking for feedback is a great start, there are lots of different ways to do it, and some work better than others.

Some common pitfalls I’ve observed over the years are:

1. Lack of a centralized approach.

This is a big one. Departmental silos and a culture of “safeguarding” data is a huge challenge for most organizations. Not only is it unfortunately quite common, it also creates huge problems.

You’ve probably seen it before, or perhaps you're engaging in it yourself. Teams / departments that send surveys specifically for their own team and use case, with no oversight across the organization which team sent what survey and when, let alone sharing of the feedback or insights with anyone else.

Siloed survey work isn’t just a waste of money (as you can consolidate your tech with an enterprise wide approach to VoC), it’s also rather annoying for customers. The lack of company-wide stand down rules means the customer may receive several surveys from the same company, potentially in a short timeframe.

When designing a VoC program, you need to consider the entire organization and the customer experience end-to-end. All survey activities need to be centralized and governed centrally, with stand down rules and data synthesis and sharing across the organization. While we may ask questions specifically to a single team or experience, if you’ve ever read through customer verbatim, you know that customers tell you what they want to say, regardless of what you ask. Chances are, you’re getting feedback that’s actually about something or someone else in the organization. Analyze that data and SHARE IT so you (or someone else) can act on it.

This pitfall becomes much more apparent in the case of unsolicited feedback.

2. Metrics not suited for the job and not consistently used across the business.

Metrics is one of my Love-Hate topics. It’s an incredibly important conversation to have, but all too often organizations place too much effort on collecting them, but enough effort on deciding what’s the right metrics to collect to start with.

I’m just going to say: NPS.

NPS is a topic of much controversy, and often the metric creates trouble not because of what it is, but because of how we “chose” to use it; incorrectly. The discussion doesn’t start with “What’s the right metric to use to help us understand CX success”, but “Let’s use NPS (because hey, that’s what customer centric companies do!), and see what we do with the results”.

Bad call. NPS is a powerful metric, but we can’t use it everywhere and every time.

Before you launch any VoC program, take your time and put a VoC strategy together. Link it to your wider CX / Customer Strategy, or if you don’t have any of those, link it to your Business Strategy. Select a metric that’s right for you.

Once chosen, apply that metric(s) consistently across all your CX work. You can’t compare apples to apples if you don’t talk apples to apples.

3. Lack of an Insight to Action Framework.

Collecting data / feedback is one thing, what you do with those insights is yet another. We’ve become really great at collecting any data we can possibly think of, yet analyzing, let alone acting on this data is something many of us are still struggling with.

The Insights to Action (I2A) framework outlines ways to collect the voice of your customers, make sense of that data by converting it into insights and identifying root causes, and enable decision making and acting on feedback. Having a solid understanding and a framework in place that guides your sharing of insights to enable driving action is crucial.

I’ve said it before and I say it again, there’s no point collecting feedback if you’re not doing anything with it.

An I2A framework also guides you on your team or company structures. While a centralized approach is great for some activities (e.g., governance, see above), when it comes to sharing insights and enabling action, we need to democratize that data and data access with a decentralized approach.

Getting your team structures right is vital to enabling your CX success and driving a culture of customer centricity.

I hope that was useful.

In our second part to this “Did you know that VoC Maturity can be broken down into 2 phases?” in February, we’ll be discussing “Did you know that the second phase of VoC maturity is all about tech acceleration?”.

In the meantime, feel free to reach out should you have any questions, or check out some of our “foundational” DYK’s from last year if you’d like to learn more. We have a few to choose from, e.g.,

Got any questions? Get in touch today!

#VoC #CX #customerexperience #VoCmaturity #technology #metrics #InsightstoAction

Previous
Previous

DYK # 16 (B): Did you know that the second phase of VoC maturity is all about tech acceleration?

Next
Next

DYK # 15: Did you know that AI will transform VoC programs from measurement practices into the engine room of the experience improvement function?