As a researcher, I often catch myself falling prey to the ‘shiny object syndrome.’ I’m excited by the latest bit of data or a cutting edge hypothesis, constantly thinking about what is the next big thing for our members.
Which has its merits – of course.
But, revisiting past ideas and refining those – well, that has merit too. A senior leader at a large financial services company once told me, “Lara, I can’t shift my strategy 180 degrees each year. But when you provide insight around common themes and help me fill out another piece in the framework to provide top notch service…that’s what I find huge value in.”
Fair enough. So, we decided to take a look at one of the best known pieces of our work – the Customer Effort Score, or CES.
To be sure, CES is just one part of a much larger body of customer effort work. But, its simplicity and power made it easy to communicate across the customer service world.
A simple question, “How much effort did you personally have to put forth to handle your request?” on a 5-point scale from very low effort (1) to very high effort (5), proves to be an extremely strong predictor of future customer loyalty – with 96% of customers reporting high-effort experiences becoming more disloyal in the future, compared with only 9% of those with low-effort experiences.
It is quick and easy for customer to evaluate, easy to implement across different service and survey channels, is easy to track over time, and correlates with business outcomes.
And with 61% of the CEB Customer Contact membership saying they measure customer effort (with another 24% planning to do so in the next 6-12 months), it is clear that effort measurement is more than just a fad.
Over the years, we have partnered with many service organizations to incorporate CES into their survey mechanisms – largely to great success. But, as we rolled it out globally we found three key themes emerging that indicated room for improvement:
- Inconsistent Interpretation: Both the scale (where 1 is ‘good’ and 5 is ‘bad’) and the wording itself are open to customer perception, with some customers misinterpreting the scale and others feeling that the question was probing into whether they had done enough on their own before contacting the company.
- Uneven Global Applicability: Effort does not translate neatly into all languages, and in fact has different meanings based on cultural or regional significance.
- Lack of Benchmarking Capabilities: Due to the above, many companies adapted CES to fit into their customer base which meant that cross-company comparisons became difficult.
Given these findings, we went back to the drawing board to see if there could be a better CES metric that would keep the upsides of the first version of CES but also help close some of its emerging gaps.
We tested a host of different effort questions – everything from gauging customer expectations to time spent getting an answer – across a sample of nearly 50,000 customers from a range of companies, industries, and regions. And we found one measure that was head-and-shoulders above the rest. In fact, it was nearly 25% more predictive of customer loyalty than the next best metric.
So…drum roll please…here it is – what we affectionately have been calling CES 2.0:
Like its predecessor, it’s a simple question that has all of the same benefits for customers and companies alike.
And, it is 1.8x more predictive of customer loyalty than customer satisfaction (CSAT) measures – plus it is 2.0x more predictive of Net Promoter Score (NPS) than CSAT. Read More »