21 years ago, back when Justin, Britney and Christina were all still in the Mickey Mouse Club, I was a young, very inexperienced Marketing Executive intern huddled over a high power (486) computer, trying to work out if I was about to make a career-limiting decision.
In a glamorous jet-set location (Runcorn) and working in a trendy, customer centric and fashion-conscious company (ICI Chemicals and Polymers – purveyors of the finest Chlorine and Sodium Perborate in Europe), the numbers had been crunched, the analysis was complete, but there was a data line that did not fit the trend and it was...worrying.
This was 1993 – a world of Jurassic Park, the Ford Mondeo and the Vauxhall Corsa , with The X-Files just around the corner, and where a massive 0.5% of the UK population had access to the internet. Not that there was much to access; no Internet Explorer (initial release Aug 1995) Google (founded Sept 1988) or Wikipedia (launched Jan 2001).
We also didn’t have sophisticated customer experience measures; Net Promoter Score got going in 2003 (Harvard Business Review), and Customer Effort Scores were 7 years after that (HBR 2010). The “best practice” standard measure was Customer Satisfaction - but ICI was ahead of its time, and we had commissioned a huge survey of nearly 100 questions using a competitor comparison. For each topic the customer was asked to score ICI vs our best competitors.
And score us they did – with separate scores on price, product, packaging, product quality, delivery, chemical grades, technical expertise, safety, availability of stock, lead time, clarity of invoice – just about every dimension you could imagine.
It was fantastic – not least because ICI was ranking top vs our competitors on just about every single measure.
That’s not what had me worried. The reason I was reaching the head-in-hands stage of analysis at 8pm on a rainy Wednesday was the last question on the survey.
A free format box, where customers were invited to share “any other comments”.
We didn’t have any advanced natural language processing software, so the verbatim were passed down the chain of command to the lowest rung (me) and I did what marketing executives everywhere have done when handed a pile of customer feedback forms with several hundred verbatim comments – I read them.
The problem: what I was reading did not match what was showing up in the competitor ranking data. Not even a little bit. According to the scores we were “best in market” and consistently ranked Number 1 or 2 in every measure. Whereas the verbatims were...less positive.
In the end I made the decision, wrote the Executive Summary, submitted the report and, unsurprisingly, found that my internship did not extend into a graduate job. Perhaps I should have picked a less colourful customer verbatim to quote in the summary but it simply captured the mood and theme of all the customer comments, and blew a hole in all the lovely competitive rankings:
“You are not SO bad that we can’t do business with you”
21 years later, having worked with customer feedback, research and measures across many industry sectors, I am still not convinced that the truth lies in a number, be it NPS, CSat or CES.
Guessing what might be important to your customers, and then asking them to score you on that, doesn’t work. All the scoring questions have the same inherent problem – the Company is defining and forcing the agenda and telling the customer what they are allowed to talk about. Doesn’t matter what the score is – NPS, Net Effort or even Customer Satisfaction they all have this inbuilt imbalance.
The truth about your service, the real insight into the heart of your customers’ emotions, only shows up when you ask them, and let them talk in their own words about whatever is important to them.
Take a look at our Infograohic to find out how to make the most of Customer Feedback by asking at the right points throughout the Customer Journey: