ICMI is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Advertisement

Customer Experience: More Than A Number

So Many Options

Customer experience has become one of the most polarizing topics discussed in our industry.  Whether you are attending a conference or networking with peers, you are almost certain to hear people discuss the pros and cons of the various methods to measure customer experience.  This can range from the traditional metrics such as Customer Satisfaction and Net Promoter Score (NPS); to the newer measures like Customer Effort and the Word of Mouth Index (WoMI).  There is also a growing discussion on whether we should abandon surveys and focus on other tools like Speech Analytics.  All have merit, yet the sheer volume of options can be overwhelming. 

Given all of the different viewpoints surrounding how to measure customer experience the last thing I want to do is imply that one position is right and the others are wrong.  My opinion is that the appropriate measure could vary depending on your specific industry, customer, or business need.  Instead of offering another opinion in an already crowded discussion I want to share what we do and why we do it, as customer experience is the most important metric in Operations at Cars.com. 

More than a Number

Before I start there is one key point that needs to be made.  Regardless of the measurement used, it is wrong to think that choosing a method to measure customer experience will translate into anything meaningful.  Customer experience is not a single number.  For your customer experience program to be successful you must perform significant analysis on the data collected to identify opportunities for improvement and then take action! 

Cars.com’s Customer Experience Process

At Cars.com we have been using WoMI to measure experience when a consumer or dealer contacts our Customer Care team (contact center survey); as well as when we conduct our Dealer relationship survey.  There were a number of reasons that we switched from NPS to WoMI, with the biggest three being:

  1. WoMI allowed us to identify our true detractors so we could better understand our opportunities for improvement. 
  2. We found WoMI to be a more accurate measure of customer loyalty.
  3. WoMI enabled us to more confidently benchmark ourselves against other companies.

If you are interested in more details about WoMI specifically, a good source of information is Innovating Analytics by Larry Freed, the former CEO of ForeSee (who created WoMI).  The book does an excellent job outlining how WoMI works and providing real-world examples of it being used (including Cars.com).

In addition to using WoMI, we found that measuring in two different channels was invaluable.  The contact center survey gives us immediate feedback on how we did with a specific interaction and how well we take care of our customers when they need help.  Our Dealer relationship survey is conducted quarterly and because it is not tied to any one interaction or transaction, provides a more thorough understanding of how our Dealers view Cars.com as a whole.  It also takes the emotion (good or bad) immediately following an interaction out of the equation.

With both of our surveys we have standard processes to share the results throughout our organization while also responding to any immediate needs of our customers. Our contact center survey results are delivered daily to our Customer Care leadership team, as well as our Speech Analysts (more on them later).  If the interaction related to the survey requires coaching or a customer follow-up the Speech Analysts will coordinate those efforts - usually carried out by a Manager or Team Lead.  The relationship survey also gives the responding customer the option to request someone contact them and we reach out within a day of receipt through our Sales team.

The aggregated results for both surveys are reviewed on a monthly and quarterly timeframe to identify larger trends that impact our customers.  We work with individual departments to review the data so that we can make improvements. 

During this process we also provide recommended actions, document the agreed upon actions to be taken, set a target completion date, and then follow-up until completed.  This process has initiated changes in everything from core features on our website to the collateral our Sales team uses when meeting with our Dealers. 

Once that process is complete the scores and analysis are shared quarterly with our Senior Management Team as a part of our company’s balanced scorecard.  This team includes our VPs and above, all the way through the C-Suite.  The discussion begins with the numeric scores but then highlights key trends and the actions being taken to further improve the customer experience.  It is also an opportunity for us to ask that resources be added or reallocated if a large-scale change is needed.

It’s Okay to Use Multiple Measures

Surveys are not the only tool we use to manage customer experience at Cars.com.  Last year we implemented Speech Analytics so we could spot emerging trends and find more ways to improve our service.  Our Speech Analysts look to correlate the experience of a customer (good or bad) with keywords or phrases in our speech tool so that we can optimize for future calls (more on speech later).  This process occurs daily as survey results are received. 

For the past three years we have also leveraged a monthly case analysis report, focused on our inbound contacts.  The analysis evaluates the frequency of issues as well as the length of time it takes to resolve an issue.  Combining those two data points we create an index that helps prioritize any process or technology enhancements that will improve both our customer and employee experience. 

What’s Next?

Looking forward we will continue to explore new methods and technology for measuring customer experience.  However, I do not anticipate significantly changing our current program.  While we may look at new data points, as we did last year when we introduced Speech Analytics, it will be more from the perspective of how the data correlates to our existing measures and how we can refine our program overall.  The reason? We know that the measurement we use is not as important as the improvements we drive with our customer feedback.

More Resources