Case Study: Email Quality Monitoring-You Don’t Know What You Don’t Know

Filed in Blog, Call Centers, Customer Service by on March 5, 2013
Email Quality Monitoring Case Study: You don't know what you don't know

A client of ours has been conducting quarterly studies with us to keep a pulse on their consumer satisfaction scores throughout the year.  Their scores were high and comparable to other companies in our benchmarking database.  However, our client wanted to move the needle from good to great.

The Challenge:  You don’t know what you don’t know

Our client had previously only used the customer satisfaction data as a measurement tool, which was very effective for their needs.   Now, they were interested in finding out where and how they could improve.  The survey research easily gave them the “how” to improve.

In order to find out the “where”, we did a deep dive into the verbatim comments.  This helped us isolate the leading cause for dissatisfaction.  To no one’s surprise, the leading cause for dissatisfaction was that consumers felt their questions were not answered.  The challenge, however, was in finding out why the question wasn’t answered.

We have found that consumers feel their questions aren’t answered because:

  • The standard response really didn’t address their specific question (a management issue)
  • The wrong standard response was provided (an agent issue)
  • They just weren’t happy with the response even though it was given correctly and with elegance (an unfortunate issue)

The Solution:  Email quality monitoring!

The only way to find out the true cause for the consumers feeling their questions weren’t answered was to go back and do some email quality monitoring of those cases. Through email quality monitoring, we were able to match the consumer’s question with the response provided by the representative.  After reviewing all of the cases, we started seeing some trends in the data and were able to help develop a clear path to improvement.

The Result:  A square peg will never fit into a round hole

The email quality monitoring told us that the standard responses were just too, well, standard.  Although the consumers were given answers that related to their questions, the responses just didn’t quite fit the exact question being asked.  This also helped explain why many consumers complained via the survey that the company’s response was not personalized, sincere or helpful for their situation.

When discussing the email quality monitoring results with the client, we spoke about the need for personalizing these scripts and creating some new templates as well.  We realize that it’s not practical (or possible) to create a standard response for every issue that comes up, however if you have the right agents answering your email who excel at reading comprehension and writing, you can empower them to mix and match responses appropriately.

If a consumer doesn’t get an answer to their question, some may contact you again for clarification (which would defeat the efficiency you thought you were creating with the standard responses).  However, most will not bother and there is a chance that they will purchase your competitor’s product instead.

Consumer satisfaction research is important, but its value is increased when you pay attention to not just overall trends, but the individual cases.  By conducting both call and email monitoring on consumers you survey, you are able to not just measure, but also improve the customer experience.

Do you conduct email quality monitoring in your call center?  What are some things you found out that you didn’t know before?

Tags: , , , , , , ,

About the Author ()

Comments (1)

Trackback URL | Comments RSS Feed

  1. Great perspective you’re offering, Dawn! It’s particularly significant that you point out “the right agents who excel at reading comprehension and writing.” That’s why I created an email skills assessment tool to accurately predict who can choose the best template based on their understanding of the customer’s message. Unlike many writing tests, this one evaluates the reading comprehension piece as well as template selection, and not just grammar. Lots of companies miss that element and waste time training agents who are not likely to represent them well in either email or social media.