Support Operations

Melissa Kovacevic
Quality Calibration: You Say It’s 80, I Say It’s 90...Let’s Call the Whole Thing Off!
Written by Melissa Kovacevic Mar 20, 2012

This blog originally appeared here, and is reposted with the author's permission. —Ed.

I love visiting contact centers and working with leadership and quality teams. After thirty-plus years in the industry, I will say that I’m still happy with my career choice. I certainly have had a lot of interesting experiences working with all those centers, but nothing can compare to some of  those knock-down, drag-out calibration sessions pitting supervisor against supervisor, supervisor(s) against the quality analysts, supervisor against manager. You get the picture.

As I’ve sat and observed the interactions—including the eye rolling, the almost name calling, and the defense of what some participants described to a supervisor as “your pet agent”—I've wondered where the customer experience is in all of this.

When the scores don’t add up, so many calibration sessions become more about “I’m right and you're wrong” finger-pointing than how this call affected the customer. I’ve even seen some managers avoid the whole infighting issue by just scheduling calibration sessions once a quarter or even less frequently, instead of taking steps to improve them.

In order to have productive (and yes, professional) calibration sessions, we need to set some ground rules. For instance:

  1. Opinions are just that: opinions. Monitoring should be based on facts, instead of rating the call highly because “Mary meant well” or “John’s worked here a long time.” Consistency in how we rate agent skills is important.
  2. Listen for “moments of truth” for the customer: accuracy, timeliness, problem resolution, empathy, listening, etc. Did we take care of the reason for the call? If not, was it the agent’s issue or a policy/procedure that prevented resolution (which needs revision if possible)? Why did the customer contact us and did we resolve the problem? Why not?
  3. If you don’t agree on the scores, why not? Discuss rationally, not emotionally. Don’t take discussion personally.
  4. Make sure everyone understands what your customers expect and need to have a positive experience: customer feedback, surveys, comments the customer makes during the call, CSAT scores (customer satisfaction, or perhaps related to the calibration infighting--I found out this also stands for Combatshootingandtactics.com). Monitoring may also include checks for sales skills and revenue generation (if the customer is satisfied, we know we have a greater opportunity to sell more). The customer experience.
  5. Repeat #4. The customer rules, and providing what they need/want will determine whether this call was an 80 or 90 or whatever scoring system applies.

This is just some food for thought and I know there are more great calibration ideas out there. I hope you’ll share some of your calibration stories here or on Twitter or LinkedIn.

Leave a Comment
We value your feedback. To leave a comment, please log in or create an account.