Published: October 29, 2017 | Comments
Mention “training evaluations, ” and most training professionals will pass right on by. Don’t get me wrong, trainers crave performance feedback and look for ways to improve our work, but many evaluation schemes are so complex and require so much time and resources, that they are out-of-reach for many contact center training departments. So we resort to counting participants, counting training hours, and relying solely on end-of-course “smile sheets” to evaluate our training. I suggest that we can do better.
First of all, end-of-course evaluations or “smile sheets” are helpful for evaluating training, as long as they provide information that is actionable. If the assessment is a just a checklist of agree or disagree statements, how actionable are the findings? Often they are not. A useful resource for designing actionable smile sheets is Dr. Will Thallheimer’s book, Performance-Focused Smile Sheets: A Radical Rethinking of a Dangerous Art Form.
But let’s go even further back to before the end-of-course surveys and before the course is even held. The key to proper training evaluation is knowing what the business owner is looking for in the people who complete the training. If you haven’t gotten clarity on that, then your training is unlikely to be successful. For some training—such as new hire training—you might need to revisit these periodically. What are the top 10 call issues? Are new hires trained to address these? Are they able to quickly navigate the knowledge management system to find answers (as opposed to memorizing the answers)? Can they quickly search and find customer information on your systems? Do they follow proper processes for privacy, verification, documentation, and providing resolutions? What are the things that are most critical for them to know and do when they first start taking calls? Is training designed to deliver those things?
You might have to go back and tweak the design of your new hire training to make sure that changing expectations are satisfied. Explicitly knowing what your business partners expect from participants after training and mutually agreeing on how to measure those expectations is critical for delivering good training and identifying what to look for in the business results to indicate the success of the training. That simply takes a conversation with people that have grown to know and respect your training insights and expertise—which requires building good, professional relationships.
Other areas that can provide useful, reliable information on the success of any training include looking at participants’ quality monitoring immediately after training. Look specifically at the area(s) where training was supposed to make an impact. Are they doing better? That’s a clear indication that the training provided the desired impact on participant behavior. Also, talk to their frontline supervisors to see what differences they may or may not notice in the participants. It is incredible how much a frontline supervisor will notice, so making them your partners will provide tons of useful information before and after the training.
I am also a great proponent of doing another participant survey 14-45 days after the training—after the participants have been able to apply what they learned. I keep the survey short (no more than 5-7 questions) and I ask things like:
- Now that you have been doing the job for a little while, how able do you think you were at the end of classroom training to put what you learned into practice on the job? (A wide variety of positive, neutral, and negative answers are provided for respondents to choose from.)
- Now that you have been on the floor for a little while, which of the following were true about your course instructor? (A wide variety of positive, neutral, and negative answers are provided for respondents to choose from, with the ability to select more than one answer.)
- After the course, when you began to apply your new knowledge on the job, which of the following supports were in place for you? (This can be helpful for the business owner as well as to training)
- What is something you wish you had learned in training but did not? (Open-ended responses)
- What did you learn in training that has proven to be the most helpful to you on the job? (Open-ended responses)
By surveying training participants after they have had time on the job, you will learn whether they really did apply new or refresher learning, and you will also learn where the training might not be as effective as it could have been. You might still have to dig to get to specifics, but the post-training survey can help balance out the optimism of end-of-course surveys. And it also makes people think back on the training and reminds them to use things they might have let fall by the wayside.
Contact Center Training departments can evaluate training without having to put a lot of resources into place. If the evaluation is not providing you with helpful information to get better at training, then it is a waste of time. But it doesn’t have to be something so cumbersome that it is impossible to do correctly. Each year, look at how you are evaluating every bit of your training and consider what you can do to get better evaluation data to help your training programs. Make sure, too, that you are using your existing resources like quality monitoring and front-line supervisors to evaluate and support your training efforts.
*A Likert scale is a five or seven point scale which is used to allow the individual to express how much they agree or disagree with a particular statement.