ICMI is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Advertisement

Evaluate Training Effectiveness with Learner Feedback

questionSuccessful organizations evaluate their training programs to optimize learning transfer from the classroom to the job. You’re probably familiar with commonly-used methods to evaluate training effectiveness, such as learner feedback, tests and assessments, role plays and simulations, observation, and customer surveys.

As part of a comprehensive training evaluation approach, these methods form a strategy to determine if a class or curriculum has had the desired impact on employee performance. Some training teams add more detail to their training evaluation yardstick, using metrics like return on investment or strategic impact, but most training managers take a more pragmatic view. They ask, “Is what we are doing in the classroom influencing performance?”

Fueled by that pragmatic mindset, most trainers take the first step toward training evaluation with a post-class survey that asks the learner to provide their perspective on the learning experience. Gathering learner feedback is a quick and inexpensive starting point to gauge what’s happening in the classroom, understand the learner experience, and refine training design and delivery.

Here are five easy-to-implement ideas you can use to boost the effectiveness of learner feedback as an evaluation method:

Gather learner feedback early and often.

Post-course surveys may be helpful to improve future training classes, but don’t provide real-time feedback for real-time adjustments. Instead, add in-progress touchpoints for trainers to solicit learner feedback throughout a learning event. These can include:

  • Daily pulse-checks for multi-day classes
  • Weekly surveys for multi-week curriculum
  • Standard questions at transitions between modules

Formative data helps the trainer adjust their delivery based on learner feedback, and provides context and an early warning if something is going wrong. This will provide good information for trainers as learners move into tests and assessments.

Craft learner-centered survey questions.

Questions designed to solicit feedback about the learner’s experience shift the focus from critiquing the trainer or the content to sharing unfiltered input about their experience as a learner. It's a subtle difference that can substantially impact the quantity and the quality of feedback offered.

See the difference in these two sets of examples:

  • “The course content was engaging.” (trainer-centered)
  • “I was engaged in the exercises and discussions.” (learner-centered)

and:

  • “The practice exercises were practical and relevant.” (trainer-centered)
  • “After this training, I feel prepared to apply what I learned in my job.” (learner-centered)

Adjust survey questions to pivot away from what the learner thought about the training. Learner likes and dislikes are interesting, but our goal is a sound learning event that results in learning transfer. Substantial questions focus on the learner experience and learners’ preparedness to apply what they learned on the job.

Here are some more examples of these learner-centered survey questions:

  • “Was what you learned today relevant to your job?”
  • “Do you anticipate any barriers in using what you learned to your job?”
  • “What do you think the impact of what you learned today will be when you return to work?”
  • “Rate your confidence in applying the new process from a 1 (least confident) through a 5 (most confident).”

Survey questions like these also prompt the learner to reflect more deeply on their learning experience, confirm how the course content applies to their job, and solidify their intention to apply what they learn when they return to work.

Gather learner feedback after learners are on the job.

After a training class, it's not uncommon to hear, “Class was great. I loved it - thanks!” This type of feedback is always satisfying to hear, but feedback after the learner has returned to work is more critical than the immediate reaction. A week or two after the training event, reach out one more time and ask:

  • “What have you applied from the class you attended?”
  • "Were you prepared to apply the knowledge you learned on the job?”
  • “What would have made the training experience more effective and practical?”

Clarify what your learner feedback data tells you and your stakeholders.

It’s true, learner feedback doesn’t necessarily reflect if the course was successful, and it isn’t entirely objective or reliable, but it can provide valuable root-cause data about critical factors that influence learning and skill transfer.

Did a quarter of the class fail the end-of-class evaluation? Are learners not able to use what they learned after a training class? Return to the class evaluations to see how they perceived the class, if identified barriers were resolved, and how learners rated their readiness to apply what they learned. Learner feedback can provide valuable clues when learners cannot demonstrate newly acquired skills or knowledge.

Create an end-to-end process for learner reaction data.

Class is over, and the feedback surveys are in. Now what? In many organizations, that’s it; managers and trainers take a cursory glance at the results and move on to the next class.

If you ask your learners to provide you with feedback, you are responsible for using it. Compile the results and the comments. And ask yourself the following questions:

  • Which trainers, classes, and formats elicit the most positive reactions? Why?
  • What are the feedback themes and roadblocks to learning transfer?
  • What changes will you make to the course content, format, delivery based on the feedback?
  • What are the areas where you’ve invested, and it shows? What are the training team’s big wins for the year
  • Share the results, as appropriate, with the contact center every quarter to demonstrate your commitment to being open to feedback, continuous improvement, and growth.

Learner feedback is just the start of a comprehensive training evaluation strategy. While it may not carry the weight that tests, assessments, observations, and other measurement tools do, this easy-to-collect data helps us connect more effectively with our learners, gather in-flight reactions for course correction, and detect early roadblocks.