How do you measure the success of training and education programs? In a decade of building skill development workshops for business leaders, I looked at three things: butts in seats, satisfaction scores, and contract renewal rates. Over time, our team cracked the code to maximize those three metrics.
But I often felt that something was missing. What about IMPACT? How did we know our clients were adopting new behaviors over the long term? Accomplishing new, meaningful results as leaders in their organizations? Having new conversations that led to new ways of working together? Solving old problems that had recurred in some cases for decades?
What happens after the class is over, after the workshop ends, after the eLearning is complete? And, as learning designers, how do we build training experiences that reliably lead to long-term change?
In conversations with our learners months and even years after a workshop, it was usually clear the experience had left an impression. In many cases, participants described having had an epiphany: the workshop was the moment that they began seeing the world, and their role as leaders, differently. But, hearing their stories, it was much less certain that they were carrying the practical lessons and principles they’d learned into their working lives and applying them in their behavior day-to-day.
Shifting the focus to behavior change
In leadership training, participation and satisfaction are essential. Strong participation means the people who need the training take the training. Strong satisfaction (often) means people who take the training walk away inspired to apply what they’ve learned. A training business cannot survive without the ability to deliver both.
The best trainings, though, go one step further and make it easy to translate inspiration into effective day-to-day action… and results.
One of many topics we focused on with our learners was the art of giving feedback. Somewhat counterintuitively, we found that leaders often struggle just as much to deliver positive feedback as they do negative, and that mastering, and regularly practicing, positive feedback can be a game-changer for the morale of their teams, and even organizations.
With positive feedback, we aimed to achieve three behavior changes:
- Changing the frequency of positive feedback from random (when the leader noticed something particularly impressive) to regular (proactively finding something to acknowledge on a daily or weekly basis)
- Changing the feedback they gave from general to specific, so that it reinforced key behaviors, norms, and values
- Changing how they delivered feedback messages from one-size-fits all to personalized
In each case, the force of habit worked against us. We needed a way to help our participants remember to apply the techniques they’d learned amid the ups and downs of their busy work weeks, as well as to remember the details of those techniques in the moments they needed to apply them.
One way we addressed this challenge was to begin building tools to pair with the lessons our learners most needed and wanted to implement.
1. Pair your learning experience with a tool
For us, tools come in many forms. They could be:
- Talking points
- Conversation planning guides
- Decision scorecards
- Email templates
- Sample agendas
- …and much more
Each tool serves as a practical “job aid”: a short reference and/or starting point that helps a learner apply a concept to a real situation in their work life. Among other formats, we’ve shared tools in paper workbooks handed out during live sessions, in editable PDF documents emailed to participants after a virtual session or eLearning module, and as digital guides that can be personalized and saved in a learning platform and pulled up via smartphone later on.
2. Bridge the gap between the moment of learning and the moment of need
Many learners never take the step to apply what they’ve learned because they don’t pin down a specific time to do so. What they were excited to try during a training session may be completely out of mind when a situation comes along that calls for applying that learning. For busy leaders, simply remembering to go through the exercise of giving regular feedback may be a difficult first step.
A feedback tool needs to help them close that gap.
One way to do that would be to give a range of recurring situations where they might incorporate feedback, and commit to one. Will it be during an upcoming 1×1? During a scheduled team meeting? A performance review? A Friday afternoon? A good tool might prompt the learner to schedule (and perhaps even script) that feedback during the course or workshop, while the learning is still fresh in their mind.
3. Isolate what makes the learning most difficult to apply
Think practically about what obstacles stand in the way of applying the learning to a real situation. Is the technique complicated to remember, with many steps required to apply it well? Will the learner have to endure any anxiety or discomfort to do so? Are there common misconceptions or misunderstandings about the situation that make success less likely? Those are the kinds of failure points the tool should help overcome.
It’s critical not to overlook this step when creating a tool. We’d typically spend hours interviewing a series of learners, seeking to understand what made applying the concepts most difficult for them in practice, and centered our tools around what we discovered.
For giving positive feedback, one difficulty we heard was that the level of detail required for highly impactful feedback felt unnatural to many and thus was fairly uncommon. Most erred on the side of enthusiastic but vague feedback. It took repeated application of our feedback framework before specificity began to feel “natural”. Therefore, it was important that learners used the tool to plan several feedback messages, not just one.
4. Break the challenge into parts
Step three typically uncovers a handful of difficulties that are the most likely failure points for learners. The tool should help them work through all of them, one at a time.
Here’s how we might break down the difficulties of giving impactful feedback into three steps:
Step 1: Describe what the other person did in terms of a specific behavior, observed at a specific moment. Instead of “you’ve been doing a great job recently”, you might say “I appreciated what you said when you set the agenda for Monday’s team meeting”.
Step 2: Connect the dots to the impact. Explain why what they did mattered. For instance: “if you hadn’t addressed the concerns of each party so empathically, I don’t think we would have had a productive conversation, and I don’t think we’d be on the road to achieving our goal today. It was a pivotal moment, and you handled it perfectly”.
Step 3: Deliver your message in the right format or setting. Some revel in public recognition, but others prefer a quiet word or a private note. The tool should give a sense of the options available and how to choose the right format for the person you’re acknowledging.
5. Embed the tool into the learner’s flow of work
Usually, one application of the learning is not enough to cement long term behavior change. That takes repeated practice. While some dedicated students will create their own structures to ensure they practice, it helps immensely if you can grease the skids by putting the tool where they can’t help but think about it in the moments that it can help them.
There are high-tech and low-tech ways to achieve this. Sometimes it’s as simple as posting a printed tool by the door where the learner will notice it when they start their day. Or it could be embedding a feedback tool, for instance, in a performance management platform so that they see a link to it when they fill out a performance evaluation form.
Whether high tech or low, the goal is that the tool gets used enough times that the thought process it maps becomes second nature to the learner, and they get practice applying the concepts to specific situations common to their jobs, and lives. The best tools get scribbled on, highlighted, and/or wrinkled up (or the digital equivalent).
We found that adding tools that followed the five lessons above made a huge difference in the kinds of stories we heard months and years down the road. By providing a concrete map for how to apply the lessons we taught to unique situations, and making it more likely that learners remembered those lessons in the moment of need, we were much more likely to hear that learners had adopted new practices, new behaviors, and achieved new results.
Want to learn more about smart ways to measure the success of your education programs? Watch our on-demand webinar on The Consumption Conundrum, and don’t forget to schedule a demo.