Tuesday, 6 February 2018

An Idea for Measuring the Effectiveness of L&D



Is it just me, or is nobody talking about measuring the effectiveness of Learning & Development? 

Don't get me wrong, there are plenty of options for (and discussion about) evaluating attendance and completion (aka: butts in seats), reaction, learning, behaviour, results (Kirkpatrick's Levels 1-4), and even return on investment (or ROI, Phillips' Level 5).

That's all well and good, but it's also strictly transactional. All we're evaluating is a single course, intervention, or program.

Why aren't we measuring the L&D function as a whole?


Measuring the effectiveness of L&D


Consider this situation:
Edwina heads up her company's L&D function. Her team is exceptionally diligent about measuring the training interventions (whether they're courses, job aids, quizzes or something else).
Employees love the training. Level 1 results typically hover around 4.8 t of 5.
Post-training testing show that people who complete the courses are able to demonstrate the required skills (Level 2). Mastery scores on post-course test average about 90%-95%.
Follow-up surveys of the employees show that they're applying the skills they learned back on the job (Level 3) and surveys of their managers show that those applications are having an impact on business results (Level 4).
Even better, training has a positive ROI. That is, the benefits of the training are greater than the cost of delivering it. 
Sounds great right? Frankly, Edwina's doing a better job of measuring training that most of us. Time for a plot twist:
With all those great results, Edwina felt her position was secure, but she struggled to get a "seat at the table". Senior leadership didn't take her (or L&D) seriously.
The problem was that while her team did a great job executing on the training they delivered, they missed out on many opportunities to really support the business. They were too focused on the training they wanted to deliver and on doing things the way they always had.
Eventually, she was let go and replaced by someone more in tune with the needs of the business.
Ouch!

I have an idea of what her problem was. I'm sure you were expecting me to, otherwise I wouldn't have told you that story.

Actually, I think there are a few issues.
  1. Selection bias: the only people her team surveyed were those who completed training, and (for Level 3 /4 surveys) their managers.
    What about all the people who don't take her training? What about those who abandoned the courses in the middle? Surely they have something important to say - such as why they're not using her services. 
  2. Doing things right versus doing the right things: All her evaluations and metrics do a great job of measuring how well her team does what they do. What if there's a disconnect between what training the business needs and what Edwina's team offers?
  3. It doesn't help with resource planning or allocation. Does Edwina need more people on her team? Is she overstaffed? We can't tell. 

As Peter Drucker wrote, "What gets measured gets managed." So I propose an evaluation of the L&D function. Something we can use to measure how well we're helping the business meet their goals. With that information, we can work on better aligning ourselves to the business.

I suggest we ask every business unit and department head these five questions:
  1. To what extent do you feel L&D has contributed to your business unit's or department's success?
    • L&D is instrumental. We wouldn’t be able to succeed without their help.
    • L&D helps us to succeed.  We could manage without them, but we would be less effective.
    • L&D is irrelevant. They don’t help us meet our goals, but they don’t interfere.
    • L&D gets in the way. They make it more difficult for us to succeed.
    • I didn’t know we had an L&D team. / I don’t know what L&D does.
  2. Is there anything you'd like L&D to start doing?
  3. Is there anything L&D does that you'd like them to stop doing?
  4. What is L&D doing that they should keep doing?
  5. Do you have any other feedback?
The first question serves two purposes. It gets our clients thinking about how we contribute to their success (and how we could going forward). It also gives us feedback on how well we're doing so far.

The remaining questions are optional, but give our clients a chance to weigh in on what's working, what isn't, and provide any suggestions they might have. My hope is that with the first question framing the conversation, the comments will help us get (or stay) on track. Relevance, here we come!

Here's a printable PDF of my original, but I think you'll like the next item better.
I've also set up an editable version of the survey (in Google Docs). I'm sure many of you out there have ideas to make it better, so please mark it up!

I know there's room for improvement here (this the editable shared file), but wouldn't it be great to make statements like these?

  • 45% of the company's department heads say L&D is instrumental in helping them achieve their goals
  • A further 35% of the company's department heads say L&D is helping them achieve their goals.
What do you think? Leave a comment, edit the file, go!