Contact Us

12 Learning Metrics You Can Design Into Your Course Learning Journey

| 4 Min Read

Traditionally, there has only been a few metrics that digital learning has focused on - course completion and score. It’s understandable - our learning systems and technology have been designed that way, and let’s face it they are easy metrics to get our heads around. But what do they really measure? Our ability to nudge people to the end of a course and their ability to answer a few random quiz questions and little else. What they don’t measure is the effectiveness of the learning.

However, our learning systems are finally coming of age and with this comes the ability to measure more. Technologies like xAPI, Learning Record Systems and integrations with business data tools like Tableau and Power BI mean that we now have the means to measure the things that help us better understand the effectiveness of our learning programs.

But what are those things?

Below we have mapped out 12 metrics you can now easily integrate into your learning journey design and then measure. We have categorised these metrics against the four levels of Kirkpatrick’s model for learning evaluation to show coverage. 

Reaction - how did participants respond to the training?

  • Progress/Drop Out Points - with more granular tracking we can now more easily track progress for each individual learning object in digital learning, even from within learning modules. This allows us to pinpoint where people falter and drop out of learning experiences.

  • Activity Time  - while we have been able to track total learning time for a while, we can now track time spent on each digital activity, including videos and looking at when people stop and pause. 

  • Learning NPS - like any customer service, measuring the likelihood of participants to recommend the learning to other participants can be a great measure of their reaction to it. However, this is obviously best used in learning that people have selected for themselves.

 

Learning - how much did participants learn from the training?

  • Capability Area Scores (or Topic Scores) - Overall scores are only really useful if all questions in a quiz are testing the same capability, which is rare. Instead, if you group questions into capability areas (or topics) and calculate a score for each area you can get much deeper insights into where the learning is working and where it is not. 

  • Question Attempts - With modern learning analytics it’s relatively easy to look at how many attempts a person takes to answer a question. Measuring this can tell you where the learning may not be very effective. 

  • Common Misconceptions - Delving a little deeper, you can take a look at each question and see where things are really going well, or not.  The temptation here is to just look at which questions are right and wrong. However if you ensure you write realistic wrong answers the most popular wrong answers will help you identify where you can make improvements.  

     

Behaviour - have participants applied what they have learned?

  • Learner Confidence: Here we measure a shift in a person’s confidence to apply what they have learned in the real world. While strictly this doesn’t actually measure application, it is a way of getting immediate feedback on the usefulness of the learning to real world situations. To do this, simply ask a question like “As a result of this learning, how confident are you that you can…”

  • Complex Scenario Results: If you build complex multi-step adaptive “real-world” scenarios into your learning and measure the outcome of those scenarios, you can get a realistic measure of the learners capability to apply what they have learned to the real world. 

  • Application Task Quality: Most modern learning portals allow you to create learning journeys rather than just events. By building in real world measurable work tasks in the days or weeks after a learning knowledge event, you can get a measure of the learners’ ability to actually apply the learning. Simply ask them to complete the task and then have a coach or manager rate the output of that task against a rubric of objective evaluation questions.  

  • Action Learning Survey: During a learning event, ask a learner to digitally record an action log - things they will do or change in their day-to-day work, and when they will do them. For the actions they have flagged for the near future send a follow up survey to see if they have done it, and if on reflection the learning helped them do it.  

  • Performance Surveys: We often survey participants at the end of a learning event for feedback - often referred to as “happy sheets” as it only really gives us an indicator of the learner's contentment with the user experience, and not really a measure of effectiveness. Instead delay this survey until weeks after and ask questions about what they have done with the learning. Better still survey the learner’s manager or coach and ask questions about how the learner’s performance in the program area has shifted.  

 

Results - What benefits has the organisation experienced as a result of the training?

  • Business Metric Correlations: Measuring the impact of learning on real-world business metrics is hard because there are so many other factors that influence business performance like sales. Life is easy if after everyone completes a product module sales of that product go up, but we all know that could be for many other reasons. Instead look for correlations. First try and find common patterns in your learning data that can be cross referenced to business metrics. For example learners in this group of stores performed much better in the learning than another group of stores but both groups of stores had the same market conditions - if the stores with better learning outcomes outperform on product sales, you may be able to reasonably conclude a causal effect. 

Showing a business outcome will always require you to dig a little deeper but with all this extra data learning data now available in xAPI and with tools like Watershed, Tableau and Power BI the task of exploring your data has become a whole lot easier.

This is just a selection of metrics you can measure with the help of modern learning analytics tools. The common thread is they have all been designed into the learning journey. If it’s these metrics or others you want to measure the most important thing is to think about them up front and design them into your learning journey.