Until recently learning, particularly eLearning, was all about content and interaction – but with advances in learning analytics, that approach is rapidly changing. Data now plays an equally important part in both the learning design process and in how the end learner experiences eLearning.
Though we’ve not been without data before now, what we could do with that data had previously been fairly limited due to the restrictions of standards like SCORM, and not particularly useful to L&D leaders. Tracking scores and completions alone doesn’t give us the insights we need to make decisions as L&D leaders, turning eLearning into a box ticking exercise rather than supporting and measuring the development of meaningful capability.
As L&D leaders, we want to be performance partners to the business, but to truly fulfil that role we need data insights. So, with the advances in learning analytics in mind, what are some of the metrics other than completion and scores that we can now track in our eLearning to measure and support performance and continuous improvement?
1 - Key Learning Performance Indicators (not just Scores)
We traditionally track scores in eLearning, but what do scores really tell us about the learner’s performance and achievement of goals? Well… not much, really. But with the advancement of data analytics, it is now possible to track multiple scores or indicators within a learning experience. These indicators can be aligned to the key business or capability goals of the program and then mapped to questions throughout the experience to give much more meaningful measurements of learning and learner development.
For example, in a sales learning module you might track three key metrics; Customer Empathy, Product Knowledge and Selling Skills. Having a learner or learning cohort’s score broken down by these key indicators is much more useful and relevant in reporting back to the business and in informing decisions about where and how you might need to improve your program or provide additional follow-up support
2 - Quiz Question Answers
Though performance indicators give you great summary level insights into capability, sometimes you need to take a deep dive into the actual answers that your learners have given. Advancements in eLearning analytics mean that you can not only see all of the responses that have been given, but can break these responses down by attempts and look up the and look up the answers to individual questions – all from with SCORM modules.
3 - Common Misconceptions
It’s important not only to track the questions that people have gotten wrong, but also to understand the misconceptions that lead to the incorrect answers. Getting useful insights into these common misconceptions takes a combination of both good learning design and meaningful data analytics. The learning designer must create plausible wrong answers to the questions that align to the most common mistakes people make on the job – this then allows you to take corrective action with specific individuals and explain why the answer they have chosen is not the best way to do things, even though it might seem right.
4 - Self Assessment Results
Self assessments differ from quizzes in that they ask the learner to self-reflect on their work practices rather than answer quiz questions, and are more about benchmarking a learner’s needs than about testing them. Using a similar mapping process to that used in developing the learning performance indicators, these benchmarking questions can be matched to indicators, giving you summary level reporting and in depth insights into your learners needs that can then also be used to hyper-personalise the learner’s journey.
5 - Reflective Questions & Polls
Not all questions need to be quizzes and scores. You can use reflective questions to ask learners to consider their current practice or situation, with no right or wrong answers. This is a great way not only to increase engagement, but also to gather valuable insights into your learners’ current practices and needs. It effectively integrates insight gathering into your eLearning modules, allowing you to better understand your learners and drive continuous improvement.
6 - Confidence Rating
With the power of integrated data analytics added to our eLearning, we can now reconsider how we go about evaluating our learning. One of the most useful evaluation indicators is a shift in learners' confidence – put simply, does the learner feel confident to apply what they have learned to real-world practice. An example of this in our product training context might be asking the learner “how confident are you in recommending this product to a customer” and asking them to rate on 5 point Likert scale. Asking this question before and after the learner undertakes a module lets you see any shift in confidence that occours as a result of the learning.
7 - NPS and Feedback
Similarly, you can now integrate NPS scores and associated feedback for your training right into your eLearning module, making it easy for the learner to provide feedback and ensuring you are capturing the feedback you need to continuously improve and benchmark your learning against other programs you deliver.
8 - Drop off Rates and Points
With screen level reporting its possible to quickly identify both drop off rates, as well as pinpoint the exact point that people abandon your learning. You can then revisit that location and make changes to keep your learners engaged and moving to completion.
9 - Adaptive pathways
Data-driven learning allows you to create powerful multipath and adaptive learning. This in turn lets you report on the exact pathways people take through your learning, so you can see what additional learning people are accessing and monitor its impact.
10 - Where and When stats
Finally, to get deeper insights into how your people access your learning, you should be tracking where they learn (by device, browser, and even geographical locations) and when (by the time of day, week, month and year). This can help you in planning how you design, schedule and distribute your learning. We sometimes assume that our learners are learning on mobiles on their commute home - but more often than not we are wrong. Knowing this information changes the way you design - maybe a 5 minute bite mobile isn’t the right design, and a longer desktop optimised experience could actually the right one for your audience.
How to track these metrics in your eLearning?
Technologies like xAPI, coupled with a Learning Record System, can help you track these insights. However, not all organisations can afford the time or budget to implement these solutions. To help with this, a new generation of authoring tools such as PRODUCER allow you to collect these insights on your existing LMS while still using traditional SCORM packages. With tools like PRODUCER and ANALYTICS you don't need complex IT implementations to reinvent your learning experience; you can just publish and start tracking the insights you need to take your learning to the next level. If you’ like to find out more about how Guroo Producer can help you to do more with your learning data, get in touch with us here and set up a meeting to discuss your needs today.