Learning Analytics: A Case for Metric Quality not Quantity
Analytics are essential for providing both a micro and macroscopic view of learning populations. Too often, however, we end up drowning in data and important details are lost. As L&D specialists, self-discipline is key to extracting actionable, workable insights from a few, vital data points.
Data is key to learning, and recent developments in the range of what can be measured has transformed the L&D industry. The rise of xAPI statements means that analytics can be derived from all forms of learning: formal, informal, online, offline. The list is endless, and so too is the scope for potential.
However, the sheer volume of data has its downsides. As marketing and customer experience expert, Jay Baer, famously said: ‘We are surrounded by data, but starved for insights’. Within L&D this rings true. Heads of Compliance, Training and HR, not to mention the C-suite, want succinct, data-driven insights to see if content is landing and, if it isn’t, they want evidence-based course-adjusting.
This is where learning analytics or, the discovery, interpretation, and communication of meaningful learner data patterns, is essential. By being selective, an L&D specialist can focus on a limited number of metrics to analyse performance, predict risks, and assess learning culture.
Time, the first of these metrics, may seem somewhat obvious. However, a nuanced understanding of time to engagement, time within the learning experience, and time on specific learning activities, can deliver superb learner analytics.
This is because these three measurements are gradually ‘zooming in’. Time to engagement enables analysis of an organisation’s learning culture. The longer it is (i.e. the longer the gap between a learner receiving the content and beginning it), the less ingrained learning is in that organisation. Zooming in, time within the experience reflects how engaging the content is; while, zooming in again, time spent on specific activities shows where an individual learner’s strengths and weaknesses lie.
This is a lot more than just recording right and wrong answers. Instead, it is important to compare the training’s results to ‘real world’ data, and to interrogate the content itself. Like all forms of judgement, analytics can fall victim to our biases. Therefore, accuracy should be scrutinised objectively and alongside other data, such as learner feedback.
If a task within the learning content has been answered incorrectly it could suggest a gap in the knowledge. However, it could be a reflection of unengaging content. Furthermore, if, for example, the organisation keeps receiving fines for regulation breaches, but the learners are completing the learning with a high degree of accuracy, maybe the content needs to be updated.
Following on from accuracy, judgement is an essential metric, especially for compliance learning. This is because a learning intervention, particularly in the highly regulated pharmaceutical or financial services sector, can never prepare employees for all situations. Instead, the learning measures their ability to make judgement calls when faced with unknown circumstances.
By gathering responses to nuanced questions, L&D specialists can determine the level of judgement within a learning population. They can even predict which real-world situations employees are likely to struggle with, and suggest content to mitigate against this.
It is not just the scope of data that has expanded in recent years, so too has its geographical range. With corporate training now being rolled out to a variety of locations globally, geography has become an indispensable lens for the L&D specialist.
For instance, if our analytics show that one location appears to be struggling with the content, L&D specialists can support that region with a more accessible, or localised offering. Like time, geography is a highly versatile metric. On the one hand, it enables employers to see whether organisational values are being translated across borders; on the other, it can assess an individual’s performance within any given region.
Engagement, or, the extent to which people are truly absorbed in the experience, is the gateway for learning and a vital metric.
One way to measure engagement is to look at the uptake of optional and challenging ‘stretch’ content. Similarly, analytics can be gleaned from how learners interact with the main content, through signing-up, dropping-off and their performances with badges and/or leader boards. The recent influence of social media on learning also means that learner discussions, reviews, feedback and recommendations can also be useful in gaining insights on engagement.
It is important to add that all of these metrics are useful by themselves but become more powerful when used together. Through combinations, actionable insights can be rapidly delivered to key decision makers. Learning analytics become a tool, rather than an obstacle, to quickly understanding organisational cultures, learning populations and even individual learners.
Obviously, data, dashboard and tech are all important, but hyper-focusing on these elements somewhat misses the point. Instead, humans should be kept at the centre of any planning around learning analytics. By having a people-first strategy, L&D specialists can limit their focus to a few key metrics, ask the right questions and ensure that learning translates into actionable and sustainable decision making.
Head of Learning Strategy, Sponge