London UK 2025
logo

Dates and Venue

29 - 30 April 2026 | Excel London

29 - 30 April 2026 | Excel London

Using Data Analytics to Drive Performance and Impact

Using Data Analytics to Drive Performance and Impact

Using Data Analytics to Drive Performance and Impact
  • Event: Learning Technologies UK 25
  • Date: 23 April 2025
  • Speaker: Derek Mitchell, People Analytics Lead, Novo Nordisk
  • Chair: Virginie Chassériau, Training Manager and Coordinator, Cup of Learning
  • Estimated read time: 9 minutes

 


 

Quick read summary

This session explored how learning and development teams can use data analytics to move beyond activity tracking and towards measurable performance impact.

It matters now because organisations already hold vast amounts of workforce data, yet most learning decisions are still made with limited evidence.

Readers will gain a practical way to identify learning needs, target interventions more precisely, and measure outcomes that matter to the business, using data they already have.

 


 

Why L&D needs to rethink its relationship with data

Learning teams often say they cannot measure impact because learning outcomes are complex. The session challenged this assumption. According to Derek Mitchell, most organisations already collect far more data about employees than they realise, but L&D rarely uses it to inform decisions.

The core argument was simple. Employees generate data continuously through systems they use every day. If learning exists to build skills, change behaviour, influence culture, strengthen networks, and improve performance, then data should be the foundation for how learning priorities are set and evaluated.

The problem is not a lack of data. It is that learning teams tend to look in the wrong places.

 

Where the most useful learning data actually lives

A central message was that the most valuable learning data does not sit inside learning platforms.

Completion rates and course attendance are administratively useful, but they rarely explain performance or behaviour change. The richer signals sit across the organisation, often outside the learning function’s direct control.

Key sources discussed included:

  • HR systems, which hold role history, tenure, seniority, and self-declared skills
  • Employee surveys, which capture sentiment, confidence, stress, and perceived capability
  • Workplace tools, such as email and collaboration platforms, which show patterns of behaviour and connection
  • Social platforms, where culture and informal learning are visible in what people discuss
  • Performance systems, which track operational outcomes over time

Taken together, these sources allow learning teams to understand not just what people complete, but how they work, connect, and perform.

 

Using skills data to identify need and measure movement

The session showed how even simple skills data can be powerful.

Mitchell described asking employees just two questions: what skills they have, and how capable they believe they are. While self reported data is imperfect, patterns across groups are reliable enough to identify development needs.

This approach supports several learning objectives at once:

  • Targeting learning, by offering content only to those who need it
  • Personalising communication, by explaining why an intervention is relevant
  • Measuring outcomes, by looking for movement in capability before and after learning

Importantly, the goal is not to prove absolute accuracy. The goal is to see change. An increase, decrease, or recalibration of confidence all indicate learning has occurred.

 

Culture and behaviour are visible in everyday systems

Culture change is often treated as intangible. The session argued it is already visible in the language people use and the way they collaborate.

By analysing comments in internal social platforms, learning teams can identify what people care about, where confusion exists, and which topics are gaining traction. With text analysis supported by AI, large volumes of qualitative data become usable at scale.

This allows L&D to move from reactive delivery to proactive intervention, identifying issues before leaders formally raise them.

 

Network data reveals influence, isolation, and silos

Learning often aims to connect people, especially in leadership and development programmes. Yet teams rarely measure whether networks actually change.

The session demonstrated that network data can be gathered with simple questions, such as asking employees who they turn to for help. From this, learning teams can identify:

  • Highly connected individuals who act as informal hubs
  • Isolated employees who may need support
  • Teams or functions operating in silos

By asking the same questions before and after an intervention, learning teams can see whether networks have grown, diversified, or remained static.

 

Performance change starts with the right question

One of the strongest challenges was directed at how learning teams respond to business requests.

Too often, learning is commissioned without clarity on what performance metric needs to change. Building a solution before understanding the problem makes impact impossible to measure.

The session argued for a firmer stance. If a leader cannot show the metric they want to improve, learning should not proceed. Once a metric is defined, learning teams should work directly with data owners to track it over time, using raw numbers rather than summary percentages.

 

Ethics, transparency, and trust

The use of behavioural and workplace data raises legitimate concerns. The session addressed this directly.

Mitchell outlined an ethical hierarchy for data use, prioritising benefit to employees alongside business outcomes. Transparency was positioned as essential. People are more likely to accept data use when they understand why it is being collected and how it helps them.

Anonymisation at analysis level, clear communication, and a focus on employee benefit were presented as non negotiable foundations.

 

Practical application, turning insight into action

Questions leaders should be asking

  • What specific capability or behaviour are we trying to change
  • Which existing data already shows this is a problem
  • Who owns the data we need to track outcomes

Signals to watch in the organisation

  • Self reported skill gaps clustering in specific roles or functions
  • Persistent isolation or over reliance on a small number of individuals
  • Behavioural patterns that indicate burnout or overload

Common pitfalls

  • Treating learning platforms as the primary data source
  • Accepting vague performance problems without evidence
  • Reporting percentages without underlying numbers

What good looks like in practice

Learning teams use a small number of clear data points to identify need, target interventions, and measure movement over time. Decisions are evidence based, transparent, and aligned with real performance outcomes.

 

Key takeaways

  • Most learning relevant data already exists across the organisation
  • Simple data points can drive targeted, measurable learning decisions
  • Skills, behaviour, culture, networks, and performance are all measurable
  • Learning should not proceed without a defined metric
  • Ethical and transparent data use builds trust and engagement

 

Quote of the session

“The good data does not sit in any L&D tool. It is everywhere.”

Derek Mitchell, People Analytics Lead, Novo Nordisk

 

Final thoughts

This session reframed data analytics not as a specialist capability, but as a core discipline for effective learning leadership. When learning teams use the data already at their disposal, they gain credibility, influence, and the ability to demonstrate real impact. The opportunity is not to collect more data, but to use what already exists with intent.

 


 

Speakers

Derek Mitchell, People Analytics Lead, Novo Nordisk. Leads people analytics and insight, with experience applying data across multiple industries to influence behaviour and performance.

Virginie Chassériau, Training Manager and Coordinator, Cup of Learning. Specialises in compliance and leadership training, learning systems, and virtual training methodologies.

 


 

Explore more news
Loading

Join us at Learning Technologies 2026

Be part of Europe’s leading workplace learning event

Excel London | 29–30 April 2026

 

Register your interest

By registering your interest for 2026, you'll receive the latest news and event updates from us.

Exhibiting enquiries

Showcase your solutions to thousands of learning professionals in London next April.

 


 

Learning Technologies Sponsors

Learning Technologies Partners

Global Event Hub

Learning Technologies

London, UK

Learning Technologies Awards

London, UK

Learning Technologies Autumn Forum

Online

HR Technologies

London, UK

Learning Technologies France

Paris, France

HR Technologies France

Paris, France

OEB Global

Berlin, Germany

Zukunft Personal Europe

Cologne, Germany

Zukunft Personal Nord

Hamburg, Germany

Zukunft Personal Sud

Stuttgart, Germany

DevLearn

Las Vegas, USA

Learning

Orlando, USA

Learning & HR Tech Solutions

Orlando, USA

Join us at Europe's leading workplace learning event