facebook

RESEARCH

|

CORPORATE LEARNING

Opinion Piece: L&D and the Correlation-Causation Fallacy

L&D causation vs causation
Back

Subscribe for More Content!

SUBSCRIBE

Correlation is a useful statistical technique, but it shouldn’t be used to prove return on investment in L&D. Why? There’s no room for loose threads when showing impact and justifying decisions to executives. 

Let’s talk a little bit about how you can prove causation and why you should never fall for the correlation fallacy in L&D ROI.  

Why are L&D teams under pressure to demonstrate business impact?

Significant money is given to training and development (an average of $1,308 USD per employee), meaning that L&D leaders know the need to justify spending. Most L&D professionals look solely at ROI through the lens of their role. Think traditional KPIs such as: 

And only then will they use improvements on these metrics to correlate L&D to business performance. Which — for the most part — is quite the leap. 

But when you chat with any business leader, the reality is this isn’t what they look for. What they want from the investment is the dial to be moved on strategically impactful metrics such as: 

If you can show causation to these types of business metrics, you are truly a strategic impactful L&D function. 

What’s the difference between correlation and causation?

This is where many organisations tend to get lost. Two seemingly related L&D results can be independently influenced by different factors, though you wouldn’t know it just by taking them at face value. Correlation can be easily stated, but causation is both harder to prove and more valuable to the business. 

Correlation defined

Correlation is any statistical relationship or association between two data sets, aka two results that occur at roughly the same time. The key word here is any—meaning those results can be purely coincidental and therefore unrelated. 

Causation defined

Causation is the undeniable event in which there is a cause and effect. A happened, so B occurred. There’s also reverse causality, whereby you may think A is causing B when it’s actually B leading to A, and third factor causality, when a sneaky C is what’s causing both A and B. You can even have a cyclical situation where A causes B and B causes A.  

Why they are not always linked

There’s this idea that correlation implies causation. This is rarely the case. 

Correlation can be easily exploited. If correlation implied causation, you could argue that the divorce rate in Maine, USA was the direct result of the amount of margarine consumed per capita. Or that the number of films Nicolas Cage has appeared in correlates to visitors at Disneyland Paris.  

You can’t see true cause and effect here because these data sets are influenced by entirely different variables. (And we’re not going to try claim butterfly effect here.) Correlation is better used as a grounding point for hypothesis testing, such as do higher log in rates this month directly cause higher completion rates? Causation is the process of proving log ins lead directly to completions.  

The impact to L&D of getting this wrong

The price for continued funding of the L&D budget is meeting expectations and demonstrating ROI. Assuming learning is impacting business because of parallel financial results won’t win favour.  

Falling victim to the correlation fallacy can mean: 

And when C-Suite expectations aren’t met, the L&D budget is cut and any ability to deliver goes out the window, too.  

NB: You can still get causation wrong. High performers aren’t necessarily an indication of active learners. Even strong investments in training don’t make for high-performing companies. Hindsight’s a wonderful thing, but you need a little foresight to ensure you’re using the right data in your analytics.  

The number one way to solidify causation in your L&D efforts

In our experience, we’ve found that since the very first learning technology launched, everything from LRSs to LMSs and the so-called LXPs of today bolt on learning analytics retrospectively.  

This is because they heard the same challenges that we do: 

And when they heard these pain points, the LMS market defaulted to reactive analytics. Which is still important, and of course we have extensively invested in it, too.  

We just took a different view. We’ve built our product to also look at learning before it happens. 

This is because we both know that L&D does have a strategic impact but learning analytics alone won’t overcome perception issues. The key is the step before learning happens.  

As L&D professionals we all have conversations with business departments and what people leaders think their team needs from L&D. But to prove true causation in L&D results, we believe you need to go beyond this and gather data from workforce planning gap analyses and capability frameworks and gap analysis. Both should also be mapped to business strategy.  

Starting learning right, on a consistent digital platform that then goes on to deliver the learning and the analytics, diminishes the correlation and causation debate. And for the first time, you truly have the data to show business impact in your hands.  

Share this post!

Related Reads on This Topic

self-directed learning

Research

Opinion Piece: Why You Should be Wary of Self-Directed Learning and How You Can Better Approach it

Self-directed learning is an easy solution to employee training, but is it really the best approach to learning in the workplace…

coworkers creating a learning and development strategy

Guide

How to Develop a Learning and Development Strategy for Impactful Employee Training

An effective learning and development strategy fosters professional growth and upskilling. Find out how this can reshape your business today…

strategic training and development planning

Research

Why Strategic Training and Development is Essential for Long-Term Business Success

Strategic training and development is a vital factor in the performance of a company. Discover how to use it for the long-term…