Learning Analytics Primer
Navigating the Scope of Disruptive Analytics
Solutions to Deliver Maximum Impact
Mike Rustici
watershedLRS.com
@mikerustici
“Scalable learning is the
new reason for large
organizations to exist.”
John Hagel
Founder
Deloitte Center for the Edge Innovation
“Without data you’re just
another person with an
opinion.”
W Edward Deming
Management Systems Legend
“L&D is responsible for improving our most
important asset, our people. If we can
improve L&D, we can improve the world.”
Me
I said that
measurement
survey
page
05
14%
Not effective
at measuring
formal
learning
43%
Not effective
at measuring
experiential
learning
51%
Not effective
at measuring
informal
learning
Source: Measuring the ROI of Informal Learning, Brandon Hall, 2015
I want to use analytics to improve learning
58%
28%
8%
3% 3%
LEO & Watershed Survey 2016
Strongly Agree
Agree
Neutral
Disagree
Strongly Disagree
The biggest challenge of measuring the impact of learning in
my organization is…
9%
11%
10%
12%
13%
34%
11%
LEO & Watershed Survey 2016
No demand
Too hard
Cost
Don't know how to start
No access to the data
Competing priorities
Other
Global Human Capital Trends 2014: Engaging the 21st-century workforce - A report by Deloitte Consulting LLP and Bersin by Deloitte
Discover
Discover new knowledge through
study, experimentation and
practice.
Digest
Assess the validity of knowledge
and ensure it has a positive
impact.
Distribute
Rapidly disseminate new
knowledge to the entire
organization.
Discover
Digest
Distribute
Learning Embedded Everywhere
ask
yourself
Are your learning systems going
to
get you there?
isn’t it
because…
Small moves, smartly made
“In an exponential world, small moves,
smartly made can set big changes in motion.”
– John Hagel
getting
there
What is the smartest move organizations can make right now to lay a foundation and
set themselves on a path to becoming a scalable learning organization?
What is the biggest enabler of scalable learning?
© Watershed Systems, Inc 2017
learning analytics.
Understanding learning analytics: Complexity
measurement
The simple act of
tracking things and
recording values. Can
be passively or
actively collected.
evaluation
The process of trying to
make meaning from the
data measured.
Descriptive analytics.
Does the data mean
something “good” or
“bad”?
advanced
evaluation
When data sets get large
enough we can use
advanced evaluation
techniques to discover
powerful insights. Data
mining, AI, machine
learning, etc.
predictive and
prescriptive
Make predictions and
decisions based on data
and advanced algorithms.
Recommendations engines
are the best example in
learning.
Learning
Experience
Learner
Learning
Program
Understanding Learning Analytics: Categories
Understand an overall
learning program. Is this
initiative helping to meet
business objectives.
Understand a learner or group
of learners. Ensure
organizational readiness and
compliance.
Understand more about a
specific learning activity.
Maximize effectiveness and
spot problems.
xAPI &
learning analytics platform
page
017
all data
Learning is so much bigger than the
LMS. Track informal and experiential
activities.
xAPI and the LRS present a transformative
technology for learning analytics. Seamlessly
bring all your data together in one platform and
perform real-time analytics tailor made for L&D.
business
impact
Include data about behaviors and
performance in your learning
analytics real time
Create, view, and distribute reports
seamlessly in real-time.
A brief history of
Learning Technologies
Interoperability
SCORM 1.2
2001
Executive
Order 13111
1999
SCORM 2004
4th Edition
2009
Project
Tin Can
20102004 2006
2013
Experience API
1.0.0
2012 2014 20172000
1.0 1.1 200
4
2004
2nd
ed.
2004
3rd ed.
DoDI 1322.26 Enterprise-
Scale
Deployments
1.0.1 1.0.2 1.0.30.9 0.95
2016
Five steps to get started with
Learning Analytics
Gather your Data
Approach 1: Start with what’s easy.
• Use what you already collect.
• Identify what will be easy to send into an LRS.
step
Approach 2: Start with what’s valuable.
• Prioritize what data you need for your specific purpose.
• Plan to collect what aligns with your end goal.
step
Gathering
Methods
step
Performance
Checklist
ConnectorsxAPI CSV Import
Which
method
should you
use?
step
Get to know your
data
1. Check the data
2. Identify gaps
3. Take out the trash
4. Go for a data test drive
5. Find your footing
step
Get to know your data
Check the data.
• Is the data reliable?
• Does it make sense intuitively?
• Do different data sources use similar structures?
• Is data represented in the same way between data sources?
• Are metrics consistent with your expectations?
step
Get to know your data
Identify gaps.
• Are there learning activities you’re not tracking?
• Are you tracking the correct data about learning activities?
• What data will you need in the future?
step
Take out the trash.
• Have you found excessive in the data?
• Is there junk data that you are collecting?
Get to know your data
Go for a data test drive.
• Don’t just look at the raw data.
• Look at reports or visualization with this data.
• Compare new/clean data with historical data.
step
Find your footing.
• Make sure you’re confident in the data you have before moving
on to step 3 (operationalize).
Operationaliz
e your data
Save souls from spreadsheet hell.
• Eliminate manual work and reduce errors.
step
Make small improvements.
• Automate manual processes.
• Incorporate new information into workflows.
Operationaliz
e your data
Report on additional metrics.
• Go beyond completions, achievements, scores, and attendance.
• Add interesting analytics about your learners or experiences.
• Look at data from different angles (compare departments or teams).
step
Define Benchmarks
• Define KPIs or benchmarks right away, even if you adjust later.
• Monitor data against these benchmarks from the start.
Explore your data
Explore surprises & study outliers.
step
Explore your data
• Ask more questions.
• Form your own questions, see if your data can answer them.
• Consider the possible answers and resulting actions.
• Evaluate your program.
• Look at the learning program holistically.
• Can you measure the impact on the larger business goal?
step
Explore your data
step
Build on what
you’ve learnedstep
If you started with what’s easy (in step 1)
• Add metrics that are harder to capture, but more valuable to evaluate.
• Continue to iterate until you’ve identified your most valuable metrics.
If you’ve already identified your most valuable metrics
• Use the Seven Steps of Learning Evaluation to reevaluate metrics.
• Identify if you need new metrics.
step
The future in practice
Visa University:
A new
ecosystem
A next generation learning
ecosystem brings this all
together.
This example from Visa
shows that the LMS is just
one small part of a modern
learning ecosystem.
analyze informal learning & learning streaks
page
037
What they learned:
• How do people choose to learn?
• Search:
• What are people searching for?
• What are they not finding?
• What resources are most popular?
• What resources are not being used (scrap learning)?
• Are people going on learning streaks?
Code Blue
When a patient's heart stops
and needs to be resuscitated.
Key Metrics
Time to
first drug
Time to
defibrillation
Time to
chest
Mock Code Blue/ In person
Observation
Mobile Simulation/
Defibrillator App
LMS Training Lab
Learning Analytics Platform
(Watershed)
Getting There: Modern Ecosystem
Post Hire
International Technology Giant: Customer Service
• Track activity across LMS and
support/ticketing platform
• Monitor Customer Satisfation
• Identify how KPI trends
are impacted by other
activity
AT&T: monitor cross-platform training
What they accomplished:
• Deliver compliance training across devices (mobile,
desktop)
• Tracked without using an LMS (learners could acces
from anywhere)
• Detailed understanding of how learners were using
the content
• Spotting troublesome areas and fixed problems
before too many users were affected
Getting Started
yes you can!
01
Gather your data
Just start collecting data in a common
format in a central location. Make it low
friction and ensure you have access.
04
Explore your data
Starting to understand not just “what” is
happening but “why.” Form questions and
see if your data can answer them. Often the
best answer another question. Can you see
impacts on the larger business?
02
Get to know your data
Understand what you have. What is out
there, what is reliable and what is missing?
Do some simple evaluations and create
baselines.
05
Experiment with your data
Start a new learning program with analytics
in mind. Set up well controlled experiments
and A/B tests to validate a hypothesis.
Create a culture of continuous
improvement.
03
Operationalize your data
Automate your way out of “Excel hell.”
Define some interesting metrics and KPIs,
start monitoring them routinely. What trends
do you notice?
06
Show off your data
We’ve long known that learning is vital, not
we can probe it. Go forth and show the
world!
our
contacts
watershed
210 Gothic Ct
Franklin, TN 37067
+1-844-220-0822
http://watershedlrs.com
@watershedlrs
me
Mike Rustici
Founder and CEO
mike.rustici@watershedlrs.com
@mikerustici

Learning Analytics Primer: Getting Started with Learning and Performance Analytics

  • 1.
    Learning Analytics Primer Navigatingthe Scope of Disruptive Analytics Solutions to Deliver Maximum Impact Mike Rustici watershedLRS.com @mikerustici
  • 2.
    “Scalable learning isthe new reason for large organizations to exist.” John Hagel Founder Deloitte Center for the Edge Innovation
  • 3.
    “Without data you’rejust another person with an opinion.” W Edward Deming Management Systems Legend
  • 4.
    “L&D is responsiblefor improving our most important asset, our people. If we can improve L&D, we can improve the world.” Me I said that
  • 5.
    measurement survey page 05 14% Not effective at measuring formal learning 43% Noteffective at measuring experiential learning 51% Not effective at measuring informal learning Source: Measuring the ROI of Informal Learning, Brandon Hall, 2015
  • 6.
    I want touse analytics to improve learning 58% 28% 8% 3% 3% LEO & Watershed Survey 2016 Strongly Agree Agree Neutral Disagree Strongly Disagree
  • 7.
    The biggest challengeof measuring the impact of learning in my organization is… 9% 11% 10% 12% 13% 34% 11% LEO & Watershed Survey 2016 No demand Too hard Cost Don't know how to start No access to the data Competing priorities Other
  • 8.
    Global Human CapitalTrends 2014: Engaging the 21st-century workforce - A report by Deloitte Consulting LLP and Bersin by Deloitte
  • 9.
    Discover Discover new knowledgethrough study, experimentation and practice. Digest Assess the validity of knowledge and ensure it has a positive impact. Distribute Rapidly disseminate new knowledge to the entire organization. Discover Digest Distribute Learning Embedded Everywhere
  • 10.
    ask yourself Are your learningsystems going to get you there?
  • 11.
    isn’t it because… Small moves,smartly made “In an exponential world, small moves, smartly made can set big changes in motion.” – John Hagel
  • 12.
    getting there What is thesmartest move organizations can make right now to lay a foundation and set themselves on a path to becoming a scalable learning organization? What is the biggest enabler of scalable learning?
  • 13.
    © Watershed Systems,Inc 2017 learning analytics.
  • 15.
    Understanding learning analytics:Complexity measurement The simple act of tracking things and recording values. Can be passively or actively collected. evaluation The process of trying to make meaning from the data measured. Descriptive analytics. Does the data mean something “good” or “bad”? advanced evaluation When data sets get large enough we can use advanced evaluation techniques to discover powerful insights. Data mining, AI, machine learning, etc. predictive and prescriptive Make predictions and decisions based on data and advanced algorithms. Recommendations engines are the best example in learning.
  • 16.
    Learning Experience Learner Learning Program Understanding Learning Analytics:Categories Understand an overall learning program. Is this initiative helping to meet business objectives. Understand a learner or group of learners. Ensure organizational readiness and compliance. Understand more about a specific learning activity. Maximize effectiveness and spot problems.
  • 17.
    xAPI & learning analyticsplatform page 017 all data Learning is so much bigger than the LMS. Track informal and experiential activities. xAPI and the LRS present a transformative technology for learning analytics. Seamlessly bring all your data together in one platform and perform real-time analytics tailor made for L&D. business impact Include data about behaviors and performance in your learning analytics real time Create, view, and distribute reports seamlessly in real-time.
  • 18.
    A brief historyof Learning Technologies Interoperability SCORM 1.2 2001 Executive Order 13111 1999 SCORM 2004 4th Edition 2009 Project Tin Can 20102004 2006 2013 Experience API 1.0.0 2012 2014 20172000 1.0 1.1 200 4 2004 2nd ed. 2004 3rd ed. DoDI 1322.26 Enterprise- Scale Deployments 1.0.1 1.0.2 1.0.30.9 0.95 2016
  • 19.
    Five steps toget started with Learning Analytics
  • 20.
    Gather your Data Approach1: Start with what’s easy. • Use what you already collect. • Identify what will be easy to send into an LRS. step Approach 2: Start with what’s valuable. • Prioritize what data you need for your specific purpose. • Plan to collect what aligns with your end goal.
  • 21.
  • 22.
  • 23.
  • 24.
    Get to knowyour data 1. Check the data 2. Identify gaps 3. Take out the trash 4. Go for a data test drive 5. Find your footing step
  • 25.
    Get to knowyour data Check the data. • Is the data reliable? • Does it make sense intuitively? • Do different data sources use similar structures? • Is data represented in the same way between data sources? • Are metrics consistent with your expectations? step
  • 26.
    Get to knowyour data Identify gaps. • Are there learning activities you’re not tracking? • Are you tracking the correct data about learning activities? • What data will you need in the future? step Take out the trash. • Have you found excessive in the data? • Is there junk data that you are collecting?
  • 27.
    Get to knowyour data Go for a data test drive. • Don’t just look at the raw data. • Look at reports or visualization with this data. • Compare new/clean data with historical data. step Find your footing. • Make sure you’re confident in the data you have before moving on to step 3 (operationalize).
  • 28.
    Operationaliz e your data Savesouls from spreadsheet hell. • Eliminate manual work and reduce errors. step Make small improvements. • Automate manual processes. • Incorporate new information into workflows.
  • 29.
    Operationaliz e your data Reporton additional metrics. • Go beyond completions, achievements, scores, and attendance. • Add interesting analytics about your learners or experiences. • Look at data from different angles (compare departments or teams). step Define Benchmarks • Define KPIs or benchmarks right away, even if you adjust later. • Monitor data against these benchmarks from the start.
  • 30.
    Explore your data Exploresurprises & study outliers. step
  • 31.
    Explore your data •Ask more questions. • Form your own questions, see if your data can answer them. • Consider the possible answers and resulting actions. • Evaluate your program. • Look at the learning program holistically. • Can you measure the impact on the larger business goal? step
  • 32.
  • 33.
    Build on what you’velearnedstep If you started with what’s easy (in step 1) • Add metrics that are harder to capture, but more valuable to evaluate. • Continue to iterate until you’ve identified your most valuable metrics. If you’ve already identified your most valuable metrics • Use the Seven Steps of Learning Evaluation to reevaluate metrics. • Identify if you need new metrics.
  • 34.
  • 35.
    The future inpractice
  • 36.
    Visa University: A new ecosystem Anext generation learning ecosystem brings this all together. This example from Visa shows that the LMS is just one small part of a modern learning ecosystem.
  • 37.
    analyze informal learning& learning streaks page 037 What they learned: • How do people choose to learn? • Search: • What are people searching for? • What are they not finding? • What resources are most popular? • What resources are not being used (scrap learning)? • Are people going on learning streaks?
  • 38.
    Code Blue When apatient's heart stops and needs to be resuscitated.
  • 39.
    Key Metrics Time to firstdrug Time to defibrillation Time to chest
  • 40.
    Mock Code Blue/In person Observation Mobile Simulation/ Defibrillator App LMS Training Lab Learning Analytics Platform (Watershed) Getting There: Modern Ecosystem
  • 41.
    Post Hire International TechnologyGiant: Customer Service • Track activity across LMS and support/ticketing platform • Monitor Customer Satisfation • Identify how KPI trends are impacted by other activity
  • 42.
    AT&T: monitor cross-platformtraining What they accomplished: • Deliver compliance training across devices (mobile, desktop) • Tracked without using an LMS (learners could acces from anywhere) • Detailed understanding of how learners were using the content • Spotting troublesome areas and fixed problems before too many users were affected
  • 43.
    Getting Started yes youcan! 01 Gather your data Just start collecting data in a common format in a central location. Make it low friction and ensure you have access. 04 Explore your data Starting to understand not just “what” is happening but “why.” Form questions and see if your data can answer them. Often the best answer another question. Can you see impacts on the larger business? 02 Get to know your data Understand what you have. What is out there, what is reliable and what is missing? Do some simple evaluations and create baselines. 05 Experiment with your data Start a new learning program with analytics in mind. Set up well controlled experiments and A/B tests to validate a hypothesis. Create a culture of continuous improvement. 03 Operationalize your data Automate your way out of “Excel hell.” Define some interesting metrics and KPIs, start monitoring them routinely. What trends do you notice? 06 Show off your data We’ve long known that learning is vital, not we can probe it. Go forth and show the world!
  • 44.
    our contacts watershed 210 Gothic Ct Franklin,TN 37067 +1-844-220-0822 http://watershedlrs.com @watershedlrs me Mike Rustici Founder and CEO mike.rustici@watershedlrs.com @mikerustici

Editor's Notes

  • #6 We’re not very good at this yet
  • #7 But we want to be Attitudes and organizational priorities need to come along This is data from a LEO / Watershed survey
  • #8 But we want to be Attitudes and organizational priorities need to come along This is data from a LEO / Watershed survey
  • #9 Good news, Deloitte ids L&D as most ready to take advantage of analytics next
  • #10 So “what is scalable learning”? Scalable learning is not just learning new things that others know, but rather Creating and discovering new knowledge – this is more than just finding a new book to read Assessing its validity Rapidly disseminating it Ensuring it has a positive performance impact Discover -> Digest -> Distribute
  • #11 Is your LMS going to get you there? Fortunately there is a new generation of learning tools, models and ecosystems emerging Option – “is your learning org going to get you there”
  • #12 But the good news is, In an exponential world, small moves, smartly made can set big changes in motion Is this a better opening concept?
  • #14  Learning analytics is the big thing that’s now possible It’s not big and scary, many types of learning analytics that range from simple to complex
  • #18 xAPI is enabling this transition in learning, just like marketing It’s all about removing friction Removing just a little bit of friction has a big impact Apple design methodology Kirkpatrick model has been around since 1954, but how much of it are we using. Remember, when we asked people why they aren’t doing more learning analytics, one of the chief responses was “it’s too hard” That is changing, the friction is disappearing and that will have a profound impact.
  • #19 Can we add a bubble to this slide to show enterprise scale deployments in 2017? And get rid of the DoDi in 2016, I don’t think it happened
  • #21 Ensure your goals align with strategic priorities Involve all appropriate stakeholders Reassess the program if goals do not align
  • #22 Ensure your goals align with strategic priorities Involve all appropriate stakeholders Reassess the program if goals do not align
  • #23 Ensure your goals align with strategic priorities Involve all appropriate stakeholders Reassess the program if goals do not align
  • #24 Ensure your goals align with strategic priorities Involve all appropriate stakeholders Reassess the program if goals do not align
  • #33 Talk about experimenting with the program after this diagram.
  • #34 The next time you begin a new learning program, design it with analytics in mind.
  • #35 The next time you begin a new learning program, design it with analytics in mind.
  • #36 What does this look like in the real world
  • #37 One of the first to deploy and publicize at enterprise scale is Visa Notice: -LMS is just one small part -Inclusion of non-learning tools Sharepoint and SurveyMonkey -Learning Experience platform as front end
  • #38 (Visa example) From a client using Sharepoint Learning experience analytics Basic evaluation (with some text processing goodness thrown in)
  • #39 It’s important to develop skills during training before performing tasks in the real world. Real-world errors can have significant consequences. The organization needs to be confident of learner’s competency. Skills to be learnt are practical and physical.
  • #40 In medical training, evaluation of training effectiveness is especially important. The real-world tasks clinicians are being trained to perform are high stakes and need to be done right first time. Clinicians can’t practice on real patients while they figure it all out; to do so would put lives at risk. Instead, they must reach competency in the training environment, before applying that training in the real world. Not only do the clinicians have to be competent, but MedStar needs to be confident in their competence. Assessments need to be rigorous and effective so MedStar knows which clinicians are ready. Another unique aspect of medical training is that the skills learnt relate to very practical and physical tasks. Training and assessments, therefore, also need to be practical and physical. While it is important to understand the theory of how the heart works, what really matters is to have the skills to perform the tasks needed to resuscitate the patient.
  • #41 MedStar’s training program is made up of three main elements: The Learning Management System manages learner and organization data, and tracks a record of classroom training attended. A mobile app, called Zoll, allows clinicians to practice the steps involved in defibrillation on their own as often as they want to. In situ mock Code Blue simulations enable clinicians to practice resuscitating a medical dummy as a team.
  • #42 Learning program evaluation – advanced analytics / prescriptive analytics
  • #43 (AT&T example) Where do people bail / get stuck - is this point problematic? Watching people cheating, etc See actual experience of user – help debug How long, which pages spend most time on, which questions consistently getting wrong. Learning experience analytics And Learner analytics Still just basic evaluation
  • #45 Wrap up and Q&A