7. “Big breakthroughs
happen when what is
suddenly possible meets
what is desperately
necessary.”
--Thomas L. Friedman
http://www.nytimes.com/2012/05/16/opinion/fried
man-come-the-revolution.html
-we are the sum of what we've learned.
-systems try to capture and quantify our knowledge
-woefully incomplete
-even within just one context, say your employment at Google, or your tenure at a school, is there a system that captures more than a single digit percentage of what you've learned or what you know?
-no, there isn't,
-why is that?
-problem - lack of a standard, or an API, to tie data from different learning systems together
-standards define and confine an industry
-in learning industry we've been stuck for a decade
SCORM did this to e-learning,
created vibrant diverse marketplace
but now it holds us back
-lack of an interoperability standard has led to an outgrowth of monolithic systems
-that's natural, when systems can't interoperate/communicate, everything needs to happen in one system
-but that's not how learning happens
-these systems haven’t prevented us from learning in innovative ways, they have just don’t have visibility into it
I love this quote.
It perfectly describes where eLearning is today
We have so much technological ability to do things that just aren’t incorporated into elearning
eLearning is 8-10 years behind the state-of-the-art
-learning happens everywhere, not in a system, it can't be confined
-internet age is a utopia for learning, information is everywhere, we are constantly learning now
-you guys know this better than anybody
-we work with professional specialized, industry-leading learning systems every day, but all of these learning systems are dwarfed by two of the world's biggest learning systems created under this roof: Google Search and YouTube.
-These are but two examples of where people really learn, but it is all disconnected.
-Currently if it doesn't happen in the monolithic system we don't have any visibility into it, it can't be tracked.
-A person's knowledge and learning experiences are broad and vast, so much more than just what happens in an LMS, no matter how monolithic that LMS will ever be.
-We should never force a learner into an unnatural context or system to gather knowledge.
-a better model is to let learning happen naturally, in the environment and context that learner prefers.
-let learning happen where the learner lives.
-My YouTube story, when I wanted to learn how to weather-strip the windows in my house, I didn't go to an LMS and register for a class, I just pulled out my phone and watched a YouTube video while I was standing in front of the window.
-It was a VERY effective way of learning. It gave me just the information I needed, at the precise moment of need, without taking me away from the job.
-Wouldn't it be nice if that bit of learning was tracked though? Perhaps it's a trivial example, but perhaps someday I'll be working with Habitat for Humanity, and the build supervisor will be looking for somebody qualified to weather-strip the windows.
-learning systems *should* be specialized.
-both for tracking learning and for delivering learning.
-people learn differently. context matters *a lot*.
-organizations care about different things. departments have different metrics and goals. professions have different requirements.
-many organizations fight that, they put tracking above learning efficacy
-but there's something missing. there needs to be a connection.
-how do we bring the data from these systems together?
-how to do we get a holistic picture of a person's learning experiences?
-how do we inform one system about relevant learning experiences from another system?
-taken one step farther, how do we connect learning experiences that happen across a person's entire life, across schools, companies, hobbies etc
-our learning environment should consist of series of best of breed services, but they need to be connected.
-they need to be tightly integrated, yet loosely coupled.
-At Rustici Software, we've been thinking this way for years, and it's manifesting itself in how we designed the next industry standard.
-enter Tin Can API - a common language for experience to talk about the things that people are doing
-short history - replacement for SCORM, we created it from research project, reached v1.0 in April, also called xapi, very early in its evolution, but already very rapid adoption and profound impact on the way people are thinking about learning systems
-remember a standard confines an industry.
-a new standard that opens up new possibilities creates massive new opportunity and is a chance to rethink learning systems from the ground up.
-that is happening right now. it is still very early days, but the trends are unmistakable.
-so what does tin can do. best explained with layers of the onion analogy
-started out mostly with basic requirements, layer 1, mobile, games, simulations, offline, simpler, etc. These were straight forward requirements, easily met with a simple REST API and flexible data model.
-quick note on how Tin Can works, this is a tech talk after all. API for exchanging a series of statements of the form "I Did This". Actor-verb-object construct allows us to capture broad set of experiences [include a few examples].
-if we just stop here, it's already a big step forward but this simple construct opens up a broader set of capabilities with profound implications
-layer 2- can track both formal and informal learning, LMS doesn't have to know about it in advance, anything can be content or a learning experience. now we can track it
-layer 3 -
-just started using a new work "LRS", kind of accidentally invented a new class of enterprise software
Diagram 5 – highlight Content Delivery and tracking from diagram 3
Two core pieces provide most of the utility of the LMS
Diagram 6 – same as Diagram 5, but arrows labeling them “Learning Record Store (LRS)” and “Training Delivery System (TDS)”
With Tin Can we can separate these components, the LRS and TDS
-independent system for recording and analyzing learning experiences.
-inherently open to facility transfer of data across systems.
-causes us to re-think enterprise learning systems.
-let's have our specialized systems. let's let learners learn wherever they want to.
-let tracking take a back seat instead of driving system architecture and more importantly learning/instructional design.
Introduce the Training Delivery System (TDS)
Diagram 12
A LRS can be used to consolidate data from multiple LMS’s.
let's have a personal data locker.
-layer 4 - compare learning data to performance. wow, what a concept, why haven't we been doing this all along.
-so what are the implications of all this?
-3 big implications
-biggest change is freeing instructional designers to design the ideal learning experience, using whatever technical (or non-technical) solutions are best for the problem.
-tracking is no longer limiting us.
-we are free to use many modalities, fully realize technology's potential.
-specialization of learning systems of breaking up monolithic LMS, use best of breed components.
-as we understand it, this approach is similar to how Google approaches the world.
-Google is a series of shared services, tied together under a Google umbrella through a Google account.
-They communicate via APIs when necessary but remain independent.
-Allows us to use best of breed as necessary. Swap out Google Video for YouTube. Great example of tightly integrated yet loosely coupled.
-introducing a focus on analytics into learning organizations.
-lots of thought about possibilities for big data in learning
-this doesn't just apply in the corporate world. in fact, it is probably more important for us to start applying technology in the educational sector.
-I have elementary school age children. I watch them playing educational games on my phone or on the computer all the time, but their teacher doesn't have any visibility into.
-how great would it be for Ms McMurry to be able to see that Amelia already knows 3-digit addition so she can focus her precious classroom time on teaching her something more advanced?
-when i think about the possibilities i get very excited.
-where are we now:
-vendors in learning industry jumping on it.
-but most at SCORM parity level.
-such fundamental change introduces big disruptive effect for the industry.
-vendors tangential to the eLearning industry are now entering it as a market
-innovative companies are experimenting with and piloting new capabilities.
-most are focused on the feasibility of a LRS centric architecture.
-how to make meaning of Tin Can data. how to correlate learning with performance.
-at the same time, introducing new sources of learning data into the fold: mobile, checklist, simulations are most popular right now.
-still very early stages though, most of first pilots aren't even finished yet.
-key insight - starting small, connecting narrower stuff, solve many specific problems to discover the patterns and generalities
-these trends are happening, but we still have a lot to prove and implement before this takes hold
-so we're in a very exciting time, leading to much change and opportunity.
-tin can doesn't solve every problem, but it removes so many of the constraints of old that I'm confident we are entering an exciting time.
-there is so much pent up demand, it is exciting to see it finally being satisfied.
Slide - Timing 2 (timeline visual)
-Tin Can was at version 0.9, 0.95 was about to be released
-This project informed some of the evolution of the spec through these early drafts.
-Two types of activity providers, traditional SCORM courses and videos
-Both in a browser and on a mobile device, either connected or disconnected
-All data needed to be centrally aggregated and reliably stored to drive reports
that feed into business processes
-Now let’s dive into each of the specific components
Diagram 14 – a real world example
Vandy:
-Using LRS as center of architecture
-Previously many LMS’s, transition from LMS being the center to LMS being one source of learning data
-Data to come in from many other places
-Independent reporting tool possibilities
-Tying learning outcomes to organizational pillar goals
-telecom – doing compliance training better, measuring actual result in reported incidents
-disaster relief – measuring performance of training programs through after action reviews
-govt health – centralizing data from many LMS systems, awarding badges and measuring training effectiveness through simulation performance
-credit card – looking at sales training and correlating that with data from the CRM to measure its effectiveness