Our technology allows us to provide real value from the data behind learning- we can do this by evaluating the data we collect to develop a greater knowledge about learners through looking at their choices, actions, and results while using an online learning system.
Using this strategy has allowed us to better shape our goals for developing a better understanding of each learner, resulting in providing better content and materials, and ultimately being able to better cater to every learner’s level of needs.
Through using data we can improve and curate content, understand whether it is relevant, efficient, and effective, and decide how we can improve the content to fit the program and each of the different individuals engaged with it.
Some of the most important questions to consider when optimizing the use of this data include:
What does learning truly mean?
How do people most effectively gain knowledge, practices, and skills?
Overview of the Data Process
PHASE 1: Everything Starts with an Open Source Solution
Learning Management Systems try to fill the gap by answering the most important questions. Some of the main functionalities when using an LMS to collect data allow for:
Better assessment by being able to evaluate not only the end result, but being able to analyze how learners get there through looking at:
How many attempts a learner takes
How many times they check their answer before submitting
The duration of time between attempts
By understanding each of these elements, LMS’s and Learning Technologists are enabled to implement better learning tools.
Better collaboration through providing the technology to allow discussion and engagement with other learners. When done in groups, learning is much more effective- we have the ability to share and work together to solve problems. The data gathered from analyzing collaboration and interaction can also better inform how each person most effectively acquires and retains information.
PHASE 2: Adhering to Standards
Using technological best practice and standards allow for the best conversion of data into the information that really matters. For this reason our technology is held to several analytical standards including:
xAPI– standard that provides a way to learn exactly what a student does in any given moment while they are learning
Caliper– standard that provides the means to develop better collaboration tools
Commonly Known Collaboration Tools
People are enabled to collaborate on the same document from anywhere in the world.
Groups can communicate and make decisions via dedicated targeted channels from anywhere in the world.
Teams can manage projects, follow team status, and track completion and progress of all elements throughout the process from anywhere in the world.
A variety of users can view a video all at the same time and simultaneously interact with it, comment on it, and ask questions from anywhere in the world.
Consider the difference that everyday collaboration tools have made in your life and imagine the possibilities that the same type of technology can enable for learning.
PHASE 3: Creation of Solutions
Once data is pooled different labs are created to provide the following solutions to institutions and analysts:
-Static reports translate data on a scheduled, cyclical basis (e.g. every month)
-Dashboards are created in real-time to allow for the constant monitoring of status and learning trends
Data Science Lab
-Based around a data warehouse
-Provides machine learning modules to make predictions or understand the learning process and find correlations and predict a users knowledge and behavior based on previous experiences and other similar learning patterns
-Profiles are built on each user
For example, a profile might include the following types of conclusions:
-Spends 30 minutes/day learning
-Focuses most on labs and simulations
-Doesn’t like to participate in collaborative discussions
Complex learner and content profiles can also be developed, which works well for slow learners by providing summary breakdowns such as the duration to complete a module or task. For instance, this might break down into 20 minutes for slower learners and 5 minutes for faster learners. These profiles can also analyze and predict success rates based around a premise which suggests that just because a learner succeeds in a particular question does not mean they will succeed on an exam. Thus, providing predicted success percentages on exams based on a learner’s unique patterns, actions, and performances. Knowing this type of information allows us as the developers and designers, and institutions as the providers of content, to better provide the assistance necessary to get a learner to where they need to be.
PHASE 4: Development of Applications
After the data is broken down into labs, they can then be applied to different uses including:
Recommendation Engines which can use analyses metrics to recommend different types of courses, and can tailor courses specific to each type of learner
Learner Relationship Management where admissions staff can use the data to see the real status, interest, and level, and then contact students with the right messages and better understanding of who they are (all insights can be sent to learner relationship management)
Tracking Tools that can reflect a students place in the learning process; where they are, where they need to be, and where everyone else is- this parameter is not the final goal but can be used at every level
AI Assistant on Demand Tools can serve as assistants that provide the right content while a student is learning to help them effectively reach their goals (can be in the form of a chatbot that makes suggestions, answers questions, and provides content explanations)
“Under the Hood” Basic Breakdown
Beneath the surface of the basic process we have data repositories which are essentially collections of data that serve different purposes. These include the:
- Operational Data Repository: regular database that displays connections between the student and the content and contains all the relationships within the data management system
- Content Repository: documents that contain the content itself- doesn’t change a lot, not a lot of relationships, mostly just text with parameters; a document repository
- Learner Activity Repository: records all events from all learners into one data lake with huge text files- takes all of the data it extracts, transforms it, and loads it into one
Once data is collected and sorted properly, it can then go through the analysis process which involves:
- Taking data, making predictions, and sending it to a data warehouse and through BI tools
- Business analysts review what is happening in the organization, learn from it, and make informed decisions based around the data
The tools we use to go through each phase can be tailored to work on any LMS platform (edX, Moodle, Instructure, etc.)- our goal is to provide data analytics tools and applications that can be built on top of any LMS, making data and program optimization accessible to a range of client needs.
So what does all this mean? Developing an online learning program and strategy is not just about having the right content or design- though they are extremely important- without the data, online programs can lack some of the main features that elevate them above traditional learning environments. The data allows us to customize to every learner and provide them the tools necessary to succeed and be engaged.
What are some other ways you think data can help us better understand learners in the future? Share your thoughts with us on Twitter, Instagram, Facebook, and LinkedIn.
Chief Technology Officer