Advertisement
Advertisement
ATD Blog

A Digital Transformation Reporting to Learning Analytics

Thursday, September 28, 2017
Advertisement

Learning and development (L&D) organizations are seeing a transformation from traditional learning to digital learning. As a result of this shift, learning leaders need to change how they evaluate the impact of their learning programs. L&D has been successful in showing what happened in training, and now they need to evolve to show stakeholders what happened after training and to use data to predict what will happen with new learning programs.

L&D organizations need to shift from reporting training metrics to using learning analytics to prove and improve the impact of learning programs on outcomes. Think of training metrics as a newspaper, while learning analytics is social media. One shows what happened; the other shows what is taking place now.

Advertisement

In traditional training, L&D has asked learners about the training environment, the temperature of the classroom, and the food served during the training course. While meeting the basic needs of the learners sets them up for success, the effectiveness of training is actually determined by what takes place after they leave the classroom. In the new digital world, L&D should collect data about how the learners are performing on the job and then connect it back to the training course.

For example, after an email security training, one company sent out fake phishing emails to determine if learners would display the right behavior when faced with a possible security breach. The effectiveness of the training course was determined by how many people correctly identified the phishing email, rather than how well the learners liked the classroom experience.

Advertisement

In developing traditional training, we often use a model with steps to analyze, design, develop, implement, and evaluate (ADDIE). We often skip the evaluation step because we move on to the next project, or we don’t take the time at the start to ask about outcomes. In the new digital world, L&D should develop an evaluation strategy at the start for learning programs that are strategic, visible, and costly. Programs that are tied to company goals, affect hundreds or thousands of learners, and take time to develop and deliver need to have clear, measurable outcomes. The evaluation strategy starts with asking questions like, “What would success look like? What will people be doing differently after the training?” These questions and answers will help determine what data to track and collect.

In the past, L&D has focused on the effectiveness of the trainer in the classroom. With digital learning, the control and responsibility is now in the hands of the learner. Therefore, L&D needs to take a closer look at the behaviors of the learner in the digital learning experience to predict success. Companies are now using xAPI data to track time spent in the content, the time it takes to complete modules, who learners interact with, and what content they started but did not finish. These data can then be connected with on-the-job data to predict application after training.

L&D has developed dashboards based on learning goals, but in the digital world, L&D needs to show how learning programs affect company goals and outcomes. L&D often shares learner feedback, completion rates, and the number of courses developed and delivered each quarter. Rather than sharing activity with stakeholders, L&D needs to share the effects of learning on company goals and outcomes. L&D needs to show how the digital learning experience is moving revenue, customers, operations, innovation, and regulatory compliance using KPIs. To achieve this, L&D needs to connect with the right people to ensure learning programs are contributing to key outcomes.

As companies evolve in the digital world, and as L&D changes from traditional training to digital learning, so should the processes and practices of showing the impact of learning programs.

About the Author

Scott Weersing serves as Senior Impact Analyst at Truist Bank. Scott partners with stakeholders to show the value and impact of learning interventions on outcomes. He has worked with a variety of clients and industries, including manufacturing, healthcare, insurance, and call centers, to link training to business metrics. Scott has a BA in Political Science from UCLA and an MA in Educational Technology from Azusa Pacific University.

Be the first to comment
Sign In to Post a Comment
Sorry! Something went wrong on our end. Please try again later.