logo image

ATD Blog

Mining for Insights

By and

Thu Dec 14 2023

Mining for Insights
Loading...

Brought to you by

How a Canadian energy company upskilled its learning team and built a model program evaluation in nine months.

Learning initiatives fuel strategic progress. Leaders need actionable, timely information about program impact to keep their goals on track. However, many organizations lack effective learning evaluation programs that could provide these insights. Of the 556 respondents to TiER1 Performance’s 2023 Learning Trends Report, just 5 percent report that they generally use research-supported evaluation practices, while 64 percent feel unable to do the kinds of learning evaluation that they want to do.

Advertisement

What will help us better our evaluation process and create successful learning programs? Let’s learn from a Canadian energy company that developed both the capabilities of its learning team and a suite of powerful evaluation tools in less than nine months.

Breaking Ground

In mid-2022, the learning team at this company approached our company, TiER1, for help with evaluating a key program intended to provide foundational training for shovel supervisors in one of their mines.

To get the most from the partnership, we adopted a hybrid model where the company’s team could learn from and work alongside Will Thalheimer, PhD, MBA, an internationally recognized learning evaluation expert and author of an award-winning book on performance-focused learner surveys. Through this hybrid model, the company’s team would gain three main takeaways:

  • Actionable, high-impact insights to improve the pilot course

  • The skills to repeat this process for other courses

  • Valuable experience with this learning evaluation approach to share with others across the company

Digging In

To become full partners in evaluation, the core team spent time building shared values and a common language.

We conducted research and interviews with key stakeholders, instructors, and recent learners to better understand the needs of shovel supervisors and the goals for the course. Digging in further, the team engaged in a series of six two-hour collaborative workshops to develop foundational skills in the design of metrics, studies, surveys, and scenarios and to apply these skills to the project context.

Advertisement

We then developed appropriate metrics and measuring tools, using the Learning Transfer Evaluation Model (LTEM) developed by Will Thalheimer to guide our work. LTEM helps teams make decisions about the kinds of data to collect for a multidimensional picture of a learning program’s effectiveness. For instance, to measure job performance, we identified key behavioral indicators (KBIs) that would show whether course objectives were really being applied on the job.

We planned a time-series study, in which we would compare the data before training and at multiple points after the training, to assess the impact the training had on the learners.

What we wanted to measure:

  • Job performance

  • Task competence

  • Decision-making skills

  • Learner perceptions of performance impact

How we measured it:

  • Anonymous self-evaluations of KBIs

  • Capability checklists

  • Scenario questions

  • Performance-focused learner surveys

Developing the evaluation instruments was an iterative process, involving multiple stages of team discussion, review, and revision. For example, we validated scenario questions with both experts and nonexperts. Listening to experts “think aloud” through their answers helped us ensure that the questions assessed decision making based on the intended learning points. Testing with nonexperts helped us identify and remove wording that could make the answers easier to guess.

Advertisement

Staying Grounded

The project faced challenges, but the team persevered in excavating insights and expanding their learning evaluation capability.

We ran into obstacles with implementing the time-series study as envisioned, as the number of learners entering the program dropped significantly right before our planned start for the study. Because our sample size was too small to perform meaningful data analysis, we ultimately derived the most reliable insights from research-inspired benchmarking based on our initial interviews and observations.

Despite this setback, we still achieved several important outcomes:

  • Identifying strengths and opportunities for improvement in the shovel supervisor training program

  • Developing evaluation tools that can be used in the future for additional mine training programs

  • Informing the learning evaluation strategy for programs at the company

  • Gaining skill and confidence in evaluating learning programs that we were able to transfer to other contexts

Fueling Discovery

Research-based evaluation methods such as LTEM help us leverage learning and performance experiences of all kinds to uncover insights and achieve goals. This project illustrates how collaboration and learning science can be combined to create a solution that delivers high performance, develops teams, and drives business results.

Learning evaluation projects come in many shapes and sizes. Talk to TiER1 about your goals, and we can help design the right solution for your team.

You've Reached ATD Member-only Content

Become an ATD member to continue

Already a member?Sign In

Advertisement
Advertisement

Copyright © 2024 ATD

ASTD changed its name to ATD to meet the growing needs of a dynamic, global profession.

Terms of UsePrivacy NoticeCookie Policy