Summer 2017
Issue Map
Advertisement
Advertisement
angst_tw
CTDO Magazine

Data Points

Thursday, June 15, 2017

Don't be daunted by the measurement and evaluation process.

Despite exponential growth in big data and data analysis technology, talent development executives and organizations still struggle with determining the best process and timing for measuring talent development initiatives. In a data-driven world where CEOs focus on return on investment to justify expenditures, demonstrating the value of talent development investments gives learning leaders pause.

Advertisement

Alexander Lemheney, vice president of education for the Lehigh Valley Health Network (LVHN), believes there are several reasons measurement causes angst, despite the importance. For one thing, measurement can take considerable resources, which requires reprioritization from other needs.

"The urgent often takes priority over the important. For instance, right now we're going through several mergers, which requires my team to assess institutional readiness and teach skills, reskilling, and upskilling of those who are joining our system," Lemheney explains. Thus, finding the time to focus on measurement is a challenge for LVHN's department of education, which provides all education and workforce development services for 17,000 employees and 5,000 students dispersed throughout Eastern Pennsylvania.

"While it's essential I can articulate the value of learning in business terms," Lemheney adds, "we are so busy with recent mergers, ongoing workforce development, and new projects that it's difficult to assign resources to measure training outcomes because they are engaged in urgent, high-stakes activities. These same folks are also called upon to innovate our programs and processes while owning responsibility for the urgent business requirements demanded today, including to assess and measure outcomes and business impact. I cannot always take them away from other priorities."

Sometimes, the lack of necessary skills around measurement and limited time to obtain those skills is part of the issue. "You need a fusion of biostatistician, psychometrics, qualitative and quantitative researcher, and an understanding of business finance and quality measures. That combination skill set is rare to find. I leverage my high performers and invest in developing their knowledge and skills, task them with measurement, and they end up adding it to their already long list of priorities," Lemheney says.

Experts agree that measuring the right things relative to training is worth the effort. In the February 2015 issue of TD at Work entitled "The Four Levels of Evaluation—An Update," Jim and Wendy Kirkpatrick write that there are three reasons organizations should invest in evaluation:

  • to improve the program
  • to maximize transfer of learning to behavior and subsequent organizational results
  • to demonstrate the value of training to the organization.

Even with its challenges, LVHN successfully measures important programs, demonstrating the value of learning for the network. According to Lemheney, if you are disciplined about having a measurement strategy aligned with your core business goals, you can measure the right components of your work. His recommendations are below.

Count everything you can in your department

The talent development team should know its core functions and establish its base metrics. "We started by counting everything we could, especially metrics that are relatively easy to capture," Lemheney explains. "For instance, program volumes, materials and supplies, productivity time, resource hours on projects, and budget. Once you start capturing these foundational data points, your team starts to think more quantitatively in terms of effort and cost; this gives you information to work from."

He says his team started with simple metrics years ago, when it first decided to implement a consistent measurement strategy. Not only did it give them some practice tracking and monitoring data, but it also gave the department of education objective data to help it make better, data-driven decisions as a learning organization.

Build your evaluation strategy at the beginning of every project

At LVHN, the evaluation strategy is considered during the initial project scope. The team starts with a front-end analysis during scoping that focuses on the business case for the program, including indicators of improvement. At that point, outcomes are identified and measurable project objectives (specific, achievable, relevant, and time bound) are developed. These so called, "powerful" objectives are developed to reflect outcomes that can drive your evaluation strategy, and include:

  • reaction objectives—align with the participants gaining value from the program
  • learning objectives—describe the cognitive level of learning required from participation
  • application objectives—describe what participants will do with the new skills or knowledge they will gain
  • impact objectives—provide quantitative measures that are expected to change, relative to the participation in the program
  • ROI objectives—set expectations for actual business results.

Prioritize which projects should include higher levels of measurement

LVHN uses Kirkpatrick's four levels of evaluation as a base, but adds return on investment, as described by Jack and Patti Phillips, as a fifth level. As you progress to higher levels of program evaluation, complexity increases, requiring more time and investment. Organizations must be thoughtful in determining what level to use for which programs. Lemheney and his team carefully prioritize which programs they will apply higher levels of evaluation.

LVHN uses a standard, electronically delivered survey for Level 1 evaluation for all of its training programs. Lemheney believes that many managers underestimate the value of measuring participant reaction.

"These are more than just smile sheets," he says. "When well-designed, there is rich data that you can get out of your post-session surveys. There's so much you can learn about your content, faculty, facilities, and participants' needs. And, this data is very easy to capture."

Level 1 evaluation provides you with a picture of the participants' perceived value of the training. Generally, this is considered to be quantitative data measured using a simple scale. However, understanding can be further enhanced if qualitative data are gathered as well; open-ended questions can provide a story behind the numbers.

Lemheney recommends developing a simple coding system to categorize and theme the qualitative data to gain more insight. "If criteria are scored at 4.8 on a five-point scale, the qualitative data will help you understand what is actually driving the number. A low score may not actually be a measure of trainer performance, but [indicate] misalignment of the content. Maybe the room was too cold or the image on the screen was difficult to see. You can gain a lot by considering both the qualitative and the quantitative data."

Level 2 evaluation measures knowledge, which is relatively easy to measure using pre- and post-training assessments. According to Lemheney, LVHN uses a "combination of competency tests, performance assessments, checklists, and simulations. This helps us assess how the learner will apply their skills; react to the various emotionally taxing, high-stakes situations they may run into; and apply their knowledge to novel situations as they take care of patients." His goal is to measure both cognitive learning as well as the ability to apply the knowledge and skills to patient care.

Use your team's experts to develop cognitive and performance tests

Advertisement

Lemheney stresses that building good assessments and tests is important for getting accurate Level 2 data. "Often, we task our subject matter experts with test development. While your SMEs are highly competent regarding the knowledge, skills, and attitudes that are required for the job, they are not necessarily effective test question or performance checklist developers. This is where the talent development consultant can add tremendous value by partnering and working with the SMEs to develop the right evaluation tools."

In a healthcare environment, it is important to understand how we change behavior at the point of care (Level 3). Unfortunately, when a company has a small staff or small budget, this step often is skipped because it is time consuming. Classroom-based practice drives behavior change in the workplace. According to Lemheney, "Level 3 evaluation is not something we do for each project. But with high-stakes skills, we need to know if the learner can apply what they learned and if they actually do apply it on the job. In order to accomplish this, we use simulations to re-create the situations in which the behavior is expected to be performed." LVHN uses three different types of simulations to ensure the learners pick up the necessary skills.

First, it uses manikin-based simulations. With this type of simulation, learners or groups of learners perform in a room that closely replicates the clinical environment. In this instance, they use manikins and task trainers to practice treatment practices, such as drawing blood, inserting central lines, or performing other clinical procedures. Trainers will observe either in person, over video monitors, or through a one-sided mirror.

The second type of simulation used is a computer-generated virtual reality environment. The virtual reality simulation acquaints learners with the protocol being studied and the steps involved in various types of treatment. Using this simulation, teachers can monitor the critical thinking skills and the treatment steps to get a handle on how much the learner is able to apply.

The third type of simulation uses live actors who are trained to perform the role as patients, presenting various types of situations. Lemheney believes this type of learning easily transfers to the workplace. "I believe the more we can offer an experience that is closely aligned to the situation where we expect performance to occur, the easier it is to apply it in the workplace," he explains. "We use these simulations to train undergraduate medical and nursing students core skill as well as for our licensed employed professionals for high-risk, problem-prone situations. The skill set is considered high risk when the opportunities to encounter the situations are infrequent, thus leading to potential for becoming problematic and the consequences of doing it wrong may include adverse outcomes, customer dissatisfaction, equipment damage, and quite frankly, people getting hurt."

But Lemheney and his team don't assume these high-stakes behaviors transfer. After learners have reached a level of acceptable performance in the simulation, they are monitored on the job. Each one is subjected to a performance checklist and observed, either by the trainer or their supervisor, to measure on-the-job transfer. In some cases, the team also interviews the customer, which in this case is a patient.

Use your data to tell the story

Although LVHN rarely applies Level 4 evaluation, which measures the impact training has on results, when it does, the organization studies institutional core data to do so. LVHN examines metrics such as infection rates, patient satisfaction, and other factors that are affected by the skills that are learned in their courses.

"It's important to build your training around those key activities and behaviors that will move the needle with regard to metrics that matter to your organization. Then, you monitor the result as a key measure of the impact of your training," Lemheney says. He admits that it is difficult to isolate training as the only factor that shapes behavior, but by presenting data from all four levels of evaluation, you can show correlations, and tell a story about the impact and value of learning.

Read more from CTDO magazine: Essential talent development content for C-suite leaders.

About the Author

Meloney Sallie-Dosunmu, founder and president of Precision Talent International, is a highly skilled facilitator and organization development consultant. She has more than 25 years of experience helping private, public sector, and nonprofit organizations implement training and development initiatives that produce results. Clients rely on Meloney’s ability to diagnose training needs, engage executives in development efforts, design and conduct training, and equip in-house staff to be effective trainers.

She has a sharp understanding of organizational needs, acquired from years of experience as a learning executive in manufacturing, pharmaceutical, distribution, and nonprofit organizations. Her experience includes leadership development, coaching, workshops, and organization development projects for executives, middle managers, technical experts, and frontline leaders and employees.

Meloney has been a member of ATD for more than 30 years. She has served in many leadership roles, including as a chapter president for the Eastern Pennsylvania Chapter, leadership development team member, and national adviser for chapters.

Be the first to comment
Sign In to Post a Comment
Sorry! Something went wrong on our end. Please try again later.