Top
1.800.628.2783
1.800.628.2783
ATD Links Archive
Issue Map
ATD Links Archive
083116__measure
ATD Links

Training Measurement Needs a Purpose

Gathering data and conducting an analysis provides information. But for that information to be useful it must be used for a purpose: 

  • To improve the design of the learning experience: Evaluation can help you verify the needs assessment, learning objectives, instructional strategies, target audience, delivery method, quality of delivery, and course content.

  • To determine if the objectives of the learning experience were met and to what extent: The objectives are stated in measurable and specific terms. Evaluation will determine if each stated objective was met. However, it is not enough to know only if the objectives were met; you must know the extent to which they were met. This knowledge will allow you to focus your efforts for content reinforcement and improvement.

  • To determine the adequacy of the content: How can the content be more job related? Was it too advanced or not challenging enough? Does the content support the learning objectives?

  • To assess the effectiveness and appropriateness of the instructional strategies: Case studies, tests, exercises, and other instructional strategies must be relevant to the job and reinforce course content. Does the instructional strategy link to a course objective and content? Is it the right instructional strategy to drive the desired learning or practice? Was there enough instruction and feedback? Does it fit with the organization’s culture? Instructional strategies, when used as part of evaluation, can measure the knowledge, skills, and abilities (KSAs) the learning experience offers.

  • To reinforce learning: Some evaluation methods can reinforce learning. For example, a test or similar performance assessment can focus on content so that content retention is measured and evaluated. The measurement process itself causes the learner to reflect on the content, select the appropriate content area, and use it in the evaluation process.

  • To provide feedback to the facilitator: Did the facilitator know the content? Did the facilitator stay on topic? Did the facilitator provide added depth and value based on personal experience? Was the facilitator credible? Will you use the evaluation information to improve the skills of the facilitator?

  • To determine the appropriate pace and sequence: Do you need to schedule more or less time for the total learning experience or certain parts of the learning? Were some parts of the learning experience covered too fast or slow? Does the flow of the content make sense? Does the sequence follow a building-block approach?

    Advertisement
  • To provide feedback to participants on learning: Are the participants learning the course content? Which parts are they not learning? Was there a shift in knowledge and skills? To what extent can the participants demonstrate the desired skills or behavior?

  • To identify which participants are experiencing success in the learning experience: Evaluation can identify which participants are grasping the new knowledge and skills excelling in their understanding of the content and its use on the job.

  • To determine business impact, cost-benefit ratio, and ROI for the program: What was the shift in the identified business metric? What part of that shift was attributable to the learning experience? Was the benefit to the organization worth the total cost of providing the learning experience? What is the bottom-line value of the course’s impact on the organization?

  • To identify the learning used on the job: What part(s) of the learning experience are being used on the job? To what extent?

  • To assess the on-the-job environment to support learning: What environmental factors support or inhibit the use of the new knowledge, skills, abilities, and behaviors on the job? These factors could be management support, tools and equipment, recognition and reward, and so on.

  • To build relationships with management: The evaluation process requires a conversation with management about the business metric, evaluation plan, collection of information, and the communication of results. This continual interaction provides the opportunity to build relationships and add value to the accomplishment of objectives.

  • To decide who should participate in this or future programs: The needs assessment includes an audience analysis. In addition, the evaluation will help determine the extent to which the content applies to a person’s actual job.

  • To gather data for marketing purposes: Positive results can help promote the learning experience to other potential participants. It can also help position the talent development unit as adding value to internal clients.

As you can see, there are many purposes of evalu­ation, and the preceding list is not exhaustive. How is this information used? The evaluator determines the purpose of the evaluation as part of the evaluation plan (to be discussed later). This decision then relates to the decisions to be made, the type of data collection instruments used, timing, sources, and the location for the data. 

For advice on how to conduct effective and efficient training evaluation, check out Evaluation Basics, 2nd Edition. Each chapter focuses on a critical aspect of developing and implementing an evaluation plan for a face-to-face or virtual training program. You’ll not only delve into Kirkpatrick’s four levels of evaluation and the methods and instruments you can use, but you’ll also get help effectively communicating results.

About the Author
Don McCain is founder and principal of Performance Advantage Group and has more than 28 years of corporate consulting experience. He has authored three books as well as numerous articles in industry journals. He holds a BS in business administration, an MA in divinity, an MS in business administration, and a PhD in education and HRD from Vanderbilt University.
Be the first to comment
Sign In to Post a Comment
Sorry! Something went wrong on our end. Please try again later.