Fuel Business Strategies With L&D Analytics Cropped
ATD Blog

A Better Framework for L&D Measurement and Analytics

Monday, January 11, 2021

Too many L&D practitioners struggle to create a robust measurement and reporting strategy that meets their stakeholders’ needs. They don’t know what measures to include, are confused about how to define them, and have no guidance about how to report them. Most practitioners don’t even know where to start. It shouldn’t be this hard.

Our new book, Measurement Demystified: Creating Your L&D Measurement, Analytics, and Reporting Strategy, was written to address this important need. We want to demystify the process by providing an easy framework based on Talent Development Reporting principles (TDRp) for classifying, selecting, and reporting the most appropriate measures to meet your needs. TDRp is an industry-led effort begun in 2010 to bring standards and best practices to L&D in particular and HR in general. TDRp has evolved and grown during the last 10 years into a holistic and comprehensive framework that provides guidance on all aspects of creating a measurement and reporting strategy.

We believe the starting point for a good strategy is with the users and the reasons to measure. We also believe a framework should be simple and easy to use, so TDRp groups all the possible reasons to measure into four categories. This makes communication about measures much easier because we have a common language and classification scheme, much like accountants have in generally accepted accounting procedures (GAAP). The four overarching reasons are to inform, monitor, evaluate and analyze, and manage.

Breakdown of TDRp Assumptions based on Measurement Demystified
Inform means the measures are used to answer questions and identify trends. Monitor implies that the value of the measure will be compared to a threshold to determine if the value is within an acceptable range. Evaluate and analyze refers to the use of measures to determine if a program is efficient and effective or whether there is a relationship among key measures like the amount of training and employee engagement scores. Lastly, manage describes the use of measures to set plans, reviews monthly progress against plans, and takes appropriate action to come as close as possible to delivering the promised results by year-end. Upfront discussions with users should identify their reason for measuring, which will be important when we select the right report for their measures.

The next element in the framework is a classification scheme for the types of measures. Again, our goal is to make it simple to remember and simple to use. We have more than 170 measures just for L&D, so we need a way of grouping them that will be helpful. We suggest efficiency, effectiveness, and outcome. Efficiency measures, “How much?” with data like the number of participants or costs. Effectiveness measures, “How well?” with data about participant reaction or application of learning. Outcome measures answer, “What is the impact of the training?” through data that quantifies the impact on organizational goals such as financial, customer, operational, or people. The classification framework makes it easier to provide guidance in selecting the right measures.


All programs should have at least one efficiency and one effectiveness measure. More often, a single program will have several of each. If the program supports a key organization goal like increasing sales or employee engagement, then it should also have an outcome measure like impact on sales. TDRp provides guidance for measurement selection based on the type of program, and the book includes definitions, formulas, and guidance on over 120 measures, including those which are benchmarked by ATD and recommended by ISO.

Once you select your measures, you need to decide how to share them. TDRp again provides guidance to simplify your decision making. We start with a framework of five types of reports: scorecards, dashboards, program evaluation reports, custom analysis reports, and management reports. Each type of report maps to the reason for measuring; once you have had the upfront discussion with the user and understand their needs, TDRp indicates which report best meets that need.


A scorecard is simply a table with rows and columns in which the rows are the measures and the columns are the time periods (such as months or quarters). Scorecards work well if you want to see detailed data to answer questions about the number of participants by month and if there is trend in the data. In contrast to a scorecard, a dashboard contains summary data and often has visual components like graphs or bar charts. A dashboard may also be interactive, allowing you to drill down into the data. Scorecards and dashboards are used when the reason for measuring is to inform. Both may also be used when the reason is to monitor, but in this case, you would also include color-coding or threshold levels to alert you that the measure is within (or outside) an acceptable range.

A program evaluation report is used when the reason to measure is to evaluate a program for its efficiency and effectiveness. You typically generate a program evaluation report at the conclusion of the program and brief the stakeholders. Similarly, you use a custom analysis report to brief stakeholders on the results of a statistical analysis of the relationships among measures (for example, the extent to which training impacts retention or engagement). The last type of report is the management report, which you should use when the reason to measure is to manage. Management reports require setting specific, measurable plans (targets) for key measures and generating monthly reports to compare year-to-date results against plan and to compare forecast for year-end against plan. These reports have the same look and feel as reports used by sales and manufacturing departments.

There are three varieties of management reports:

  • The program report provides detail on learning programs in support of a single organization goal such as increasing sales.
  • The operations report shows data aggregated across all courses for those measures the CLO is managing for improvement.
  • The summary report shows how learning is aligned to the CEO’s goals as well as other important organization needs and includes the key measures for each. This report is shared with the CEO and governing bodies to show the alignment and impact of L&D.

Last, we share all the elements of a measurement and reporting strategy with step-by-step guidance to make it easy for you to create your own. We also include a sample strategy in the appendix you can use as a template. Our hope is that the TDRp framework combined with the detailed guidance and lots of examples will take the mystery out of measurement and make it easy for you to create your own measurement and reporting strategy.

About the Author

David Vance is the Executive Director of the Center for Talent Reporting which is a nonprofit organization dedicated to the creation and implementation of standards for human capital reporting and management. He is the former President of Caterpillar University, which he founded in 2001. Until his retirement in January 2007, he was responsible for ensuring that the right education, training and leadership were provided to achieve corporate goals and efficiently meet the learning needs of Caterpillar and dealer employees. Prior to this position, Dave was Chief Economist and Head of the Business Intelligence Group at Caterpillar Inc. with responsibility for economic outlooks, sales forecasts, market research, competitive analysis and business information systems. Dave received his Bachelor of Science Degree in political science from M.I.T. in 1974, a Master of Science Degree in business administration from Indiana University (South Bend) in 1983, and a Ph.D. in economics from the University of Notre Dame in 1988. Dave was named 2006 Chief Learning Officer (CLO) of the Year byChief Learning Officer magazine. He also was named 2004 Corporate University Leader of the Year by the International Quality and Productivity Center in their annual CUBIC (Corporate University Best in Class) Awards. Caterpillar was ranked number one in the 2005 ASTD Best Awards and was named Best Overall Corporate University in 2004 by both Corporate University Xchange and the International Quality and Productivity Center. Dave is a frequent speaker at learning conferences and association meetings. He conducts workshops on managing the learning function, and teaches at Bellevue University and the University of Southern Mississippi. His current research focus is on the development of Talent Development Reporting Principles (TDRp) for L&D in particular and human capital in general. He also is a trustee and lead independent director for State Farm Mutual Funds, and an advisory board member for Capital Analytics, Inc. and Knowledge Advisors. His first book, The Business of Learning: How to Manage Corporate Training to Improve Your Bottom Line, was published in October 2010.

About the Author

Peggy Parskey has over 25 years of experience driving strategic change to improve organizational and individual performance. Peggy focuses on team and organizational performance improvement leveraging performance measurement methods, management of change and organizational design to ensure sustainable capability. Peggy’s firm, Parskey Consulting, focuses on culture, strategy, and change enablement. She has melded her expertise in change with her deep experience in measurement to enable senior leaders to manage their change initiatives through data-informed decision-making.

In addition to her own clients, Peggy serves as a Principal Consultant at Explorance where she provides business development support and delivers strategic consulting services to Talent and Learning Organizations. In this role, she develops Talent measurement strategies, integrate measurement into Talent processes, develop action-oriented reports, scorecards, and dashboard for clients, and conduct impact studies to demonstrate the linkage between talent programs and business outcomes. For over ten years, Peggy was the Assistant Director at the Center for Talent Reporting (CTR) where she was responsible for credentialing talent practitioners in the execution of Talent Development Reporting Principles. She co-delivers the Measurement Demystified Workshop with David Vance that is now being delivered under the auspices of the ROI Institute.

Peggy has coauthored three books, Learning Analytics, Using Talent Data to Improve Business Outcomes (2nd Edition with John Mattox and Cristina Hall, Kogan-Page, 2020), Measurement Demystified (with David Vance, ATD Publishing, 2020), and Measurement Demystified Field Guide (with David Vance, ATD Publishing, 2021). She has authored several white papers and book chapters and delivers webinars to help upskill talent practitioners on measurement best practices. Peggy holds a Bachelor of Science degree in Mathematics from Simmons College and two master’s degrees from the University of Chicago, in Statistics and Business Administration. She is a Past President of the Los Angeles Chapter of ATD, having also served as Program Director and Chief Financial Officer.

Be the first to comment
Sign In to Post a Comment
Sorry! Something went wrong on our end. Please try again later.