March 2020
Issue Map
TD Magazine

The Overwhelming Task of Communicating L&D's Business Impact

Analyzing and communicating learning's strategic business impact requires a unique skill set that is lacking in TD functions.

Proving impact of employee development programs and initiatives has always been a nerve-racking prospect for many talent development professionals. The pace of change and the rise of analytics in the workplace has forced TD professionals to take a broader look at what analytics they should be measuring, the meaning behind the analytics they capture, how to align the findings to strategic business goals, and how to communicate to executives the impact of the data findings.


A 2019 Association for Talent Development research report, Effective Evaluation: Measuring Learning Programs for Success, highlights the challenges associated with accessing and analyzing data, attracting and retaining TD staff with evaluation skills, and deciding what programs to allocate limited evaluation resources to. ATD surveyed 779 TD professionals and found that while 95 percent of organizations perform some type of evaluation, far fewer have found success using evaluation to meet organizational and learning goals. Among the report's findings:

  • Less than half of organizations (43 percent) use big data—defined as extremely large data sets that are too big or complex for traditional data software to process—as a data source for evaluations. Two of the most common challenges are analyzing and communicating findings from data analytics.
  • Top barriers to conducting learning evaluations are the difficulty of isolating the effects of learning programs, the lack of access to data needed to conduct high-level evaluations, limited time to properly evaluate impact, and the costs of conducting those evaluations.

John Coné, principal at the Eleventh Hour Group, says in the report that to analyze and communicate impact, companies using big data should have a purpose in mind first. "The best use of big data is to give us insights that can inform decisions," he explains. "We should agree on what we are looking for before we analyze. If you start your analysis knowing what you are looking for and what decisions you are trying to inform—even if the answer is not what you expected—communicating the results gets easier."

Conveying impact

Communicating and evaluating impact are two of the 23 capabilities that ATD's new Talent Development Capability Model pinpoints as critical skills for TD professionals. According to the model, effective communication requires a knowledge of communication principles and techniques that enable a person to articulate the appropriate message for a particular audience, including:

  • creating messages that are clear, correct, complete, concise, coherent, and courteous
  • understanding the communication process and selecting appropriate communication media
  • recognizing and overcoming communication barriers
  • applying the principles of active listening
  • observing and sending nonverbal messages.

Skills important for conducting evaluations include business acumen, critical thinking, problem solving, project management, and communication.

"I think it is helpful to start with a framework in terms of what you're trying to achieve through the learning and then connecting the dots between the learning event and the business outcome," says Tom Atkinson, president of Atkinson Analytics. "It starts with having a good story to tell in terms of the evidence that shows the training is working."

So, the key for TD professionals is to identify what they are trying to measure and then recognizing to whom they are communicating the data results. Atkinson notes that three different stakeholder groups are interested in the data. "The first and most often-cited are the business leaders," he says. "Their concern is usually understanding the value to the business: What have we spent on the program? What are we getting in return? Is it giving us value?"

The second group comprises the learners and their managers. "They usually want to know: How am I doing? How can I sustain the gains? and How can I improve value, either for myself or the people I manage?" The third group comprises training professionals "who want to generally improve their processes for how we design, develop, and implement training to get the best results," Atkinson notes.

Storytelling is a great tool to deliver results. Business consultant Jeff Hiller, founder and president of Hillertime ATX, explains that most people try to impress by reciting facts, but finding a way to take the data and tell a compelling story that includes the what (what TD professionals are trying to say) and the why (why should they care) is critical in communicating analytics.

"Know your stuff," he said during the 2019 ATD International Conference & Exposition session "Leading by Numbers: Telling Compelling Stories With Boring Business Data." He added: "Look for variances in the data and find a couple of nuggets to make your point. And avoid words like kind of, seems like, might, and possibly. It is about being confident that you are clear in the what and the why."

Special skill set

Data and analytics are key drivers of organizational performance and TD, which is why they are part of the Talent Development Capability Model. The model's data and analytics capability states that TD professionals should have:

  • knowledge of principles and applications of analytics—for example, big data, predictive modeling, data mining, machine learning, and business intelligence
  • skill in identifying stakeholders' needs, goals, requirements, questions, and objectives to develop a framework or plan for data analysis
  • skill in gathering and organizing data from internal and external sources in logical and practical ways to support retrieval and manipulation
  • skill in analyzing and interpreting results of data analyses to identify patterns, trends, and relationships among variables
  • knowledge of data visualization, including principles, methods, and types and applications—for example, texture and color mapping, data representation, graphs, and word clouds
  • skill in selecting or using data visualization techniques—for example, flow charts, graphs, plots, word clouds, and heat maps
  • knowledge of statistical theory and methods, including the computation, interpretation, and reporting of statistics.

Needing the skill and having the skill are two different things. According to Corporate Learning Week's What's Trending in Corporate Analytics for 2019, talent gaps on L&D teams exist in terms of analytics and data visualization. Nearly one-third of the 100 global L&D leaders surveyed stated that significant challenges remain regarding their ability to directly define the benefits of implementing more robust analytical capabilities; 45 percent of respondents revealed they are still limited in their ability to gather, aggregate, and analyze information, frequently making decisions based on intuition rather than robust data and rigorous analysis.

"Data analysis is a level of expertise that is required, but it is a skill and capability that we, as talent development professionals, don't inherently have," says Kevin M. Yates, a self-proclaimed L&D detective who helps solve what the impact of learning is. "Historically, we have not had a need for analytics, so to expect that we can just somehow pick up that skill, we are being unfair to ourselves. It requires specialized expertise."

As the need for data analysis intensifies, it is important to explore the in-demand skills for this unique role. Some job descriptions for specialists in learning analytics include such qualifications as a high proficiency in analytical skills—for example, critical thinking, information analysis, research, communication, problem solving, and storytelling; expertise in data visualization; and advanced technical expertise in using learning analytics platforms or business intelligence tools, such as Tableau.

"When you take a look at the descriptions, the capabilities, and the performance requirements that align with L&D analysts or data scientists, they don't look anything like what we've seen before for L&D," Yates explains. "Are we really expected to have a data analytics background, along with a curriculum development background, an instructional design background, and a facilitation background? That is an unfair and unrealistic expectation to put on us as L&D professionals."

According to the Corporate Learning Week study, investments in learning analytics are the highest priority among leaders in the next two years. But what do those investments look like? Does that mean attracting staff with evaluation skills? The report found that more than half of respondents noted that it was somewhat or very difficult to attract employees with the competencies to conduct learning program evaluations.

A Thomsons Online Benefits report, Innovation Generation: The Big HR Disconnect 2019/20, shows that more companies are beginning to invest in a dedicated analytics team, and one in nine organizations have people analytics capabilities within the HR functions.

"I think it is worth considering making the role of analyst a separate role within the learning department," Atkinson suggests. "Most learning and development leaders don't view themselves as very sophisticated in measurement. And when you ask them why, the most common responses are lack of time and resources, not enough hours in the day."

So, what is the answer?

Along with a significant investment in technology, upskilling existing talent on the ins and outs of data analysis and communication is indeed a way to close the skills gap. But that's not the only way.

Many organizations are hiring analysts with specific skills to assist TD professionals in the new world of data and analytics. For smaller TD departments, outsourcing the data collection and analysis is also an option.


Thirty-two percent of the Innovation Generation respondents plan to upskill existing HR or TD team members, while 17 percent will hire external talent and 7 percent will outsource. "As businesses increasingly look to HR teams to supply data-based insights that play a real role in measuring and informing people and business strategy, their value to the organization will also increase," the report notes.

ATD's Effective Evaluation report reveals that just one in three organizations have a dedicated evaluator on staff. "Learning and development staff can develop the skills for evaluation if they are given the time and resources to do it," Atkinson states in the report. "They need to be able to understand the business and be able to have discussions with senior stakeholders about it. What's the business context? What results are you trying to achieve? How does your company deliver value?"

TD professionals need to take advantage of the analysts or experts within their companies who can help them tell the story. "Reach out and partner with your workforce analytics team or your people data team to fill the gap of where the capability is lacking. Data and analytics for L&D just emerged over the past few years, and we are seeing an uptick in organizations hiring specifically for that skill set," Yates explains.

"Measuring impact for learning is hard work and it can be difficult. Difficult doesn't mean impossible. You can show impact for learning," he adds. "Measuring the impact of learning is about investigating what happened. What makes it so uncomfortable is that it is not an exact science. But we can absolutely show whether or not anything at all changed as a result of learning, and we can use data, analytics, and storytelling to do that."

L&D Measurement and Analytics Roles

L&D measurement and evaluation expert Kevin M. Yates has examined job postings to identify new roles focused on measurement, data, and analytics on L&D teams. He found several different titles, including manager of learning impact and measurement; manager of learning research, reporting, and analytics; specialist/senior specialist of learning analytics; and L&D measurement and analytics data scientist.

Yates says these roles focus on measuring efficiency, effectiveness, and outcomes for training and learning. In addition, they're using data and analytics to tell L&D's story and using facts and evidence to show impact, inform decisions, answer questions, and reveal insights. Yates also notes that extensive qualifications, responsibilities, and skills are required for such roles.


  • Work with the instructional design team to implement design standards for measuring and reporting on the impact of learning experiences.
  • Organize and present learning and operational metrics in a way that enables the organization to reinforce the impact and value of programs with various stakeholders.
  • Identify and implement new methodologies for evaluating the impact of various learning initiatives.
  • Produce metrics and reporting on program participation and compliance.
  • Partner with the global measurement team to adopt common assessment and evaluation processes.
  • Create a learning effectiveness dashboard for project managers to use in client and stakeholder meetings; update dashboards as needed.
  • Create success measures aligned to business strategy.
  • Cultivate an internal network to effectively source metrics and results and maintain an understanding of what we ought to measure now and in the future to advance our value proposition.
  • Identify trends, themes, and correlations in training data; summarize impact and recommend adjustments.
  • Understand the different needs of our team, as well as key stakeholders, and cater analysis and presentation of data accordingly.
  • Establish automated data-collection mechanisms for key performance indicators and key metrics on both our operational and strategic performance.
  • Ensure data integrity and accessibility; help define data governance structure, processes, and standards.
  • Help tell the story of learning impact with data and fact-based insights in a manner that drives inspired action on the right priorities.


Demonstrate extensive abilities and/or a proven record of success as a team leader in:

  • utilizing and applying quantitative, qualitative, and mixed methods research design in the educational and organizational psychology sciences
  • utilizing and applying data analysis tools (for example, exploratory, data mining, regression/predictive techniques, multivariate, cluster, network analysis, etc.) and statistical software specialization into projects (for example, R, SPSS, SAS, etc.)
  • utilizing and applying computer programing skills as applied to data acquisition, analysis, reporting, and visualization (R, Python, JSON, SQL, d3.js)
  • utilizing and applying data warehousing skills, ETL (extract, transform, and load) processes, structured relational databases, and the combining of disparate data sources
  • utilizing business intelligence tools to develop reports, visualizations, and interactive web-based dashboards (for example, Microstrategy, visual insight dashboard, Qlik view, Tableau, d3.js, etc.)
  • implementing xAPI across a complex learning technology ecosystem and partnering with stakeholders and instructional designers to optimize data capture to drive continuous improvement and learning effectiveness
  • evaluating data quality and designing processes to identify and correct data quality issues.
About the Author

Paula Ketter is ATD's content strategist. Previously, she served as editor of ATD's periodicals.

Be the first to comment
Sign In to Post a Comment
Sorry! Something went wrong on our end. Please try again later.