May 2014
Issue Map
Advertisement
Advertisement
TD Magazine

Big Data: A Quick-Start Guide for Learning Practitioners

Thursday, May 8, 2014
Audio
Big Data: A Quick-Start Guide for Learning Practitioners

Four companies share how they have used learning analytics to show the training function's impact on the business.

Advertisement
Dearborn
In Elliott Masie's December 2013 T+D article, "The Uncharted Territory of Big Learning Data," this well-respected thought leader laid out important high-level theories about the impact of data and analytics on the learning field—and all practitioners should take heed.

Learning leaders also need to know that the big learning data revolution is happening right now. Nearly one-third of global organizations with more than 1,000 employees already are leveraging learning-related big data, according to a recent study by the Institute for Corporate Productivity.

Yet too few examples of success have been shared and there's little guidance on where to begin. Let's change that. I've compiled practical advice and powerful examples from learning leaders who have successfully harnessed data and analytics to advance learning and development (L&D) and business.

The end is the beginning

The foundational principle of learning data of any size is to start with desired outcomes. That's why the chapter I co-authored in the ASTD Handbook: The Definitive Reference for Training & Development is titled "Results-Based Evaluation: Keeping the End in Mind."

Starting with the end in mind means that, for instance, to help create effective sales training, begin with target results (faster time to quota for new hires, for example). From there, work backward to identify leading indicators of results (such as opportunities closed in the first three months). Then, go even further back to track the behaviors that will affect the leading indicators (for example, new reps who focus on bestselling products earn quick wins and build momentum). Now you have a solid basis for meaningful training initiatives (measure their success, too).

This is almost exactly the opposite sequence of most evaluation designs, which move through Kirkpatrick levels 1-4 as if the numbering indicates a prioritization rather than a progression of measurement sophistication.

Millions of reasons

There's ample proof that forward-thinking chief learning officers do harness data to measure results, fine-tune initiatives, truly develop teams, and irrefutably have an impact on the business. In fact, millions of dollars in top- and bottom-line impact can be attributed to adopting a numbers-driven mindset.

Let's start with SuccessFactors, now part of SAP. My Cloud Talent Success team drove these accomplishments by starting with the end in mind, using all our data sources, and frankly, outsourcing the analytics expertise:

  • We pinpointed success drivers of top sales performers using data from our customer relationship management system, learning management system, performance management system, and employee records. Now we develop those skills in sales reps, and recruit reps who have them. Previously underperforming reps are now exceeding quota, and planned attrition is practically nil.
  • Understanding the drivers of customer call volume helped our team fine-tune and market existing customer training, which improved use. Customers who completed training had a 94 percent lower call volume and 34 percent higher Net Promoter Score.
  • When we analyzed small average deal sizes, we realized reps weren't selling to senior decision makers. Introducing a course on engaging C-suite executives helped grow average deal size 65 percent post-training.
  • Using data to fine-tune our onboarding helped slash new-hire ramp time by 50 percent and reduce attrition by 80 percent.

Practitioners can develop targeted, precise solutions—surgical learning initiatives, if you will—and get out of the spray-and-pray L&D rut that plagues our profession.
What follows are highly diverse yet uniformly inspiring examples of other learning leaders who have brandished data as an incredibly effective tool.

Accenture: Analytics for analytics experts

Leave it to a company like Accenture to prove that better-trained account teams are more successful. The global consulting and technology giant, with approximately 281,000 employees worldwide, analyzed nearly 1,000 of its largest client projects over three years to determine "people factors" that contribute to their success. Dozens of variables included gender, geography, career level, and even vacation time. Yet training made the difference.

Research revealed that projects with a certain percentage of employees trained during or within 12 months prior were significantly more likely to be successful, and that the more recently trained the team members, the higher the likelihood of success. Sharing these findings with leadership reinvigorated executive support for training and development, and helped to curb the pressures of substituting billable hours for learning hours.

These analytics also led to the creation of predictive "learning analytics dashboards" that measure penetration versus optimal thresholds for 16 learning variables, such as functional training during the contract and soft skill training in the 12 months prior. These data enable management to increase training where needed and help reduce risk.

Accenture has deployed the dashboard to several of its largest global accounts, representing thousands of employees and hundreds of millions in revenue. Impact is being monitored, with promising early results.

"We're converting training data into language our business leaders use," says Dan Bielenberg, director of capability strategy. "For example, 'risk' is a term they know very well. ... Whether it's political instability, a change of client leadership, or a snowstorm, account heads have to manage a wide range of potential risks to client satisfaction and profitability. Now we're able to highlight potential risks related to learning."

Jiffy Lube: To measure is to make it happen

Technical certifications are a contributing factor to Jiffy Lube's success because staff must be certified to perform a service. But with 2,000 fast lube stores, 20,000 employees, and dozens of services, the lack of reliable certification tracking made it impossible for Jiffy Lube to ensure adherence to this standard.

In 2008, the learning team attacked the challenge in three phases. First, they standardized by developing 10 time-based targets (for example, entry-level certification must be completed within 30 days of hire) and percentage goals for typical stores.

Second, they established tracking via a new learning management system, Jiffy Lube University, which introduced a JLU Dashboard Report that monitors certification levels systemwide. Store managers, franchisees, and corporate leaders access a simple color-coded online report that shows completions by individual, store, franchisee, and region: at or above target in green, 50 percent to 99 percent in yellow, and less than 50 percent in red.

Third, they aligned goals. All learning team members and other franchisee-facing personnel took on individual performance goals for increasing certification levels.

"We believe that everyone wants to look good, and will take the necessary action to get into the green," says Ken Barber, who's led learning at Jiffy Lube since 2006. "Making the information available to everyone has helped create awareness and action to not only reach certification goals, but for individuals to reach personal and career goals by taking control of their development."

Indeed, total training hours more than doubled in three years. Today, 76 percent of stores are at 80 percent to 100 percent certification. A recent analysis revealed that the 33 percent of stores at 100 percent certification have average customer sales 9 percent higher than the system average. And the three highest average revenue figures by group are recorded by stores with 100 percent, 90 percent, and 80 percent training certification.

TELUS: Doing it their way

This 42,000-employee Canadian telecom company pioneered blending formal, informal, and social learning as a norm. But this necessitated re-examining measurement and reporting approaches developed for traditional classroom training.

"We needed to ensure we truly captured our activities and the effectiveness of all our initiatives, not just those within a defined location or period," says Dan Pontefract, head of learning and collaboration.

Pontefract and his team created a proprietary measurement system in conjunction with the TELUS Scorecard Governance Committee, a cross-functional HR, finance, and analysis team that helped them define goals, estimate targets, and design a survey that launched in 2011 and is revisited and refined annually.

The AUGER system measures:

  • accessing—clicking, opening, attending
  • usage—viewing, staying, reading, participating
  • grade—knowledge acquisition
  • evaluation—participant's assessment
  • return—resulting performance impact.

To determine return, every quarter TELUS surveys 8,000 randomly selected employees (50 percent managers) about their learning activities from the past 90 days. The percentage of respondents reporting positive impact—on knowledge or skills from formal and informal learning; or knowledge, collaboration, or engagement from social learning—yields return on learning (ROL).
In 2012, average ROL was 74 percent, beating the target by 3 percent and the 2011 actual result by 5 percent. By 2013 that had increased 100 basis points to 75 percent. Individual ROL scores by modality are revealing (as are comparisons to baseline): formal, 80 percent (versus 76 percent); informal, 85 percent (versus 78 percent); and social, 59 percent (versus 53 percent).

"Increasing overall return is one thing, but witnessing the equal spread of formal, informal, and social learning—at roughly 40 hours per team member for each category—was a testament to our pervasive learning culture," Pontefract says.

The learning team collects AUGER data via a range of sources. A custom database integrates diverse tracking systems and aggregates input for reporting by individual, department, business unit, and corporation. Any employee can access a wiki-based learning dashboard that displays, for example, investments, assessment results, and analytics, by team, division, region, or even learner age group.

"Data transparency not only holds us accountable to deliver efficient and effective learning, but holds our stakeholders accountable to their teams as well," explains Pontefract. "When any team member in the company can compare aspects such as spend, engagement, performance, and so on, it reinforces the open leadership model we've inculcated across the organization."

The truth: We need training

There is a serious global shortage of data and analytics skills. According to the McKinsey Global Institute, the United States alone faces a shortage of nearly 200,000 workers with deep analytical skills and an even greater shortage (1.5 million) of managers and analysts who can analyze data and make decisions based on their findings.

Advertisement

So whether we build, buy, or outsource data talent, learning leaders must find a way around this dearth of skills. Accenture, for example, is helping to increase metrics acumen and competencies for its learning team with a data and analytics training series. The inaugural October 2013 virtual session covered such topics as calculating waste reduction, measurement sustainability, logic models, results correlation, and causal models.

"We have deep analytics experts," says Bielenberg, "but we need to help the rest of our people use analytics more effectively."

You can change the traditional CLO-senior leader conversation by delivering hard evidence that you're affecting the business. Learning organizations can and must deliver more—more business insight, partnership, and measurable impact. Data can help transform L&D, but only if L&D professionals know how to use it.

Wherever you are on your learning data journey, take another step forward. Big learning data is here, now, and within reach.

A Look Under the Hood

Here’s how SuccessFactors, now part of SAP, successfully used learning data to demonstrate its impact on the business.

Sales Success Driver Analysis

A critical step was analyzing what makes our sales reps succeed. We drew data from four sources to study their influence on sales attainment:

  • customer relationship management (CRM) system—more than 110 sales variables, such as average deal size, win ratio, and sales cycle length
  • learning management system (LMS): courses taken, self-evaluations, timing of training
  • performance management system: manager ratings, goal setting, performance reviews, learning plans
  • employee records: hire date, manager, sales experience, prior domain experience.

We then used advanced statistical techniques, including univariate analysis, regression modeling, and structured equation modeling, to determine key influencers of performance and measure their impact. Lastly, we converted these influencers into key performance indicators and set targets.
Business Impact Analysis

Here we merged three data sources, then did analysis to determine before-and-after performance impact:

  • CRM—to track pipeline, deal size, win rate, sales
  • LMS—which courses were taken and when, in addition to hire date, manager
  • commissions file—tracks attainment versus quota for each salesperson.

To measure the training initiative’s impact on pipeline, we tracked performance for a certain period before and after a key sales course completion, isolating the impact of seasonality (for example, Q4 usually being the busiest quarter).
Partner Sales Training Business Impact Analysis

We used CRM and LMS data and compared the performance of partners who completed training with partners who did not. Similarly, we tracked the number and value of deals created and won per partner, which captured training impact.

Customer Systems Implementation Training Impact

This involved substantial manual review and analysis from less sophisticated source data. We study WebEx training logs to cull lists of customers who have attended webinars, then merge that data with customer service call logs. We then compare call activity from customers who complete training versus those who do not. Similarly, we merge data from WebEx logs with Net Promoter Score and renewals databases to compare impact.

Diverse Use Cases for Learning Data



Company

Data Initiative(s) 

Systems/Tools/ Approaches Used 

Learning Data Impact 

Advice for the Data Novice 

Accenture

Predictive “learning analytics dashboards” that demonstrate the impact of training penetration on business success

Integrate personnel, training, and financial data into a data warehouse; initial findings based on multivariate analysis; dashboards are generated using a custom visualization tool 

These analytics highlight potential risks related to learning in a quantifiable way. Linking training to profitability also reinvigorated executive support for L&D.


“Give your analytics experts room to explore, but stay focused on the decisions that you’re supporting and the stakeholders making those decisions.”
—Dan Bielenberg, Director of Capability Strategy
Jiffy Lube Automated tracking and reporting of technician certification completions, required for each type of service performed LMS with back-end entry for store managers, franchisees, and Jiffy Lube leadership Near-tripling of completions, helping slash attrition by 33 percent and save franchisees millions of dollars annually; stores with 100 percent completions have higher revenues. “When the data paints a clear picture of business impact, funding and support for training will never be an issue.”
—Ken Barber, Manager of Learning and Development

TELUS
Proprietary measurement system developed with in-house Scorecard Governance Committee; includes “return on learning” calculation from employee-reported learning impact e.Survey in-house custom survey; custom database integrates Google Analytics and PiWik (web activity), SAP (course financials, employee data), SkillSoft (e-learning activity), Confluence/Jam/SharePoint/Avaya (ratings, rankings, interactions, views); SuccessFactors LMS (courses, completions, much more) Hard measures of impact both inform senior stakeholders and enable learning teams to fine-tune initiatives and the formal/ informal/social balance. “Play offense—walk into the C-suite office, push the envelope, define the model, hold your ground. And take risks—develop gumption and have courage to innovate.”
—Dan Pontefract, Head of Learning and Collaboration
About the Author

Jenny Dearborn is a leading authority on sales enablement and training, with expertise in big data and predictive analytics. As Senior Vice President and Chief Learning Officer for SAP, she designs and drives employee learning and enablement strategy. Dearborn was recognized as one of the 50 Most Powerful Women in Technology in 2014 and 2015 by the National Diversity Council, and through the Fortune Most Powerful Women Network, she is a mentor for the U.S. State Department to female entrepreneurs in developing countries. Her team was named by eLearning Magazine the #1 top performing corporate learning organization in the world in 2013.

 

Be the first to comment
Sign In to Post a Comment
Sorry! Something went wrong on our end. Please try again later.