Advertisement
Advertisement
ATD Blog

Science of Learning 101: The Latest Research on Needs Analysis and Learning Climate

Wednesday, July 22, 2015
Advertisement

big-data-analytics.jpg
In the blog post I wrote last month, I discussed how knowledge is changing so fast in most fields that it’s impossible for most people to train for a job and then do that same job for any length of time. Jobs change as knowledge changes. The implications for the learning and development (L&D) function are overwhelming.

Unfortunately, L&D isn’t helping the situation. Many L&D teams are simply throwing more information at people in their organizations—more PowerPoints, more e-learning. But by doing this, they become part of the problem. Senior leaders don’t understand how learning works; L&D must.

The best way to solve this problem is to use the science of learning to deliver training programs that help the situation, not hurt it. Luckily, we have a great deal of guidance.

What Works?

In “The Science of Training and Development in Organizations: What Matters in Practice,” Eduardo Salas and his fellow authors assert that “decisions about what to train, how to train, and how to implement and evaluate training should be informed by the best information science has to offer.” How could we disagree, especially under the current circumstances? And yet they found that too many training departments have relied on fads, not science.

The authors used meta-analyses (statistical methods for contrasting and combining results from multiple research studies) to determine how to best design, deliver, and implement training as effectively as possible.

I thought I’d create a guide to this jam-packed research document and reveal some of the gold nuggets within. I’ll start with some of the definitions used. Then I’ll discuss the initial part of the article, in which they discuss what is needed prior to training in order to have effective training. That’s as far as I’ll get in this month’s post. (Did I mention that the research is jam-packed?)

Definitions

Here are the primary definitions, quoted directly from the paper. Anything within square brackets are my own comments.

  • Training: planned and systematic activities designed to promote the acquisition of knowledge (i.e., need to know), skills (i.e., need to do), and attitudes (i.e., need to feel). [I’m not sure how much we can, or even should, try to make people feel, but some learning researchers and theorists think we ought to work on changing attitudes.]
  • Learning: a process of acquiring new knowledge and behaviors as a result of practice, study, or experience.
  • Effective training: when trainees are intentionally provided with pedagogically sound opportunities to learn targeted knowledge, skills, and attitudes...through instruction, demonstration, practice, and timely diagnostic feedback about their performance.
  • The goal of training: to create sustainable changes in behavior and cognition so that individuals possess the competencies they need to perform a job.

Before Training Occurs

Figure 1 is Table 3 from the paper, showing pretraining actions that are most critical to improving training effectiveness. I’ll concentrate on a few of these actions for the rest of this article.

Advertisement

Salas_Figure3.png

Needs Analysis: What’s Needed and Why

The authors found two steps to be especially critical for pretraining effectiveness: needs analysis and preparation of the learning climate. From my own knowledge of the field, many L&D teams don’t do either of these steps very often or with much precision. Could this be one reason for lack of success? Probably yes—ever heard of the phrase, “Measure twice; cut once?” Analysis helps you avoid mistakes before taking action. I know teams are working under deadlines, but they can conduct an analysis of learner needs and the learning climate fairly quickly.

First, the authors clearly explain that it’s important to realize when training is not the right solution. A needs analysis will determine whether a nontraining solution is a better alternative.

Second, they explain, we often ask too few or the wrong questions. Stakeholders (such as jobholders, their supervisors, or a subject matter expert) often don’t know what they need. Likewise, they often don’t know the difference between must-know content and content that a jobholder can simply retrieve as needed. The authors remind us that this distinction is of the utmost importance, because people can only process a certain amount of information at a time. If we ask people to memorize or process more information than necessary, it interferes with learning.

You and I know that this situation (too much irrelevant content) is common with learning. It needs to stop. Who will stop it if not us?

Learning Climate: What’s Needed and Why

On the topic of learning climate, the authors found that expectations about training can and do affect learning. They found that trainees need realistic expectations about the training and must see how training will enhance their work.

Advertisement

How supervisors handle training also matters a great deal. If training is to work, supervisors must be involved in a positive way, including preparing people and reinforcing learning objectives. They must support training in a variety of ways, or the training is much less likely to be useful.

Maintenance of knowledge and skills (what the authors call skill decay) is a serious problem, so training should be scheduled near use. When skill decay is unavoidable—as is the case for infrequently used skills—proficiency practice is needed. However, the authors remind us that most organizations don’t do this unless it is mandated.

Once again, we know that L&D understands the criticality of this situation, but management doesn’t. Therefore, it is our problem to fix.

Next Steps

Next month, I’ll continue with what the authors found in their meta-analyses of what works before, during, and after training. Please let me know in the comments what you will do with this material, because I very much want to provide you with actionable information.

Further Reading

Salas, E., Tannenbaum, S.I., Kraiger, K and Smith-Jentsch, K.A. (2012). The Science of Training and Development in Organizations: What Matters in Practice,” Psychological Science in the Public Interest, 13 (2), pp. 74-101. 42.

Salas, E., & Cannon-Bowers, J. A. (2001). “The Science of Training: A Decade of Progress.” Annual Review of Psychology, 52, 471–499. 

Tannenbaum, S. I. (2002). “A strategic view of organizational training and learning.” In K. Kraiger (Ed.), Creating, Implementing, and Maintaining Effective Training and Development: State-of-the-Art Lessons for Practice (pp. 10–52). San Francisco, CA: Jossey-Bass.

About the Author

Patti Shank, PhD, CPT, is a learning designer and analyst at Learning Peaks, an internationally recognized consulting firm that provides learning and performance consulting. She is an often-requested speaker at training and instructional technology conferences, is quoted frequently in training publications, and is the co-author of Making Sense of Online Learning, editor of TheOnline Learning Idea Book, co-editor of The E-Learning Handbook, and co-author of Essential Articulate Studio ’09.

Patti was the research director for the eLearning Guild, an award-winning contributing editor forOnline Learning Magazine, and her articles are found in eLearning Guild publications, Adobe’s Resource Center, Magna Publication’s Online Classroom, and elsewhere.

Patti completed her PhD at the University of Colorado, Denver, and her interests include interaction design, tools and technologies for interaction, the pragmatics of real world instructional design, and instructional authoring. Her research on new online learners won an EDMEDIA (2002) best research paper award. She is passionate and outspoken about the results needed from instructional design and instruction and engaged in improving instructional design practices and instructional outcomes.

Be the first to comment
Sign In to Post a Comment
Sorry! Something went wrong on our end. Please try again later.