Top
1.800.628.2783
1.800.628.2783
ATD Links Archive
Issue Map
ATD Links Archive
ATD Links

Causes of Poor Learning Programs

There are many reasons why organizations settle for boring and ineffective learning programs. A few that come to mind are that organizations:

• fail to manage project risks effectively
• spend too much of a project’s resources and energy on upfront analysis
• spend too little time and energy exploring alternative design options
• focus on content presentation, accuracy, and comprehensiveness (instead of the
learning experience)
• fail to involve sponsors, stakeholders, and learners throughout the design process
• yield to design by committee and the opinions of organizationally dominant
individuals
• employ outdated methods.

Managing Risks

Traditional training design methods are quite linear, segmented processes that require the completion and approval of work done at each phase before moving forward to the next. Managers must review and approve design and specification documents, for example, before development can commence. These methods seek to limit risks the team faces at each subsequent phase of the process. Approval at the conclusion of each phase releases the team, it is hoped, from any liability should the final product not provide the expected results or meet learners’ needs.

Leaving ADDIE for SAM

The requirement within these traditional processes to gain explicit approval drives training departments to create designs that are readily communicated through pecification documents or storyboards, both of which are notorious for multiple interpretations, no matter how detailed they may be. Designers often go to extreme lengths to avoid ambiguity, yet both subsequent developers and approvers frequently envision different final products. Consequences of presenting stakeholders with a final product that is not exactly what they thought they approved can include considerable discord and disruption. Therefore, if a learning event is hard to describe, teams will set the idea aside rather than risk disapproval and failure. Learning experiences with the greatest potential are particularly susceptible to being discarded because they can be difficult to describe clearly, require exploration before they can be evaluated, and are likely to result in extended discussion—all costly risks.

Analysis Paralysis

Design documentation typically begins with the justification of why instruction is being developed. Early in traditional processes, the focus is on complete and accurate analysis of the need and alternative solutions. Regardless of the time and money spent, however, this analysis is often incomplete and inaccurate—almost by necessity, because so much “nonproductive” time can be spent on it, relevant data are hard to acquire, available information is often misleading, and situations change while the analysis is underway. For example, the analysis may be based on ineffective learning programs currently in place. Erroneous and misleading conclusions such as “e-learning doesn’t work for soft skills” can easily be reached when the real problem was with course design rather than delivery medium. Deep analysis is difficult, expensive, and time consuming. When analysis takes a big chunk out of available resources, the resulting learning program may have to be reduced to something very basic and perhaps quite unengaging.

Nothing but the Facts
Many people have grown up with and succeeded in spite of “tell-and-test” instruction—the typical approach of exposing learners to a body of information and then testing for retention. They have listened to years of instructor presentations, read shelves of books and articles, watched hours of videos (or filmstrips), done countless hours of homework, and taken many quizzes and tests. The pattern is simple. Information is supplied over a period of time preceding a quiz or test constructed of multiple-choice questions. Repeat.

The process is nicely manageable. Everyone starts on the same day, finishes on the same day. Grades indicate varying outcomes. And there will be varying outcomes, of course, because not everyone can learn the same amount in the same time. Instructors present an amount of content that can be learned by most. For some learners, it’s not enough; for others, it’s too much.

This is probably not the time or place to assail this “tried-and-true” method, but the obvious variance in results it achieves should raise questions. Does it matter whether a portion of learners fail? Does it matter whether learners can actually do anything better or differently? Does it matter whether learners were productive with their time and are eager to perform? Does it matter that some learners wasted their time because they only needed a small part of the content, if any?

Advertisement

Although it can be harder than one might think to agree on what content to present or even what the “facts” are, a basic pattern of presenting information followed by testing for comprehension or retention is an easy design for approvers to understand. Preparation of the content is as straightforward as it can be. Agreement and approvals can be won in short order. Off we go. Yet you may find learners and instructors thinking, I wish our courses weren’t so boring.

Too Many Facts
To reach consensus with a tell-and-test paradigm as described above, we just have to add in all the facts that everyone feels are relevant. Focus becomes on being thorough and complete. Exhaustive is a better word. Get it all in there. Legal needs to add more. Sure. Can’t fight legal.

Training designers know that their two biggest problems are likely to be 1) having to deal with too much content—each and every piece of which has a staunch advocate, and 2) getting timely approvals throughout the process. Nobody wants to sign off on the work until it’s done. When it’s done, people want to make changes. The most likely change: Add more content.

What Learners Want and Need
I’ve come to expect it now, but most organizations are certain they know what interests and doesn’t interest their learners. They “know” what their learners know and don’t know. They “know” what learners can and can’t do. They “know” what their learners like and dislike, how they’ll respond, and what they’ll do. They know. We don’t need to waste time involving learners. Besides, they don’t know what they don’t know. We know. They are so certain they know learner needs and preferences that they adamantly refuse to involve learners in the process of designing learning solutions. The problem is that they are rarely—very rarely—correct.It’s a huge mistake not to involve learners in the design of learning programs. Huge.

Anyone Can Do This

Instructional design is a profession. Not everyone gets the important fact that designing good instruction requires considerable knowledge, skill, and practice. A frequent cause of poor learning programs is that individuals are involved who, recalling their many experiences as a student, have strong opinions about what makes good instruction. Simply having been a student makes them no more of an instructional designer than having had their teeth cared for makes them a dentist.

For project leaders to be advocates of professional design, it helps to have examples of both poor and excellent instructional programs. Contrasts between the two can convince project leaders more readily than simply showing excellence. Demonstrating these programs can quell dominant personalities who think that anyone can design training—that putting together a presentation and a document filled with content will cause learners to change their performance—and encourage others to speak up in favor of competent approaches.

It also helps to be something of a psychologist—to be able to draw out opinions from those who just agree to avoid tension and quiet those who insist on voicing their every opinion. Regardless of the leader’s talents and team composition, it helps to have a process that can make alternatives evident and equalize opinions so the goal of just coming to an agreement doesn’t substitute for the goal of creating effective learning experiences.

We Have Our Process

While every project has risks to manage, information to gather, a tendency to present information—too much information—with minimal context, opinionated persons to manage, and a readiness to assume what learners know, need, and want, organizations also tend to believe in and hold tightly to the process they are using. Departures from a familiar process can intimidate and frighten even the most adventurous and those who seem ready to try something different. But many organizations are using outdated methods that, on close inspection, may actually exacerbate common project problems rather than resolve them.

Threatening and disorienting as change can be, it seems there are many reasons to aspire to something better.

This is an excerpt from Chapter 1 of Leaving ADDIE for SAM: An Agile Model for Developing the Best Learning Experiences, an ASTD Press publication.

About the Author
Michael W. Allen is chairman and CEO of Allen Interactions, whose studios build universally acclaimed custom e-learning, provide strategic learning consulting, and train e-learning professionals in collaboration with ASTD. With a PhD in educational psychology from The Ohio State University, Dr. Allen has pioneered multimedia learning technologies, interactive instructional paradigms, and rapid-prototyping processes, bringing each forward into leading corporate enterprises. He is an adjunct associate professor at the University of Minnesota Medical School, a popular conference speaker, and a prolific writer. In May 2011, he received ASTD's Distinguished Contribution to Workplace Learning and Performance Award. In May 2012, Allen was selected by The National Ethnic Coalition of Organizations (NECO) Advisory Committee as a recipient of the 2012 Ellis Island Medal of Honor. Through a proven process and skills born from experience, Allen Interactions delivers exciting interactive solutions that enhance knowledge, skills, and performance. It is one of the most successful and highly regarded providers of custom e-learning, training, and consulting, with studios in Minneapolis, Tampa, and San Francisco.
About the Author
Vice President, Training and Marketing, Allen Interactions  Richard H. Sites is the vice president of training and marketing at Allen Interactions, where he leads the strategic vision of Allen Interactions’s custom development learning services, training and outreach, and authoring system, ZebraZapps. He also oversees the awareness of the company’s advanced design and development approaches created by Michael Allen: CCAF-based design and the SAM process for iterative, collaborative development. Richard has more than 20 years of experience designing and delivering learning solutions to support improved workplace performance for many Fortune 500 companies in both academia and private industry. He has held the positions of vice president, client services and studio executive for Allen Interactions’s Tampa studio. Richard travels the country speaking to groups and organizations on the value of SAM, the importance and power of engaging, performance-changing learning experiences, and other topics related to the design and development of high-quality training. He is the co-author of ATD’s bestseller Leaving ADDIE for SAM and the Leaving ADDIE for SAM Field Guide. Richard holds a doctorate of education from the University of West Florida along with a master of education and a bachelor of business administration. He is also a frequent blogger on the Allen Interactions E-Learning Leadership blog.  
Be the first to comment
Sign In to Post a Comment
Sorry! Something went wrong on our end. Please try again later.