The ADDIE model or some derivative of it provides designers with the necessary structure for designing any curriculum, regardless of the instructional methods employed. Anything from lecture to Web-based training starts from the same fundamentals—the ADDIE model.
In the ADDIE model, analysis is the input for the system; design, development, and evaluation are the process; and implementation is the output. These elements overlap somewhat, depending on the project, and because the system is dynamic, there will be some sharing of duties.
Here is a brief descriptions of each element.
Analysis is the data-gathering element of instructional design. Here instructional designers assemble all the information they can possibly gather about the project before they consider anything else. Decisions about every aspect of the project must eventually be made. The information that instructional designers gather at this stage will be put to use throughout the system, so it is necessary that they have every scrap of data to ensure the design will be successful.
Design is the blueprinting stage of instructional systems during which instructional designers create the blueprint for a project with all the specifications necessary to complete the project. During this stage, instructional designers write the objectives, construct course content, and complete the design plan.
Materials production and pilot testing are the hallmarks of development. At this stage, most nondesigners begin to see progress. Everything from lecture notes to virtual reality is brought from design to deliverable. Before instructional designers move from development to implementation, it is wise for them to do pilot testing to ensure that deliverables do not have to be redeveloped.
Because of the time and expense involved, no one wants to reprint manuals or recode a technology-based project after a project goes into implementation. The pilot testing process allows organizations to implement any necessary changes in the project before the expenses associated with materials development are realized. The time and effort expended in pilot testing is well worth the effort, if for this reason alone. Pilot testing also helps designers feel confident that what they have designed works.
The most familiar of the elements is implementation. At implementation, the design plan meets the learner, and the content is delivered. The evaluation process that most designers and learners are familiar with takes place in this element. Evaluation is used to gauge the degree to which learners meet objectives and facilitators or technologies deliver the project.
Evaluation doesn’t deserve to be listed last in the ADDIE model because it takes place in every element and surrounds the instructional design process. Evaluation is a constant guard at the gate of failure. The advantages of using an instructional system are numerous, the most important being the ability to design projects quickly and efficiently. Nothing is left to chance or ignored when a designer stays within the framework of the ADDIE or other ISD models. One possible disadvantage is the necessity of a designer to be familiar with the ISD process.
The Model at Work
The ADDIE model is best put to use as soon as someone in an organization thinks there is a need for a training course. A description of the way an employee at one company applied the ADDIE model follows. The company, which requested anonymity, provides information-technology to manage food and beverage operations at ballparks, stadiums, arenas, casinos, and other establishments in the hospitality industry.
Brian J. Reider, the instructional designer, was considering creating a course for installers, support technicians, dealer representatives, and hardware technicians. The course would provide these staff members with the information they needed to provide the best possible help to their customers.
Here are the steps Reider followed as he applied the ADDIE model to creation of a course.
Problem: Reider defined the problem as whether to create a course as a way to allow the company to continue providing the best support for the customer. One consideration he noted was that he was not an expert on hardware.
Analysis: During this data-gathering stage, Reider tried to get answers to the following questions:
- Why do we need this course?
- What makes this hardware so different from the other that it needs its own course?
- What information needs to be covered in the course?
He used face-to-face interviews to gather information that would be relevant in the course. He interviewed support technicians (software and hardware), installation technicians, hardware technical writers, members of the hardware research and development department, and the director of Hardware Services to help determine what information is necessary for a technician in the field. He also read support cases to see what some of the major problem areas were. He concluded that the following topics should be included in the course:
- how this software and hardware are different from others
- basic knowledge of all components (Parts Identification)
- what parts are replaceable
- how to install the replacement parts
- how to convert one model to the newer model
- how to use some basic troubleshooting techniques.
Design: In this blueprinting stage of instructional design, Reider created observable and measurable terminal objectives for the course. The design took into account the need to create an evaluation later in the development. Reider had the subject matter experts review the objectives and give any feedback.
From the objectives, he determined that the best delivery method for instruction would be an instructor-led course with extensive hands-on exercises. He created an organizational chart (similar to a course or topic map) so that he had a graphical representation of the topics and subtopics to be discussed. This helped him group and link different topics to one another. It also allowed him to create the necessary enabling objectives.
Development: Materials production and pilot testing are the key elements of this stage. For developing the course, Reider followed the nine events of instruction, which Gagne developed as a sequence for lesson plan design. Reider also maintained contact with some subject matter experts to ensure that the material he was creating was accurate.
Reider was becoming increasingly knowledgeable about the hardware and was actually able to identify, remove, and replace all of the replaceable parts. While it was disassembled, he and other staff members used a digital camera to photograph the different components. They will use these photographs for a parts identification job aid on the company’s Website. Reider did not create a formal evaluation for the course due to the constraints on the length of the course and the purpose of the course.
Implementation: The course was implemented soon thereafter.
Evaluation: Before the formal implementation of the course, a pilot class was held. The participants of the pilot class were new hires and members of the training department. The participants had no knowledge of the hardware. At the end of the pilot class, a focus group was held to obtain feedback on the course. The participants completed a level one evaluation form.
Revision to the course was made on the basis of responses from the focus group, responses on a level one course evaluation, and the feedback from the instructor. Reider reported that he made minimal changes as a result of the level one evaluation and the focus group. The course received great reviews, and the students enjoyed all the hands-on activity. Reider taught the course at various company offices throughout the United States as well as to company employees in Germany and Hong Kong.