Advertisement
Advertisement
Uncertin.fw.png
ATD Blog

Q&A With Ken Spero: Simulations and Experience-Based Learning

Wednesday, January 28, 2015
Advertisement


KennySpero.jpg
Ken Spero is a senior learning strategist. In his role, Ken continues his work in Experience Design to help clients to shift the focus from “what” leaders think to “how” they think, using a systems thinking and immersive learning approach to improve strategic thinking, decision making, and performance. 


  Q: When you talk about simulations, what do you mean?

KS: We all know that experience is the best teacher. However, companies today simply don’t have the time or the wherewithal to allow their people to learn through the “school of hard knocks.” So simulation provides experience and provides it in a context where the participants are forced to think critically and make decisions. Then they have the opportunity to experience the consequences. Simulation is about capturing and deploying experience, primarily personal experience, and that’s where simulation can be the most powerful.

Q: Where is this simulation approach most effectively used in organizations?

KS: When we think about simulations and this idea of experience, we focus on what we refer to as, “initiative implementation.” When an organization is implementing an initiative, for instance an ERP system, Six Sigma or a CRM, the company is requiring their employees to do new things, better or differently, which may include acquisition of new skills or a behavioral change. The sweet spot for this kind of simulation is creating both a skill development piece and a behavioral change piece because an organization is fundamentally changing the way that they operate.

Simulation can be a very key tool to help organizations become more competitive, especially in an instance where they’re looking to do something that they’ve never done before. As we think about the way the mind works, we are often challenged with sifting through relevant experiences and leveraging them to determine how best to act.

But what happens when you don’t have any relevant experiences to draw upon? The outcomes can be unpredictable because we don’t have a sense of how to effectively proceed. When using simulation in the context of implementing an initiative, what we are enabling our clients to do is populate the portfolio of experiences that they will be drawing upon. So as they make decisions, they will be able to think more broadly and have an expanded experience-base upon which to rely.

Oftentimes, initiative implementations are very contextual, where just instructing or providing direct learning experience is not necessarily going to help. Issues frequently do not present themselves in exactly the way that they are presented in the training content. Simulations allow you to capture and present those often frustrating “just happens” of life and facilitate faster, more effective decision making.

Q: How do you distinguish this “experiential” approach from other forms of training?

KS: In the realm of training, it’s important to distinguish between instances where instruction is sufficient or where experience-based training is required. When it’s really process-oriented, training can be a very effective tool. However, when the application of skill requires context and depends on the nature of the people with whom you’re interacting, those are the times where experience is going to be more effective as a tool for teaching than just straight-forward instruction.

When we get into providing or “bottling” experience, it allows you to leverage the power of storytelling. A good story can be extremely engaging and people end up seeing themselves in the story as it plays out. They frequently confess that they “felt” the consequences. For example, the Harry Potter books. There were seven very lengthy books, but people read them because they became engaged by a good story and clearly recall significant elements. Similarly, the power of relevant stories in a familiar business context, with realistic decision points and relevant consequences find their way into the experience portfolio and are readily retrievable during initiative implementation.

Advertisement

Q: You mentioned bottom-line impact and measurement, which has become the focal point of training for companies across industries. How does using this model make that possible? 

KS: One of the key aspects of simulation is measurement. When you’re going through an experience or simulation, part of the feedback process is going to be on a scorecard. When designing a simulation, one of the first steps is to clearly articulate the scorecard. It starts off by having an orientation towards measurement and these are essentially the observable behaviors that the client is looking to have manifest from the participant when implementing their initiative. This makes simulations valuable from the development perspective because oftentimes clients will not have thought through the initiative from the observable-behavior perspective. 

The goal of every initiative is to improve speed, efficiency, or quality; these are the quantifiable elements that are important. But how is it going to manifest in terms of the behavior of the employees who are tasked with doing these things? In designing a simulation, you are actually forced, in essence, to align and articulate the observable behavioral elements necessary for initiative success and that should be demonstrated by one going through the experience. There is value in being able to articulate those things because that’s what is going to be measured in the simulation. 

Simulation also provides failure as an option. You can actually do things wrong and that is also a very powerful learning opportunity. Then you understand why you shouldn’t do that and when faced with this type of situation in real-life, you will think twice because of your simulation experience. By starting with the scorecard, the building blocks for ultimately measuring the outcomes that you’re looking for are clearly articulated. By taking it from that perspective and identifying what these observable elements are, one can readily establish benchmarks against which to view and quantify improved efficiency and effectiveness. 

One of the key challenges in measurement is often just being able to understand the units of measure of what it is I’m looking to produce. The process begins by understanding these elements then you can ultimately relate to these measurable units and get to a quantifiable goal. 

Q: What are some examples where you have done this?   

KS: A couple of good examples would be in the area of compliance management in the pharmaceutical industry. We worked with Schering-Plough, who decided that compliance is at its core a behavioral issue. Because there is so much training occurring in the compliance arena, it’s not about knowledge, but oftentimes it’s the behaviors that come with it. That’s when a sales rep is out in the field and the doctor asks them to invest in something. 

As a salesperson your first inclination is going to be to say, “yes,” and you might not think of the consequences of that response. Simulations allow the organization to model the behaviors they would like their people to demonstrate when they’re faced with whatever situation. The feedback that comes from the field is often, “This is the best training I’ve ever gone through.” This sentiment is rare coming from compliance training, which can be dry and dull—this makes it fun. 

Q: So for someone unfamiliar with simulation, it may sound very difficult and time consuming to implement. What would you say to people with those concerns? 

KS: This is a key challenge because it does sound complicated and, in the past, has been costly. One of the things we’ve also done is significantly alter the development process because it requires a different way of thinking. It is actually not that time-consuming or costly to produce, it’s fairly straightforward. The reason why it’s daunting is when you try and fit simulation into traditional instructional design methodologies, it can be very complicated because simulation does not necessarily sit nicely inside of the principles and processes used to develop quality classroom interaction or computer-based instruction. 

Advertisement

So we have created something we call “Experience Design.” It requires less specific expertise than instructional design because all you really need to know is how the experience itself manifests. If you can describe a “week in the life” of whoever your target audience is, that’s the capability required to put together a simulation. A simulation does not revolve around starting with competencies or learning objectives. It starts out with what needs to be done. 

Let’s say it’s leadership development. If I can describe what a good leader does, then I should be able to observe those learning objectives or competency naturally by the way the leader executes. If it’s a real experience that’s relevant, I should be able to observe, based on the decisions you make, the direction that one follows in the simulation. I have to define what those metrics are, but it’s all about the experience rather than trying to figure out how to make “trust” show up as a learning objective. 

We live through these scenarios on a day-to-day basis and our actions should demonstrate whether we get it or not. So the challenge is to articulate the experience which is the starting point rather than try to build things around specific competencies and learning objectives. 

Q: Why is this model worth the cost to companies that have many options in the training space? 

KS: One of the things we’ve found with Experience Design is that simulation is not the answer in and of itself. Simulation is most powerful when it is used as part of a blended approach where you first learn key concepts, and then have a chance to apply them in a real-world setting. 

Initiative implementation is complex and there’s a lot of content, software, and the way the process itself needs to be implemented that requires instruction. Simulation then plays in the role of the experiential component and it makes everything else around it that much better. If you provide content in advance and then provide the simulation, it makes the content more effective due to experience and it makes you ready for the next step because you want to know more. It creates the engaged learner who is primed for additional instruction. 

If one looks at the literature and case studies that talk about failure to achieve the targeted budget or time requirements, usually it’s not about the content, it’s about the behaviors associated with implementing—whether it’s change management or some other aspect of resisting change. Simulation allows you the opportunity to increase the probability that you will achieve the success that you targeted with an initiative. 

These are not your parents’ simulations; they’re not nearly as expensive or time-consuming to produce. The entry point for using simulation is much lower and the expertise comes from the organization itself.    

For more on simulation-based learning, register for ATD’s Essentials of Experiential Learning and Simulations, facilitated by Ken Spero.  

Note: This post is excerpted from the Future Pharma article, “Solve the Training Catch-22.”   

About the Author

The Association for Talent Development (ATD) is a professional membership organization supporting those who develop the knowledge and skills of employees in organizations around the world. The ATD Staff, along with a worldwide network of volunteers work to empower professionals to develop talent in the workplace.

Be the first to comment
Sign In to Post a Comment
Sorry! Something went wrong on our end. Please try again later.