Mission success is achieved when lessons learned are shared not just within a project team, but across the entire organization.
A coherent KM strategy and implementation is vital to specifically addressing how knowledge is created, retained, shared, and transferred throughout the federal organization and with its partners and contractors. It requires dynamic contextual learning that supports the effective transfer and use of knowledge throughout the organization.
NASA's driving motivation concerning knowledge is ultimately mission success. This emphasis is inevitably affected by the inherent nature of complexity in its many forms, such as:
- continuous adjustments in organizational lines of authority and responsibility that alter project risk
- technical complexity across and within multiple systems
- system effects driven by change in terms of people, funding, priorities, support systems, and technological maturity
- interface management of the core and extended team, as well as stakeholders and customers at multiple levels of involvement and engagement.
In this complex environment, there are many definitions and views on what knowledge means and consists of. For NASA it involves unique requirements, solutions, and expertise shared across individuals, teams, projects, programs, mission directorates, and centers.
In the traditional sense of understanding what is being shared, it can be broken down into codified knowledge (represented by scientific knowledge, engineering and technical knowledge, and business processes) and know-how (represented by techniques, processes, procedures, and craftsmanship). Both are critical for mission success and, in our experience, significant improvements are gained through efforts that focus on the capture and flow of knowledge.
In a nontraditional sense, other types of knowledge play a significant role in projects, such as the social dimension of knowledge that allows individuals and groups to achieve success within the organizational context. A better understanding of the social context of project knowledge can serve as a basis for improved prioritization and a more pragmatic approach to problem solving.
Another example of nontraditional knowledge is that human cognition is colored by natural biases, such as those formed through experience and culture. NASA represents a complex technical organization consisting of several divergent domestic and international cultures with different perceptions. Understanding these perceptions are important for the success of NASA's projects, especially since 80 percent of the programs and projects are international in nature.
NASA knowledge services evolution
Defining events often shape organizational change. For example, the Challenger tragedy forced NASA into brutal introspection and action, creating a program and project management initiative designed to promote project management excellence and competency in advance of NASA's needs through improved individual training and development services.
At the time, NASA was still a leader in managing large, expensive, long-duration projects that offered technologically challenging programs, which allowed practitioners to engage in a natural progression of learning. Preparation of talent depended on the amount of time spent to gain professional experience and a reliance on "greybeards"—the previous generation of project talent—serving as mentors, coaches, and guides. Challenger led to individual preparation becoming systematized, codified, and improved.
Another defining event for KM at NASA was the set of Mars mission failures (the Climate Orbiter, Polar Lander, and Deep Space 2 probes), organized under a management paradigm called "Faster, Better, Cheaper." These mission failures challenged the status quo as Challenger did, which resulted in a focus on shared stories, new policy guidance, and a more disciplined approach, including improved hardware and software testing in science missions that did not involve crew safety issues.
In terms of technical workforce development, NASA realized that projects and programs happen in teams of talented people, not individuals. Team development thus became a focus for the agency.
The Columbia mishap echoed the Challenger disaster. Detaching foam damaged the wing on ascent, but NASA did not openly look at technical options. Assumptions were made that were technically indefensible, such as the thought that the leading edge materials were tougher than the thermal tiles and could not be easily compromised. Communications and interpersonal dynamics were shown to be ineffective, and the Shuttle Program heard but did not listen to engineering and safety concerns.
Again, the agency went through another brutal introspection that drove a relearning of lessons from case studies, new multidiscipline knowledge-sharing forums, and major governance and policy changes. These changes resulted in the creation of the NASA Engineering Safety Center, a change in NASA governance on the balance of power in technical missions, an emphasis on defining technical authority in mission decisions, and the adoption of mechanisms to help communications and improve interpersonal dynamics that can defeat phenomena such as "organizational silence" (the tendency to say or do little despite the presence of significant organizational threats). NASA realized it had to take an integrated KM approach to ensure mission success.
KM in NASA today
In 2011, the NASA Aerospace Safety Advisory Panel reported that the agency needed to create a more systematic approach to capturing implicit and explicit knowledge, and recommended the appointment of a formal agency-level chief knowledge officer, supported by a set of appointed chief knowledge officers at each center and mission directorate. NASA concurred with this advice and appointed the recommended personnel in 2012.
This focused the knowledge services effort into the office of the chief engineer, further evolving its functions toward serving as an enterprisewide project management office. This office created a structure responsible for developing and implementing the strategy, policy, standards, workforce development, advanced concepts, mission architecture, integration across program and mission boundaries, and program assessment for overall technical workforce development that supports project and program success at an enterprise level.
Any KM approach for NASA needs to be adaptable and flexible enough to accommodate the varied requirements and cultural characteristics of each center and mission directorate. A federated model certainly was the best fit for the agency, defining the NASA chief knowledge officer and deputy chief knowledge officer functions as facilitators and champions for agency knowledge services. Through extensive collaboration with all customers and stakeholders in the knowledge process, the NPD 7120 NASA Knowledge Policy for Programs and Projects was rewritten to adjust to the fact that NASA had greatly expanded its knowledge activities during the past several years to include a wider array of services than simply archiving lessons learned.
This new policy defines a set of strategic imperatives that target NASA objectives for knowledge and emphasizes the development and implementation of future knowledge initiatives, measures, and metrics:
- In terms of people, sustain and expand the use of the agency's intellectual capital across NASA's enterprises and generations through better networks, alliances, and communities of practice.
- In terms of people, increase collaboration across organizational barriers through promotion of a culture of openness.
- In terms of systems, support the technical workforce in executing NASA's missions efficiently and effectively through lessons learned, mishap reports, and promulgation of best practices.
- In terms of systems, create a marketplace for knowledge that identifies the value of information and aligns practitioner and organizational imperatives through accessible information and user-friendly services.
The knowledge community recognized that knowledge takes many different forms in the agency. Some knowledge can be found through self-service, such as typing a query in a search box and getting answers that point the practitioner in the right direction. This only involves one person at a time, and works with explicit knowledge that does not require much context or personal judgment.
At the other extreme, how do practitioners, for example, learn to make go or no-go launch decisions when faced with data that raise uncertainties in the decision-making process? That kind of tacit knowledge is dependent on context and personal judgment, and requires social interaction between people.
One of the most striking observations that NASA's knowledge community made in February 2012 was the sheer depth and breadth of activity already under way across the centers and mission directorates. Given the range of possible knowledge activities from self-directed queries to social interactions such as sharing stories with each other, they identified a set of categories that addressed most of the activities taking place across NASA that could be populated on the first-ever Agency Knowledge Map:
- Online tools—Include but are not limited to portals, document repositories, collaboration and sharing sites, and video libraries.
- Search, tag, and taxonomy tools—Dedicated search engine for knowledge (for example, Google Search Appliance) and any initiatives related to metatagging or taxonomy.
- Case studies and publications—Original documents or multimedia case studies that capture project stories and associated lessons learned or best practices.
- Lessons learned/knowledge processes—Any defined process that an organization uses to identify or capture knowledge, lessons learned, or best practices.
- Knowledge networks—Any defined knowledge network (such as a community of practice, expert locator, collaboration activity, and workspaces) designed to enable exchanges and collaboration.
- Social exchanges—Any activities that bring people together in person to share knowledge, such as forums and workshops. The reach of these activities can be multiplied through online tools such as videos and virtual dialogues.
These are by no means the only available categories, and are not a perfect fit for every type of knowledge activity. With awareness that the perfect is the enemy of the good, the knowledge community used these categories as an initial starting point that could be modified during iterative reviews.
A closing challenge
There are fundamental challenges in creating a knowledge system that is efficient and effective. Consider this story based on reality, but changed to protect the innocent.
A program is undergoing a design review. The program is a complex aerospace mission that is sending a scientific payload on a rocket with a 10-year journey to Uranus. During the program review, questions are raised about engineering mishaps. The discussion centers on methods for addressing the mishaps and how effectively the program captured the lessons and shared them to mitigate future risks.
The team members talk with great excitement about their commitment and approach to knowledge capture and lessons learned. Then a review team member asks, "What is being done to ensure these lessons are being formally and systematically captured and made accessible across the whole organization?" Silence envelopes the room. Finally, the program team indicates that there is no system that effectively captures and shares these critical lessons.
We return to the still-unresolved question that began this article: Where does your technical workforce go to find out what they don't know?
This troubling vignette continues to occur today within NASA, even with all of the progress toward developing an integrated KM system. Can such lessons be found in five or 10 years? This issue of access and search is the greatest challenge.
The modern organization and project is committed to identifying and sharing critical knowledge and lessons within a team, but it is increasingly hard to find such knowledge across a system of systems and across project boundaries. NASA has learned the power of sharing and learning within our project teams, but we falter when the knowledge should travel across larger boundaries. In an increasingly complex and interconnected world, this integration will be the difference between success and failure.
Biases That Accompany Decision-Making Processes
In our interview with Nobel Prize–winning scientist Daniel Kahneman on his recent New York Times bestseller Thinking Fast and Slow, he clarified how humans address increasing levels of complexity in the project environment through heuristics that can introduce errors into decisions. Some examples follow.
- Availability—making judgments on the probability of events by how easy it is to think of examples and their consequences.
- Substitution—substituting a simple question for a more difficult one.
- Optimism and loss aversion—generating the illusion of control over events and fearing losses more than we value gains.
- Framing—choosing the more attractive alternative if the context in which it is presented is more appealing.
- Sunk-cost—throwing money at failing projects that already have consumed large amounts of resources to avoid regret.
- Mental filter—focusing on one feature of something that influences all subsequent decisions.