Skip to main content

2018 | Buch

How Could This Happen?

Managing Errors in Organizations

insite
SUCHEN

Über dieses Buch

The first comprehensive reference work on error management, blending the latest thinking with state of the art industry practice on how organizations can learn from mistakes.

Even today the reality of error management in some organizations is simple: “Don’t make mistakes. And if you do, you’re on your own unless you can blame someone else.” In most, it has moved on but it is still often centered around quality control, with Six Sigma Black Belts seeking to eradicate errors with an unattainable goal of zero.

But the best organizations have gone further. They understand that mistakes happen, be they systemic or human. They have realized that rather than being stigmatized, errors have to be openly discussed, analyzed, and used as a source for learning.

In How Could This Happen? Jan Hagen collects insights from the leading academics in this field – covering the prerequisites for error reporting, such as psychological safety, organizational learning and innovation, safety management systems, and the influence of senior leadership behavior on the reporting climate.

This research is complemented by contributions from practitioners who write about their professional experiences of error management. They provide not only ideas for implementation but also offer an inside view of highly demanding work environments, such as flight operations in the military and operating nuclear submarines.

Every organization makes mistakes. Not every organization learns from them. It’s the job of leaders to create the culture and processes that enable that to happen. Hagen and his team show you how.

Inhaltsverzeichnis

Frontmatter
1. Fast, Slow, and Pause: Understanding Error Management via a Temporal Lens
Abstract
Managing errors in real time requires errors to be reported in a timely manner. Yet silence, or covering errors up, is a more natural tendency for organizational employees. The author uses a temporal lens to shed light on why and how errors are or are not reported. She suggests we begin to think not just about whether or not errors are reported, but also about how fast (or slowly) errors are reported, and when and why error reporting starts and stops. By focusing on timing, pace, and rhythms issues of error reporting and cultural differences in these issues, Lei highlights a possible error paradox. That is, although rapid actions and responses are needed in the heat of the moment, people need to pause, reflect, and explore in order to successfully report on and cope with errors. As Lei demonstrates, fast action or reporting should not necessarily be discouraged, but we should be alerted to the side effects of emphasizing speed over analysis.
Zhike Lei
2. Errors and Learning for Safety: Creating Uncertainty As an Underlying Mechanism
Abstract
If learning is to be encouraged, error and the resulting increase in uncertainty need to be permitted, and even actively sought, even though they may collide with an organization’s concerns about proving that they are safe. As the author shows, when decisions are made on how uncertainty should best be managed for particular work processes, stability and flexibility requirements need to be analyzed in view of the specific necessities for control and adaptation. The author makes it clear that uncertainty may be beneficial for safety in situations where there is a danger of the over-routinization of behavior due to highly standardized and repetitive task requirements.
Gudela Grote
3. When Silence Is Not Golden
Abstract
Drawing on a survey of nurses and airline personnel, the authors make it clear that if we want to empower others to speak up and minimize errors and incidents, we need to understand personal perceptions of risk in order to mitigate the worries and fears involved. We must demonstrate that the benefits of speaking up are indeed greater than the perceived personal costs involved. The authors show that speaking up has to be encouraged constantly for people on all organizational levels, and it must be done in an environment that is safe and 100 percent conducive to their input. However, it is leaders who must help create that environment.
Immanuel Barshi, Nadine Bienefeld
4. Executive Perspectives on Strategic Error Management
Abstract
The authors focus on CEOs supervising organizations in which grave errors occurred. Those errors challenged the organizations in their strategies and endangered their survival. As a result, executives and academics have begun to recognize that strategic error management is a critical feature of exercising the highest responsibility in an organization. The authors emphasize that, regardless of whether an organizational error occurs at the top or the bottom of the hierarchy, strategic error management refers to all actions that top executives of an organization undertake (or fail to) in order to disconnect latent errors from actual and potentially adverse consequences and to learn from them.
Vincent Giolito, Paul J. Verdin
5. The Strategic Imperative of Psychological Safety and Organizational Error Management
Abstract
Despite considerable discussion in the management literature about the need for flexible strategies and agile learning organizations, many—if not most—large organizations and their strategy processes remain top-down, slow to change, and fraught with obstacles to learning. A “strategy-as-learning” approach is presented that contrasts with the dominant conception of strategy-as-planning. Conceptualizing and practicing the work of organizational strategy as a learning process implies that strategy is about developing good questions and thoughtful hypotheses to be tested through execution. This produces a mode of operating called execution-as-learning.
Strategy-as-learning requires psychological safety, which enables speaking up, dissenting, error reporting, candidly discussing risks, and practicing organizational error management. Without these behaviors, especially at the executive levels, organizations are at risk of experiencing avoidable strategic failures.
Amy C. Edmondson, Paul J. Verdin
6. Learning Failures As the Ultimate Root Causes of Accidents
Abstract
Drawing on their ample experience in high-risk industries, the authors show that a number of major accidents have been preceded by warnings raised by people who attempted, unsuccessfully, to alert actors who had the ability to prevent a danger they perceived. The authors demonstrate that, very often, the dissenting opinions and whistleblowers were not heard due to cultures in which bad news was not welcome, criticism was frowned upon, or where a “shoot the messenger” attitude prevailed. Instead, performance pressures push systems in the direction of failure and lead organizations to reduce their safety margins. As a result, a “new normal” is established, and no significant problems are noticed until it is too late.
Nicolas Dechy, Yves Dien, Eric Marsden, Jean-Marie Rousseau
7. Understanding Safety Management Through Strategic Design, Political, and Cultural Approaches
Abstract
As the author shows, we need more than one lens when discussing the failure or success of error reporting. Looking through the strategic design lens, reporting should be part of everyone’s job description. Examined through the political lens, reporting systems are intertwined with power, status, and relationships and lead to conflicts that have to be addressed. Reporting becomes a cultural habit when people see others reporting, and these role models tell stories about having an impact from their reporting. The author explains how when people see that reporting helps address common problems, it becomes meaningful in a way that is not just part of a job description.
John S. Carroll
8. Errors and Error Management in Biomedical Research
Abstract
The authors put a much-needed focus on the validity of biomedical research results that has come under scrutiny. However, as they make clear, due to the complexity of the experiments involved, errors quite naturally occur frequently. A way of managing these errors is the “Laboratory Critical Incident and Error Reporting System” (LabCIRS), a software tool to record all incidents anonymously and to analyze, discuss, and communicate them. It has been adapted from the CIRS, used in the clinical world to improve patient safety in complex, fast-paced, and often understaffed settings.
Ulrich Dirnagl, René Bernard
9. Empowerment
Abstract
The author portrays empowerment and open communication as vital elements of error management. Based on his experience as a physician, he shows that, in order to empower teams in medical practice, the train-the-trainer method has turned out to be a successful approach to sustainably implement open communication. Further steps include sign-out procedures at the end of every surgical procedure, in which technical details and problems as well as communication issues are discussed. Communication problems and misunderstandings are to be tackled before they become chronic and impact the team climate. The author emphasizes that to empower others requires a form of leadership that moves away from a command-and-control style, works with facilitating, coaching, and guiding, and is open to feedback.
Jan Brommundt
10. Open Error Communication in a High-Consequence Industry
Abstract
In their work as health care professionals, both authors have gained vital experience in error management. They draw our attention to the human tendency to focus only on a single reason when dealing with errors. To mitigate this tendency, they show a two-step alternative process. The first step, a “sequence of events analysis,” is conducted immediately after an accident or near miss. This data capture serves to inform the later, second analysis, called a “focused event analysis.” The focused event analysis is a causal analysis study involving all key stakeholders for the purpose of seeking knowledge about the contributing variables and the steps that can be taken to eliminate system vulnerabilities.
Julianne Morath, Mallory Johnson
11. Confidence and Humility
Abstract
The author analyzes the cultural difficulties we have in situations where we should speak up but do not. Given a culture in which speaking up is not the norm, doing so means opposing the rules and standing out. It contains the possibility that, through our action, we isolate ourselves to such an extent that we are no longer part of the community. Therefore, if we want to encourage people to speak up, one way is to create the culture in which speaking up is part of its norms. In most cases, this will demand systemic changes, which can only happen if those who have the power to build the necessary culture possess the characteristics it takes, namely, confidence and humility.
Robert Schroeder
12. Just Culture
Abstract
The author stresses the necessity of creating a just culture in an organization, in which frontline operators and others are not punished for actions, omissions, or decisions taken by them, but in which gross negligence, willful violations, and destructive acts are not tolerated. The major prerequisites, he states, are trust and a management that walks the talk, communicates its dedication to a fair process in an open and transparent way, and makes sure that there are no inherent incentives for risk behavior. The reporting system itself must be easily accessible, and reports must be followed up expeditiously to address error-producing conditions. These have to be dealt with, even if they require changes that affect organizational processes and the corporate culture itself.
Helmut Kunz
13. Error Management in the German Armed Forces’ Military Aviation
Abstract
Based on his extensive knowledge gained in leading positions of military aviation, the author demonstrates that reaching, and sometimes even exceeding, the limits of human capacity plays an increasingly crucial role in incidents involving modern and highly complex systems. As a result, the author states, we have to realize that people—even when they are qualified personnel—err and cannot prevail in all critical situations. In addition, one person alone will no longer be able to handle the mass of data contained in highly complex systems. Instead, individual experts must turn into communicative team players. This requires that soft as well as hard skills have to be continuously trained and evaluated.
Peter Klement
14. Crew Resource Management Revisited
Abstract
The author offers a long-overdue, critical evaluation of the application of crew resource management (CRM) in aviation. Three decades ago, CRM was developed to reduce the hierarchy gradient on the flight deck. The aim was to achieve open, factual exchanges of information and thought processes in order to ensure the safe operation of flights. However, initial results from an ongoing research study the author has conducted with others have shown that most of the interviewed pilots portrayed CRM not as a fixed, integrated part of their procedures for increasing safety but rather as an add-on that ranked below carrying out their mission, safety, and standard operating procedures. In other words, either CRM has to be reshaped or the training needs to be intensified.
Jan U. Hagen
15. Error Reporting and Crew Resource Management in the Israeli Air Force
Abstract
The author portrays the lack of insights into the use of crew resource management (CRM) in military aviation. He shows that there are features of CRM that are not compatible with the goals of military aviation. Whereas in civil aviation safety and risk avoidance are paramount, the crews of military flight missions are expected to first account for the completion of the mission, and hence take risks. The more important the mission, the more risk the crew is expected to take. The author makes it clear that an adaptation of the tools and language of CRM for the military is therefore essential, during which time the unique characteristics of military aviation need to be taken into consideration.
Avner Shahal
16. Lessons from a Nuclear Submarine Mishap
Abstract
Drawing on his experience as a former nuclear submarine commander of the US Navy, the author demonstrates that an organizational structure in which a senior executive is directing processes is fundamentally fragile and susceptible to error. In these systems of order and command, even when those lower in the hierarchy are invited to speak up, the fundamental structure of the hierarchy remains the same, with senior personnel making decisions based upon the information they interpret. The author describes a more resilient approach he used, namely, empowering teams and crews so that their leaders do not have to give (error-prone) orders but can rely on the capabilities of those working with them.
L. David Marquet
17. The War on Error: A New and Different Approach to Human Performance
Abstract
As the author states, errors and their consequences are the products of internal and external conditions that can be seen and controlled in advance by people operating in real-time environments armed with the right body of knowledge, tools, and techniques. He explains that everybody has a personal fingerprint of error and tends to make the same kinds of mistakes. After they understand what their personal error pattern is, error control is a matter of learning new information and applying a new set of tools. The author suggests that the phrase “to err is human” should be replaced by “to improve is human.”
Tony Kern
Backmatter
Metadaten
Titel
How Could This Happen?
herausgegeben von
Prof. Dr. Jan U. Hagen
Copyright-Jahr
2018
Electronic ISBN
978-3-319-76403-0
Print ISBN
978-3-319-76402-3
DOI
https://doi.org/10.1007/978-3-319-76403-0

Premium Partner