Systemic Lapse Theory: A Universal Framework for Understanding Accident Causation
Author: Paul Harrow, Shield Safety: www.myshieldsafety.com
Contact: paulharrow@icloud.com
Abstract
This article proposes the Systemic Lapse Theory, a universal framework asserting that all major catastrophic events are precipitated by systemic lapses in one or more of three critical factors: lack of rigor, lack of human caring, and normalized deviance. These lapses create vulnerabilities leading to failures in domains such as engineering, aviation, and industrial systems. Through a qualitative analysis of five historical case studies—the 1986 Challenger Space Shuttle disaster, 2010 Deepwater Horizon oil spill, 1981 Kansas City Hyatt Regency skywalk collapse, 1986 Chernobyl nuclear disaster, and 1979 American Airlines Flight 191 crash—this study demonstrates the theory’s applicability. The Systemic Lapse Theory offers a concise framework for understanding accident causation and developing preventive strategies. Implications for safety management and directions for future research are discussed.
Keywords: Accident causation, Systemic Lapse Theory, lack of rigor, lack of human caring, normalized deviance, safety science
1. Introduction
Major catastrophic events, including engineering failures, aviation disasters, and structural collapses, consistently reveal human and organizational factors as root causes, transcending technical specifics. While models like Reason’s (1990) Swiss Cheese Model highlight layered system failures, a more focused framework emphasizing systemic lapses is needed. This article introduces the Systemic Lapse Theory, a novel framework developed by the author, positing that all major catastrophic events arise from systemic lapses in three critical factors: lack of rigor, lack of human caring, and normalized deviance. “Systemic” reflects the interconnected failures across organizational, technical, and cultural systems, while “lapse” denotes critical oversights precipitating disasters. This study formalizes the Systemic Lapse Theory, tests its universality through case studies, and explores its implications for safety science.
2. Theoretical Framework
The Systemic Lapse Theory is grounded in three interrelated factors that form the basis of catastrophic events:
2.1 Lack of Rigor
Lack of rigor refers to insufficient thoroughness, discipline, or adherence to standards in planning, execution, or oversight. This includes inadequate risk assessments, incomplete testing, or failure to follow protocols, creating systemic vulnerabilities.
2.2 Lack of Human Caring
Lack of human caring denotes a disregard for the welfare of individuals affected by decisions, such as employees, passengers, or communities. This often manifests in prioritizing efficiency or cost over safety, undermining ethical safeguards.
2.3 Normalized Deviance
Normalized deviance, as defined by Vaughan (1996), occurs when deviations from safety standards become accepted as normal. Over time, these deviations erode safety margins, increasing catastrophic risk.
The Systemic Lapse Theory asserts that these three factors—lack of rigor, lack of human caring, and normalized deviance—are universally present, alone or in combination, in all major catastrophic events, providing a unified framework for analysis.
3. Methodology
This study employs a qualitative case study approach to test the universality of the Systemic Lapse Theory. Five major catastrophic events were selected for their scale, impact, and diversity: the 1986 Challenger Space Shuttle disaster, 2010 Deepwater Horizon oil spill, 1981 Kansas City Hyatt Regency skywalk collapse, 1986 Chernobyl nuclear disaster, and 1979 American Airlines Flight 191 crash. Each case was analyzed to identify the presence of lack of rigor, lack of human caring, and normalized deviance, using primary sources (e.g., official investigation reports) and secondary sources (e.g., peer-reviewed studies). The analysis focused on human and organizational factors, drawing on safety science and organizational behavior literature.
4. Case Studies
4.1 Challenger Space Shuttle Disaster (1986)
The Challenger explosion, killing seven crew members, resulted from O-ring seal failure in cold temperatures. Lack of rigor is evident in NASA’s failure to rigorously address known O-ring vulnerabilities, despite engineer warnings (Vaughan, 1996). Normalized deviance occurred as O-ring erosion was repeatedly observed but deemed acceptable. Lack of human caring is reflected in prioritizing launch schedules over crew safety.
4.2 Deepwater Horizon Oil Spill (2010)
The Deepwater Horizon disaster, which killed 11 workers and caused environmental devastation, involved multiple failures. Lack of rigor is evident in inadequate well control procedures and safety system testing (National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling, 2011). Normalized deviance manifested in routine bypassing of safety checks. Lack of human caring is seen in prioritizing cost and speed over worker and environmental safety.
4.3 Kansas City Hyatt Regency Skywalk Collapse (1981)
The collapse of two skywalks at the Kansas City Hyatt Regency Hotel killed 114 people and injured over 200. Lack of rigor occurred in the design and review process, where a critical change to the skywalk support system—doubling the load on connection rods—was not adequately analyzed (Marshall et al., 1982). Normalized deviance is evident in accepting unverified design changes as standard practice. Lack of human caring is reflected in prioritizing cost and schedule over guest safety.
4.4 Chernobyl Nuclear Disaster (1986)
The Chernobyl explosion resulted from a flawed reactor design and a poorly executed safety test. Lack of rigor is evident in design and testing protocols that failed to address known instabilities (International Nuclear Safety Advisory Group [INSAG], 1986). Normalized deviance occurred through routine violations of safety protocols. Lack of human caring is inferred from the systemic disregard for operator and public welfare.
4.5 American Airlines Flight 191 Crash (1979)
The crash of American Airlines Flight 191, killing 273 people, followed an engine detachment during takeoff. Lack of rigor is evident in inadequate maintenance procedures, where improper forklift use caused undetected damage (National Transportation Safety Board [NTSB], 1979). Normalized deviance occurred as maintenance shortcuts became accepted despite deviating from guidelines. Lack of human caring is reflected in prioritizing cost-saving practices over passenger and crew safety.
5. Discussion
The case studies confirm that the Systemic Lapse Theory applies universally, with each event exhibiting lack of rigor, lack of human caring, and normalized deviance, often in combination. The Kansas City Hyatt Regency collapse demonstrates how lapses in design rigor and normalized deviance in accepting unverified changes led to catastrophe. Similarly, the American Airlines Flight 191 crash highlights lapses in maintenance rigor, normalized shortcuts, and cost-driven disregard for safety. These findings, alongside the Challenger, Deepwater Horizon, and Chernobyl cases, affirm the theory’s applicability across diverse domains.
Potential counterexamples, such as natural disasters (e.g., the 2004 Indian Ocean tsunami), were considered. While natural events may seem independent, their catastrophic impacts are often amplified by systemic lapses, such as lack of rigor in preparedness or normalized deviance in ignoring risks. Technological failures, like the 2018-2019 Boeing 737 MAX crashes, further support the theory, with lapses in testing rigor and normalized acceptance of software flaws. These observations suggest the Systemic Lapse Theory’s broad applicability.
The theory offers practical implications for safety management. Organizations can address lack of rigor through process audits, foster human caring via empathetic leadership training, and mitigate normalized deviance with monitoring systems. Behavioral Based Safety programs can identify risky behaviors (Geller, 1984), while design thinking can enhance rigor and caring. Compared to Reason’s (1990) model, the Systemic Lapse Theory provides a concise focus on systemic human and organizational lapses.
6. Limitations and Future Research
While the Systemic Lapse Theory is robust across the analyzed cases, its universality requires testing in contexts like cybersecurity or public health crises. The subjective nature of “human caring” poses challenges for quantitative measurement, necessitating indicator development. Future research should explore the relative contributions of the three factors and evaluate interventions targeting them in high-risk industries. Longitudinal studies could assess whether applying the theory reduces accident rates.
7. Conclusion
The Systemic Lapse Theory provides a robust framework for understanding the root causes of major catastrophic events. By asserting that systemic lapses in rigor, human caring, or adherence to safety norms are universally present, the theory offers a clear lens for analyzing and preventing disasters. Its focus on human and organizational factors positions it as a significant contribution to safety science, with potential to guide proactive safety cultures worldwide.
Declaration of Interest
The author declares no conflicts of interest.
References
Columbia Accident Investigation Board. (2003). Report Volume I. National Aeronautics and Space Administration.
Geller, E. S. (1984). The psychology of safety: How to improve behaviors and attitudes on the job. Chilton Book Company.
International Nuclear Safety Advisory Group. (1986). Summary report on the post-accident review meeting on the Chernobyl accident (INSAG-1). International Atomic Energy Agency.
Marshall, R. D., Pfrang, E. O., Leyendecker, E. V., & Woodward, K. A. (1982). Investigation of the Kansas City Hyatt Regency walkways collapse (NBSIR 82-2466). National Bureau of Standards.
National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling. (2011). Deep water: The Gulf oil disaster and the future of offshore drilling. U.S. Government Printing Office.
National Transportation Safety Board. (1979). Aircraft accident report: American Airlines Flight 191, McDonnell Douglas DC-10-10, Chicago, Illinois, May 25, 1979 (NTSB-AAR-79-17).
Reason, J. (1990). Human error. Cambridge University Press.
Shrivastava, P. (1987). Bhopal: Anatomy of a crisis. Ballinger Publishing.
Vaughan, D. (1996). The Challenger launch decision: Risky technology, culture, and deviance at NASA. University of Chicago Press.