Is the Normal Accidents perspective falsifiable? Antti Silvast Department of Social Research, University of Helsinki, Helsinki, Finland, and Ilan Kelman Center for International Climate and Environmental Research – Oslo (CICERO), Oslo, Norway Abstract Purpose – This article is motivated by debates regarding Charles Perrow’s Normal Accidents perspective which describes how technological systems are prone to failure if they have complexity and tight coupling. The purpose of this paper is to explore Normal Accidents conceptually to understand whether or not it might be feasible to disprove it or to find counterexamples. Design/methodology/approach – The approach taken by this article is to identify and explore assumptions inherent in Normal Accidents which might make the perspective non-falsifiable. Findings – The findings and discussion cover two principal assumptions inherent in Normal Accidents. First, no past record of the absence of a Normal Accident excludes the possibility of a future Normal Accident. Second, analysis of a Normal Accident is always relative to the selected definition of the system, but a system can potentially be defined so that there was or was not a Normal Accident. Practical implications – Although the Normal Accidents perspective does not appear to be falsifiable, the perspective should still be taught and considered when designing and operating technological systems. The reason is that, even if Normal Accidents is a truism, it is not accepted as such, meaning that society is setting itself up for continual catastrophic failures of technology. Originality/value – An exploration is provided of digging beneath the “Normal Accidents” perspective and its discussants to explore why the perspective is so important yet so rarely implemented in practice. Keywords Complexity, Coupling, Normal accidents, System accidents, Technology, Systems analysis, Disasters Paper type Conceptual paper How normal are “Normal Accidents”? In 1984, sociologist and organizational scholar Charles Perrow published the book Normal Accidents: Living With High Risk Technologies (Perrow, 1984) which he updated 15 years later (Perrow, 1999). The book drew attention to various human-built systems – including nuclear power plants, airplanes, and ships – consisting of multiple, interrelated components. The book’s particular interest was these systems’ catastrophic potential. The system catastrophes stem from two-system traits according to Perrow (1984, 1999). The first trait is complexity. If a system has many components and if they fail in sequence or simultaneously, then the consequence could be a catastrophic failure destroying the system. Examples are nuclear power plant explosions and airplane crashes. This failure might be unstoppable due to the complexity, since so many components are involved. The failure might also, apparently, be unexpected or entirely inconceivable by the designers and operators of the system prior to the disaster, because so much happened to so many components. That, again, is the manifestation of complexity. The current issue and full text archive of this journal is available at www.emeraldinsight.com/0965-3562.htm Disaster Prevention and Management Vol. 22 No. 1, 2013 pp. 7-16 r Emerald Group Publishing Limited 0965-3562 DOI 10.1108/09653561311301934 7 Normal Accidents perspective