From Wikipedia, the free encyclopedia

The Swiss cheese model of accident causation illustrates that, although many layers of defense lie between hazards and accidents, there are flaws in each layer that, if aligned, can allow the accident to occur.

The Swiss cheese model of accident causation is a model used in risk analysis and risk management, including aviation safety, engineering, healthcare, emergency service organizations, and as the principle behind layered security, as used in computer security and defense in depth. It likens human systems to multiple slices of Swiss cheese, which has randomly placed and sized holes in each slice, stacked side by side, in which the risk of a threat becoming a reality is mitigated by the differing layers and types of defenses which are “layered” behind each other. Therefore, in theory, lapses and weaknesses in one defense do not allow a risk to materialize (e.g. a hole in each slice in the stack aligning with holes in all other slices), since other defenses also exist (e.g. other slices of cheese), to prevent a single point of failure. The model was originally formally propounded by James T. Reason of the University of Manchester, and has since gained widespread acceptance. It is sometimes called the “cumulative act effect”.

Although the Swiss cheese model is respected and considered to be a useful method of relating concepts, it has been subject to criticism that it is used too broadly, and without enough other models or support.[2]

Holes and slices[edit]

Emmental cheese with eyes. Each slice will have holes of varying sizes and positions.

In the Swiss cheese model, an organisation’s defenses against failure are modeled as a series of imperfect barriers, represented as slices of cheese, specifically Swiss cheese with holes known as “eyes“, such as Emmental cheese. The holes in the slices represent weaknesses in individual parts of the system and are continually varying in size and position across the slices. The system produces failures when a hole in each slice momentarily aligns, permitting (in Reason’s words) “a trajectory of accident opportunity”, so that a hazard passes through holes in all of the slices, leading to a failure.[3][4][5][6]

Frosch[7] described Reason’s model in mathematical terms as a model in percolation theory, which he analyses as a Bethe lattice.

Active and latent failures[edit]

The model includes Active and Passive failures. Active failures encompass the unsafe acts that can be directly linked to an accident, such as (in the case of aircraft accidents) a navigation error. Latent failures include contributory factors that may lie dormant for days, weeks, or months until they contribute to the accident. Latent failures span the first three domains of failure in Reason’s model.[8]

In the early days of the Swiss Cheese model, late 1980 to about 1992, attempts were made to combine two theories: James Reason’s multi-layer defence model and Willem Albert Wagenaar’s tripod theory of accident causation. This resulted in a period in which the Swiss Cheese diagram was represented with the slices of cheese labelled as Active Failures, Preconditions and Latent Failures.

These attempts to combine these theories still causes confusion today. A more correct version of the combined theories is shown with the Active Failures (now called immediate causes) Precondition and Latent Failure (now called underlying causes) shown as the reason each barrier (slice of cheese) has a hole in it and the slices of cheese as the barriers.


New Zealand’s Swiss cheese model for managing COVID-19[9]

The same framework can be applicable in some areas of healthcare. For example, a latent failure could be the similar packaging of two drugs that are then stored close to each other in a pharmacy. This failure would be a contributory factor in the administration of the wrong drug to a patient. Such research led to the realization that medical error can be the result of “system flaws, not character flaws”, and that greed, ignorance, malice or laziness are not the only causes of error.[10]

The framework has also been applied to a range of other areas.[11] For example, Lubnau, Lubnau, and Okray apply the model to the engineering of firefighting systems, aiming to reduce human errors by “inserting additional layers of cheese into the system”, namely the techniques of Crew Resource Management.[12] Olson and Raz apply the model to improve deception in the methodology of experimental studies, with multiple thin layers of cheese representing subtle components of deception which hide the study hypothesis.[13]

See also[edit]


  1. ^ “Revisiting the Swiss cheese model of accidents”. Eurocontrol. October 2006.
  2. ^ Daryl Raymond Smith; David Frazier; L W Reithmaier & James C Miller (2001). Controlling Pilot Error. McGraw-Hill Professional. p. 10. ISBN 0071373187.
  3. ^ Jo. H. Wilson; Andrew Symon; Josephine Williams & John Tingle (2002). Clinical Risk Management in Midwifery: the right to a perfect baby?. Elsevier Health Sciences. pp. 4–6. ISBN 0750628510.
  4. ^ Tim Amos & Peter Snowden (2005). “Risk management”. In Adrian J. B. James; Tim Kendall & Adrian Worrall (eds.). Clinical Governance in Mental Health and Learning Disability Services: A Practical Guide. Gaskell. p. 176. ISBN 1904671128.
  5. ^ Stranks, J. (2007). Human Factors and Behavioural Safety. Butterworth-Heinemann. pp. 130–31. ISBN 9780750681551.
  6. ^ Robert A. Frosch (2006). “Notes toward a theory of the management of vulnerability”. In Philip E Auerswald; Lewis M Branscomb; Todd M La Porte; Erwann Michel-Kerjan (eds.). Seeds of Disaster, Roots of Response: How Private Action Can Reduce Public Vulnerability. Cambridge University Press. p. 88. ISBN 0521857961.
  7. ^ Douglas A. Wiegmann & Scott A. Shappell (2003). A human error approach to aviation accident analysis: the human factors analysis and classification system. Ashgate Publishing. pp. 48–49. ISBN 0754618730.
  8. ^ Wiles, Siouxsie (22 October 2020). “Siouxsie Wiles & Toby Morris: Covid-19 and the Swiss cheese system”. The Spinoff. Retrieved 28 October 2020.
  9. ^ Patricia Hinton-Walker; Gaya Carlton; Lela Holden & Patricia W. Stone (2006-06-30). “The intersection of patient safety and nursing research”. In Joyce J. Fitzpatrick & Patricia Hinton-Walker (eds.). Annual Review of Nursing Research Volume 24: Focus on Patient Safety. Springer Publishing. pp. 8–9. ISBN 0826141366.
  10. ^ Taylor, G. A.; Easter, K. M.; Hegney, R. P. (2004). Enhancing Occupational Safety and Health. Elsevier. pp. 140–41, 147–53, 241–45. ISBN 0750661976.
  11. ^ Thomas Lubnau II; Randy Okray & Thomas Lubnau (2004). Crew Resource Management for the Fire Service. PennWell Books. pp. 20–21. ISBN 1593700067.
  12. ^ Olson, Jay A.; Raz, Amir (2021). “Applying insights from magic to improve deception in research: The Swiss cheese model”. Journal of Experimental Social Psychology. 92: 104053. doi:10.1016/j.jesp.2020.104053. S2CID 228919455.


Read More