Beta
Podcast cover art for: Can 'normalisation of deviance' help to explain a catastrophe?
All In The Mind
Australian Broadcasting Corporation·14/02/2026

Can 'normalisation of deviance' help to explain a catastrophe?

This is a episode from podcasts.apple.com.
To find out more about the podcast go to Can 'normalisation of deviance' help to explain a catastrophe?.

Below is a short summary and detailed review of this podcast written by FutureFactual:

Normalization of Deviance in High-Risk Systems: Challenger, Columbia, and Lessons for Prevention

Executive snapshot

This episode examines how organizations can normalize risky shortcuts, gradually treating deviations as standard practice. Anchored by the Challenger and Columbia disasters, it expands to healthcare, firefighting, and other high-stakes domains to reveal the cognitive and organizational forces that push systems toward failure, and it discusses strategies for staying vigilant, listening to frontline intuition, and pursuing forward-looking accountability to prevent future catastrophes.

Quotes from the program illuminate the core ideas, including the call to stay uneasy about success, the notion that deviations emerge under pressure, and the ultimate goal of incident investigations to prevent recurrences.

Introduction and central question

All in the Mind’s exploration centers on a concept coined to describe how people and organizations drift toward unsafe practices: normalization of deviance. The episode frames this through high-profile disasters and real-world case studies, arguing that in complex, high-risk systems, people do not set out to fail; they seek efficiencies and safety within the constraints of deadlines, budgets, and pressures. The host and guests acknowledge that such deviations are often locally rational, even if they appear harmful in hindsight, and that understanding these dynamics is essential to preventing repeat disasters.

"Normalization of deviance can happen in any industry, but the places where it’s perhaps more likely" - Dr. Nate Sedler, University of Aberdeen.

Normalisation of deviance, drift, and local rationality

The discussion builds a framework around three core ideas: normalization of deviance, drift into failure, and local rationality. Normalization describes the slow acceptance of behaviors once deemed deviant, especially when prior deviations do not immediately produce negative outcomes. Drift into failure refers to the gradual, almost inevitable descent as the system and its actors become complacent, assuming tolerance for minor failures. Local rationality captures the notion that decisions make sense to those within a given context, shaped by goals, information, and attention at the moment. The conversation includes perspectives from Griffith University’s Sidney Decker and the University of Aberdeen’s Nate Sedler, who connect these concepts to real-world risks in spaceflight, healthcare, and emergency services.

"Stay uneasy about your success" - Professor Sidney Decker.

Case studies and the breadth of the problem

The Challenger disaster serves as a focal case, with engineers noting O-ring erosion—issues known for years—yet launch proceeds driven by schedule and budget pressures. Columbia’s foam strike and subsequent re-entry damage illustrate how seemingly small, known risks accumulate into catastrophic outcomes. The Costa Concordia cruise-ship incident and UK drug trials are discussed to demonstrate that normalization of deviance is not limited to aerospace but traverses multiple industries where lives hang in the balance. The episode articulates how these events arise from organizational incentives, cognitive biases like availability heuristics, and the pressure to perform under constraints.

"The aim of any incident investigation is to prevent the disaster from occurring again in the future" - Sana Qadar.

How to break the drift and improve accountability

To counter drift, the guests advocate several pathways: maintain healthy skepticism about success, cultivate frontline intuition alongside data, and reframe accountability as forward-looking responsibility rather than after-the-fact blame. They argue that systems should stay uneasy about their own safety margins, and that investigations should emphasize who is affected and what needs to be done next to prevent harm. The conversation also highlights the concept of second victims—those who signed off on decisions yet bore the consequences and guilt—emphasizing learning over punishment as a central aim.

"People don’t come to work to do a bad job" - Nate Sedler, University of Aberdeen.

Closing reflections

The episode closes by stressing the continued relevance of these lessons as organizations undertake ambitious projects, including NASA’s Artemis program. The speakers remind listeners that while rules and procedures exist to prevent harm, adherence is never guaranteed without ongoing vigilance, humility, and a commitment to learning from what goes wrong so it does not happen again.