SH45: It’s obvious why it happened!! (In hindsight)

In this podcast episode, Gareth reflects on the challenges of learning from near-misses, particularly in the context of recent tragic events involving the loss of the Titan submersible. The episode explores the biases that hinder our ability to analyze and learn from incidents, emphasizing the complexities of socio-technical systems and the difficulties in replicating conditions for learning. Drawing parallels with diving, Gareth discusses the dynamic nature of risks, the fallacy of binary safety assessments, and the importance of recognizing uncertainties. The episode delves into cognitive biases, heuristics, and psychological factors influencing decision-making, shedding light on the sunk cost fallacy, prospect theory, and the local rationality principle. It advocates for a culture of learning, critical debriefs, and the application of human factors principles in diving. Ultimately, the episode encourages listeners to approach incidents with curiosity, suspend judgment, and seek to understand the local rationality of those involved. The tragic loss of the Titan submersible serves as a poignant reminder of the imperative to learn and improve safety in complex systems. The episode concludes by honoring the lives lost in the incident.

Original blog:

https://www.thehumandiver.com/blog/its-obvious-why-it-happened

 

Links:

How Near-Misses Influence Decision Making Under Risk: A Missed Opportunity for Learning: https://pubsonline.informs.org/doi/10.1287/mnsc.1080.0869

If we want to learn, notice the conditions, not the outcomes: https://www.thehumandiver.com/blog/don-t-just-focus-on-the-errors

AccipMap: https://linkinghub.elsevier.com/retrieve/pii/S000368701730100X

DMAIB report: https://dmaib.com/reports/2021/beaumaiden-grounding-on-18-october-2021/

Implications for hindsight bias: https://www.semanticscholar.org/paper/Perspectives-on-Human-Error%3A-Hindsight-Biases-and-Woods/d913cdeae4e2782881a52e635e06c208b0796aed

If the adverse event occurs in an uncertain or unusual environment, then we are more likely to judge it more harshly: http://journals.sagepub.com/doi/10.1177/0146167292181012

Five principles behind High-Reliability Organisations (HRO): https://www.high-reliability.org/faqs?_gl=1*j0ylqo*_ga*NDkyNjExMzA3LjE2ODc2Nzc2NTI.*_ga_TM3DC1EMKK*MTY4NzY3NzY1MS4xLjEuMTY4NzY3OTI2OC4wLjAuMA..

Prospecive hindsight/Pre-mortems: https://www.thehumandiver.com/blog/how-to-help-correct-the-biases-which-lead-to-poor-decision-making

Red Team Thinking: https://www.redteamthinking.com/

Guy’s blog, Is the Juice Worth the Squeeze?: https://www.thehumandiver.com/blog/is-the-juice-worth-the-squeeze

Doc Deep’s final dive: https://gue.com/blog/i-trained-doc-deep/

Single and Double Loop learning: https://hbr.org/1977/09/double-loop-learning-in-organizations

Columbia Accident Investigaion Board: https://govinfo.library.unt.edu/caib/news/report/pdf/vol1/chapters/chapter8.pdf

New ways to learn from the Challenger disaster: https://dx.doi.org/10.1109/aero.2015.7118898

Drop your Tools: http://www.jstor.org/stable/2393722

Availability, Representativeness & Adjustment and Anchoring: https://www2.psych.ubc.ca/~schaller/Psyc590Readings/TverskyKahneman1974.pdf

Trieste record breaking dive: https://www.usni.org/magazines/proceedings/2020/january/first-deepest-dive

Resources from RF4 presentation: https://bit.ly/rf4-resources

Psychological safety, Tom Geraghty’s site: https://psychsafety.co.uk/

Normal Accidents: https://en.wikipedia.org/wiki/Normal_Accidents



Tags:

 English, Decision Making, Gareth Lock, Human Factors, Incident Investigation