SH182: Joining Dots is Easy, Especially If You Know the Outcome

In this episode, we discuss the complexities of learning from mistakes and adverse events in diving and beyond. Using real-world examples, including a technical diving error and a high-profile medical case, we explore how systemic pressures, biases like hindsight and confirmation bias, and the gap between "work as imagined" and "work as done" influence decisions. We highlight the importance of Just Culture in fostering open discussions and meaningful learning, emphasizing that improving safety means addressing systemic issues, not just individual actions. Join us to rethink how we approach errors and build resilience in high-pressure environments.

Original blog: https://www.thehumandiver.com/blog/joining-dots-is-easy-if-you-know-the-outcome

 

Links: Last week’s blog: https://www.thehumandiver.com/blog/my-biggest-mistake

HFiD Facebook group: https://www.facebook.com/groups/184882365201810

Some cognitive biases: https://www.thehumandiver.com/blog/from_blaming_to_learning

RaDonda Vaught verdict: https://www.npr.org/sections/health-shots/2022/03/25/1088902487/former-nurse-found-guilty-in-accidental-injection-death-of-75-year-old-patient

Learning from RaDonda Vaught case: https://www.linkedin.com/pulse/reckless-homicide-vanderbilt-just-culture-analysis-david-marx/

The learning line (page 7, section 6): http://sunnyday.mit.edu/16.863/rasmussen-safetyscience.pdf

Learning organisation: https://gue.com/blog/improvement-requires-learning-learning-happens-at-the-organizational-level-too/

 

Tags:  English, Decision Making, Gareth Lock, Hindsight Bias, Just Culture, Psychological Safety