‘One mistake and you are dead’ – isn’t how accidents normally happen

- english ccr complacency decision making gareth lock human factors rebreather Jun 03, 2020

This phrase is often used to get people to pay attention to the hazards they face while undertaking a dive, especially a rebreather dive. If you don’t pay attention and make a mistake, then you are dead. The problem is that accidents don’t happen like this and as a consequence, we start to drift from ‘acceptable’ behaviour. 

This blog has been written to complement an article written by John Bantin from Undercurrent who highlights the risks involved in diving rebreathers, referencing the documentary I produced ‘If Ony…’ (https://www.thehumandiver.com/ifonly) Please visit the site and read his article https://www.undercurrent.org/blog/2020/06/01/the-tragic-and-un-necessary-death-of-brian-bugge/ as it gets across a number of key points well.

However, I do have an issue with the phrase used in the article and the title of this blog, and I will explain why.

Extended Practice of Motor Skills Makes for a More Efficient Brain ...

Our brains are wired for efficiency. We look for the most efficient solution to a problem – maybe by taking less time, spending less money, cutting corners in the processes we follow or spending less attention on a task that has become well practised. This is normal human behaviour. Unfortunately, we often judge whether the task was successful by looking at the outcome, not by looking at how we got there. This is a cognitive bias, a mental shortcut we take to save mental energy – good outcome means we had a good process, so let’s repeat that process the next time. (Unless we engage in effective debriefing to highlight the drift)

As we erode the standards we have been taught, we erode the safety margins built into the system (the ‘system’ consists of training, equipment, environment, procedures and humans and their behaviour) until we are operating on a knife-edge, an edge which we don’t know it might be. This is more commonly known as ‘normalisation of deviance’ and this video provides a little more insight into the topic.

Normalisation of Deviance - A Brief Explanation from The Human Diver on Vimeo.

This drift or normalisation of deviance doesn’t just happen at the individual diver level, this happens at the instructor, dive centre and agency level. Note, if an instructor drifts and teaches the student what the instructor thinks is right but is wrong, the student is unlikely to know any better, so is it a student issue if something happens later due to a lack of knowledge/skills? How does the organisation ensure that drift is limited? End of course forms are unlikely to have much effect on this as students don't know what they don't know.

Now we’ve covered that drift is normal, let’s look at we can do to prevent this drift, complacency or normalisation of deviance. As a diving community, we don’t have to reinvent the wheel as the high-risk industries have already done the hard work in the latter part of the last century between 1960-2000. They developed concepts like human factors, and human and organisational performance, in which they recognised error-producing conditions and developed barriers/defences to prevent accidents from happening.

Human Error: Amazon.co.uk: Reason, James: 0783324940244: Books

One of the most well-known models to explain this process is the ‘Swiss Cheese Model’ from Professor James Reason. Reason explained that if we can design systems, tools and processes at different levels within a system that prevent an error from occurring, providing ‘defences in depth’, then we reduce the likelihood of an error making it all the way to those at the ‘sharp end’. Normally, ‘sharp end’ refers to workers, but the same concepts apply to divers too. These levels were defined as Organisational, Supervisory and Individual. The Swiss Cheese Model is normally shown as a static image, but this is not how things work in the real world. The holes are created by humans because of their fallibility move around, and open and close too. This video shows a dynamic model of the Swiss Cheese Model. 

Simple Swiss Cheese Model from The Human Diver on Vimeo.

 This is a simple, linear model with has an obvious cause and effect process: something happens back in time which impacts something in the future. We think we can solve this by filling the holes, breaking the chain of events, and so the incident doesn't happen. Errors/mistakes/violations still exist within the system, but they can't propagate through to completion leading to an accident/incident.

This model or analogy was valid for a considerable amount of time, but as researchers and industries started to develop ideas on systems thinking and complexity, there was a recognition that you didn’t need to have a chain of events to have an accident, you could have an emergence of factors leading to an accident. In this case, you might have different pressures or issues coming together, and each on their own wouldn’t have caused an issue. In effect, you create a critical mass and it is why I created the following two animations. These are my ideas and they could be rubbish academically, but to me, they help demonstrate or explain emergence!

In this second animation, I have created a spherical version of the Swiss Cheese Model, in which each of the layers or shells is a layer in the Swiss Cheese Model e.g. Organisation Influence, Supervisory Failure, Latent Failures and Active Failures, and at the centre there is a detector or sensor which will ‘explode’ once a critical amount of ‘energy’ falls on it. This animation has a ‘big hole’ in it which allows lots of ‘energy’ in a short period of time to fall on the detector. Note that ‘energy’ can come from different directions and failures.

The Big Hole Model from The Human Diver on Vimeo.

In this final animation, we have something similar. However, in this case, we have lots of little holes allowing ‘energy’ to fall onto the sensor with the same result.

The Little Hole Model from The Human Diver on Vimeo.

If we look back at each of the incidents or adverse events in the three animations, we can identify those critical or causal factors which contributed to the event (hindsight bias). However, in real time, we are dealing with multiple competing goals which erode the safety margins which have been created – again, this is normal human behaviour. We don’t know which of those factors will be the one which finally pushes the system to reach critical mass. This reminds of Sidney Dekker’s comment “Cause is something we construct. It isn’t something you find.”

That doesn’t mean we are all doomed. There are recognised ways of reducing drift at all levels within the system, at the organisational level, the supervisory level and for the diver at the ‘sharp end’. The UK Health and Safety Executive and Cranfield University wrote this report which highlights how accidents/incidents can be reduced in CCR diving, which included teaching divers about human factors and human error. 

The tools, procedures and processes which are available are not enough, they need to be employed and checked/audited for effectiveness. If you don’t check on how well an organisation, an instructor or a diver is doing, don’t be surprised when they have drifted a fair way from the standards and you have an accident due to drift and accumulation of bad habits/old equipment/old procedures. The accident will likely be a surprise, despite having been made throughout the system and they didn’t end up with someone dead…until that critical mass has been reached, a point which you cannot predict with 100% accuracy. Note, 'checked/audited' also means holding yourself accountable to what you have been taught using tools like checklists, briefs and debriefs.



Gareth Lock is the owner of The Human Diver, a niche company focused on educating and developing divers, instructors and related teams to be high-performing. If you'd like to deepen your diving experience, consider taking the online introduction course which will change your attitude towards diving because safety is your perception, visit the website.