“The root cause of an accident is our imagination”Sep 16, 2021
“The root cause of an accident is our imagination” - Nippin Anand
Many cave courses have accident analysis as a key element of instruction because it shows how and why certain procedures have developed over time and are executed in the manner they are. Therefore, you might think that the headline quote is wrong because we have evidence that something went wrong, and it has been corrected. The problem with that approach is that adverse events (accidents, incidents, and near-hits) are not simple, linear events with a single root cause, they are the convergence of multiple factors. Sometimes these are not even considered to be relevant to the activity at hand but with the benefit of hindsight, we can easily see how relevant it was. In addition, our own cognitive biases mean that we tend to focus on an individual and their failure(s) rather than the context in which they are diving in and how it made sense for them to do what they did. This doesn’t just relate to diving and the diving world it is called the fundamental attribution bias or error for a reason – it affects all our lives to some extent or other. (See recent GUE InDepth article).
Outcomes are a function of…
As an example of the interdependent nature of factors, in 2019, I co-wrote a paper with two cardio-thoracic surgeons that examined the rise of human factors in the surgical domain and how this had increased patient safety and associated outcomes. The key thread that ran throughout the paper was that the technical skills of the surgical team were not enough to provide successful outcomes in the complex environment of the operating theatre. Rather, outcomes were a function of technical skills, the context in which the operations were taking place, luck and/or randomness and non-technical skills. The other key point we made was that these factors were not additive in nature, but rather had some unknown multiplication or weighting factor that influenced the overall score and therefore focusing on a set of skills, knowledge or factors without considering the whole would lead to flawed mitigation for the failures that may have happened.
As I wrote this paper, as with many things in human factors, I started to draw parallels with the diving industry and how successful outcomes have been developed, but more importantly, what happens when it comes to incident and accident analysis as this is where a significant amount of our equipment and skills development comes from.
Looking at the first layer, technical skills. This relates to the technical aspects of buoyancy control, trim, moving your feet for propulsion, the physical activity of laying line, the sequence to execute an effective gas sharing exit, the maths that goes into gas planning and then reading your SPGs while on the dive, or using a stills or video camera in its housing. These skills are developed initially through training and/or experimentation with the goal that you have some form of muscle memory that allows you to execute the skill when you want or need to achieve the results you expect. This is often referred to as “practice makes perfect”, or rather it should be “perfect practice makes perfect” because there is some form of feedback needed, otherwise you end up practising the wrong thing.
The next level refers to context. What this refers to is everything that surrounds us when we undertake a task and how the task is designed. This is a core theme within the science of human factors whose goal can be summarised as ‘making it easier to the do the right thing and harder to do the wrong thing’. This doesn’t just relate to the physical environment, like being in the water, the amount of visibility we have, the cave formations and structures we navigate through, but also the cultural, social and peer environments we are part of and interact with. In 1936, Kurt Lewis proposed a theory that our Behaviour is a function of the Environment we are in, and the Person themselves, B=f(E, P). As humans we are creatures who like social conformance, we don’t like to be different, we don’t want to stand out. As such, how our peers behave influences our own behaviours and the culture we are in shapes what is right or wrong depending on your perspective.
Context includes the tasks we have to undertake and how they have been designed. For example, we are more likely to make a mistake when laying line in high flow, whilst operating a scooter or in limited visibility than when in the gin clear, calm waters of the (majority) of Mexican caves. Another example would be how gas planning and decompression has evolved over time considering the incidents and accidents that have previously happened. Finally, we need to consider how easy it is to execute the task given how it is written. This might be about how many divers you can take on a training dive and keep track of them when things start to go wrong. Just because it is written in the instructor guide, it doesn’t mean it is safe for the environment.
As with technical skills development, having some form of feedback is critical to both understand the context, but also to improve it.
Randomness and Luck
The third level in the model looks at randomness and luck, something not normally considered when we look at our own performance. Were we lucky, or were we good? In the medical paper, we used the example of being allergic to antibiotics. A certain percentage of the population is, but unless you’ve been treated with them, you won’t know. Therefore randomness refers to the population e.g., 1:100 000 people have an allergic reaction to an antibiotic, but if that person is you, the score 1:1, you were unlucky. In diving, this could relate to having a patent foramen ovale (PFO), having a flow reversal in a cave, a metallurgic failure of a regulator or scooter component or as in the case of Steve Bogaerts, a stalactite which was used as tie-off, collapsing and sinking into the silt! The point here is that there is a level of unknown that we have to consider in our contingency plans.
The final level, which is what I spend most of my time talking about is non-technical skills (NTS) or as it is known in the aviation sector, crew resource management (CRM). These skills are made up of decision making, situation awareness, communications, teamwork, leadership & followership and performance shaping factors like stress & fatigue. This image is the framework that I have developed to show the interdependence within these specific skills and it is surrounded by psychological safety and a Just Culture as a way of facilitating learning. These skills are the glue that holds the technical skills together. For example, knowing when to change from one finning technique to another, how and when to communicate a change in direction or ascend during a decompression profile, or to undertake a brief to ensure that everyone has a shared mental model of what is going to happen on the dive.
Due to the constraints of this article, I am only going to cover two skills within the framework below but you can learn far more in the newly released Essentials of Human Factors in Diving course which can be found on The Human Diver website (www.thehumandiver.com) which is a 3-hour course broken into 2-5 mins lessons covering 12 modules.
Situation awareness is the ability to perceive what is going on, make sense of it now, and then most importantly be able to project with a high level of accuracy, what will happen in the future. Situation awareness makes up the first part of the decision-making process, gather information, and the research has shown that the majority of errors are made because of incorrect information being processed by a ‘good’ decision making process. If you ever studied computing, you might remember the phrase GIGO, Garbage In, Garbage Out, and therefore you’ll now understand why developing individual and team situation awareness and communicating what you know is so important to successful outcomes.
Decision making is a multi-faceted process. We operate in two ‘modes’ and these were discovered by Daniel Kahneman and Amos Tversky and they are called System 1 and System 2. System 1 is fast, intuitive, biased and has a low mental workload associated with it, whereas System 2 is slow, methodical and logical and it takes lots of mental energy to process. We don’t like spending mental energy, so most of our time we are operating in System 1 which means that we are making use of our excellent pattern-matching skills. We see something, we match it to something we’ve previously experienced and then decide on a course of action. The problem is that if we don’t have a complete match, we make an educated guess. Most of the time it is correct, and if we have a good outcome, this reinforces our mental models because of the biases we have. If we are in an uncertain environment, or we have a critical step to take, then we have to force ourselves into System 2 so we can logically process the problem. This is not easy when then are pressures like time, gas consumption, peer pressure, or beers on the bar waiting!
So why is the root cause of an accident our imagination?
The reason is that we will find what we seek. If we think that an accident was due to a technical failure, then we will tend to focus on that aspect. If we think it is about poor choices by the diver, we will focus on the decisions and violations that the diver took. If we think it was because of a medical issue, we will focus on that aspect. Instead, we must look wider than just the proximal causes. The adverse event will have been caused by multiple factors, some of them potentially unrelated to the task at hand. In addition to the different levels in the framework above, there will be tens of influential factors that lead us to make the decisions we do and the risks we take. As an example, a recent research project I undertook asked 9 divers what they believed influenced divers to make the decisions they did and the risks they took. Between us, we collected approximately 300 and 150 post-it notes respectively. Admittedly, some of those were duplicates, but the decisions we make now are made based on previous experiences. Note, when looking at someone else’s event, it is their experiences that count, not yours.
So, the next time you read of an adverse event in diving, or elsewhere for that matter, let your imagination wander and consider the system and not just the individual otherwise you’ll be following the fundamental attribution bias and end up with the wrong causes.
www.thehumandiver.com/ifonly - a fatal accident story told through the lens of human factors
Gareth Lock is the owner of The Human Diver, a niche company focused on educating and developing divers, instructors and related teams to be high-performing. If you'd like to deepen your diving experience, consider taking the online introduction course which will change your attitude towards diving because safety is your perception, visit the website.