“Blame is the enemy of safety” - moving from blaming to learningMar 11, 2022
We all make mistakes - we do the wrong thing, thinking it’s the right thing. We all slip at times - doing something that we didn’t intend to do. We all have lapses of memory - forgetting something in the heat of the moment. Everyone breaks rules of some sort during their lives, often because we see the reward or need as outweighing the potential consequences.
Mistakes, slips, and lapses are classified as different types of ‘human error’. They describe the normal variability of our performance, which is influenced by a whole variety of internal and external factors. Factors such as training, experience, time pressures, social pressures to conform, ease of understanding of the task or dive brief, even the ease by which we can use our dive equipment, like a dive computer. Importantly, we can never say with certainty that we knew an error would happen. If we knew before the event, we wouldn’t do that activity, or we would change something to ensure it didn’t.
If I knew I was going to make a mistake, like teaching something wrong in a class, I would have read the instructions or training manual again to refresh myself.
If I knew I was going to make a slip, like cross-clipping a boltsnap on a D-ring and getting it jammed, I would have slowed down and concentrated on that specific activity to the exclusion of other things.
If I knew I was going to forget something, like omitting an o-ring, I would have used a checklist and ticked off the items as I went along.
In each case, we can only really classify something as an ‘error’ after the event because we use a 'counterfactual' i.e., "would have". Therefore saying someone made an error and that was a contribution to the accident might be technically correct, but it is of almost zero value to improve our performance and safety. Saying the cause of an accident is due to human error is about as useful as saying that an apple falling is due to gravity.
They broke the rules, but why?
While violations can be intentional, often the reason is the risk has been ‘assessed’ as being beneficial to the situation, the project, the dive centre or the training organisation and so it is worth deviating. This is why in some parts of the safety world violations are called ‘at risk behaviours’. However, research from Denham Phipps with anaesthetists showed that the conditions which led to rule-breaking had to do with:
- The rule. Who wrote it, what value it added, how likely would it be they were caught, and what would be the consequences if things went wrong.
- The anaesthetist. Their attitude, position in the organisation, peer behaviours/social conformance.
- The context. Time pressures, ease of completing the task following the rule and equipment configuration.
The biases that limit learning
When we observe someone else do something that ends up with an adverse event, like getting lost, having a rapid ascent, running out of gas, injuring themselves, injuring a student, or ending up with someone dead, we often fall foul of a number of biases that limit our learning.
- Hindsight bias. The ‘I knew it all along’ effect. We believe that we would have seen this event developing and would have done something about it. It is a bit more complicated than that, so next week’s blog is going to cover hindsight bias in more detail.
- Outcome bias. We don’t link the quality of the decision with the outcome faced. We often don’t think about the experience or knowledge of the diver or instructor involved. We assume that they had a clear head, had the knowledge, had the experience and therefore must have missed the obvious. We don’t think of the context.
- Severity bias. The more severe the outcome, the harsher we judge the situation. A diver who runs of gas just as they breach the surface will be judged more harshly than someone who surfaces below minimums, who will be judged more harshly than someone surfacing with the correct amount of gas. However, this last diver didn’t monitor their gauge at all on the dive.
- Fundamental attribution bias. We will look at personal or individual factors (lazy, unskilled, poor attitude, wasn’t paying attention…) before we look at external factors (workload, visibility, current, social pressures, quality of equipment, experience…) as to the cause. We often want to attribute individual fault rather than the context because we use this as a form of self-protection. “If I can focus on them and their lack of competence, and I know they are different to me, then I won’t make those same mistakes.” This might appear simple, it is, and at the same time, most biases are simple!
These biases often lead us to blame individuals and their associated personal factors, skills and traits rather than looking at the context as to how it made sense to do what they did. Even when people break ‘rules’, there are often rational reasons based on the context they are in. Stopping at ‘they broke the rules’ during an investigation doesn’t help us dig deeper into the reasons why the rules were broken.
If we consider that these attributions drive how safety is managed in diving, we can see why the recommendations after diving incidents come about: fix the human, because they are the obvious flaw in the system. If we can address the ‘bad apples’, then the safe system we have designed and implemented will continue to be safe. Reflect on the point that actions to manage safety derive more from attributions than from actual causes and that human error is not a cause, it is a symptom or indication that there are problems in the wider system.
But what about their behaviour?!
We should consider those causal attributions relating to an individual and blame are not the same thing. In attribution theory, blame implies that the behaviour was inappropriate, unjustifiable, or intentional. Therefore, while blame is a special case of causal explanations, a person-centred causal attribution (‘they touched it last’) of an accident doesn’t have to involve blame. An individual’s behaviour/actions can be seen as the cause of accidents even when the behaviour/actions are not attributed to wrongness or intentions of harm.
We often focus on an individual’s actions because we can see them immediately, but we don’t see how situations or organisational problems develop because they take longer to materialise. We should also consider the attribution of agency (‘they dropped the cylinder’ vs ‘the cylinder fell’).
It isn’t wrong to say someone's behaviours/actions contributed to the accident. Humans and their variable performance can be considered as causal factors to the events that emerge over time, but to attribute the direct and sole cause to them is massively flawed.
This means that we need to take a different approach to ‘investigating’ diving incidents and accidents if we want to learn. We need to look wider than just the proximal causes, actions and decisions that happened in the previous day or so. We need to look up and out, not down and in. The majority of diving incidents are investigated by three ‘groups’: law enforcement to look for criminal activity, legal teams for lawsuits, and regulators e.g., OSHA, HSE or WorkSafeXX for compliance. In many cases, these bodies are not interested in true learning, rather they are looking for non-compliance or presence of foul play which then leads to some form of punishment being delivered.
There are other judges out there though! Nearly all (serious) diving incidents end up on social media somehow. Unfortunately, this leads to the position where observers and contributors end up as judges, jury and executioners because the real stories rarely get told and we fall foul of the biases described above. Often it is the instructor, the diver or the ‘agency’ that gets blamed without really understanding how the situation developed. If we don't understand the situation, how do we improve things for the future.
Moving from blame to learning
There are a number of general techniques that we can use to help understand what has happened in an event.
Firstly, try to determine local rationality. In hindsight, something might appear irrational. However, if you are able to understand how it made sense for them to do what they did, or what conditions led that to be ‘the best’ decision, we are well on our way to reducing blame. We often look backwards, informed by hindsight bias, for 'bad choice' after 'bad choice' and falling foul of confirmation bias as we find them. What about if we tried to understand the 'good choices' that people made and how it made sense? What if we considered what pressures they were under, the training they had and the mental models they were using? Put ourselves in their shoes (fins!). If the report we read leads us to think "I would have done that", then it is a good learning report.
Secondly, as soon as you hear counterfactuals, (should have, could have, would have, failed to…) then look at the reasons behind these decisions. Are we being clouded by our experiences and biases or do we genuinely understand the issues at hand because we have spoken to the person(s) involved?
Unfortunately, this need to determine local rationality is why fatalities, in my opinion, are not a great way to learn. We can’t ask the most important person how it made sense for them to do what they did. There is also normally a lot of emotion and grief which clouds rational thought too.
Learning from Unintended Outcomes
We have plenty of data highlighting the outcomes from adverse events, but we don’t have much that looks at the conditions, the cognitive biases and non-technical aspects of an event. That is going to change in 2022 when The Human Diver launches the first learning-focused investigation course.
The current outline consists of approximately 2.5 hrs of online pre-learning to get the core theory in place. Then either a two-day face-to-face workshop or a 5 x 3.5 hr online workshop run over two weeks. These workshops will look at how an ‘investigation’ develops, what tasks need to be completed and then provide some practical tools to look at an event from a learning perspective. The output of the investigation leads to learning products that meet the different requirements of divers, instructors, agency staff, legal personnel and regulators. There will be some post-class work required to consolidate the learning too.
The first two classes are expected to run in May 2022. One online (5 days between 30 Apr and 12 May) and one face-to-face (21-22 May, UK). More details shortly but if you want to drop me a line to express interest, visit the contact page
- To blame is human. To fix is to engineer. https://www.academia.edu/527985/People_or_Systems_To_blame_is_human_The_fix_is_to_engineer
- The Field Guide to Understanding Human Error. S. Dekker.
- Chartered Institute of Ergonomics and Human Factors: Learning from Adverse Events https://ergonomics.org.uk/resource/learning-from-adverse-events.html
- US Forest Service - Learning Review Guide. https://www.fs.usda.gov/rmrs/coordinated-response-protocol-learning-review
- "Blame is the Enemy of Safety" from Engineering a Safer World by Nancy Leveson.
Gareth Lock is the owner of The Human Diver, a niche company focused on educating and developing divers, instructors and related teams to be high-performing. If you'd like to deepen your diving experience, consider taking the online introduction course which will change your attitude towards diving because safety is your perception, visit the website.