17 Cognitive Biases which Contribute to Diving Accidents

- english gareth lock Jan 08, 2018

Introduction

Humans are subject to a variety of cognitive biases and heuristics (mental shortcuts). These directly impact the decision-making process of the diver, sometimes resulting in incorrect judgments when certain circumstances dictate. In most settings this can be relatively harmless, however, for those operating in high-risk environments, the consequences of an incorrect decision can be critical or deadly, especially if there is a short period of time between error detection and the ability to recover the error. The bias itself impacts the divers' perception of reality, changing their understanding of the situation, and filtering the true nature of the events as they unfold. These cognitive issues are further compounded by physiological factors in diving like narcosis, reduced colour perception and visibility, and the changes in sound transmission when underwater.

The effects on human perception

Human perception is a “conscious sensory experience” that utilises a combination of our senses and our brain to filter and read those sensory inputs. Research has revealed that there are a number of common ways that our brain’s perception is modified. While these biases serve as filters, hindering our ability to make accurate decisions, they are also considered essential to deal with the massive amounts of information which we have to deal with in short periods of time. This blog covers this reduction process in more detail. The problem we face in real time in high-risk scenarios is that often we are unaware of this filtering and reality modification process.

Types of cognitive bias

There are many types of cognitive bias that can influence divers’ safety because they impact risk perception and acceptance. The following are some examples of biases that can be particularly dangerous to divers:

Ambiguity effect

An aspect of decision theory where a person is more likely to select something that has an intuitively clear risk as opposed to one that seems relatively less certain.  This could lead someone to choose a more risky option, albeit one with a more certain risk. For example, a CCR stays on the loop when there is a fault in the rebreather compared to bailing out and making an ascent on open circuit.

Anchoring bias

A bias where people will make decisions based on a provided data point, for example, if given a baseline of a certain amount of gas as a requirement or a depth for the dive, this number will be utilised in determining requirements regardless of whether operational needs might actually require much more, or much less. This might be 'surface with 50 bar/500 psi' but there is no understanding what this number means in terms of cylinder size, depth or breathing rate.

 

Attentional bias

Humans pay more attention to things that have an emotional aspect to them. In diving, this could lead to a person making a decision based on a perceived problem due to a past experience. For example, if a diver has had a DCI event or someone close to them has, they might ignore the risk of running low on gas in an attempt to avoid DCI. An incident was recounted to me about a diver who ran out of deco gas because they didn't understand the mechanism by which the 6m stop was managed on their SUUNTO and was afraid of getting bent, despite being close to 6m for a long time. 

Attentional tunnelling

This has been defined as “the allocation of attention to a particular channel of information, diagnostic hypothesis, or task goal, for a duration that is longer than optimal, given the expected cost of neglecting events on other channels, failing to consider other hypotheses, or failing to perform other tasks”.  This can be more simply explained as the ‘7 +/- 2 lightbulbs’ of mental capacity. If a number of those lightbulbs are taken up with basic tasks, the capacity to monitor other tasks is limited. This is despite the risks of not completing those 'apparently' secondary tasks. An example of this would be shooting UW video and not monitoring pO2, as in the case of Wes Skiles. (The incident was much more complex than I have just explained but attentional tunnelling was a major contributory factor).

Automaticity

While not a bias, this refers to the fact that humans who perform tasks repeatedly will eventually learn to perform them automatically - so-called muscle memory. While generally a positive attribute, this can lead to a person automatically performing a function (such as a checklist item) without actually being cognisant of the task itself.  Expectation bias can lead them to assume that the item is correctly configured even if it is not.

Availability heuristic

This describes how people will over-estimate the likelihood of an event based upon the emotional influence the event may have had, or how much personal experience a person may have had with that type of event.  This can lead to incorrect assessments of risk, with some events being attributed more risk than they should, and others not enough. An example might be how much focus is placed on DCI (a pretty rare event) compared to running low or out of gas, which is much more common.

Availability cascade

This is a process where something repeated over and over will become to be accepted as a fact. An example of this is the misconception that diving nitrox extends your bottom time AND makes diving safer on the same dive. Rather, minimum decompression times can be extended given the same level of risk of DCI, or the risk of DCI can be reduced if the minimum decompression time for air at the same depth is used. e.g. the same level of decompression requirement exists for 32% nitrox at 30m and 30mins or if using air, 30m and 20mins, but you cannot be safer and have the longest bottom time.

Base rate fallacy

Historically, the lack of data has resulted in a diver lacking the ability to see large statistical trends. When a person focuses on specific events (which might be a non-event), rather than look at the probability over the entire set (ignoring the base rate) there is a tendency to base judgments on specifics, ignoring general statistical information. This can affect divers if they are not able to accurately assess the risk of certain decisions. Examples in diving could be cardiac risk, or out-of-gas situations. A lack of population non-fatality data makes this bias even more pronounced.

Confirmation bias

This describes a situation where a person will ignore facts or information that does not conform to their preconceived mental model and will assume as true any information that does conform to their beliefs.  This is very dangerous in diving where a diver might form an incorrect mental model of their situation and have a very difficult time changing that view even in the face of new information. An example of this would be a diver who is convinced that their CCR is functioning correctly, despite more and more warnings to the contrary. If linked with alarm blindness (where alarms are often present), this can be a critical problem.

Expectation bias

This might be considered a subset of confirmation bias, but describes a situation where a person sees the results they expect to see. Due to the (un)reliability of some electronics, when alarms occur and they ‘always happen’, then false positives occur and the diver expects the alarm to be telling a falsehood. Unfortunately, they have a limited ability to cross-check other than bailing out which normally means the end of the dive.

Optimism bias

As the name suggests, this is a situation where people are overly optimistic about outcomes. It is a common issue in diving, as divers have seen so many bad situations turn out “okay” that the sense of urgency and risk can be reduced when such reduction is not warranted.

Outcome bias

This is the tendency to take outcomes into account when they are irrelevant to the decisions involved. An example in diving would be comparing an incident whereby a cave diver completed a blind-jump and ended the dive successfully and one where the blind-jump lead to the death of a diver. Outcome bias can contribute to normalisation of deviance.

Overconfidence effect

As the name suggests, there is a strong tendency for people to overestimate their own abilities or the qualities of their own judgments. The Dunning-Kruger effect is a great example of this. This can have fairly obvious implications in diving. Overconfidence is often developed because of the positive reinforcement techniques used in training and the want/need to pass people without having ‘mastery’ of the skills needed.

Plan Continuation / Sunk Cost Fallacy

This might be considered a subset of confirmation bias.  There is a strong tendency to continue to pursue the same course of action once a plan has been made, but it may also be influenced by some of the same issues that lead to “sunk cost effect”, where there is a “greater tendency to continue an endeavour once an investment in money, effort, or time has been made”.  Plan continuation bias is the basis of the case study used in the online micro-class in which a diver ran out of gas at 60m and bolted to the surface, which led to his death.

Prospective Memory

A common situation where one needs to remember to do something that will occur in the future. It can be particularly challenging when faced with distractions of any sort, e.g., a person is driving home from work and needs to stop to pick up milk en-route. If that person then gets a phone call, there is a high probability that they will forget the task of stopping at the store. In diving, an example would be an instructor assembling their rebreather and they get interrupted by a student, leading to a critical step being missed when they go back to their unit to carry on the assembly.

Selective perception

There is a strong bias to view events through the lens of our belief system. This is different than expectation bias in that it is generally applied to our perception of information as filtered by our belief system itself, while expectation bias is more generally utilised describing situational awareness based on things we expect to happen. Selective perception can lead to an incorrect hypothesis, such as a belief that a DCI event was based on a specific profile rather than other factors.

How do we prevent these having a negative impact?

The first step is recognising that cognitive biases exist and can have both a positive and a negative effect on human performance.

Once identified, the hazards of cognitive bias can be mitigated by a variety of means. One thing to bear in mind is that the most effective way to improve safety is by changing the system so that human behaviour isn't the only thing that needs to be changed e.g. briefing students not to interrupt the instructor while they are assembling their equipment, or providing an electronic checklist on the CCR which monitors system parameters to ensure the checks are executed. If the system cannot be changed, then human behaviour and thinking need to be modified - possible options are:

Training

Whether the training is initiated at the individual level through personal development, or via formal programs, training is the only way to actually start the change process which is needed to address the root behavioural issues that result in biases which have a negative outcome. The first defence against any hazard is to understand that the hazard exists and training provides that.

Education & Learning

Technical, CCR, cave divers and instructors, because of the additional risks they face, should be trained to understand the types of cognitive biases that exist, when they are likely to occur and strategies to avoid them. A comprehensive understanding allows the individual to see how these biases can affect them and their students, and what the implications can be if they are not identified and trapped. Kahneman and Tversky's System 1 and System 2 are used to describe the intuitive and automatic activity, and the logical, methodological process of problem-solving and decision making respectively. Understanding what System 1 and 2 are and recognising situations when to move from System 1 to System 2 is an essential skill which takes practice. Feedback, via debriefs, is used to develop these skills and understand where biases may have compromised effective decision making.

Crew Resource Management - Non-Technical Skills

One of the most effective and proven strategies in combating perceptual errors in aviation has been crew resource management (CRM). The reason for this is because the aim of CRM is to create a shared mental model between the operating team and the hardware/software they use. If the other individuals or team members are trained to recognise these cognitive biases, they will be much more likely to recognise the situation develop in real-time thereby preventing the incident from developing.

Conclusion
           
While published research into the role of cognitive biases in diving incidents and accidents is not readily available, there would have to be something very different about diving if human behaviour in this environment was to be different to aviation, nuclear, rail, healthcare, and oil & gas.

Unfortunately, research has also shown that most people are unaware of these biases and how they impact human performance when certain latent issues converge to create the grounds for an accident or near miss. 

One way in which diving safety could be improved would be to ensure that these topics are part of an accident or incident investigation process rather than being captured under the general topic of ‘human/diver error’.

[Note: how many people counted the biases? There were only 16 listed. Expectation bias at play.]



Gareth Lock is the owner of The Human Diver, a niche company focused on educating and developing divers, instructors and related teams to be high-performing. If you'd like to deepen your diving experience, consider taking the online introduction course which will change your attitude towards diving because safety is your perception, visit the website.