Risk or Uncertainty in Diving: What’s the difference? Why it matters.

- english cognitive biases decision-making gareth lock Mar 25, 2023

Diving is an activity that takes place in a hazardous environment. We have not yet evolved to live in the water, and nor can we survive for very long underwater without some form of mechanical or technical support (therefore Darwinism doesn't apply!). In addition to drowning, we have other hazards to deal with such as hypoxia (unconsciousness and death due to lack of oxygen), hyperoxia including oxygen toxicity-induced seizures, hypercapnia (excessive CO2), entanglement, decompression illness, being lost underwater or on the surface, and the list goes on. Fortunately, these events don’t happen very often, but they do. As we have hazards that could kill us, we have to manage the associated risk, which is traditionally made up of the likelihood of the event and the consequence of the hazard/harm materialising.

The problem with the traditional concept of risk is that we don’t manage risks when we are diving, we are managing uncertainties, and there is a difference. The classical distinction in economics is between risk, in which the probabilities of various outcomes are known, and uncertainty, in which even the probabili­ties associated with events are unknown or unknowable. In diving (or any complex environment for that matter) we cannot know what the probabilities for an event are. We can make a guess, but we have very poor data to determine what they are likely to be.

"The risk I took was calculated. But am I bad at math"

This quote came up in a presentation about the Plura cave diving accident in which 2 divers lost their lives after becoming stuck in a cave. The dive team weren’t unique. As humans, we are not very good at judging risks, even more so when it comes to low probability/high consequence events. When faced with a novel situation, we try to reduce the uncertainty (make it more certain) by comparing it with previous memories and experiences, and then ‘choosing’ a course of action that will lead to what we think will happen.

We are making ‘educated’ guesses, and these guesses or gambles are based on three primary heuristics or mental shortcuts: availability bias, representativeness bias, and ‘mirages’.

Availability Bias

Events that are more common and relate to large groups of people, or have high levels of emotional value, will be recalled more readily. In addition to this, the more vivid something is, the easier it is to recall. Consequently, when vividness and frequency diverge, the availability heuristic leads us to overestimate the frequency of rare but memorable events but underestimate the frequency of duller but more common ones. For example, human error is normal (slips, lapses, mistakes and ‘violations’) but for most people, it is only when we have a catastrophic event that we think about the importance of checklists or the analysis of breathing gas in preventing an adverse event from occurring. Most of the time, the error that has been made was caught before it led to a catastrophic event, or the context didn’t create a catastrophic situation e.g., not analysing gas that contained 50% but fortunately, the divers were above 21m so hyperoxia wasn’t an issue, therefore nothing bad happened. As a consequence, we don’t think about failures prior to the event, only the adverse outcomes.


Representative Bias.

When people try to work out whether something matches a previous class (in psychology, classes can relate to people, objects, tasks, roles etc), they use what they already know about the characteristics of objects in those classes to guess which one the test object belongs to. An example I use in my classes is an underwater camera person who is doing an 80m rebreather dive in poor visibility, strong current, and high workloads. I ask the students to describe what happened. Nearly all of them refer to the cameraperson as a ‘he’ – however, the event refers to a story that Becky Kagan-Schott told for ‘Under Pressure’. At a wider level within diving, some look at diving as ‘safe’ because of the way it is marketed – blue waters, clear skies, bright fish, not dark green water, poor visibility, strong currents, and dulled colours.

How we estimate the numerical value of something is also affected by the reference point or anchor from where we start. There have been numerous case studies of people anchoring their answer based on a randomly allocated ‘anchor’ e.g., by spinning a wheel of fortune in their presence. This is why it is important for leaders (which includes instructors) not to speak first when it comes to stating difficult or controversial outcomes/plans that need some form of discussion and consensus e.g., maximum run-time on a dive, maximum decompression, maximum penetration, how long to stay in cold water because their answer will shape others’ responses. Generally, we tend not to adjust very far from our anchors.



This is where we think we have many choices, but we only have one. For example, immediate rewards are much more valuable than delayed rewards of equal or somewhat larger magnitude. Spending money on something shiny for diving now, or saving up getting additional training or a memorable trip away? We are often ignorant of our own (emotional) reactions, and so tend to believe that we will value a greater reward to be received later more than a smaller immediate reward. However, as the opportunity to get the immediate reward approaches, its perceived value increases dramatically, thereby reducing the apparent effect of the future (larger) reward. Consequently, the choice between an imminent reward and a delayed one is, for most purposes, a mirage. This research from Ainslie was built upon by Kahneman and Tverskey as part of their work which led Kahneman to win the Nobel prize (Tverskey had passed away before the award was made).


Kahneman and Tversky showed that people are risk-seeking in some situations and risk-averse in others, rather than being risk-averse in all cases as economists would have predicted. Their research showed that subjects tended to be risk-seeking when faced with losses, and risk-averse when facing gains. All else equal, in situations of gain, people prefer certain options to uncertain ones, and so they choose a sure thing over a gamble when the expected values are the same. A simple example here would be a divemaster or instructor who has the choice: break the standards of the agency (might get caught) or not break the standards and get fired (definite outcome). See Lanny’s story in Under Pressure about this very point. This isn’t about making gains (making money) it is about losses (not losing a job).

The anchor point is important when looking at losses and gains because the effect of gains/losses is not constant. Small changes in probabilities are indistinguishable unless they shift one from uncertainty to certainty (or vice versa). People will thus pay more for a reduction of 1% in their chance of getting a dread disease if this moves them from a 1% chance to a zero percent chance, but they will pay considerably less for this same 1% reduction if it moves them from a 5% chance of getting the disease to a 4% chance.


These last two points could be used to argue why divers (and people at large) buy shiny stuff rather than invest in education and development – equipment is tangible whereas education is vague. Education and development require effort, and we know how cognitively efficient (lazy) people are!!

Framing and Location/Cultural Effects 

This final point highlights that how we frame a decision or argument can have a huge impact on the decisions made by individuals and groups, as well as the cultural norms. Location doesn’t just refer to geographical location, but also organisationally. A divemaster or instructor might recognise the very real risk of injury or litigation because they are exposed to it every day, but a training agency staff member who does not dive much and doesn’t have to make the constant trade-offs and adaptations to remain commercially viable, doesn’t see the importance of dealing with this messy problem. The simple answer is ‘follow the rules’ but the rules don’t always and can’t always apply!


How to address these biases?

First off, biases just are. They are often portrayed as negative, but they are also extremely useful. Fundamentally, they reduce our cognitive overhead so we can focus on other elements, especially when we are time-limited (or perceived to be time-limited). 

Education is a critical factor, but it only works ahead of time or if there is an ‘operational pause’ which allows a certain type of thinking to be engaged (System 2). This pause is needed because once we are in the tunnel of bias, we are unlikely to spot what is going on outside the tunnel until something happens and then another bias kicks in - Hindsight bias. Daniel Kahneman, was asked “given your knowledge about cognitive biases, are you immune to them?”, he said “No! I am better prepared to know when they are present, but I will still succumb in certain situations.”


Knowing that if you are time-pressured, you have incomplete information, you are fatigued or stressed (or subject to other performance shaping factors or error-producing conditions), then we need to slow down. While I am a huge fan of checklists, they are not a panacea. What they do is purposefully slow us down and force us into System 2 behaviours. However, they have to be well-designed, they have to have a supportive cultural and social environment to be effective, and they have to be role-modelled.

The education that The Human Diver provides helps with managing uncertainty, not risk. We don’t have a clue what the numbers are when it comes to managing risk, other than to say they are not worth the paper they are written on, and people won’t use them anyway! A bit like most risk paperwork! However, these numbers provide a comfort blank, an anchor, to determine whether something is 'safe' or not. Safety is a social construction, not an absolute. So is an acceptable level of risk! 

Reference: Social Structure, Psychology, and the Estimation of Risk. Heimer, 1998.

Gareth Lock is the owner of The Human Diver, a niche company focused on educating and developing divers, instructors and related teams to be high-performing. If you'd like to deepen your diving experience, consider taking the online introduction course which will change your attitude towards diving because safety is your perception, visit the website.