diver in a system

Four Ways We Talk About 'Human Factors' in Diving

April 29, 20269 min read

When something goes wrong underwater, "human factors" is usually the first phrase out of somebody's mouth. A fatality report concludes that the cause was "human factors". A training agency revises a procedure because of "the human factor". An instructor trainer delivers a weekend course on "human factors for divers". A systems thinker talks about "human factors in the diving system".

On the surface, these all sound like the same thing. They are not. Steven Shorrock, one of the clearest voices in the wider human factors community, wrote a short series of articles a few years ago pointing out that the phrase is used in at least four very different ways, each carrying quite different assumptions about where problems come from and what we should do about them. His examples were drawn from aviation, air traffic control and healthcare, but the pattern maps almost perfectly onto diving. If we want to move the industry beyond compliance theatre and towards genuine learning, we need to be clear about which version we are actually using.

Implementing Human Factors in Aerospace Manufacturing - Oxebridge Quality  Resources


1. "The Human Factor" — the colloquial shorthand

The first version is the one that appears in press releases, coroners' comments and online forums. It is rarely defined. It usually means something like the person did something they shouldn't have, or failed to do something they should have. In diving, this is the "the diver panicked", "the instructor got complacent", "the buddy failed to check the gas" version of human factors. A diver dies in a cave they weren't trained for, and the verdict is "human factors". A rebreather diver loses consciousness at depth, and the verdict is "human factors". The phrase does a lot of work in those sentences, but almost none of it is useful.

The attraction of "the human factor" is that it seems to point at something solid — a person, with agency, who made choices. And that is not nothing. People are not passive components in a system; they have intentions, goals, responsibility. A humanistic reading of this phrase would say: the diver was a whole person, embedded in a life, with their own history and their own values, not a set of data points. That is worth holding onto.

In practice, though, this version of "human factors" almost always collapses into blame. It frames the diver — usually the one at the sharp end, the one no longer able to answer back — as the unreliable part of an otherwise well-designed activity. And from that framing flows a predictable set of responses: more rules, more signatures on more forms, more emphasis on "personal responsibility", occasional public shaming on social media, and the quiet retirement of a phrase like "bad apple" into the community's vocabulary. Sometimes those responses are appropriate. Usually they are not, because they treat the person as if the person were the whole system.

Brain

2. "Factors of Humans" — what's going on inside the diver

The second version goes down and inwards, into the human being themselves. This is the version most non-technical skills courses are built on. We talk about attention, perception, memory, situation awareness, decision-making, workload, stress, fatigue, narcosis, CO₂ retention, the cognitive effects of cold. We talk about System 1 and System 2 thinking. We talk about slips, lapses and mistakes. We talk about inattentional blindness, and why a diver can swim straight past a guideline jump they've laid themselves. We talk about the startle response, and why a free-flowing second stage at forty metres feels like the end of the world for three seconds before the trained response kicks in.

This is a genuinely useful topic. Decades of cognitive psychology and physiology give us a reasonable picture of what a human being is actually capable of, and where the limits sit. Understanding that our working memory is small, that our attention is selective, that fatigue degrades judgement long before it impacts motor skills — these things help divers and instructors make sense of their own experience without reaching for shame. When a well-trained diver makes a gas-switching error at a deco stop, it is often more helpful to understand it as a predictable consequence of how human attention works under load than as a character flaw.

But a "factors of humans" account, on its own, has real limits. It reduces a person's lived experience to a list of cognitive components, and then often to a score on a questionnaire. Workload becomes a number. Situation awareness becomes a score out of ten. The person's actual experience — what the water looked like, what the team felt like, what the pressure from the operator felt like, what the silt-out sounded like — gets stripped out. And because the concepts stay inside the diver's head, it struggles to explain why the same diver performs beautifully on Tuesday and falls apart on Saturday. The factors changed. The context changed. But a strictly internal account cannot see that.

Enter wreck


3. "Factors Affecting Humans" — the world around the diver

The third version of human factors looks outwards. It includes everything that shapes the diver's performance from outside: equipment design, procedures, training standards, supervision, team dynamics, weather, visibility, current, depth, temperature, workload, schedule pressure, surface logistics, commercial pressure, regulatory expectations, cultural norms in the local dive community. The HSE's familiar definition — environmental, organisational and job factors that influence behaviour — sits squarely here.

This is an important step forward in terms of making it easier to do the right thing. It acknowledges that when a diver gets into trouble, the causes usually lie at least partly in the situation they were diving in, not only in the diver themselves. Two identical second stages with no tactile or visual difference between them. A checklist that is completed before the actions are actually done because it's laminated to the kit bag rather than built into the pre-dive flow. A roster that puts a tired instructor on a third teaching dive of the day below the thermocline. A training standard written for an imagined student who doesn't exist in the real shop. Once you look, these factors are everywhere, and they shape what divers actually do far more powerfully than most exhortations to "be safer" ever will.

The limitation of this view is not that it is wrong, but that it is still mostly linear and mostly reductive. It lists factors as if they were independent ingredients. In real diving, they interact. Pressure from a paying customer interacts with an instructor's fatigue, which interacts with a slightly out-of-trim student, which interacts with a current that wasn't in the brief, which interacts with a team that hasn't dived together before. Picking any one factor and trying to "fix" it often moves the problem somewhere else or creates a new one. A tick-box pre-dive checklist reduces one class of errors and produces another; divers who complete the list without doing the checks. This is not a failure of the checklist. It is a failure to see that factors do not sit still while we intervene on them.

Inside Wreck

4. Socio-Technical System Interaction — the diving system as a whole

The fourth version is the one that makes the biggest impact on diving and diver safety, and the one that is hardest to communicate! It treats diving not as a collection of divers, kit and rules, but as a socio-technical system — a set of interactions between people, equipment, procedures, organisations, regulators, commercial pressures and the physical environment, playing out across time and across scales. The focus moves from factors to interactions. From what went wrong to how work actually gets done.

In this view, the Linnea Mills case, the Brian Bugge case, the Dylan Harrison case, the double fatality on the Scylla, the triple fatality in Chac Mool, and much less emotive, the quiet near-miss on your last training weekend — none of these are reducible to a single cause or a single person. They are emergent outcomes of a system that, on most days, works well enough to hide the drift. Work-as-imagined by the agency standards writer is not work-as-prescribed by the local instructor's adapted procedure, which is not work-as-done on a cold Sunday morning with a distracted student, which is not work-as-disclosed in the post-dive debrief, which is not work-as-judged by the inquiry panel two years later. All five versions are real, and the gaps between them are where the learning lives.

This kind of human factors is harder to sell because it refuses simple answers. The honest reply to most questions in diving — should I dive this profile? should I train this student? should we run this course? was this accident avoidable? — is "it depends", and then a careful mapping of what it depends on. That is unsatisfying if you want a rule, and essential if you want to perform well in conditions the rule didn't anticipate.

It is also the only version that does justice to both sides of what we claim to care about: system performance and human wellbeing. Production, safety, capacity, learning, meaning, satisfaction, pride in the work. None of those reside inside the diver alone, and none of them sit inside the procedures alone.

So what?

If you are an instructor, a dive leader, or anyone trying to improve how diving is done, it is worth asking, every time the phrase comes up: which human factors are we actually talking about? If the answer is:

  • "the human factor", we are almost certainly about to blame someone.

  • "factors of humans", we are probably about to run a course on decision-making and call it done.

  • "factors affecting humans", we are probably about to change a procedure and hope it holds.

Only when the answer is "how the diving system interacts with itself" are we likely to produce the kind of change that actually lasts — and the kind of learning that makes us, genuinely, better than yesterday.

Systems thinking is not an excuse to scatter responsibility into the ether. People still make choices, and those choices still matter. But choices are made inside systems because the context and our previous experiences influence our ‘decision’ now, and if we ignore the system, we will keep being surprised by outcomes that were, in retrospect, entirely predictable.

Gareth Lock is the founder of The Human Diver and Human in the System — two organisations built on a single conviction: that most unwanted events in high-risk environments are system failures, not people failures. Through structured courses, immersive simulations, incident investigation, and keynote speaking, he brings frameworks from military aviation and academic human factors research into the practical reality of diving and high-risk industry. His work spans recreational and technical divers learning non-technical skills for the first time, through to senior safety leaders restructuring how their organisations investigate, debrief, and learn. Everything sits under one guiding principle: be better than yesterday.

Gareth Lock

Gareth Lock is the founder of The Human Diver and Human in the System — two organisations built on a single conviction: that most unwanted events in high-risk environments are system failures, not people failures. Through structured courses, immersive simulations, incident investigation, and keynote speaking, he brings frameworks from military aviation and academic human factors research into the practical reality of diving and high-risk industry. His work spans recreational and technical divers learning non-technical skills for the first time, through to senior safety leaders restructuring how their organisations investigate, debrief, and learn. Everything sits under one guiding principle: be better than yesterday.

LinkedIn logo icon
Back to Blog

Contact Menu

© 2026 The Human Diver