
What Happens Underwater, Stays Underwater — And That's a Problem. Part 1 of 3
Blog 1 of 3: The Problem Space
This is the first in a three-part series drawing on the research behind Storytelling to Learn: What Happens Underwater, Stays Underwater — a mixed-methods MSc thesis completed at Lund University's Division of Risk Management and Societal Safety. This blog covers the problem and the existing literature. Part Two examines the data. Part Three sets out what needs to change.
In 2020, an 18-year-old diver named Linnea Mills entered the water in Glacier National Park overweighted, with her drysuit's low-pressure hose disconnected due to incompatible fittings. She had no functional positive buoyancy. She descended to 40 metres and drowned. Court filings subsequently revealed not just individual failures but compounding systemic weaknesses in instructor oversight, training standards, and the quality management of the dive centre involved. The case was eventually settled out of court.
Her story is not unusual in its structure, only in the degree to which it made it into a public record. Across the world, every year, divers experience near-misses, close calls, and adverse events, and most of those stories go nowhere. They are not analysed. They are not shared. They do not become the raw material for learning. They stay underwater.
That is what I was going to explore in the MSc thesis I developed over 2023-2024.
Diving Operates Differently From Other High-Risk Domains
SCUBA diving is a high-risk leisure activity conducted in an environment that does not sustain human life. The decisions made under pressure, literally and figuratively, have real consequences. And yet, unlike military aviation, commercial aviation, healthcare, or nuclear energy, the sports diving industry operates in a largely unregulated space. Training is delivered by agencies that maintain intentional distance from the instructors who carry out that training, which limits accountability and, more importantly, limits learning.
The industry talks a great deal about risk management. But given the near-total absence of reliable data, on numbers of dives conducted, near-misses experienced, or the contextual factors surrounding incidents, what most stakeholders are actually managing is uncertainty, not risk. Uncertainty, as Gigerenzer (2014) notes, is managed not through data but through emotions, heuristics, and cognitive biases: recency, severity, outcome, hindsight. This is a fundamentally different cognitive mode, and it produces fundamentally different decisions.
The formal incident reporting systems that do exist, Divers Alert Network (DAN), the British Sub-Aqua Club (BSAC), and a handful of national bodies and organisations, were developed primarily through the lens of diving and hyperbaric medicine. Their focus tends to be epidemiological: how many deaths, from what proximate causes, leading to what outcomes? Decompression sickness. Equipment failure. Ascent problems. These are not uninformative, but they are incomplete. They almost never reach the systemic, organisational, and cultural factors that made the outcome more likely. The unit of learning is focused on the indivdual, not the system.
The Aviation Parallel. What Can Be Learned?
In 1975, the Secretary General of the International Air Transport Association told a global conference something that now seems obvious but was, at the time, radical: that describing accidents as "pilot error" was, at best, misleading, and at worst, irresponsible. That watershed moment, combined with the introduction of flight data recorders and cockpit voice recorders, gave aviation the ability to tell context-rich stories following adverse events. This meant that stories could be examined, shared, and learned from without the immediate reflex to assign blame. The research shows that the more context that is presented in the narrative, judgement is reduced.
The diving industry, in the view of this research, sits roughly where civil aviation was in the 1970s. The dominant narrative, embedded in the training manuals of PADI, IANTD, NAUI, and SDI/TDI, attributes incidents to poor judgment, overconfidence, lack of awareness, and recklessness; language that places causation firmly with the individual. This is Individual Blame Logic (IBL), and while it satisfies the psychological desire for a clear, simple narrative, it obscures everything that made the situation possible in the first place.
The concept of Work as Imagined versus Work as Done is well established in the safety science literature. Organisations design procedures as if work will unfold in predictable, controlled conditions. What actually happens on a dive boat, in a cave, or at the back of a dive centre at the end of a long day looks quite different. Those deviations and adaptations — normal, human, rational responses to the conditions at hand — are exactly what context-rich stories can illuminate. Without them, the learning is shallow, often single-loop (focusing on the proximal issues), rather than double- or triple-loop learning which looks up and out, rather than down and in.
What Is a Context-Rich Story?
For the purposes of this research, a context-rich story was defined as one that does not simply recount the actions immediately before or after an incident, but includes the social, cultural, technical, and environmental factors that were present, and that may have been building for days, weeks, or months prior. These are the performance influencing conditions that make it easier to do the "wrong" thing and harder to do the "right" thing.
The difference in learning value between a simple narrative and a context-rich one is substantial. A simple story says: diver ran low on gas, surfaced rapidly, got decompression sickness. A context-rich story says: diver was a late replacement on the team, equipment checks had been rushed because the boat was running behind schedule, the dive plan had not been revised to reflect a current that was running faster than forecast, the team leader had not dived with this configuration before, and the diver had been reluctant to raise concerns because of the perceived status of their buddy. Those factors, taken together, tell a very different story about causation — one that allows much more meaningful intervention.
Stories without context, as Snowden (2002) observed, are largely meaningless. Context is what allows sense-making to occur. And sense-making is what allows organisations and individuals to improve.
The Literature: What We Know From Other Domains
The academic and practitioner literature on learning from incidents is extensive — in aviation, healthcare, nuclear, construction, and oil and gas. Across these domains, a consistent set of conditions has been identified as necessary for effective incident reporting and learning.
Reason's (1997) model of safety culture describes five interdependent sub-cultures: reporting culture, just culture, informed culture, learning culture, and flexible culture. These are not independent. You cannot have a functioning reporting culture without a just culture to underpin it — because people will not report what they believe will be used against them. You cannot have an informed or learning culture without data flowing in from reporting systems. The five cultures are a system; weakening one weakens all.
The aviation community has embedded this understanding in legislation and practice. NASA's Aviation Safety Reporting System provides reporters with protection from enforcement action if they voluntarily disclose safety concerns, which has generated one of the largest and most valuable safety databases in the world. The European Union's Regulation 376/2014 creates explicit legal protections for reporters in aviation. Nothing equivalent exists in sports diving.
In healthcare, similar barriers to reporting have been well documented. Waring (2005) identified that the fear of blame is frequently cited as the primary obstacle to reporting, but argued that this framing overlooks a range of other cultural factors that are equally influential. In the military, debriefs and learning reviews, both individual and large-scale exercises like RED FLAG and COPE THUNDER, have normalised the routine examination of what went wrong and why, without the immediate assumption that someone must be sanctioned for it.
The research also draws on the literature concerning how individuals and organisations actually learn from incidents. Drupsteen and Guldenmund (2014) distinguish between the social and cognitive processes involved: individual learning from incidents requires not just exposure to information, but active participation, reflection, and a sense-making environment that supports counterfactual thinking. Organisational learning requires something broader: a willingness to look beyond the immediate event and examine the structural conditions that allowed it to develop.
The literature on just culture is particularly relevant here. Dekker (2017) has argued extensively that the question in a just culture is not simply "who made the error?" but "what made this error likely, given the conditions in which people were working?" The line between acceptable and unacceptable behaviour exists in every just culture, but its location matters less than the process by which it is drawn, and importantly in the diving space, who is involved in drawing it because often it is social media judgment or the lawyers.
The Gap and the Research Question
Despite this body of knowledge existing in adjacent domains, very little of it has been applied systematically to sports diving. The author's own 2011 white paper applying the Human Factors Analysis and Classification System (HFACS) to diving incidents was met with resistance from training agencies. Subsequent work applying tools like AcciMap has demonstrated that systemic analysis is both possible and illuminating in the diving domain, but it has not become standard practice.
The research identified a specific and underexplored gap: not just whether stories are being told in diving, but what factors influence whether they are told at all. That is a narrower and more actionable question than "why doesn't diving do incident reporting better?" It is focused on the conditions — social, cultural, organisational, technical, psychological — that shape whether a diver decides to share a context-rich story following an adverse event.
The research question was therefore: What are the factors that influence the telling of context-rich stories that could facilitate learning following an adverse event in sports diving?
The answer — drawn from 676 online survey respondents, four focus groups, two lawyer interviews, and an unsolicited written account from a diver who had witnessed a fatality — is the subject of the second blog.
Part 2: The Data — What Divers, Instructors, Cave Divers, and Lawyers Actually Said (11 April)
Part 3: Learning or Blaming: The Choice the Diving Industry Needs to Make. (12 Apr)
References:
Dekker, S. (2017). Just culture: Restoring trust and accountability in your organization (3rd ed.). CRC Press, Taylor & Francis Group.
Drupsteen, L., & Guldenmund, F. (2014). What is learning: A review of the safety literature to define learning from incidents, accidents and disasters. Journal of Contingencies and Crisis Management, 22(2), 81–96. https://doi.org/10.1111/1468-5973.12039
EC. (2014). Regulation (EU) No 376/2014 of the European Parliament and of the Council of 3 April 2014. European Commission.
Gigerenzer, G. (2014). Risk savvy. Viking.
Lock, G. (2011). The application of the Human Factors Analysis and Classification System (HFACS) to improve diving safety. https://drive.google.com/file/d/1Iz3qRRyo2NjdiBGbPcRhj14NoCTuuM4/view?usp=share_link
Mills v Gull Dive Center PADI (2022). https://www.scribd.com/document/555406095/Mills-v-Gull-Dive-Center-PADI-2nd-Amended-Complaint
Orlady, H. W., & Orlady, L. M. (2017). Human factors in multi-crew flight operations (1st ed.). Routledge.
Reason, J. (2016). Managing the risks of organizational accidents. Routledge. https://doi.org/10.4324/9781315543543
Snowden, D. (2002). Complex acts of knowing: Paradox and descriptive self-awareness. Journal of Knowledge Management, 6(2), 100–111. https://doi.org/10.1108/13673270210424639
Waring, J. J. (2005). Beyond blame: Cultural barriers to medical incident reporting. Social Science & Medicine, 60(9), 1927–1935. https://doi.org/10.1016/j.socscimed.2004.08.055

