You can't learn from adverse events if you are going to blameJun 05, 2022
I recently made a social media post about the impact that punishment can have on learning which was an extract from a research paper looking at what happens if you change the approach when it comes to 'investigations'.
"Contrary to the suggestion in accident reporting literature that punishment is a beneficial component of a just culture, both those who carry out and those who oversee the works believe otherwise. Where punishment is an element, a range of negative behaviours can emerge as a result...A consequence of all of this is future events are "brushed under the carpet" and the organisation experiences minimal learning because of the fear the accident process generates. The removal of punishment from the accident reporting system vastly improved the perception of the process and its intent. Participants believed they could be more honest as they knew they weren't going to get "sent to the gate" because of the information divulged and were "treated like a human"" - Heraghty, Dekker &Rae, 2021.
There were some interesting comments that followed this post regarding accountability and how do you deal with negligent behaviours or intentional sabotage? This is not unexpected given the level of litigation that exists in the modern, Western world.
How else do people 'learn' that these behaviours are unacceptable and therefore punishment acts as a lesson to others not to do this? Unfortunately, research shows that punishment doesn't change the behaviours associated with the errant behaviours (McLeod, 2018), what they do is drive people to be more compliant with the rules, even if the rules don't add value.
Being ultra-compliant can lead to situations where the risk is transferred further down the timeline e.g., if diver training is very prescriptive and based on formulaic scenarios, it won't prepare divers to deal with the variability of the real underwater world (the same goes for instructor development too and dealing with 'real' students). Furthermore, punishment means that stories of 'normal work' and the continued challenges and conflicts that exist get hidden and the organisation and wider (diving) community don't learn.
"The ﬁndings of this study show that the use of sanctions of any form as part of the accident analysis process damages the ability of an organisation to learn and to create mutual trust between management and the workforce...The study also raises moral and operational concerns for organisations that continue to use retributive mechanisms to deal with error...The results clearly indicate that organisations need to do more using a science-based approach to ensure their accidents are eﬀectively managed and positive learnings are gained to avoid similar occurrences in the future." - Heraghty, Rae & Dekker, 2020. https://linkinghub.elsevier.com/retrieve/pii/S0925753520300746
Two examples were given in one of the social media posts about where punishment should be applied. Both have severe outcomes, so there will be some form of outcome and severity bias applied, and both will be influenced by hindsight bias.
The first example was the tragic case of Linnea Mills, a young lady who died on a drysuit diving course in Glacier National Park. Her drysuit was not set up correctly and did not have a working LP feed, she was overweighted, the site and time of day were not suitable for her experience level and the instructors/supervisors didn't appear to have the experience needed to deal with this situation. The criminal case was rejected and the case is now the subject of a civil litigation case. There were numerous failures that lead to this adverse event, and also post-event when it came to preserving evidence that would have helped identify issues. What has become apparent from the publicly available information is that the training agency knew about sub-standard practices at this dive centre before the fatal dive but didn't appear to do anything about it.
While looking higher than the instructor in the system appears to be good, what this potentially does is move the 'blame' from the instructors involved to the training agency and blame again limits learning because a defensive approach is taken. What complicates matters for wider learning is the lack of organisational memory. After speaking with a number of training agency staff from different agencies, it appears the norm that when something goes wrong, telephone conversations are used to exchange information because these cannot be the target of 'discovery' in the event of a legal case.
If we want to look at organisational issues, what if we ask different questions to find systemic reasons why this event happened in the manner it did.
- Once certified as a diving instructor, the majority of diving instructors and instructor trainers/course directors are not required to undertake any formal, practical requalification process which checks their competencies with 'real' students. Some agencies do have a mandatory requalification process, however, it is not a 'no notice' check so instructors will likely behave in a manner aligned with standards ('Hawthorne effect'). Therefore, are the current QC/QA processes based on ensuring the competency of the instructors or about protecting the organisation by having a 'fantasy document' set in place and transferring the risk to the instructor by having an 'air gap' of responsibility in place?
- Does the agency have the resources to check the competencies of their instructors as part of understanding 'normal work'? Relying on instructors to self-police themselves when it is known that individuals, teams and organisations will drift towards 'unsafety' or increased risk does not take a HF or systems view of safety. The likelihood of drifting is increased if there are commercial and workload pressures present and there is nothing to bring the drift back into place.
- Does the agency have the resources to examine quality control issues to understand the difference between what is in the standards ('work as imagined') and what is actually taught ('work as done')? Does the organisation formally recognise the gap between 'work as imagined' and 'work as done'?
- Do the agencies and diving community at large have a level of psychological safety so that concerns (e.g., near-misses, close calls or sub-standard behaviours) can be raised?
"When speaking of the previous process, workers spoke of sticking to the line, “I didn’t see anything”, so as to not “dob on others” as they didn’t want to see anyone lose their job and themselves labelled as a “pariah” for inadvertently loading the bullets. Punishment was believed to be too often the action without fixing the real problems." https://linkinghub.elsevier.com/retrieve/pii/S092575352100093X
- Do the agencies and diving community at large have a Just Culture in place which allows learning to take place, or is it retributive in nature?
- Do the agencies and diving community employ a learning-based approach to adverse events? Do they have a structured format that can be used to look at accidents, incidents or near-misses?
The second example given was that of the recent grounding of the Socorro Vortex liveaboard in the Galapagos islands at night because there wasn't, allegedly, a night watch in place. Not having a watch at night isn't new. One of the contributory factors was the inability of the crew to detect the fire on the MV Conception in time, which subsequently led to the loss of 34 lives, was because the night watch wasn't effective. I would hazard a guess that prior to this event, many overnight operations didn't have effective night watch capabilities (and some still might not). Unfortunately, on this occasion, a potential hazard (fire) became a real one, and the risk materialised.
Questions to ask here regarding the Socorro Vortex grounding:
- What were the conditions that led to the night watch not being effective?
- How often did the night watch fall asleep without any issue, if this was the reason why they weren’t effective?
- Is there a culture within the liveaboard operators that allows safety issues to be raised? (Psychological safety and Just Culture)
- How much extra would it cost to increase crew numbers so that adequate rest could be maintained,if that was an issue?
- Would the customer base be willing to pay that extra amount?
- How much has COVID-19 impacted the safety of liveaboard operations with financial margins being cut?
- How have previous near-misses been dealt with by company leadership and owners? Are they production/output focused?
Taking a different approach to understanding 'local rationality' can mean that wider issues are brought to the fore. This report from the Danish Maritime Accident Investigation Board looks at a grounding which happened while the Master was asleep while 'on watch' and takes a 'no blame' approach to understand how it made sense for the crew to operate as they did, including much wider systemic factors.
All of the above appears to be sound in a theoretical way. But how does it work in practice, right now?
- Change doesn't happen overnight. Especially if there are cultural changes that need to be influenced. It will take time. Changing the language is probably the easiest thing that can be done. Moving from "Who is to blame?", "Why did they do that?" and "What were they thinking?" to "How did it make sense?" or "What did you expect to happen?" provides a very different set of answers. Consider how 'accident investigation' is thought or compared to 'Learning from Unintended Outcomes' or 'Learning Review'?
- Take a wider, systemic view of what has happened when looking at adverse events. Rather than the proximal causes, look at where else has something like this happened and didn't end up as an accident.
- Recognise and work to counter the cognitive biases that limit learning. This blog in GUE's InDepth magazine looks at a number of them which limit learning. Look at the Chartered Institute of Ergonomics Guide to Learning from Adverse Events for more examples of a changed approach to learning.
- Most errors are committed by good, hardworking people (divers, instructors and training agency staff) trying to do the right thing. Therefore, the traditional focus on identifying who is at fault is a distraction. It is far more productive to identify error-prone or error-producing conditions and settings and to implement systems that prevent these same diving stakeholders from committing errors, catch errors before they cause harm, or mitigate harm from errors that do reach divers, students or staff.
- Consider that the "the lack of understanding of the impact of punishment following an accident is a signiﬁcant risk to organisations. The continued use of punishment to manage safety causes signiﬁcant damage to both the individual exposed to it and the organisation wielding it." Punishment doesn't have to happen at the organisation level to have an impact - naming and shaming by the online community who have an incomplete picture of what happened can seriously impact learning and coming forward with learning opportunities. The difference between how the 'safety stop' case at Stoney Cove, UK was treated in social media when one part of the story came out compared to the final version of events reproduced here.
- Ask three questions: Who is Hurt? What do they Need? Whose Obligation is it to Meet that Need? These come from Sidney Dekker's Restorative Just Culture Checklist
You can blame, or you can learn, but you can't do both at the same time. The information you need to genuinely learn is often hidden if there is a hint that punishment will be delivered if the 'true story' is provided.
Gareth Lock is the owner of The Human Diver, a niche company focused on educating and developing divers, instructors and related teams to be high-performing. If you'd like to deepen your diving experience, consider taking the online introduction course which will change your attitude towards diving because safety is your perception, visit the website.