Incompetent and Unaware: You don't know what you don't know...

- english diving gareth lock human factors safety Apr 28, 2016

This is a summary of my presentation at TekDiveUSA as recordings weren't allowed. The interactive presentation was just short of an hour so this is a bit of a long post!!

The simple premise is that we don’t know what we don’t know. Even worse, we don’t that we don’t know!! This double hit in competency was highlighted by a couple of social researchers Dunning and Kruger when they undertook a series of experiments to assess what their subjects thought of their own knowledge. The exams covered logic, humour, and grammar and showed how prevalent the problem was.

The images below show the issue at hand. Pretty much everyone thought that they were above average (60-70% scores), but not exceptional. However, if you look at those who scored in the bottom quartile, the lowest scores were in the order of 12-15% even though they thought they scored 55-60%. The other noticeable part was that those who were well above average underestimated their abilities. However, those top quartile also couldn’t understand why those who were ‘incompetent’ couldn’t do the tasks at hand. They had excessive expectations of others, maybe fueled by the misplaced confidence in the others' abilities.

So what has this got to do with diving?

Those divers who are below the horizontal line have an over-inflated perception of their capabilities, and that includes me as the author. I will also have a perception of what is already known information!! It hits us all!! 

This over-inflated perception of knowledge and understanding means that we create models of the world which are not necessarily correct because they are based on our experiences. We have to have experience to be able to develop credible but creative models of the ‘what ifs’ that we constantly face in diving. This mismatch between reality and the models is known as complacency.

Our brains have limited capacity to process information, actively this is around 7 +/- 2 pieces of information, passively it is in the order of several thousand bits of information, but our senses pick up millions of pieces of information! The brain sheds this information by filtering out what is not perceived to be relevant. Unfortunately, we don’t know what is relevant until after the event and something went wrong.

Therefore, we have two massive biases which are working against us when trying to improve our behaviors. Outcome bias and hindsight bias.

Important: Watch this video to the end before moving down the page.

CLICK >> WINGSUIT VIDEO (links fixed - 16 Dec 17)

So, the wing suit flyer made it through the crack. Thoughts? Awesome experience? What if he had gone splat on the cliff face? What would have been your thoughts now? Probably fairly derogatory and negative. Now consider what happens when we read an accident or incident report from diving. How many times do we immediately jump to the conclusion that they were stupid? The problem is you know the outcome so you immediately think that they could have spotted what was going to happen and prevent it from happening.

Now watch this video. Every time there is a blank or fade think to yourself, what is going to happen.

CLICK >> DIGGER VIDEO (links fixed - 16 Dec 17)

Did you really think that that was going to happen? How many of you would have thought about checking the crucial piece of equipment? Now think about diving. How often do you check EVERYTHING that would prevent an accident or incident? How often do you analyze your gas? Every dive? What about pre-dive checks? Every dive? Have you ever completed a dive and found something that should have been checked and wasn’t? Don’t worry if you did, you are human!!

What you see is all there is…with hindsight we start to pick out information that is relevant because we know what the outcome is. At the time, you don’t know what is really important from the millions of pieces of information out there.

As Professor Sidney Dekker puts in his book Field Guide to Human Error (5th Ed) using these two brilliant cartoons.

To improve our decision making we need to have good situational awareness. Situational awareness is the ability to notice (not just see/hear/feel) what is going on around us, process that information to determine what it means to right now, and then project into the future. And the go around the loop around picking up changes.

This diagram from Endsley's 1995 paper on a theory of SA is quite detailed but shows why experience [at the bottom] is so important. We need experience to determine relevance to the perception & processing stages. Beginners can see and understand something (SA Level 1 & 2) but are not able to project (SA Level 3). So when someone makes a mistake in a novel environment, it shouldn’t be unexpected. As long as they survive, they will probably learn from it and apply it to their next encounter. Therefore training has to be relevant and in the environment in which it will be used.

Checklists

An easy answer to limiting mistakes in aviation has been the application of checklists. These certainly have their place to improve situational awareness by ensuring that at the point in time immediately before an ‘operation’ is executed things are as they are supposed to be - they baseline the model. However, for checklists to be effective, they have to be easy to use, they need to be no more than 7-9 items in length, a team needs to be formed, and communication skills need to be employed to ensure two-way communication takes place. As Ken Catchpole, Professor of Human Factors Medical University of South Carolina stated in his paper 'The Problem with Checklists - “…a checklist reliant on teamwork for success may fail despite all the items being followed, because those team skills were insufficient."

Safety has been improved in aviation not directly because of checklists, but because checklists are part of a safety system where crews are taught leadership, communications and assertion skills, teamwork and the limitations of human performance. Checklists provide the structure for effective communication and decision making.

Decision Making

One of the reasons checklists can help is because of the way we make decisions. Daniel Kanheman in his book “Thinking. Fast and Slow” describes the two main ways our brains work.

System 1: fast, without apparent rationale, little or no effort and
System 2: logical, thoughtful, slow, effort required.

The problem is that for humans to operate at the pace we do, we need to operate primarily in the System 1 thinking zone. This means we take lots of short-cuts because that is efficient. The negative outcome occurs when we don’t have complete Situational Awareness (due to perceived irrelevance, distraction, experience etc) because we can make poor decisions as a consequence. Watch this video to see what I mean.

CLICK >> SELECTIVE ATTENTION

The problem with this decision making process in safety situations is that if the adverse event that could happen doesn’t, we create a new baseline as to what is ‘safe’, and not until something bad really does happen do we realize how far we have come from the original baseline. This is known as ‘normalization of deviance’. Unfortunately, humans are known to migrate to high risk situations and therefore it shouldn’t be unexpected when incidents happen or people break 'rules'. This is why checklists can help.

Given that we only get feedback when someone adverse happens, we don’t know where the unacceptable boundary is until we fall over the edge, so the best we can do is keep ourselves in the safe area with good teamwork which keeps us accountable, checklists to limit deviation, communications skills to challenge when things aren’t going to plan.

Using statements like “If you don’t follow this checklist (or safety message) you could die.” and then you don’t get hurt or a friend doesn’t die, you reduce the value or weight applied to that statement and you start drifting… If you open up an aircraft operating manual it doesn’t say on page one “Caution, not following the instructions in this manual could get you killed.” I understand the reasons for the including such statements (liability limiting) but it is unlikely to have the intended effect.

Safety is an emergent property which comes from valid risk perception and acceptance. However, you can’t perceive the credible risk unless you have experience, and you can only accept the risk you see. This is why experienced divers and engineers sometimes use statements like ‘they [meaning the user] have to accept the risk’ but if you are in the 4th quartile (top right of the graph at the top) you have a different (more complete) view of the risk compared to someone in the bottom left and therefore they are unable to understand the full risk they are accepting.

Moving Forward

I have mentioned many issues that make our diving less safe. What can do about it, especially if we are unable to see our own failures or what knowledge and skills we lack?

First off, we need to learn to talk honestly about failure. Failure is normal. That is how we ‘grow up’. But we can only talk about it if the community recognizes that we will make mistakes, even obvious ones!! If people open up and talk about how their incident or event occurred, ‘shouting’ at them for being stupid is going to achieve 3 things: they won’t talk about future events because they are scared, they may think they are stupid and can’t dive and leave the sport, others who are observing won’t report their incidents. Training cannot provide for every eventuality which means we have to learn from others’ mistakes and errors. In addition, we have to learn about human fallibility and the cognitive (mental) biases we suffer from. Hindsight and counterfactuals are great if you want to see what someone should have done, but it doesn’t help identity why it made sense for them at the time. The vast majority of people have an innate sense of self-preservation, therefore they don’t choose to do something that would end up with them dead. They make constant risk assessments and judgements based on previous experiences and behaviors of theirs, especially those in positions of authority. This is one reason role model behavior is so important.

CLICK >> LIFT VIDEO

Resources

Dekker’s videos on Just Culture - this short course by world expert Sidney Dekker should be considered essential watching for anyone involved in incidents (investigator, subject, or those reading about incidents)

For a reading list of additional information, please visit https://www.thehumandiver.com/pages/reading-list.

Human Factors Skills in Diving Online Class - Introduction to Human Error and the Human Factors Skills (Improved Situational Awareness, Decision Making, Communications, Leadership/Teamwork/Followership and effects of stress and fatigue). More than 2.5 hrs of theory with some practical exercises and detailed case study. 

Human Factors Skills in Diving Two-day Class - In-depth practical and theory classroom-based course using computer-based simulation to develop these skills in a practical manner. It is done in a classroom because it would be impossible to impart the same knowledge transfer in a diving environment in two-days. The interactions, communications, and stress levels are far more easily controlled in the classroom too! The class is currently aimed at Tech/Cave/CCR divers and Instructor Trainers, Instructors and those divers with an interest in Human Factors and Human Performance. Course dates can be found here



Gareth Lock is the owner of The Human Diver, a niche company focused on educating and developing divers, instructors and related teams to be high-performing. If you'd like to deepen your diving experience, consider taking the online introduction course which will change your attitude towards diving because safety is your perception, visit the website.