It's the little things that catch you out...Apr 18, 2016
Humans like to do things efficiently, or as some call it, cutting corners. The problem is that when the configuration changes and the diver cannot recall the 'situation' and it bites, it sometimes ends up with a dead diver.
Complacency is a term often used as one of the key factors when it comes to diving fatality reports. The problem is that complacency is only really apparent after the event because we have had something occur that shows us how far we are from the ideal. The same mental developmental processes which allow us to operate efficiently are also the same processes which can lead to an accident. Complacency can be summed up as the difference between the perceived model of the world and the reality of the situation. To save time, humans create models of expectation of what is happening around them so that we don't have to process vast amounts of information. If nothing changes, we run that mental model making assumptions about what will happen in X seconds/minutes/hours from now. If nothing major changes to bring our focus to the fore, we assume that the model was correct. This can have major consequences if there is a change and we don't pick it up because either we weren't looking for it, or because it was below a threshold level of detection.
This report was released earlier this year which showed that a USAF C-130 flight-deck crew had stowed a night-vision goggle storage case in front of the control column to make it easier to conduct an engine-running offload at an airbase in Afghanistan. Unfortunately, when the offload was finished, they were still operating in a blacked-out cockpit and the crew did not see the control restriction and proceeded to take-off. Unfortunately, the aircraft crashed shortly after take-off killing all onboard due to that control restriction, and the crew's subsequent action of an emergency drill thinking something else was wrong. (This summary is much easier to read & navigate than the main report, and highlights the key issues).
Another aviation story where experts have made mistakes...but what has this got to do with diving?
Consider a diver who is undertaking a video filming task. They are relatively inexperienced on this CCR and want to reduce the noise of the solenoid firing which may be picked up by their camera system. In addition, they believe that they want to have clear arms when operating the video equipment, so move the handset controllers to the outside of the arms. They need to have good access to the viewfinder, so move the HUD away.
Now they have no means of actively monitoring their pO2 within the breathing loop.
That's ok because the system will inject O2 into the loop when the pO2 drops below a certain level and keep the level to sustain life. That is until the O2 cylinder is turned off to reduce the noise on the video...
When we are task-loaded, and that only means actively monitoring more than 7 +/-2 things at once, we have a poor appreciation of time as that is something that needs to be actively tracked. Consequently, it would be difficult to keep track when the O2 needed to be turned back on without looking at a technical monitoring solution e.g. HUD or handsets.
Unfortunately, the diver drowned when they ascended and the pO2 dropped below life sustainable levels.
I am sure a significant number of people would have immediately gone "How stupid was that? It was obvious it was going to end in disaster." Obviously not, because most people don't have a deathwish...
Hindsight and Complacency
In hindsight, it is easy to see that the risk controls (solenoid and controller, handset displays, HUD) were being removed and that there was likely to increase in risk due to an inability to provide sufficient O2 for life. If the removal of these controls is essential, additional controls are needed which could include team monitoring of the videographer, and if need be, interrupting the shot to allow O2 to be added to the breathing loop. Much better a lost shot, than a dead diver. Even better is not to turn the O2 off and accept that there is noise which can be removed in post-processing.
We could easily judge this to be complacency, but incidents don't happen in isolation. Divers operate as part of a system which includes the standards frameworks, equipment manufacturers, training organisations, dive centres, instructors, social conformity requirements, authority gradient, task objectives or goals, the physical environment and the diver. The list goes on. However, by focussing on the diver and their 'complacency', the chances of preventing future accidents like this are very slim. We need to take a wider, systems view of the problem.
Technical issues are relatively easy to solve but need to be reported to identify the failures. Social and commercial drivers are much more difficult.
However, the process can start by being able to brief and debrief dives to set the baseline and expectations including what controls are in place and what lessons can be learned from the previous events. We need to be able to create an environment where we can ask or question someone else's configuration without the fear of having your head ripped off, even if they are a superstar in the diving world. This would allow issues to be captured before divers get in the water. An external set of eyes will spot more than those of the diver themselves. Sometimes it might be rubbish, but better that than missing something essential.
The training provided by the Human Factors Academy covers effective communications including assertion skills, how to maintain situational awareness and recognise the signs when it is being lost, teamwork skills and the stages which every team goes through. This is available online or classroom-based.