With Errors: Aviation Blames The System, The Diving Community Often Blames the IndividualSep 18, 2017
NTSB Finds ‘Blind Spot’ in SFO Radar After Near-Miss to Aviation’s Greatest Disaster Reports the Mercury News: “The wayward Air Canada plane that nearly caused an aviation disaster at San Francisco International Airport [on July 7th] dropped off radar displays for 12 seconds in the moments before it approached four fully loaded passenger jets on the taxiway, according to new information released Wednesday from federal aviation officials investigating the incident. A source familiar with the investigation called it a “blind spot” that is a half-mile from the start of Runway 28-Right and Taxiway C.” [The link includes cockpit audio]
In aviation, when a near miss of a catastrophic error takes place everyone including pilots, airline companies, government agencies, passenger witnesses, air traffic control and airport administration, all work together on a federal level through the NTSB and the FAA to find the cause.
One of the reasons for the multiple parties being involved is because on average it takes at least seven mistakes to cause a catastrophic error in aviation, so there’s lots to learn from taking apart a near miss and looking at the context rich data across different areas. Fortunately, things have moved on and investigators have the help of blackbox recorders, video surveillance systems, team-based communication protocols, and practiced simulation at work in the aviation industry to maximise safety effectiveness. They also have a level of legal protection which allows candid information to be protected from the judiciary. Indeed, 2 years ago in the UK, the High Court prevented the release of cockpit video and audio materials to the police and legal teams who were looking at criminal investigations into two crashes.
Of note, even if a single individual’s actions are solely contributory to the accident or near-miss, in aviation the ultimate goal goes beyond the singular blame to an educational and training requirement program change. An individual who has managed to fail at this level exposes a failure of the system somehow, and that we all agree is a good thing to discuss and fix. And in the case of the Air Canada Flight AC759, it only took a month to expose a systematic failure.
Now consider what happens in the diving industry when someone puts their hand up and says "We had a close call...here is what happened" Often it includes discussions about missed checks, incorrect assumptions, poor briefing, inadequate assessment of the risks, changes which were obvious after the event...and so on. The armchair pundits start pointing fingers and say "You should have done this or that." and don't look at the systemic or cultural issues at play. Hindsight bias provides a clarity which is not possible in real-time. I have just seen two specific instances about this, one involved a 4-person dive team who drifted at depth (70-80m) and got separated from their chase boat because their skipper didn't see the dSMBs on the surface. They were picked up nearly 10 km from the drop-in point. The second was a fatality in the Far East when a group of OW divers entered a wreck at 37m using a single cylinder as their gas source. The inside of the wreck silted out and one of the divers died as they were lost inside.
The result of this online criticism? People stop telling stories which allow other people to learn from their experience. You can't teach everyone everything in a training class. You don't have enough time nor does the instructor have enough experience. Therefore you have to learn from others' mistakes and near-misses in a manner which talks about why decisions were made a certain way, even if that means they broke the 'rules'. The acceptable level of risk is a personal construct...with hindsight you are always better informed than the person at the time.
The diving community does not have a FAA, a CAA, an EASA or whatever overarching regulatory body exists in that territory (and I think it would add a level of complexity to a recreational activity s is probably not needed). Furthermore, diving does not have an independent investigative body such as the AAIB or the NTSB and any investigation that takes place is often protectionist in nature due to the threat of legal action. Indeed, one agency's incident form says "This form is being prepared in the event of litigation" - what are the chances that the 'real' story will come out if it means that rules, guidance or process has been broken?
Human error is normal. If there are consistent errors by the user at the sharp end, these are not individual issues they are systemic issues. However, to identify where those systemic issues lie, we need to collect data in a manner which allows it to be collated and analysed using a standard framework covering not just the proximal cause, but also to identify systemic issues. Aviation is as safe as it is because they have learned to recognise that human error and failure are normal and talk about it in a non-judgemental way.
Crucially, you can fire an individual, but if you don't change the system, the failure will continue to happen.
The Human Factors Academy provides human factors, non-technical skills and Just Culture training with the aim of improving diver safety which generally leads to diving being safer and more fun. After all, who wouldn't want to prevent an incident from happening...?