In the mid-1980s, there was a flight safety film produced which showed a Royal Air Force pilot walking out to his single-seat Jaguar fighter aircraft for a training sortie. He prepares the aircraft, starts it up and takes-off down the runway. Unfortunately, the aircraft has an engine failure immediately after take-off, but the pilot can’t jettison the external stores, and crashes and he dies because the aircraft doesn’t have enough power to fly on a single-engine given its full fuel load and the heavy external stores. It transpires that during the pre-take-off checks, the pilot forgot to arm the stores jettison system, and even though he is trying to jettison them, they won’t go because there is a safety system in place to stop an inadvertent release. It would be quite easy to blame this highly trained and professional pilot for forgetting to do something which was part of his pre-take-off checklist. However, the Royal Air Force recognised that it takes many things...
W sytuacji, gdy coś poszło nie tak i doszło do wypadku, bardzo szybko jesteśmy w stanie powiedzieć, w jaki sposób można było zapobiec temu wydarzeniu. Stosujemy zwroty typu: „gdyby tylko zrobili A…” lub „powinni byli zrobić B…” lub „mogliby zrobić C…” lub „ja bym zrobił D…”. Robimy to, ponieważ staramy się znaleźć rozwiązanie, które pozwoli nam zapobiec powtórzeniu się tego samego wydarzenia w przyszłości.
To naturalna reakcja. Staramy się wprowadzić porządek w nieporządek. Myślenie w ten sposób jest znane jako uzasadnienie kontrfaktyczne (czyli wyobrażenie sobie tego co mogło by być – przyp. tłum). W najbardziej podstawowej formie uważamy, że gdyby ludzie podjęli różne działania w danej sytuacji, to wynik byłby inny. Niestety, postępując w ten sposób stosujemy nieistniejące w danej chwili zmienne do historii, która się już wydarzyła, aby opowiedzieć...
One of the key themes I teach is that safety is not necessarily the absence of accidents or incidents, but rather the presence of barriers and defences and the capacity of the system to fail safely. This quote comes from Todd Conklin, a researcher and practitioner working in the safety and human performance industry in the US. The idea is that we develop technical and non-technical skills and design the equipment, procedures and training and manage the environment so that risk is managed at an acceptable level. However, we can't design or manage everything beforehand, so we need to be able to handle those 'odd events' and then share the stories so that others may learn.
In rebreather diving, one of the best ways to ensure that the unit is safe to dive is to make sure that the build and final pre-dive checks are completed. However, even if this is done diligently on the surface including packing the scrubber and then executing the complete set of final checks, things can go...
What does safe mean to you? The dictionary defines safe as “protected from or not exposed to danger or risk; not likely to be harmed or lost” and in the context of diving, we often think about the physical risks. These can include decompression sickness, animal-induced injuries, separated from the team/boat, entanglement, lost within a cave system or running out of gas. These are all credible negative outcomes which we should be concerned about. In fact, a number of these appear in the 2008 research paper from DAN (Common causes of open-circuit recreational diving fatalities) which examined triggers, disabling event/injuries and causes of death in diving, and so they should be definitely considered as part of our risk management plans and diving plans.
But what about another form of safety? A form which Google under project Aristotle identified as the key trait of high-performing teams and without which nothing else really mattered. A form...
When things go wrong, or incidents/accidents happen, it is easy to identify how the problem could have been prevented by applying one of the following the phrases ‘If only they’d done A…’ or ‘They should have done B…’ or ‘They could have done C…’ or ‘I would have done D…’ We do this because we are trying to identify a way in which we could prevent the same thing happening again in the future.
This is a natural reaction. We are trying to bring order to disorder and is known as counterfactual reasoning. At its most basic form, we think that if the people had taken different actions, then the outcome would have been different. Unfortunately, we are applying non-existent facts to the story to tell a different one, one with a happy ending.
Here are a couple of examples of counterfactuals in relation to diving:
Many of my readers will have heard about me talk about Professor James Reason's Swiss Cheese Model and how it can be used to show how incident develop because of holes in the barriers and defences which are put in place to maximise safety.
Professor Reason's research showed that at different levels within a system, there are different barriers or defences present. e.g. organisational, supervisor and individual. However, these defences can have holes in them because the organisations, supervisors and operators are all fallible and therefore the defences cannot be perfect.
You'll get notified of updates and news.
And we promise not to share, sell or give away your personal details!