ESRC Learning from Incidents Seminar - Research to Practice

- english gareth lock human factors research safety Dec 22, 2015

"Keep the system open".

"Immediately close the system".

Two diametrically opposite decisions involving the same equipment with the same potential impact but occurring at different times and with different supervisors and the organisation's staff think they are both correct! Why?

I recently attended the 5th Economic Social Research Council (ESRC) Learning from Incidents seminar which is being managed by Glasgow and Caledonian University and was held at the British Safety Council HQ in London on June 15. The series of seminars are being run to bring specialists from various inter-disciplinary industrial organisations and companies, and academic institutions together with the key aim of improving our ability to learn from incidents.

The seminars are run along the same format each time, key presentations from invited speakers from around Europe are given followed by a discussion period trying to elucidate key points for that theme. All of the seminars will feed into wider research and will have covered 6 topics.

The theme for this last seminar was Research to Practice with the following topics being presented:

  • Dr Ritva Engeström (University of Helsinki) - Change Laboratory and Developmental Work Research
  • Professor Eve Mitleton-Kelly (London School of Economics) - Addressing complex problems through collaboration: A complexity theory approach
  • Professor Lasse Gerrits (University of Bamberg) - Back to normal: Generating resilience in complex systems
  • Dr Luise Vassie (Independent Consultant and Advisor) - Abstract

What I go out of this event was three-fold

1. The ability to network and discuss key issues across a number of domains with true experts in their field thereby expanding my own knowledge is essential.

2. The ability to listen to the differing viewpoints from industry and academia with specific cases studies or theories presented identifying how problems were solved or how theories were proved. The difference viewpoints is very interesting as I sit in both academia (undertaking a part-time PhD) and the output into industry (training and coaching in a number of domains in how to educate and modify behaviours in a practical and pragmatic manner.)

3. The confirmation that a number of terms which we think are common, are in fact dominated by the perspective we take and that until a common language and context is clearly defined, the ability to learn from other domains will be limited. This was especially true when considering what 'incident' actually meant!

One aspect which I thought resonated with much of my work is that an incident is heavily context dependent. For example, two diametrically points of view were both considered correct by those involved in, and impacted by, the incident.

In the first case, operational imperative meant that two components placed in a critical position in an infrastructure setup (in terms of traffic) were not replaced immediately after they had been found to have reached a critical degraded level. The delay was due to the impact this would have on the network (just before morning rush hour), and as such, the plan was to defer replacement until a slower period some 12 hrs later in the day.

Whereas sometime later in the day (10 hrs after the decision had been made), the locally-responsible supervisor took a binary view of the problem: the components must be either safe or unsafe. Consequently deciding that if they were going to be changed later, they must have been unsafe and as such called a stop to the traffic on the infrastructure. This unplanned stop took place at the peak of evening rush hour and the knock-on effect then took another 6 hrs to clear before the components could be replaced rather the planned 2-3 hrs to replace the components if a graceful degradation of the system had been executed.

In the post-event analysis that took place, individuals within the organisation were asked whether the first supervisor had made a 'correct' decision to defer the shutdown and they mainly replied yes. When the same people were asked later in the investigation whether the second supervisor had made the 'correct' decision in terms of calling an immediate stop, they again mainly agreed that they had.

So here we have the same people looking at similar events (short notice closure of an infrastructure network) at different times of the day but each event having the same consequences - major impact on traffic flow through this node. In one case, the decision appeared to be based around not impacting the wider infrastructure network and potential negative service reputation that would be caused, and the second potentially predicated on the impact of a catastrophic failure of the components and a disastrous incident.

Which one was correct? You could say 'it depends' but I don't think you can say definitively one way or another.

Finally, I raised what I think is a key point in this research project: we broadly know what the issues are from lessons identified, and we know that we need to get stakeholders to learn, but before changes can be effected, it is essential that we understand what motivates individuals or organisation's to learn and apply that knowledge accordingly. This will pose a significant challenge because the motivators at the organisational level may have a negative impact on the adoption of revised practices, processes or behaviours at the individual level. This subject really needs to be considered if the front-line practitioners (at the individual or organisational level) are to make a difference.



Gareth Lock is the owner of The Human Diver, a niche company focused on educating and developing divers, instructors and related teams to be high-performing. If you'd like to deepen your diving experience, consider taking the online introduction course which will change your attitude towards diving because safety is your perception, visit the website.