Domino Theory and EOD “Near Miss” investigations

I’ve blogged before about the view that the EOD community should have and embrace a “near miss” reporting process for incidents that don’t result in death or injury but might have.  In conversation with other figures in the EOD world, I sense that the view that such a process should be looked at and encouraged, at last, is growing in momentum.

Other industries such as aviation and the nuclear industry have such processes and a culture that supports it.   The EOD community (normally) only has such a process when injury or death occurs.  My personal view is that that is wrong and means that organizational we don’t benefit from others mistakes. I think it means people die unnecessarily and I think it reflects poorly on our thinking and on the leadership within the community, and is morally untenable.

I know there are contrary views, and I respect them, if not understand them.  I know that the biggest hurdle is cultural and the issues of a career foul if an EOD technician self reports a failure is significant. That’s a key issue that needs addressing. I know that a near miss system also requires a vigorous and effective command and control system to implement and manage this, and leaders, managers, investigator and professionals with deep levels of skill (and pure leadership and moral courage) to implement. I know a “near miss” reporting and investigation system isn’t going to be easy, and will have tricky engagement elements whichever branch of the EOD world it applies to.

Perhaps there are preliminary steps. Perhaps a training course in accident and near accident investigation for EOD unit commanders is a first step.  Ken Falke has developed an excellent concept in the IMPACT program and that perhaps sets a template that might be used for ear misses too.

Everyone has an opinion on this and many of the community are resistant to such radical concepts  – and I do admit that the concept will require radical change.

I don’t think that such a system can be introduced over night, as a diktat.  I think it needs a lot of thought and that a whole lot of work needs doing first. My understanding is that some of the professional institutes are interested in funding some academic work, maybe a PhD, to analyse this more carefully, hopefully resulting in recommendations that would inform future systems, and that would be great. I think there’s a lot of work than can be done to extend and build on the work being done to improve the current investigative and lessons learned processes that apply to injuries and deaths into those that are “near misses”.

A “near miss” is simply an accident that occurred but by perhaps chance caused no injury. Who can honestly claim that they don’t want to know the causes of that accident, whether it caused an injury or not?  Let me address one point here. It is very rare, I think, that accidents and near misses of any kind are caused by single point failure. There’s a sort of “domino” theory of accidents, with a whole chain of faults or failures, often, before the final step.  As EOD professionals I think it a greater awareness of the context of a near miss, of the other dominos in the chain which resulted in either a fatality or an accident OR a near miss are vitally important.  Those other dominos might involve:

–       Poor quality of training (content , design or delivery)

–       Ineffective tool capability (hence identifying equipment requirements)

–       Procedural inadequacies

–       Unknown technical challenges presented by ordnance or IEDs

–       Poor operator choices, mental or physical condition

–       Poor operational command issues

–       Poor communications

–       Poor intelligence

–       Contributing causes from a whole host of other areas.

So almost all the time , risks to EOD technicians and operators are complex and are not simply a direct relationship to operator capability.  Identification and characterization of all the other contributing causes for accidents and near accidents I think can valuably lead to improved professionalism and save lives.

This is an important an emotive issue, and I’m very open to publishing alternate views from others in the community, as is, with no edit from me. Fire away.

Share:

2 Comments

  1. 24th July 2013 / 1:47 am

    Well put Roger. The need for a near miss reporting system for EOD and other parts of the explosives profession is profound. There is another factor to consider as an obstacle: pride.

    I am a Member of the Institute of Explosives Engineers and have authored a few articles for the Journal. One was the opening article for a new section (some years ago) aptly named "The Learning Curve". The obvious purpose of the section was to allow members to describe lessons learned in explosives related events which, hopefully, could be put in a humorous light for the benefit of all. My article described an incident that occurred while conducting a destruction burn of ammo in which no one was hurt but primarily resulted in red faces all round. Mike Groves wrote a follow on article where he similarly provided a humorous example of demolition activities. Then the series ended, as I understand it, from a lack of articles.

    I know that from an august group such as the Institute more articles on mishaps are available, likely hundreds. It is the nature of the business of doing unusual things. But no one provided articles. There maybe other explanations but I submit to you that pride is the likely cause. Not wanting to appear even the least bit ignorant in front of peers.

    This is the type of pressure and lack of information that definitely can cause incidents to be repeated un-necessarily.

  2. Peter Lovett
    16th August 2013 / 12:15 am

    An interesting point of view that has much to commend it. Why should people have to re-invent the wheel every time they go to work on an IED.

    My area is aviation where a reporting culture has long been encouraged to the end that there is a saying very common among pilots; learn from the mistakes of others as you won't live long enough to make them all yourself. The study of human factors in aviation is now massive and research continues to this day as, unfortunately, the most common cause of crashes in aviation are now human.

    There may also be some benefit in the use of another concept that has grown out of aviation crash causation. That is Prof. James Reason's "Swiss cheese" model of causation. Aviation has attempted to strengthen each element involved in flying; better pilot training, better maintenance, better meteorological information etc. However, Prof. Reason's model suggest that in each layer there are holes and when those holes line up the result is an aircraft crash. Although I am not involved in explosives in any way shape or form I can imagine that in the ultimate failure of an explosion, a similar line-up of weaknesses will be involved.

Leave a Reply

Your email address will not be published. Required fields are marked *

Close Me
Looking for Something?
Search:
Post Categories: