You can contact me at rogercdavies(atsquiggle)  If you have a comment and the system won't let you post it, ping me using the @ for (atsquiggle)

This blog has evolved into a review of historical and modern explosive devices, and responses to them. Links are drawn between historical activity and similar activity in the world today. Mostly I focus on what are now called IEDs but I have a loose personal definition of that and wilingly stray into discussions of more traditional munitions, the science and technology behind them, tactical employment and EOD responses. Sometimes it's just about interesting people in one form or another. Comment is welcome and encouraged but I do monitor it and reserve the right to delete inappropriate stuff. Guest posts are always welcome. Avoid any stuff that makes the enemy's job easier for them.

A note on moral perspectives. Throughout this blog there are descriptions of all sorts of people using IEDs, explosives, or suffering the consequences. Some of the people using IEDs are thought of as heroes by some and terrorists by others. One person's good guy fighting for a cause is another person's evil demon.  It's complicated, and history adds another series of filters too. All of us too live in a narrative made up around however we were brought up, what we were taught and what we learned along the way, rightly or wrongly. So if you sense moral ambivalence, one way or the other, well, I'm guilty and I'm not perfect.  By and large though, I have unapologetic sympathy for those dealing with the devices, whether they be soldiers, cops, or whatever, even those who are part of Nazi or other nasty regimes. That's the cool thing about EOD techs - we don't really care who the enemy is.


Entries in EOD Psychology (13)


More Black Box Thinking

My article here , on the subject of "Black Box Thinking" and other supporting activities has now been reproduced in a number of professional journals within the EOD community, namely the C-IED Journal,  "The Detonator" (the Journal of the International Association of Bomb Technciicans and Investigators), and this month will be featured in the Journal of the Institute of Explosive Engineers.

I continue to be struck be the lessons for the EOD community that can be learned from the aviation industry. One of the most high profile proponents of what Matthew Syed calls Black Box Thinking of that activity, Captain Chesley "Sully" Sullenbeger is interviewed here and it's well worth a read.  Sully is elsewhere quoted as saying:

"Everything we know in aviation, every rule in the book, every procedure we have, we know because someone died. We have purchased at great cost, lessons bought with blood that we have to preserve as institutional knowledge."

So, a challenge. Can we in the EOD community look in the mirror and say we have valued every drop of blood spilled by our community?  Have we wrung every lesson learned dry, examined our failures and treated them as lessons to be learned by the entire community and thus honoured their sacrifice?  Can we honour and respect those named on our memorials if we don't bother to understand exactly the lessons they provide us? Do we, as a community, have exemplary post incident or post "near-miss" investigative procedures? Do our policies encourage the admission of errors without penalty?  Do we have the hard data to support our policy development or are we still doing it by seat-of-the-pants anecdote?  Are our rank structures and organisational frameworks road-blocking rational self examination?  Is it too difficult? Is it too much to ask?

We too have experience bought in blood, but have we valued it?




Near-Misses - Lessons from WW2, and a danger highlighted

I’m returning once again to the issue of the need for “near-miss” reporting in the EOD community.  My previous post on the subject is here.

I’ve been digging further and found an interesting study done by a famous Cambridge University psychologist, J T MacCurdy in the 1940s.  MacCurdy studied “The Structure of Morale” as it affected the British population and military during the first part of WW2. One subset he looked at were those who experienced the “Blitz” in London in 1940 and 1941.

MacCurdy theorized that people affected by the blitz fell into three categories:

  • Those who were killed.
  • Those who experienced very closely a bomb explosion. They are in the immediate vicinity – they feel the blast, see the destruction, are horrified by the carnage, perhaps they are wounded, but they survive. They are deeply “impressed” in the sense that there is a powerful reinforcement of the natural fear reaction. The psychological experience is of being faced by death, and the fear is reinforced.
  • The next category, who MacCurdy describes as the “remote-miss” group are one step further removed.  They hear the sirens, they hear the enemy planes, they hear the explosion. But they are unharmed. Psychologically the remote miss is different – the fact that they escaped injury and escaped the devastation lessens fear – the catastrophe was physically remote and so fear is put to one side. In many there builds a feeling of invulnerability and indeed excitement.

Now, while such feelings experienced by the "remote-miss" category were probably instrumental in reducing the effects of the Blitz on the population in Britain at the time, my position is that if you took those three categories and applied them to EOD operators on modern day EOD operations, then the same categories might present themselves.

  • Those operators who are killed. There will, inevitably, be some form of reporting and investigation.
  • Those operators who are injured or actually witness an unintended explosion. Usually there will be some form of investigation.
  • Those operators who experience a near miss, make a mistake but no explosive event occurs. I think this is analogous to MacCurdy’s “remote-miss” group.  Such events are usually not reported or investigated.  Not only does the mistake remain unaddressed, but the operator, perhaps begins to feel invulnerable and that can lead to tragedy, of course.

Instinct and a little experience tells me that that this psychological phenomena of an EOD operator feeling invulnerable after a series of "near misses" may be true.  I believe that an appropriate near miss reporting system could mitigate against this danger. 

Some other broader points about near-miss reporting:

  • In terms of human lives, near misses are cheap, sometimes  almost zero-cost learning tools that can be used to prevent injury and death.
  • The number of near misses, compared to fatalities is significant. Typically the aviation industry finds about 600 near-miss incidents to one fatal incident. That’s a hugely useful data set that is currently being ignored in the EOD community and not even collected.

I accept the issue of non-punitive reporting is the nub of the challenge as it faces the EOD community but:  

  • The aviation community has confidential near miss reporting systems. 
  • The US fire and rescue community has an anonymous near-miss reporting system.  
  • There are near-miss reporting systems for nurses and doctors in the healthcare community
  • The British railways system has had a near miss reporting system since 2006.

If these industries and professions can manage such systems, why can't we?  Are our egos too big?


Domino Theory and EOD "Near Miss" investigations

I’ve blogged before about the view that the EOD community should have and embrace a “near miss” reporting process for incidents that don’t result in death or injury but might have.  In conversation with other figures in the EOD world, I sense that the view that such a process should be looked at and encouraged, at last, is growing in momentum.

Other industries such as aviation and the nuclear industry have such processes and a culture that supports it.   The EOD community (normally) only has such a process when injury or death occurs.  My personal view is that that is wrong and means that organizational we don’t benefit from others mistakes. I think it means people die unnecessarily and I think it reflects poorly on our thinking and on the leadership within the community, and is morally untenable.

I know there are contrary views, and I respect them, if not understand them.  I know that the biggest hurdle is cultural and the issues of a career foul if an EOD technician self reports a failure is significant. That’s a key issue that needs addressing. I know that a near miss system also requires a vigorous and effective command and control system to implement and manage this, and leaders, managers, investigator and professionals with deep levels of skill (and pure leadership and moral courage) to implement. I know a “near miss” reporting and investigation system isn’t going to be easy, and will have tricky engagement elements whichever branch of the EOD world it applies to. 

Perhaps there are preliminary steps. Perhaps a training course in accident and near accident investigation for EOD unit commanders is a first step.  Ken Falke has developed an excellent concept in the IMPACT program and that perhaps sets a template that might be used for ear misses too.

Everyone has an opinion on this and many of the community are resistant to such radical concepts  - and I do admit that the concept will require radical change.

I don’t think that such a system can be introduced over night, as a diktat.  I think it needs a lot of thought and that a whole lot of work needs doing first. My understanding is that some of the professional institutes are interested in funding some academic work, maybe a PhD, to analyse this more carefully, hopefully resulting in recommendations that would inform future systems, and that would be great. I think there’s a lot of work than can be done to extend and build on the work being done to improve the current investigative and lessons learned processes that apply to injuries and deaths into those that are “near misses”.

A “near miss” is simply an accident that occurred but by perhaps chance caused no injury. Who can honestly claim that they don’t want to know the causes of that accident, whether it caused an injury or not?  Let me address one point here. It is very rare, I think, that accidents and near misses of any kind are caused by single point failure. There’s a sort of “domino” theory of accidents, with a whole chain of faults or failures, often, before the final step.  As EOD professionals I think it a greater awareness of the context of a near miss, of the other dominos in the chain which resulted in either a fatality or an accident OR a near miss are vitally important.  Those other dominos might involve:

-       Poor quality of training (content , design or delivery)

-       Ineffective tool capability (hence identifying equipment requirements)

-       Procedural inadequacies

-       Unknown technical challenges presented by ordnance or IEDs

-       Poor operator choices, mental or physical condition

-       Poor operational command issues

-       Poor communications

-       Poor intelligence

-       Contributing causes from a whole host of other areas.

So almost all the time , risks to EOD technicians and operators are complex and are not simply a direct relationship to operator capability.  Identification and characterization of all the other contributing causes for accidents and near accidents I think can valuably lead to improved professionalism and save lives.

This is an important an emotive issue, and I’m very open to publishing alternate views from others in the community, as is, with no edit from me. Fire away.


Why don’t we investigate “near misses” on EOD operations?

Interesting article here about the psychology of “near misses”.  It’s human nature to think that a project is a great success even if total disaster was missed by a fraction.

I think there’s an interesting corollary here for EOD operations. Most countries have an investigation system for examining when there’s an incident that kills or injures an EOD operator or bomb disposal technician. But if the operator is “lucky” and escapes unscathed there’s often no such investigation… and as the article points out, the individuals involved tend to repeat their potentially disastrous behaviour.  To quote from the link “People don’t learn from a near miss, they just say “it worked, lets do it again.””  Ok ,there are the occasional exceptions, but on a global basis I think the statement there is no investigation of near misses is true as generalisation.

 I’m fascinated that the FAA has addressed their problems in the area of "near misses" by analyzing the issues and pre-emptively fixing them so that there has been an 83% drop in fatalities over the past decade.

So… how do we collect and analyse the “near miss” data from EOD operations? (And you know that generally the answer is that we don’t). I think partly there is a culture in bomb techs globally to avoid such activity and partly there are frankly weak oversight structures over most EOD units.  That’s provocative I know but I stand by what I’m saying – argue me back if you wish.

One of the facts quoted in the article is that a risk analysis firm suggests that there are between 50 and 100 “near misses’ for every serious accident. Instinctively I wouldn’t be surprised if that stat applies equally to EOD operations.

Most incident investigations of EOD casualties work backwards – I think it's time the community turned this on its head and EOD organizations get used to trying to spot the near miss. I don’t doubt that would require a huge cultural shift, and collection and analysis of a lot of data but I think it’s needed.




EOD Operators delude themselves

I’m returning now to the issue of the psychology of EOD operators once again.  This interesting paper published last year discusses “The Rubicon Theory of War” and is well worth a read.  Its focus is more on the strategic issues of war-making and the psychology of politicians, generals and the public.  The theory suggests that people are cautious and avoid war when it appears a long way off, but as war gets closer their estimates about the succeses which they will encounter in the war and its likely result improves, probably way beyond rational expectation. 

I’m wondering if there exists , sometimes, a more individual, tactical effect on the psychology of individuals in battle, and specifically within the mind of an EOD operator on a planned operation, i.e. one where there is time and planning activity to carry out in advance.   Does innate caution change as H-hour get closer? Is this simply a natural effect of knowing you have thought of everything and your planning is good? Or is it simply a psychologically based falsehood?. I suspect it is both and difficult to tell the two apart.  I certainly recall operators who were a little perturbed at the beginning of planning an operation and who were much more confident even if I thought their plan was less than perfect. Also, intriguingly I never remember feeling too much concern on those planned ops I did myself. But I am now old and addled of mind and it was a long time ago.

Now, there’s an issue here to dig into.  As a commander I wanted my operators to be confident. I didn’t want to see over-cautious, unsure, tentative operations. There’s nothing worse for the units the EOD team is supporting than to see their EOD operator being unsure and lacking confidence. On more than one occasion I had to sit down with unit commanding officers and either move an operator or try and protect him from a commander who thought he wasn’t on top of things.   But at the same time I wanted that confidence to be justified, and I wanted an operator to put up his hand and say “Hang on, its not as straight forward as that,”when it came to pushy commanders asking too much. Once I was even banned from a brigade area (only for a couple of hours!) by a 1 star who thought I was not supporting his aims and as he called it (wrongly) spreading alarm and despondency amongst senior policemen over a specific operation. It took a few phone calls to close that one out , I can tell you.

The Rubicon Theory suggests there are two mind sets – firstly “deliberative” which dominates during the pre-decisional phase. It’s cautious and tentative and should prompt detailed planning when various courses of action are considered and compared.  This switches to “implemental” once decisions are made.  That’s fine too, but the danger to EOD operators is that the implemental mind-set is liable to override new information as it comes in and tends to assume that planned EOD actions will be successful.   I see almost a parallel here with earlier blogs comparing the “thought-through” approach with the “instinctive” approach, that matches the constructed complex render safe procedures developed for a planned op with the SOP, rapid deployment “drills and skills” approach sometimes needed on higher tempo operations. As ever, the difficulty is separating the two, and knowing when to use one or the other.   If we have SOPs that we automatically use in certain scenarios, by God we should be confident in them.

The authors of the paper describe something interesting. People in “implemental” mind sets are much less open and receptive to new information that perhaps they should pay attention to. Instead (and this is important) they seek information that supports the choice they have already made.  I think that’s something that every EOD operator (and intell analyst) should avoid doing, whatever the scenario.  Looking back, frankly, it’s a failure I myself was guilty of on certain operations and I can now recall seeing it on others, although I was probably too dumb to see it as such, back then. 20/20 hindsight can be a sickening thing.  Implemental mind sets are over optimistic, and although EOD operations must activate implemental activity at some stage we need to guard against the weaknesses it generates. I suspect the key once again , is to recognise when a planned EOD action hasn’t done what you expected and be able to re-think from there. I sense that’s where things can go wrong if not grasped or recognized at that stage, but I won’t give examples for offending the living!

Suffice to say that this paper is a good read for EOD operators – take out the strategic war fighting examples the paper uses and insert your own tactical EOD experiences. It’s startling stuff.