EOD Operators delude themselves

I’m returning now to the issue of the psychology of EOD operators once again.  This interesting paper published last year discusses “The Rubicon Theory of War” and is well worth a read.  Its focus is more on the strategic issues of war-making and the psychology of politicians, generals and the public.  The theory suggests that people are cautious and avoid war when it appears a long way off, but as war gets closer their estimates about the succeses which they will encounter in the war and its likely result improves, probably way beyond rational expectation.

I’m wondering if there exists , sometimes, a more individual, tactical effect on the psychology of individuals in battle, and specifically within the mind of an EOD operator on a planned operation, i.e. one where there is time and planning activity to carry out in advance.   Does innate caution change as H-hour get closer? Is this simply a natural effect of knowing you have thought of everything and your planning is good? Or is it simply a psychologically based falsehood?. I suspect it is both and difficult to tell the two apart.  I certainly recall operators who were a little perturbed at the beginning of planning an operation and who were much more confident even if I thought their plan was less than perfect. Also, intriguingly I never remember feeling too much concern on those planned ops I did myself. But I am now old and addled of mind and it was a long time ago.

Now, there’s an issue here to dig into.  As a commander I wanted my operators to be confident. I didn’t want to see over-cautious, unsure, tentative operations. There’s nothing worse for the units the EOD team is supporting than to see their EOD operator being unsure and lacking confidence. On more than one occasion I had to sit down with unit commanding officers and either move an operator or try and protect him from a commander who thought he wasn’t on top of things.   But at the same time I wanted that confidence to be justified, and I wanted an operator to put up his hand and say “Hang on, its not as straight forward as that,”when it came to pushy commanders asking too much. Once I was even banned from a brigade area (only for a couple of hours!) by a 1 star who thought I was not supporting his aims and as he called it (wrongly) spreading alarm and despondency amongst senior policemen over a specific operation. It took a few phone calls to close that one out , I can tell you.

The Rubicon Theory suggests there are two mind sets – firstly “deliberative” which dominates during the pre-decisional phase. It’s cautious and tentative and should prompt detailed planning when various courses of action are considered and compared.  This switches to “implemental” once decisions are made.  That’s fine too, but the danger to EOD operators is that the implemental mind-set is liable to override new information as it comes in and tends to assume that planned EOD actions will be successful.   I see almost a parallel here with earlier blogs comparing the “thought-through” approach with the “instinctive” approach, that matches the constructed complex render safe procedures developed for a planned op with the SOP, rapid deployment “drills and skills” approach sometimes needed on higher tempo operations. As ever, the difficulty is separating the two, and knowing when to use one or the other.   If we have SOPs that we automatically use in certain scenarios, by God we should be confident in them.

The authors of the paper describe something interesting. People in “implemental” mind sets are much less open and receptive to new information that perhaps they should pay attention to. Instead (and this is important) they seek information that supports the choice they have already made.  I think that’s something that every EOD operator (and intell analyst) should avoid doing, whatever the scenario.  Looking back, frankly, it’s a failure I myself was guilty of on certain operations and I can now recall seeing it on others, although I was probably too dumb to see it as such, back then. 20/20 hindsight can be a sickening thing.  Implemental mind sets are over optimistic, and although EOD operations must activate implemental activity at some stage we need to guard against the weaknesses it generates. I suspect the key once again , is to recognise when a planned EOD action hasn’t done what you expected and be able to re-think from there. I sense that’s where things can go wrong if not grasped or recognized at that stage, but I won’t give examples for offending the living!

Suffice to say that this paper is a good read for EOD operators – take out the strategic war fighting examples the paper uses and insert your own tactical EOD experiences. It’s startling stuff.

EOD Operators are lazy

EOD Psychology – A technique for forcing System 2 thinking in EOD operations

In previous posts I have discussed some thought on EOD Psychology inspired by Daniel Kahnemann’s book “Thinking Fast and Slow” where the concepts of “System 1 and System 2 thinking are explored. In very simple terms an EOD operator should utilize “System 1” automatic thinking (by the use of drills) to enable speed within a framework of good principles.  However when something goes wrong or when the unexpected happens or appears, EOD operators must switch to “system 2 “careful analytical thought. The challenge is that some EOD operators (probably most) struggle with this change.  In some cases there is strong evidence to suggest, I believe, that system 2 is not engaged and an operator fools himself into thinking he has carefully thought through the issues, and is free from psychological biases when he is not. It’s then that people die.

Links to previous posts for any reader who wishes to see my earlier posts are here

http://www.standingwellback.com/home/2009/7/28/bomb-technician-training-and-psychology.html

http://www.standingwellback.com/home/2009/8/5/testing-intuition.html

http://www.standingwellback.com/home/2011/10/9/why-do-bomb-techs-do-stupid-things.html

http://www.standingwellback.com/home/2011/10/11/why-do-bomb-techs-do-stupid-things-part-two.html

http://www.standingwellback.com/home/2011/10/24/bomb-technicians-and-psychology.html

http://www.standingwellback.com/home/2011/11/1/eod-operators-are-not-rational.html

http://www.standingwellback.com/home/2011/11/3/eod-psychology-playing-doctors-and-nurses.html

http://www.standingwellback.com/home/2011/11/4/things-you-cant-do-in-a-bomb-suit.html

To this I’m feeding in concepts of narrative psychology where humans look to fit unknowns into a narrative, and the dangers of that in that we look to fit information into an existing simple narrative rather than develop a new complex narrative.  I’m now also drawing on a CIA publication ‘The Psychology of Intelligence Analysis’ by Richard J Heuer, Jr. available here under “1999”.

https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-and-monographs/

Although this latter reference is the understanding of psychology with regard to intelligence matters, I think there is an excellent “read across” to the EOD world.

In particular Part III of this latter reference examines cognitive biases where errors in analytical thought are caused by simple strategies.  I recommend a wade through this document. In fact if I had a chance I’d make a read of it mandatory for EOD operators.

One point it makes is that we perceive what we expect to perceive, and we look for indicators to reinforce our pre-conceptions rather than focus on indicators which suggest alternate explanations.  This is fundamentally true in EOD threat assessments as well as intelligence analysis.  It is this element that I’m currently focusing on. This research has allowed me now to begin to develop some suggested mental tools for EOD operators and I’d like to humbly present the first one now.

This first tool is one that increases “self awareness” in a situation where System 2 thinking might be demanded.  I am calling the tool the PRE-MORTEM.  The tool attempts to force, indeed “shock” the brain into considering other threats and forces the individual to consider issues that he has ruled out during his Threat Assessment. Here it is:

On any EOD operation where time allows, after the operator has conducted his threat assessment, he should spend a few short minutes developing an imaginary report. To do this he must imagine he is his immediate superior, writing a report to his fellow operators describing the mistakes the operator made that killed him on this very operation.  He can do this by filling in a quick form, which is easily developed, or if time precludes by giving it verbally to his team members in the CP/ICP.  The report should focus on errors in judgment that the operator made and the specifics of the threat he hitherto ignored which resulted in his death. Such activity forces the operator to imagine circumstances that his biases have potentially already set aside or discounted. It may be somewhat morbid but the morbidity can shock the operator into switching from System 1 to System 2 careful, analysis. Remember that the human brain and by corollary the EOD operators brain is lazy and needs encouraging to spend the effort of switching to system 2.

I appreciate that on some operations the tempo of the task does not allow for time to do this – but I certainly think time should be found if something unexpected has occurred. I’d be interested in any feedback on this technique.  I have cooking some other mental tools which might be useful to throw into EOD training and I’ll return to those in the future.

EOD Psychology – Playing Doctors and Nurses

OK, I’m beginning to develop more detailed thoughts now as I get into Kahneman’s “Thinking Fast and Slow”.  I’d like to explore one aspect now. Firstly thanks to those engaging with me on a couple of linked-in forums on this. Secondly I want to steal an idea my friend Andy Gibson told me about and merge it with something I’m getting from Kahneman.  It is an analogy to “types” of EOD activity and the skills sets needed, which I think match the System 1/System 2.

The analogy is this.  There are two sorts of EOD response as there are two sorts of medical response.  In the medical world a hugely important role is played by paramedics and nurses. They either go to scenes of accidents and have to make rapid decisions usually based on SOPs, or they are applying straightforward diagnostic skills routinely to patients.  Then there are the doctors, surgeons and diagnosticians who play a different medical role. They diagnose treat and operate when the level of “judgment” allows and requires a different skill set. At an accident the challenge is keeping the accident victim alive “now” for the next few minutes and huge skill is needed to do that. But a patient suffering an exotic unknown tropical; illness or a complex brain tumour needs different skill sets applied to the problem in a different time frame.  So. I’m wondering are these examples of System 1 (the nurse/paramedic) and System 2, the surgeon/diagnostician?

And in EOD I think the System 1 Nurse/paramedic style operation exists. This might be assault IED in high tempo operations, or routine predictable  SOP controlled mine clearance.   And the System 2 operation exists for the EOD equivalent of the surgeon/diagnostician.  Here the IED may need exotic techniques to render safe, or the nature of the IED may demand a significant diagnostic process, and there is time to undertake that.  The implications are that we need to recognise the boundaries and sometimes that’s really difficult.  There is also an important question this raises – should we have different people to conduct System 1 and System 2 tasks or do we train people to cross the boundaries.

A quick aside. One of the diagnostic tool sets for System 2 EOD is a thing called “Tactical Design Analysis” or TDA.  It’s also a skill set used in incident investigation by WIT teams and the like. I am a huge proponent of TDA, but you know what, I’ve never seen it taught systematically on any EOD training course, ever in the world, and one of my few claims is that I’ve knocked around the EOD training world quite extensively.

I’d also like to make a little clearer that these concepts of System 1 and System 2 are not new  – indeed I discussed them on this blog a couple of years ago. But what Kahneman is saying is that sometimes we fool ourselves that we are using System 2 when we are in fact relying on System 1. It’s that error, or bias that I think is dangerous for bomb techs.

I think (but I’m not yet sure and I need to think some more) that there are distinctive patterns when this fault occurs, and I hope to be able to understand those patterns better and develop some suggested strategies for dealing with it. I don’t want to big this up into the biggest question facing EOD operators – it’s not. But looking back at my experience of EOD – as a student, as an operator, as a commander and as a trainer, and most importantly as a developer of technical and operational procedures I’ve always had a concern and difficulty myself at this System 1/ System 2 interface, and I think that it would benefit from some hard thought.

EOD Operators are not rational

OK, another provocative title. I’m still looking hard at Bomb Squad psychology to understand better how EOD operators assess situations and make decisions in the heady atmosphere of an incident command post/control post. It’s the cognitive processes that I’m focusing on.  The fundamental starting point is that an EOD operation is a puzzle, where the nature of the problem and its solution are not always clear. I fully accept that in many EOD operations the problem and the solution are indeed clear… but it’s when these are not obvious that the challenge is encountered.

My current reading is Daniel Kahneman’s  very excellent “Thinking Fast and Slow”. I want time to think slowly about this myself, as some of the concepts are a bit mind-blowing, so again I’ll drip feed some thoughts in coming weeks. Suffice to say Kahneman is  a world leading psychologist and Nobel prize winner. Firstly he disrupts what I suspect is the common thinking behind many EOD training systems – that the operator is what psychologists call a “rational agent”.  This concept suggests that people behave like systematically, making entirely rational, evidence-based algorithmic decisions and the only challenge is to train them to utilize the correct algorithm.  Kahneman points out that this is not the case and makes a convincing argument.  People make systematic mistakes and non-logical decisions consistently.  In most lives this doesn’t really matter (and he talks extensively about the economic world), but in EOD the consequences of this are serious.

So Kahneman builds a new model for our cognitive processes and it’s a model that I think is leading me to some specifics as far as EOD is concerned. In broad outline (he says) there are two systems we use as cognitive processes:

System 1 – This all about how we use instinct based on little data (Malcolm Gladwell’s “Blink”) . It is automatic and effortless, and is the product of retained memory and learned patterns of association. It’s all about snap judgments. Sometimes we call it intuition.  But we are not good at telling whether it is right or not – if our intuition is right once, we tell ourselves we have “good intuition” and trust our gut every time after that – which in the vernacular is boll0x.  Find me one person in the world who doesn’t think they have good intuition.  Going back to my earlier posts, I think System 1 tries to fit the limited facts into a “narrative “ – (any narrative) and wings it from there.

System 2 – This is the difficult one. It considers, evaluates and reasons. Importantly for us,in the EOD context, it needs time, and it needs effort.  The key point Kahneman makes is that we believe we are using System 2 when we make our threat assessment and plan our RSP… but actually more often than not we are using System 1…. And that’s dangerous because that way makes assumptions on the basis that the narrative “sounds right” (see earlier blogs).  So the crucial piece, perhaps, is to “recognise” a System 1 decision when the operator is taking that decision and encourage a System 2 approach. I’m beginning to see some specific techniques that an EOD operator might use to do this and how it might be implemented in training.

Now, System 1 always functions no matter how little data we have. I think as EOD operators we “cage” System 1 with some firm SOPs that stop it getting to out of hand. So, for example, within the British community, the SOPs of “one man risk”, “as few approaches as possible”, “always take positive EOD action”, “always have an EOD weapon on hand on a manual approach” cage and constrain some of the weaknesses of System 1 and encourage good practice.  System 2 requires willpower, effort and discipline.  It’s hard work, intellectually and most importantly it requires time.  As someone famous (Edison, I seem to recall )once said – “There seems to be no limit to which some men will go to avoid the labor of thinking”  and Mark Twain said “ Thinking is the hardest work there is, which is probably the reason so few engage in it”.

I’m intrigued that the time factor is so crucial for System 2, and I wonder if the tempo of EOD operations in Afghanistan is such that the time for a System 2 approach is constrained. I suspect so.  As an aside I’m still a little haunted by some hand-written notes I once read about an EOD operator who died in the 1970s.  The operator was a youngish Captain with a good track record of IED disposal in an intense operational theatre. But the tempo of operations increased and the operator was clearly tired.  His boss when writing in a small innocuous exercise book, as he had dozens of times before about the performance of operators on their “tour”, was simply bewildered why this experienced operator was taking the actions he was taking when the device exploded blowing him across a field, dead. I’m not haunted by the operator, who I never knew, I’m haunted by the desperate words that his OC used as he expressed his lack of understanding as to the actions this man had chosen.

I’m also intrigued by the implication to high intensity tactical operations. The infantry solder, boots on the ground, fights a System 1 battle, while the commander and staff officers fight a System 2 battle…and the EOD operator has to fit his actions with both, depending on the circumstances and I think that causes some dynamic tension that EOD operators have to be trained to cope with.

In future posts I hope to start to develop some possible EOD “specifics” from this System 1/System 2 cognitive process which I’d like to share or have shot down by readers. 

Bomb Technicians and Psychology

In posts a couple of weeks ago I was exploring why bomb technicians make mistakes. To declare my prejudices, I suspect that there are good bomb technicians and bad bomb technicians  (delete and insert IEDD operators as you see fit). There are two consequent questions: 

  1. How do you identify good from bad during assessment (which may or may not be “on the job”)
  2. How do you train bomb technicians to avoid mistakes

There are complex issues here. Firstly we are talking about life and death decisions, for both the bomb technicians but also the trainers and assessors. And possibly for others, too.   Secondly should we identify optimal technicians/operators before their training?  Or weed out the bad ones during training? Or at the end of training or throw them out when their mistakes are made operationally?   This leads to further questions such has how do we identify sub-optimal people before training and is such a filtering valid? There are also questions about how we test and assess those undergoing training and how we monitor them operationally.

The truth is, I fear, that not enough thought is going into this. Indeed I have not encountered any EOD organization, anywhere on the globe, that has spent enough time on trying to answer these questions.  Anyone care to disagree?

So I think as a first step understanding the psychology of bomb technicians undertaking operations is important. In my humble opinion it is what I call the “threat assessment” phase that is the crucial step and indeed the step I struggled most at during training.  It is also the step that as a commander I saw people struggle most with operationally, and I see it still to this day when observing training.

Let’s return to the psychological functions at play at this point in the operator’s thought processes.  The psychologist Lee Roy Beach (with whom I have had some recent dialogue) posits 6 different sorts of “memory” at play here:

Immediate memory – the current focus of attention, that information that the operator receives at the scene of a suspect IED, as he receives it.  I think there are interesting and complex issues to explore here about techniques an operator can use to elicit information from witnesses and the way he asks the questions.   I think “organized thinkers” with a good questioning technique can optimize the immediate memory. Too few EOD schools or courses teach questioning technique effectively. I think this is an area the military can learn hugely from law enforcement – cops are naturally better questioners than soldiers. Are there any studies or theories about questioning technique out there that could be applied to EOD technicians?  Immediate memory also plays a key role in threat assessment. In an IED operation the operator is bombarded with data and information via many senses. An ability to “filter’ this information and notice what is important is, perhaps, the crucial key. 

Retentive memory – These are things learned in the past. It might relate to the types of devices a terrorist group uses, their tactics, and also perhaps how the immediate memories (above) are stored in the brain

Procedural memory. These are skills that have been learned.  How to use a particular tool, how to undertake a particular task. Questioning technique should perhaps be a procedural memory-based skill

Declarative memory – this is information, any intelligence received before the operation began. It is crucial to  threat assessment. I think that there is significant room for more work to understand how “intelligence reports” can be designed to optimize their use and understanding by specific users.  Declarative memory is important in this idea of “Narrative psychology”-  a story that “makes sense” and hangs together. A narrative that doesn’t hang together is an indication that the threat assessment is wrong. But it is easy for a poor operator to justify a weak narrative and see what he wants to see, because the alternative is confusion. I think this piece is quite crucial in understanding why things go wrong. A poor operator convinces himself a narrative makes sense, when it doesn’t.

Episodic memory – these are portions of the narrative that can be recited as specific episodes- think of them as chapters in a story.   The convoy was travelling down the road, in the following order at about this speed, in this order, and they turned left…

Semantic memory – these are the memories that link the episodic knowledge together and link them to general knowledge. Semantic knowledge also deals with hierarchical understandings.

So to summarize, the building blocks of an IEDD threat assessment are complex and, not only that, very frequently they are occurring in highly pressured situations, indeed life and death situations.  To pick out a couple of important elements:

1. Filtering out important information from a vast bandwidth of sensory perception is tricky. Lee Roy Beach makes the statement, that many of us will recognize: My contention that people are better at knowing what’s wrong with things than with what’s right–better at knowing what they don’t want than what they do. We spot discrepancies faster than congruencies, if only because errors and threats usually reveal themselves in discrepancies from the norm or from the expected. I think that this simple truth has vast implications for systems design and training.    Now this makes sense – we have all heard of ordinary soldiers who are better at spotting the roadside IED because they sense something out of place.  The trouble is we need all our EOD operators to have that skill and some don’t. Can it be trained? Or is it natural?

2. Making a narrative that “makes sense” is a crucial step.  Just as crucial is validating that narrative. It is here that I see many problems with the concept of two EOD operators working in harness, together. All too often the response from one is not to challenge but to say “Yeah that sounds good”.  There needs to be a more adversarial approach to establishing a valid narrative. Working down to a subordinate No 2 can be tricky indeed, but the No 2 should be encouraged to say “Boss, how does that fit our SOPs?”.  Working up to a commander (who may not be present) , the EOD operator has to expect his narrative to be challenged and justify it accordingly.  If a path of action can’t be justified on the radio to an experienced boss, it shouldn’t be let out of the incident command post.  But two equals working together rarely (never?) are adversarial enough and simply reinforce the others poor judgment.  

Comments welcome, as ever.

Close Me
Looking for Something?
Search:
Post Categories: