EOD Operators delude themselves

I’m returning now to the issue of the psychology of EOD operators once again.  This interesting paper published last year discusses “The Rubicon Theory of War” and is well worth a read.  Its focus is more on the strategic issues of war-making and the psychology of politicians, generals and the public.  The theory suggests that people are cautious and avoid war when it appears a long way off, but as war gets closer their estimates about the succeses which they will encounter in the war and its likely result improves, probably way beyond rational expectation.

I’m wondering if there exists , sometimes, a more individual, tactical effect on the psychology of individuals in battle, and specifically within the mind of an EOD operator on a planned operation, i.e. one where there is time and planning activity to carry out in advance.   Does innate caution change as H-hour get closer? Is this simply a natural effect of knowing you have thought of everything and your planning is good? Or is it simply a psychologically based falsehood?. I suspect it is both and difficult to tell the two apart.  I certainly recall operators who were a little perturbed at the beginning of planning an operation and who were much more confident even if I thought their plan was less than perfect. Also, intriguingly I never remember feeling too much concern on those planned ops I did myself. But I am now old and addled of mind and it was a long time ago.

Now, there’s an issue here to dig into.  As a commander I wanted my operators to be confident. I didn’t want to see over-cautious, unsure, tentative operations. There’s nothing worse for the units the EOD team is supporting than to see their EOD operator being unsure and lacking confidence. On more than one occasion I had to sit down with unit commanding officers and either move an operator or try and protect him from a commander who thought he wasn’t on top of things.   But at the same time I wanted that confidence to be justified, and I wanted an operator to put up his hand and say “Hang on, its not as straight forward as that,”when it came to pushy commanders asking too much. Once I was even banned from a brigade area (only for a couple of hours!) by a 1 star who thought I was not supporting his aims and as he called it (wrongly) spreading alarm and despondency amongst senior policemen over a specific operation. It took a few phone calls to close that one out , I can tell you.

The Rubicon Theory suggests there are two mind sets – firstly “deliberative” which dominates during the pre-decisional phase. It’s cautious and tentative and should prompt detailed planning when various courses of action are considered and compared.  This switches to “implemental” once decisions are made.  That’s fine too, but the danger to EOD operators is that the implemental mind-set is liable to override new information as it comes in and tends to assume that planned EOD actions will be successful.   I see almost a parallel here with earlier blogs comparing the “thought-through” approach with the “instinctive” approach, that matches the constructed complex render safe procedures developed for a planned op with the SOP, rapid deployment “drills and skills” approach sometimes needed on higher tempo operations. As ever, the difficulty is separating the two, and knowing when to use one or the other.   If we have SOPs that we automatically use in certain scenarios, by God we should be confident in them.

The authors of the paper describe something interesting. People in “implemental” mind sets are much less open and receptive to new information that perhaps they should pay attention to. Instead (and this is important) they seek information that supports the choice they have already made.  I think that’s something that every EOD operator (and intell analyst) should avoid doing, whatever the scenario.  Looking back, frankly, it’s a failure I myself was guilty of on certain operations and I can now recall seeing it on others, although I was probably too dumb to see it as such, back then. 20/20 hindsight can be a sickening thing.  Implemental mind sets are over optimistic, and although EOD operations must activate implemental activity at some stage we need to guard against the weaknesses it generates. I suspect the key once again , is to recognise when a planned EOD action hasn’t done what you expected and be able to re-think from there. I sense that’s where things can go wrong if not grasped or recognized at that stage, but I won’t give examples for offending the living!

Suffice to say that this paper is a good read for EOD operators – take out the strategic war fighting examples the paper uses and insert your own tactical EOD experiences. It’s startling stuff.

Share:

1 Comment

  1. Muso
    19th April 2012 / 11:54 am

    Great article Roger. I'm not an EOD operator but I'd be interested in your thoughts on this:

    There is another reasonably well accepted logical structure of the influences that govern all human behaviour. This breaks down all human motivations into 3 influences: Intellectual/ Social/ Biological.

    I think these are potentially useful to your analysis. In the EOD case, social influences are at the back of the queue – other than you are doing the job for entirely social reasons your decision making isn't influenced by what anyone else thinks – I hope 🙂

    The Rubicon theory fits well with the remaining two aspects of the intellectual and biological drivers. Intellectual and deliberative are essentially describing the same thing, I think you will agree. What I think is interesting is that the implemental stage is a correct description of where you are at in the process but potentially overlooks the fact that, in a high risk situation, your fight or flight instincts are brought to the fore. And if we're going to have a fight then I better believe I can win and I will get the first blow in! So my confidence suddenly ignores everything other than my raw biological instinct for survival (or, more generally, success) and the rubicon forms isolating access to other influences on my actions that may be more reasonable.

    In the case of facing the prospect of any serious adverse situation, I will be rational and seek to avoid conflict or danger. But if I get to the point where I cannot avoid that conflict or danger then I enter a primarily biologically influenced state of mind; my intellectual quotient having been instructed to shut up while we get out the hell of this situation. In this state my confidence appears to be high but, in truth, it is no longer appropriate to compare rational confidence with my present state of mind. Confidence is not a relevant metric in this condition because I am working on adrenaline fuelled instinct. In this state my brain has closed its doors to new information as I don't have any processing capacity for that at this time.

    I appreciate that well trained EOD operators will not be in a state of high anxiety and that they will have the knowledge and drills to maintain presence of mind. But is it reasonable to suggest that the relative balance between the intellectual and the instinctive biological states, as the task progresses, is what causes the phenomena you are describing and leads to the exclusion of additional information?

    If yes then, essentially, the operator simply 'can't help it'. In that case, what are the implications for procedures that will enable an operator to regain intellectual dominance of the situation?

Leave a Reply

Your email address will not be published. Required fields are marked *

Close Me
Looking for Something?
Search:
Post Categories: