Bomb Technician Training And Psychology

As I said below I’m exploring some ideas about optimising certain aspects of bomb technician training, to address what I think are some current weaknesses in terms of recognising a psychological bias by bomb technicians during the assessment phase of their operations to “see what they expect to see” and without realising they fall into an analytical bias or a “confirmatory bias”. I must say up front I’m no trained psychologist and this needs lots more work, but for now I’m being guided by Dan Gardner’s book “Risk” which, at least for me as a layman is explaining some of things I think I’ve seen in bomb tech training and operations over the last twenty years.

It would appear that the brain, when dealing with matters of threat and fear has two “systems”.

  • System 1 is driven by things such as feeling, emotion, instinct, “gut”. It seems to rely on the “availability heuristic” (meaning that the brain remembers exceptions rather than routine).It drives certain physiological responses such as an increased heart rate and can drive the “fight or flight” response. It is quick – taking seconds or less to come to conclusions. When asked to justify a decision made by “System 1” operators can rarely put it into words.
  • System 2 is driven by reason, by the “head” rather than the gut. It uses logic, and is slow. The decisions are more “conscious” and can be explained. Sometimes the “head” is lazy and must be pushed to adjust decisions or analysis made by the “gut”. This happens especially under stress or when rushed.

The problems come because:

a. System 1, although very effective in some situations can give rise to deeply flawed threat assessments (to mix bomb tech language with psychology language) because there hasn’t been a systematic , objective view of all the evidence. But in some situations speed is needed and there is simply no time for system 2.
b. System 2 sometimes is used to rationalise a decision or assessment already made by system 1, so it too ends up being flawed. It can be wrong, and the confirmatory bias to look for confirmation rather than “dis-confirmation” can lead bomb techs who may be under significant tactical and operational pressure to make poor “threat assessments”. System 2 is often stymied because system 1 has already made the key decisions, and system 2 just tags along rationalising it. Once a belief is established the brain will look to confirm it and discard contrary information without logic

So, a few implications jump to my mind:

  1. I think that SOPs work well to support decision making in time sensitive situations. They provide a structure giving the bomb tech a minimum amount of choices to make in time-stressed situations. They essentially provide structured decision making that addresses the weaknesses of system 1. But…. could we make the SOPs better by having a greater understanding of the psychology of the situation?
  2. SOPs also work well when they encourage predetermined actions (“drills”) that cover all the bases, so even if a bomb tech has discarded the threat, say, of a pressure plate booby trap, the SOP still demands he acts is if there may be one.
  3. How do we encourage “system 2”, the head, to be less lazy and more effective at questioning system 1, gut instincts? Dan Gardner makes a nice analogy here, describing the brain as a car, driven by a caveman with a lazy, but bright PhD student in the passenger seat. The bright student is inclined to grab the steering wheel, but sometimes is distracted and just sits back and looks out at the window.
  4. Should we think about the ways that Bomb techs receive intelligence, to ameliorate the negative influences of psychological response to fear? Do intelligence analysts in general think about how their intelligence analysis affects the psychological decision making process going on their reader’s heads? ( I think this questions is much bigger than the small focus of bomb techs, and perhaps deserves a broader study on its own.) How do we manage the delivery of intelligence in the light of the “availability heuristic”, and something called the Von Restorff effect,(where we remember the unusual over the routine) as well as reasoned analysis? System 1 responses seem to be encouraged with images and narrative, even anecdotes (bomb techs are very good at anecdotes!). System 2 with logic, data and numbers. How do we manage this? There are some interesting things that fear does in this process. I think EOD scenarios are inherently fearful situations so they are worth exploring. The brain culls low risk experiences and memories, but fear “glues” memory. Fearful situations and resultant memories are more likely to influence “system 1” thinking. I wonder if we can “glue” important intelligence with a bit of fear to make it stick better? Or do we encounter the danger of over emphasising something that way and encouraging a system 1 response?
  5. Its clear from studies that “system 2” is aided by improved numeracy. Has this sort of research been applied to bomb technicians? – I think the answer is no, but some bomb tech training schools have very high academic/scientific training standards – others do not. Perhaps the output isn’t always what we expect – better technical understanding, but the associated higher levels of numeracy may also be a hidden factor in effective bomb technician training, because they encourage more effective system 2 analysis.

Here’s a great example of the confirmation bias. I’m thinking of a mathematical rule, that gives me three initial digits – 2, 4, 6. You are the bomb tech assessing this situation or “rule”. You have to make a judgement on what the next three numbers are, and you can test your hypothesis by giving me the next three numbers. The chances are, you will give me “8, 10 12”, and that would fit the rule and be correct…..and if you were challenged again you’d probably say “14, 16, 18”, and again, that would fit the rule wouldn’t it? But, actually the rule I had in my mind was “ any three rising numbers”. You see, your inclination was to see a simple even number, by two increase, and you have a bias to confirm that assessment which you couldn’t resist. But if you’d given me , say at the second time of asking an alternate test of your theory, such as “13, 14, 17” you’d have properly tested your gut, system 1 solution with an alternate “threat assessment” by using good logical system 2 brain power.

I’ll return with more thoughts on this shortly.

Bomb Technician Training

I’ve been thinking about the training of bomb technicians to deal with IEDs quite a lot of late. One of my oldest friends is currently responsible for such training in the British military and having myself undergone such training in the past (way past!), and been a “customer” employing people from that training regime, and now for the last ten years involved in designing and delivering training for bomb technicians in many countries around the world, not surprisingly I have some views, but I also have gaps in my knowledge I’m trying to fill. I intend to air some aspects of what I’m finding over the coming weeks.

Here’s some thoughts for starters:

  • Fundamentally there are quite a few aspects and many different approaches to such training. If truth be told the thinking behind the design of such training is rarely done, in my opinion at the appropriate depth, and those responsible for policy often don’t understand all the complex issues. Sweeping statement but I think I can back that up.
  • I think training to be a counter-terrorist bomb tech is a very difficult challenge. To deal with the problem of defusing an IED is not a simple task. There are technical challenges, command and leadership challenges, decision making processes, intelligence requirements, knowledge issues, potentially significant stress and pressure, operationally complex coordinating to be done, questioning and interrogation techniques, and the very fact that it is in the interest of your enemy to make the task as damned difficult as he can and hide stuff from you. And the penalty of getting any of those wrong is your life, and maybe others too.
  • There are some really interesting psychological issues that are rarely addressed in bomb technician training, about how humans make decisions, about how some information is processed by “instinct” or gut feel and how some data is processed by reasoning. And the two often contradict – but I’m not aware of any training course in the world that addresses those psychological issues up front with students. I’m reading a fascinating book by Dan Gardner called “Risk” about how humans process information regarding risk that is giving me some new insights that I’ll post in coming days. As a heads up, it’s clear that the psychological process of assessing risk is often flawed.
  • I’ve seen some remarkably competent, consistent bomb technicians, and in truth some awfully consistent bad ones. I can’t always predict which ones will be which before a training course starts. But I’d like to. I’ve also seen some really good ones, who do make mistakes. For the record, I was a long way from being the best.
  • In many parts of the world such a training course is “attendance” – no real assessment or pass or fail criteria are applied – I have a problem with that. In other parts of the world there are pass and fail criteria, but this gives rise to a few difficult and thorny issues – firstly the training organisation are put under huge pressure when they are not “passing enough”, and secondly, the complexity of the assessment in the end usually comes down to an objective assessment with no really valid metrics.

I’ll be returning to this subject in more detail in coming days – feel free to pile in with comments now.

ooops… IED attack on BRIMOB headquarters from beyond the grave

In 2005 the Indonesian counter-terrorist unit BRIMOB raided a house where a Jemaah Islamiyah bomb maker was staying. He was killed in the raid. A number of IEDs were recovered to BRIMOB headquarters in East Java for technical exploitation and analysis. Last week, the devices exploded after a “short circuit” caused one to detontate and the resulting fire caused others to explode. There were other devices recovered from a 2007 raid also held in the store.

No comment needed really – the report is here.

The dead IED manufacturer , Azhari Husin, is allegedly responsible in part for the Bali bomb, the JW Marriott hotel bomb of 2003 and the Australian Embassy bomb of 2004  and now the BRIMOB bomb of 2009, 4 years after his death. Oops.

Hezbollah’s international capability

The Terror wonk, here, makes some interesting comments about recent Hezbollah threats and current capabilities.  Following the comment on the New York CT post below about Hezbollah threats, it makes the point that Hezbollah’s operations internationally seem to have been limited or contained.  It links this in part to careful monitoring of Iranian embassies activity which it cites as being an important cog in Hezbollah’s international capability.

As an aside, the blog referenced makes an assumption that Hezbollah’s last main international terrorist attack was the 1996 attack on Khobar Towers in Saudi Arabia. Whilst indeed that is what the Saudi authorities assert, I remain unconvinced.  There is little evidence, and in my opinion the attack may just as easily be the responsibility of sunni militants in Saudi , who after all have mounted dozens of attacks in the meantime.

Close Me
Looking for Something?
Search:
Post Categories: