300 people are dead as the result of one disastrous failure with “shared knowledge”: in Baghdad, a bomb detector costing tens of thousands of dollars failed to sense a terrorist bomb. Was this tragedy a failure in technology? No. It was a failure in knowledge. Who claimed, and who believed that the device really could detect bombs? For a TOK classroom, the fake bomb detector provides teachers with a powerful story, a striking example of some key issues surrounding knowledge, and an exercise in applying critical thinking to claims with evident consequences.
Consider:
- The so-called bomb detector was nothing of the sort: it was a plastic handle with a small antenna.
- The Iraqi officials responsible for issuing this “detector” and hundreds of others to the armed forces had been told repeatedly that the detector was a fraud–and that the British manufacturers of the fraud had many years ago been sentenced to a 10 year prison term for the fraud.
- As a result of the spectacular demonstration of the dangers involved in trusting the fake devices, the detectors have been banned in Iraq–but continue to be used not only in Iraq but in some other countries.
How did knowledge go so horribly wrong?
The answers are complex and cut across several different areas–advertising, pseudoscience, deliberate fraud, bureaucracy, political maneuvering, corruption, and, perhaps, most powerfully for TOK, cognitive glitches.
Treated chronologically, from the invention and marketing of the devices to their continued use, the problems with vaunted knowledge going horribly wrong include the following:
1. Language
Those involved with creating and advertising the useless devices, knew what they were about. Adopting a kind of toy “golf ball detector”–equally useless–they, like many fraudsters, especially in medicine, coopted the language of pseudoscience to create an aura of credibility.
- The device (in the most widespread version) was sold using the pretentiously technological name of ADE651 by a company called ATSC.
- The device claimed to use “programmed substance detection cards” which were tuned to the “frequency” of explosives. The technology was (of course!) based on “nuclear quadrupole resonance (NQR) or nuclear magnetic resonance (NMR)“. Impressive? Of course!–though, surprisingly, the fraudsters didn’t abuse the standard scientific terms common to pseudoscience–quantum, nano, and “energy field”!
- Other versions, sold by other fraudsters to governments in Mexico, Thailand, Egypt and Niger, went by equally fine-sounding names: the “GT200”, “remote substance detector” and the “Alpha 6”.
For TOK students the issue is obvious: how much should we trust technical names and technological terminology that we don’t understand? How can we assess such impressively scientific sounding language?
2. Credible authority/false premise
The credibility of the devices was enormously enhanced because they were backed by British embassies in Mexico City and Manila, through the Department of Trade and Industry.
Clearly the issue here is complicated for TOK students–and us all. How do we assess the reliability of knowledge sources in a particular case when, in most cases, those knowledge sources are demonstrably reliable? Amongst other things, of course, we must question the premise that “a knowledge source that is generally reliable must be always reliable.”
3. Skeptics, critical thinking and deduction
In spite of the bafflegab of the language–or, more likely, because of it–many critical thinkers smelled a rat. Eventually, governmental agencies, both in the U.S. and in the U.K. examined the devices and found them to be nothing, nothing, nothing at all.
But–and here is a good lesson for TOK students–amongst those quickest off the mark were those who applied various versions of critical thinking–or just plain common sense.
From their initial use in Baghdad many years ago, soldiers mocked them, claiming that too much aftershave could set them off.
Others, principally citizen skeptics, largely applying principles of deduction, concluded that the devices couldn’t possibly work. Why? Well, first, the manufacturers (a slightly grand word for a not very grand backyard operation), claimed that they could be used for just about anything: simply by putting a sensor card in a closed chamber with a fragment of a substance–including illegal ivory and truffles!–the user could then shift the sensor card to the device, and presto!
Second, they claimed the device could be used over distance, through concrete and water–without any scientifically credible explanation of how this could possibly be done.
For TOK students, the problem is a vivid demonstration of the principle popularized by Carl Sagan, “Extraordinary claims require extraordinary evidence.”
4. Tests: induction and the scientific method
Initially, it seems, one skeptical organization, run by retired millionaire magician James Randi, hired an independent company to test the identical American fore-runner of the ADE651–the “Quadro”–and demonstrated, through testing, that the devices could do nothing.
These tests were ignored by the FBI. Later, government testing both in the U.S in 1996, and the U.K., where the devices were marketed under the name “Mole”, again demonstrated that the devices were completely useless.
In 2010, the British banned their export and subsequently jailed the con-artists.
The matter should, of course, have stopped here. TOK students will point out that it would be difficult to argue against the results of this kind of test, where the human element can be removed, leaving only scientifically verifiable–and repeatable–evidence.
Yes? Well, not quite.
5. Tests: ideomotor effect
The fact is, however, that subsequent tests, principally run in Iraq, showed, against any scientific possibility, that on some occasions they did seem to work. How could this be? Putting aside other explanations for now, particularly interesting for TOK students is the principle of the ideomotor effect.
Consider the following incident:
“General Jabiri, meanwhile, challenged an NYT reporter to test the ADE 651, placing a grenade and a machine pistol in plain view in his office. Every time a policeman used it, the wand pointed at the explosives. Every time the reporter used the device, it failed to detect anything. “You need more training,” said the general.”
Are we to conclude that Jabiri was completely wrong-headed? Likely, yes. After all, the explosives were in plain view. But there is another possibility, one that links to subtle behaviour of the human brain.
Popular Mechanics uses the human failures involved in depending on fake bomb detectors to examine the ideomotor effect— — involuntary muscle movements that guide a device like a Ouija board “planchette”, supposedly (in the case of the Ouija board) under the guidance of a spirit. (https://en.wikipedia.org/wiki/Planchette) Appropriately, the article is entitled “The Military Pseudoscience that Just Won’t Die”
After writing about willow twigs supposedly used to detect water underground (“dowsing rods”), and then equally useless devices in the Vietnam war, the journalist concludes with the description of a study at the University of British Columbia involving the psychology of the ideomotor effect and its apparent effectiveness:
“If there are cues that you have not noticed consciously, but which have been tagged by your unconscious mind, dowsing may reveal that unconscious insight.
“The British Colombia study results suggest that, in theory, dowsing might provide a marginal edge for detecting IEDs. By harnessing unconscious knowledge, a person might be slightly more likely to spot a bomb and save lives. In that sense, dowsing might be better than nothing, but it’s an idea best left in the past. McCormick was a criminal because he charged up to $60,000 for a device that was no more effective than a pair of old coat hangers. For that price, you can get a real bomb detector that really saves lives.”
6. Reactions to tests and human psychology:
In spite of overwhelming scientific evidence that these detectors
- didn’t work (induction) and
- couldn’t possibly work (deduction),
they continue to be used, both in remote parts of Iraq, and, amongst other places, Egypt, where they are supposedly programmed to detect HIV. (Yes, HIV!)
Admittedly, in the particular case of Iraq, it seems that there is a complex web of bureaucracy, corruption, and power politicking that lies behind much of their continued use (at least until the recent ban.)
Cognitive biases and logical fallacies
However, perhaps the most important aspect of this whole story for TOK students is what it reveals about the human tendency to cling to irrational behaviour and/or beliefs in spite of irrefutable evidence to the contrary.
Consider the following brief accounts, in the light of such cognitive blunders. You might find it useful to look back to the first post of a two-part series we did in March in this blog, on “Conspiracy theories, intuition, and critical thinking”. There we treated the following cognitive biases:
- The need for control
- Pattern finding
- Intentionality Bias
- Proportionality Bias
- Perceived Risk
- Confirmation Bias
To this list, you will probably want to add the following to deal with the problems raised by the knowledge claims about the fake bomb detector:
- Motivated reasoning. This is a form of reasoning that takes confirmation bias a step further. It involves not just accepting as true only what accords with what we believe already (confirmation bias), but going further to reject as false anything that doesn’t fit with it. Motivated reasoning “drives people to develop elaborate rationalizations to justify holding beliefs that logic and evidence have shown to be wrong. Motivated reasoning responds defensively to contrary evidence, actively discrediting such evidence or its source without logical or evidentiary justification. Clearly, motivated reasoning is emotion driven.” (See more on motivated reasoning.)
- Sunk cost fallacy. When we have already put time or effort into something, we don’t like the idea that we’ve been wrong or stupid, or that we’ve wasted our resources. So even if a procedure, a product, or an idea is increasingly revealed to be faulty, we resist changing our minds. We’ve sunk too much into believing it already! (See further explanation.)
- Fallacy of cherry picking. Lovely metaphor: a tree is covered with cherries (evidence) but we select only the cherries that suit our purposes and ignore all the others. “When only select evidence is presented in order to persuade the audience to accept a position, and evidence that would go against the position is withheld. The stronger the withheld evidence, the more fallacious the argument.” (See further explanation.)
- Texas sharp shooter fallacy. Another good metaphor: a gunman fires random shots at a barn – but then paints a bull’s eye target around where bullets just happened accidentally to hit most often. Looks like he’s an excellent shot, and as if he’s managed deliberately to hit that target! The metaphor applies to accidental clusters of random data (such as coincidences) when a pattern is imposed afterwards – and the data that doesn’t fit the pattern is ignored. (See further explanation of how we fall for this one, with some entertaining examples.)
You will also find an overlapping treatment of cognitive biases in the chapter on “Intuition as a way of knowing” in the Theory of Knowledge Course Book, and treatment of further logical fallacies in the book’s inter-chapters.
TOK exercise: applying critical awareness and skills
The four selected articles below provide rich material for identifying human cognitive tendencies and failures in thinking critically. The passages excerpted from them focus attention on particular quirks of the human mind.
Article 1. Caroline Hawley, “The story of the fake bomb detectors”, BBC news, October 3, 2014.
Astonishingly, perhaps, some officials still blame user error or even the weather for the device’s failure to detect explosives.
Gen Saad Ma’an, from Iraq’s Ministry of Interior, accepts the devices are not working “to the required level” but says the country is still waiting to replace them.
“Maybe it’s affected by conditions – by the weather, by how the policemen themselves are using it,” he says.
Article 2. Martin Chulov, “Iraq PM orders removal of British-made fake bomb detectors”, The Guardian, July 4, 2016.
Some security officials were slow to respond to the order, still holding wands on Monday at approaches to the central city and along roads to the airport and the north. The reluctance to acknowledge them as useless was in part centred in having to acknowledge that there are few alternatives to keeping bombers away from Iraq’s towns and cities.
“Sometimes it is better to pretend,” said one senior interior ministry official. “To say that these don’t work says that we don’t have anything better. The people need some sort of reassurance.”
Article 3. Dominic Evans and Saif Hameed,“Fake bomb detectors banned in Baghdad, but still in use from Beirut to Cairo”, Albawaba news, July 28, 2016.
Police captain Raad Shallal, manning a checkpoint near the town of Khalis in Diyala province said he knew the detector was useless.
“It serves as a scarecrow, more than a real bomb detector,” he added, standing close to a colleague who was checking vehicles with one of the devices.
That theory, that they might deter bombers even if they cannot detect bombs, was lampooned on Iraqi television by satirist Ahmed al-Basheer.
“So it’s a scarecrow,” he said. “This is the right thing to do, use a device that the entire globe knows is not working in order to scare terrorists who live on the same globe we’re on.”
Article 4. Leo Benedictus, “Why are countries still using the fake bomb detectors sold by a convicted British conman?”, The Guardian, June 9, 2014.
Like dowsers, though, many security personnel continued to keep the faith. In 2010, even after McCormick had been charged with fraud, Pakistan’s Airport Security Force admitted to the Dawn newspaper that they were continuing to use a device of their own design that operates on the same principle.
In 2009, the New York Times confronted bomb squad commander Major General Jehad al-Jabiri with evidence of the ADE 651’s fraudulence, yet he insisted that it was effective, saying: “Whether it’s magic or scientific, what I care about is it detects bombs.”
Detective Sergeant Steve Mapp, who led the investigation into McCormick, some people’s belief in the ADE 651 is almost unshakeable. As he told Business Week, “In Kenya they said, ‘No, we know about Mr McCormick’s conviction, but we’re really glad we’ve got them – and they do work.'”
Conclusion
Have you noticed the range of dates across which the exposées of this fake bomb detector spread? Given the thorough debunking of the gadget as a bomb detector (and as a detector of golf balls and HIV!) it is astonishing that anyone persists in clinging to belief in its powers.
If this were an isolated incident, then all it would be worth in a TOK class is an anecdote and a laugh – or perhaps just a rueful head shake about the “funny old world” and “funny ol’ human beings.” And I suppose it deserves these anyhow. We need our laughs, and observing human beings certainly provides a dizzying number of rueful head-shakes!
But we can’t just laugh. The failures that this story illustrates are dismally common: failure to evaluate effectively the reliability of our sources of knowledge; readiness to accept knowledge claims with insufficient or dubious evidence, and even at times with evidence to the contrary; cognitive resistance to questioning our beliefs and changing our minds once we’ve accepted a knowledge claim; and the possibly deadly consequences of false conclusions. In fact, we might not feel inclined to laugh at all over failures in critical thinking, given their possible consequences. Our students won’t be in the position (we hope!) of shopping for bomb detectors, but they will make decisions about when to administer or withhold medical treatment (such as vaccinating children), or about how to place their votes.
Looking for more thought-provoking topics to support and inspire your TOK teaching? Subscribe today
References
Kelsey Atherton, “How a millionaire sod fake bomb detectors to governments all over the world”, Popular Science, April 24, 2013. http://www.popsci.com/technology/article/2013-04/how-millionaire-sold-fake-bomb-detectors-governments-world
Leo Benedictus, “Why are countries still using the fake bomb detectors sold by a convicted British conman?”, The Guardian, June 9, 2014. https://www.theguardian.com/world/shortcuts/2014/jun/09/fake-bomb-detectors-british-conman-pakistan-karachi-airport
Martin Chulov, “Iraq PM orders removal of British-made fake bomb detectors”, The Guardian, July 4, 2016. https://www.theguardian.com/world/2016/jul/04/iraq-orders-withdrawal-uk-fake-bomb-detectors-baghdad-haidar-al-abadi-james-mccormick
Dominic Evans and Saif Hameed,“Fake bomb detectors banned in Baghdad, but still in use from Beirut to Cairo”, Albawaba news, July 28, 2016. http://www.albawaba.com/news/fake-bomb-detectors-banned-baghdad-still-use-beirut-cairo-866750
Ben Goldacre, “ADE651: wtf?”, Bad Science, Nov 14, 2009. http://www.badscience.net/2009/11/wtf/
David Hambling, “The Military Pseudoscience That Just Won’t Die”, Popular Mechanics, July 6, 2016 http://www.popularmechanics.com/military/research/a21678/dowsing-iraq-bomb-detectors/
Caroline Hawley, “The story of the fake bomb detectors”, BBC news, October 3, 2014. http://www.bbc.com/news/uk-29459896
James Randi, “Iraqi General Turns Down JREF Challenge, Gets Arrested for Bomb Detector Scam”, James Randi Educational Foundation, Feb 18, 2011 http://archive.randi.org/site/index.php/swift-blog/1217-iraqi-general-turns-down-jref-challenge-gets-arrested-for-bomb-detector-scam.html
“UK conman found Guilty of selling fake bomb detectors to Iraq for $40 million”, Skeptical Science, April 24, 2013. http://www.skeptical-science.com/people/skeptics/uk-conman-guilty-selling-fake-bomb-detectors-iraq-40-million/
Image: Checkpoint in Abu T’Shir, Iraq by Todd Frantom [Public domain], via Wikimedia Commons