CRIME & HEURISTICS Part 2
Heuristics; flashbulb memories; terrorism; policing; offender profiling
In the last Crime & Psychology newsletter, we saw that the human mind is sometimes treated as a limited-capacity processor. We also saw how we human beings constantly try to make the most of that limited capacity. In a phrase, that means cognitive heuristics – the kind of mental short-cuts we use to make sense of our worlds, quickly, cheaply, easily (and, often, wrongly).
Why should Crime & Psychology readers care about this? We have only to consider what we expect from our criminal justice system.
We could ask, for instance, why we punish people who commit crime. The immediate, natural response is that such a question answers itself – but the phenomenon of regression to the mean (which you may remember from last week) makes us think twice. Does incarceration, for instance, really accomplish anything? There are many stories, for sure, of criminals emerging from prison fully reformed, ready to walk the straight and narrow. We tend to consider them evidence that punishment achieves some end. Perhaps it does not: perhaps these particular criminals would have ‘learnt their lessons’ anyway.
Frustrated parents who punish their children often notice an improvement in behaviour afterwards. It seems as if the punishment was effective, but maybe it wasn’t. Maybe it was just regression to the mean. Maybe praise for good behaviour would have been more effective than punishment for bad.
Here are some more important cognitive heuristics. Not only can we see their bearing on our subject-matter: thinking about them can help us to think, period:
Falsification of history: Everyone who was alive at the time can remember where they were when JFK was assassinated. At least, 99% of them managed to do so, 14 years after the event. ‘I can still feel the tread on the steps of Emerson Hall’; ‘I was carrying a carton of Viceroy cigarettes, which I dropped’[i]. Or did they? How can we possibly tell? Perhaps these people were nowhere near Emerson Hall; perhaps they were carrying a box of Oreos, or car parts, or condoms. There’s no way to be sure that they are right, however vivid they beleive their memories to be. Indeed, a large part of what you and I think we remember – ‘flashbulb memories’ or otherwise – never happened at all. That’s because we reconstruct events in our minds, rather than replaying them on some kind of internal VCR. And what we remember can be quite self-serving. You’ll notice this with sports fans, in particular. Before the big fight, they are unsure who will win. After wards, what do they say? ‘I knew it! I knew it all along!’ No, they didn’t. This heuristic helps us feel happier and more confident about our knowledge of, and place in, the world.
In-group out-group bias: We are quite weak, defenceless animals. Trapped in the kind of hostile environment in which human beings evolved, we had to rely on other members of our group, or tribe, to help us survive. To be thrown out of the group meant death. No surprise, then, that human beings tend to be quite sociable, forgiving animals. We cannot afford to lose our social support. Members of our own group appear good, decent, intelligent: they have all kinds of fine, valuable qualities. Members of other groups are quite similar to each other, bad, unpleasant, and quite possibly stupid. The categorisation of people into groups seems automatically to result in a cognitive bias towards our own group and against the others. Just look at the news. A few months ago, I wrote a two-part newsletter about this exact phenomenon. You’ll like it! Check it out here and here.
Illusory correlation: Two unusual events occur close to each other in time or space. We suppose they must belong together. Imagine that you go out to your local supermarket or grocery store and you happen to see a man in a cowboy hat, shoplifting. It is unusual to spot a shoplifter in action, and, (at least where I live,) not too many men go shopping in cowboy hats. Thereafter, you may feel just the tiniest bit suspicious of men in cowboy hats. Two unusual events went together, and, psychologically, it must seem that there’s a connection. If you live in Texas, you’ll need to supply your own example.
Hindsight bias: When you look back on a sequence of events, they seem inevitable and predictable. Hence the criticism often levelled at the American government for failing to prevent, first, the attacks on Pearl Harbor that led to US involvement in the Second World War, and, second, attacks on the World Trade Center that led to various kinds of US involvement across the Middle East. There were plenty of clues that both were about to happen, critics claim. That’s correct, of course – but it’s easy to forget, with the benefit of hindsight bias, just how difficult it can be, at the time, to sort the signal from the noise. Why wasn’t Lee Harvey Oswald stopped earlier, when the CIA and FBI had plenty of reason not to trust him? Same reason.
Hindsight bias seems especially pernicious when the outcome was tragic, as it was on September 11.
On July 10, 2001, George Tenet, who was the director of the Central Intelligence Agency, learnt that al-Qaeda was planning a terrorist strike in the USA. He took that information to Condoleeza Rice, who was the National Security Adviser, rather than approaching President George W Bush himself. Here’s what Ben Bradlee of The Washington Post had to say about that: ‘It seems to me elementary that if you’ve got the story that going to dominate history, you might as well go right to the President’[ii]. Well, maybe so – but it is only through hindsight bias that you and I only know the story was going to dominate history. It is unclear that George Tenet could possibly have known that as early as July. Conspiracy theorists have their own ideas about what happened not just in 2001, but in 1963, as well. You can see how their ideas get traction.
Illusory skill: Oh boy, there’s a lot to be said here. A chap could write a book. Let’s try to keep it bite-sized. Soundbite sized, even: ‘experts aren’t always right’. Also, you don’t hear very often from the ones who are most likely to be right.
What do I mean by that? The psychologist Philip Tetlock gathered together 80 000 (yes, 80 000!) political predictions from experts who made their living that way. He wanted to know how accurate they were. Here’s Daniel Kahneman: ‘people who spend their time, and earn their living, studying a particular topic produce poorer predictions than dart-throwing monkeys who would have distributed their choices evenly over the options’.
Not very encouraging, is it? Never fear, it gets worse. Those who had the most knowledge produced the worst predictions. Why should that be? Well, it seems that advanced knowledge of a domain produces a feeling of expertise, which can lead to unrealistic overconfidence. In fact, beyond a certain point, extra knowledge in an area does not produce extra predictive power (economists talk about ‘marginal utility’, but let’s leave that to them). We reach that point very quickly. You or I can probably make political predictions that are very nearly as a good as the experts. And do you happen to own a dart-throwing monkey?
The most famous, and sought-after, forecasters are the ones who make the most exciting predictions. You can see why: television news journalists don’t want to interview bland, boring pundits who say ‘It may be like this, but there’s a 20% probability it may go the other way’. They want pyrotechnics, pin-drop moments, pizzazz – which is another way of saying they want viewers.
There is a tendency on the crime pages to devote rather a lot of attention to then opinions of ‘offender profilers’ (long-time readers will know that I have a certain scepticism about them). The press wants offender profilers to make confident predictions about the nature and background of unknown perpetrators. Some profilers (not all, by any means) are rather too happy to oblige. The investigative psychologist, David Canter, dealt with this issue at some length in his book, Mapping Murder, when dealing with the Washington Sniper case from 2002. It was a unique case. No one has seen anything like it before. How, then, could any responsible scientist make predictions about the personal characteristics of the killers? The answer is that they couldn’t - but that didn’t stop them Not all of them, anyway.
‘We live’ Canter wrote, ‘in a world where “expertise” is a substitute for considered discussion. When faced with a difficult problem, such as what the Washington Sniper killings were all about, content producers for newspapers, radio and television fill the vacuum in understanding by quoting anyone who claims some special expertise […] Yet as Marymount University’s Professor of Forensic Psychology, Mary Lindahl, said with some ferocity after the glut of “expert” opinions offered during the hunt for the Washington Sniper, “The public could profile as well as the experts on TV…most were wrong’[iii]. The picture below shows John Allen Muhammad, one of the two ‘Washington Snipers’.
Action bias: In situations that we are unfamiliar with, it seems better to do something than nothing. That is not necessarily true. Indeed, nothing can often be the optimal strategy. According to Rolf Dobelli, in his book, The Art of Thinking Clearly, novice police officers often make the mistake of intervening too early in a threatening situation, while their more experienced colleagues stand back and wait[iv]. The more experience the officer has, the better they understand that ‘doing something’ sometimes means ‘doing the wrong thing’, thus causing more trouble than you prevent. Don’t just do something, stand there!
False consensus effect: Are you in favour of capital punishment? If so, I bet you think most people are. If not, I bet you think most people are against it. That is because we tend to think of our own opinions as more popular than they really are[v]. The same goes for our interests. Do you enjoy the nifty newsletters you receive every week from Crime & Psychology? Of course you do! Try recruiting a few more readers – you know they’ll love it. If you don’t enjoy them, on the other hand, I’m safe in assuming you’re not reading this anyway.
And with that in mind, let me leave you with a few words of wisdom: If you’d like to support this peerless publication, it’s very easy to do. I’ve providentially provided the means. Simply bop a bright blue button below. Thank you!
All images courtesy of WikiMedia Commons. References provided largely out of academic habit, but also so you can chase up anything you find particularly interesting.
[i] Yarmey AD & Bull III AD: ‘Where were you when President Kennedy was assassinated?’ Bulletin of the Psychonomic Society, 11, 1978, pp133-35
[ii] Ben Bradlee, quoted in Kahneman, Daniel: Thinking, Fast & Slow, Penguin, London, p204
[iii] Canter, David: Mapping Murder – The secrets of geographical profiling, Virgin, 2003, p4
[iv] Dobelli, Rolf: The Art of Thinking Clearly, Sceptre, Great Britain, 2014, p134
[v] Ross, L; Green, D & House, P: ‘The “false consensus effect” – An egocentric bias in social perception & attribution processes’, Journal of Experimental Social Psychology, 13(3), May 1977, pp279-301
Another great column. Really interesting re some problems regarding with how we think. The point about experts and confidence was quite interesting and reminded me something, I think, it was called calibration tests... where you are asked to answer questions with 90% and in ranges (how many people are born in the UK each year and you answer with a range).... the range can be as large as one wants to .... people do really poorly on such tests.
I really enjoyed your points re regression to the mean because these are often ignored when talking about programs that reform individuals.
, I think that…nope, I’ll wait for more data before giving my theory ! 😂 It was fascinating and that’s a fact, Sir 😊 thanks