Mental shortcuts: A psychological explanation of why psychiatrists overestimate risk

In his 2011 book Thinking, Fast and Slow, Nobel Prize winning psychologist Daniel Kahneman explains how humans make decisions. He discusses how we use heuristics, otherwise known as “rules of thumb”, to cut corners in our thinking processes to help save time and effort in making judgments. Usually these heuristics work well, but sometimes, in tricky situations, they can fall seriously short of what we need. Estimating the likelihood of a rare event is one such area.

The power of heuristics

Kahneman gives the example of driving around Israel during a time when suicide bombings on buses were relatively common.  He knew the hard statistics – over the last 3 years a total of 236 people had been killed by bombs, but over 1.3 million people rode the bus every day in Israel. His chances of coming to harm were incredibly small, but he couldn’t help feeling afraid of buses. Why?

Several heuristics will have been interfering with letting Kahneman make a rational decision here. Firstly, the availability heuristic. The easier it is to think of a previous example of a bus bomb, the more likely you are to think it will happen again. This is how terrorism works. With so much media attention publicising the bombs, there will have been copious opportunities for people to conjure up mental images of bus bombs – which will scare them into thinking it will recur more frequently than it actually will.

At the extreme, this can grow into an availability cascade, in which a small or non-existent threat is blown out of all proportion because one person voices concern, creating a highly accessible mental image that others use to voice concern, and so on, until everyone is concerned. Mad cow disease is a classic example of this, or electricity pylons giving people cancer. More topically, Romanian immigrants “flooding into the UK” fits into this category nicely too.

Also involved is the representativeness heuristic. How much a bus looks like a typical bus used for suicide bombs will influence how likely people think it is to blow up. If Israel’s buses were a variety of different models, people would be far more scared of the models in which bombs had already been placed, and less scared of the others, even though in reality the model probably wouldn’t have had much impact on the risk.

The third heuristic to interfere is the affect heuristic. Whether we like it or not, how we feel about something plays a big part in how good we think something is for us. We tend to overestimate the benefits of things we like and the risks of things we don’t like. Obviously, vivid images of the result of suicide bombings are going to inspire some pretty serious levels of dislike in people, so they’ll view the risks of more bombings as even bigger. Incidentally, we use the technique of making people feel bad about something to increase their estimation of it’s risk for good purposes too – putting unpalatable pictures of tumours on cigarette packets, for instance.

This ties in with another cognitive trick we use to save mental effort sometimes, called attribute substitution. When we have trouble answering a tough question, like how likely is this bus to blow up, our minds will often substitute an easier question and answer that instead, like how awful would it be if that bus blow up? We then take our answer and match it an answer of corresponding intensity for the first question. “It would be really awful if that bus blew up” becomes “that bus is really likely to blow up”.

As if those weren’t enough reasons to declare humans flawed at judging risk, it turns out that we’re way keener to avoid a loss than to make a gain. This is called loss aversion. Most people, says Kahneman, will turn down the offer of tossing a coin that would win them $125 or lose them $100. In fact, it’s only when people are offered $200 for winning that coin toss that they start to agree to gamble. We hate to lose, even if it costs us a chance of winning.

People are so keen to avoid losing that they’ll make statistically bad decisions in order to avoid it. Take insurance, for example. People will pay way more to insure their belongings than the risk of them being damaged would suggest is reasonable. The more you stand to lose, the greater the amount over the odds you’ll pay to not lose it.

Risk in mental health

So how does all this tie in to mental health? Well, psychiatrists judge risk all the time. At least we think we do.

Let’s take judging the risk of a mental health patient killing themselves or someone else. We often section people because we think the risks of this are so high.

In fact, on paper, the risks of both those outcomes are incredibly small.

In 2009, for example, 84 patients killed themselves during hospital admission or on a period of trial leave, out of a total of approximately 120,000 admissions involving 108,000 people. That’s a suicide rate of around 1 in 1400 admissions. Even if we take the cases in which risk of suicide was the major contributing factor to admission (21%), that’s still only 1 in 285 admissions.

Likewise for homicide, in the whole of 2011 a miniscule total of 18 people who were convicted of homicide had been given a diagnosis of schizophrenia at any point in their life. That’s on a prevalence of schizophrenia of 0.3-0.7% for the entire country, adding up to a risk quoted by one study as 1 in 9090 cases. The same study found that the homicide rate during a first episode of psychosis, often the time during which a patient is most ill and is most likely to be sectioned, was 1 in 629 patients.

Mental shortcuts

So when we deem someone to be “high risk” – and I do this myself – what do we really mean?

If you take Kahneman’s point of view, it means we’ve used our heuristics to cut corners in our decision making and come up with a heavily biased assessment based on how we feel and our active memories, rather than the facts of the matter.

When we see someone in A+E who feels suicidal, or is so psychotic we feel they might hurt someone, what do we think of? Is it the statistics? No.

We use the availability heuristic to conjure up images of previous patients we’ve had, or even just heard of, that have killed themselves or hurt others. Newspaper headlines like The Sun’s “12,000 killed by mental patients” crops up in our heads.

We use the representativeness heuristic to liken the patient in front of us to what we imagine the typical patient who will kill themselves or someone else to look like – and they usually fit the bill because the image is so vague.

We use our affect heuristic to allow our feelings to judge how risky the situation is – and because there can be no worse feeling that making a wrong decision and seeing someone die because of it, we decide that the risk is high.

Finally, we use our aversion to loss to take more precautions than the statistics deem necessary to ensure what’s important – our patient’s safety.

None of these heuristics would cause too much of a problem if we were dealing with small decisions, or dealing with a moderate decision only once. But that’s not what we do; we make big decisions about bringing people into hospital because they’re “high risk” all the time. Potentially hundreds of people, to prevent just one or two bad outcomes.

Why use risk as a basis for treatment?

On this evidence, I don’t see admitting or treating someone purely for reasons of risk to be a reasonable way of practicing psychiatry. I certainly don’t see it as a good enough reason for taking away someone’s liberty by sectioning them. If we were any good at forecasting risk there might be a case for it, but as it stands, we’re so poor at predicting bad outcomes that the amount of treating and coercing we do to stop one averse event is far too large.

Fortunately, almost all the people that are brought into hospital as “high risk” also need to be there for their own health – because they are so unwell. The illness and the risk often come together.

Improving people’s health is a far more rational use for a hospital, and I’m sure that by focusing on treating illnesses, the tiny risks our patients do pose will shrink even further – not that our heuristics would be able to tell.

Advertisements

About Alex Langford
I am a psychiatrist (now an SpR) based in Oxford after 3 years working in South East London. Before I went into psychiatry, I used to be a general medical doctor, and I also have a BSc in psychology. I'm particularly interested in improving the public face of psychiatry, evidence based medicine, teaching and patient rights. Don't mention cricket unless you've got the next fortnight free to discuss it.

2 Responses to Mental shortcuts: A psychological explanation of why psychiatrists overestimate risk

  1. ghost says:

    What happens is that as a service user you very quickly learn to never utter the ‘s’ word. Even when you are desperately trying to fight to stay alive and need to be able to disclose the torment in your head. And inevitably some people will then attempt or complete suicide because there was no safe space to talk out feelings without the very real threat of being locked up.

    Compare the approach and ethos of the Samaritans with crisis workers and psychiatrists. The Samaritans see some of the most desperate people in society who are in theory at the highest risk. However they also provide a space that is safer than anything you will find in statutory services. And for 2 reasons: they operate a completely confidential space and they don’t have the weight of statutory responsibility that has bred the ‘cover your back’ approach that underpins all MH service crisis care

  2. Chris Jackson says:

    Why do you say “I used to be a medical doctor” before you went into psychiatry; obviously all psychiatrists are ‘medical doctors’ (MBBS or otherwise), so your terminology might confuse the public into thinking that they are not qualified physicians. I think I understand what you’re trying to say though, so why not write instead “I used to work in general practice/hospital medicine” which might be clearer/more accurate?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: