5X5 Episode 106

From SGUTranscripts
Jump to: navigation, search
5X5 Episode 106
Availability Heuristic
19th March 2012

Transcript Verified Transcript Verified

5X5 105 5X5 107
Skeptical Rogues
S: Steven Novella
R: Rebecca Watson
B: Bob Novella
J: Jay Novella
E: Evan Bernstein
Links
Download Podcast
Show Notes
Forum Topic

Availability Heuristic[edit]

You're listening to the Skeptics' Guide 5x5, five minutes with five skeptics, with Steve, Jay, Rebecca, Bob and Evan.


S: This is the SGU 5X5 and this week we are talking about the availability heuristic. This is a bit of a follow-up to the last episode, where we talked about the representativeness heuristic. Heuristics are quick mental rules of thumb. They are hard-coded by evolution in our brains. They're mental short cuts that may be useful or true much of the time but they are not strictly logically true, and therefore they may lead us to conclusions of to thought processes that are not valid. Most of these, including the availability heuristic, were described by psychologists Amos Tversky and Daniel Kahneman. The availability heuristic is simply this: if we can easily think of an example of something, then we think that that something is likely and important. We use the availability of examples that we can bring to mind as a quick down-and-dirty way of assessing how common or how important a phenomenon is. We could take medical cases, for example. If someone makes the claim, for example, that smoking causes cancer and is bad for your health but our grandfather lived to be 90 years old and smoked all his life, we might say, "well that can't be true because I can readily think of an example of somebody I know who smoked for a long time yet lived to a ripe old age." The availability of that example looms large in our mind rather than the statistics. How does smoking affect the chance of developing lung cancer or our longevity? We should be thinking about that question statistically. Instead, we allow ourselves to be unduly influenced by the chance examples that may come to mind.

R: And the media can really mess with our ability to understand statistics in a way that the availability heuristic can explain. For instance, a few years ago, the big news story was that it was the summer of the shark, because apparently there were tons of shark attacks happening; people were scared to go to the beach. It was almost like the year that Jaws came out. In fact, when you looked at the statistics, there were actually fewer shark attacks that summer than in previous summers or some years after that. But the media was putting such a highlight on those stories that it tricked people into thinking that it must be a common occurrence.

S: In fact, if you ask people is it more likely to die of a shark attack or some type of cancer, they might say shark attacks because there are examples put in front of them in the media, when in fact, even an obscure type of cancer that people don't hear about may statistically be much more likely. Further, there was a study along those same lines, Rebecca, of people who watch soap operas. And they actually believe that a much higher proportion of the population are doctors and lawyers than people who do not regularly watch soap operas.

R: They also believe that a higher percentage of the population are identical twins, one of whom is evil.

S: That's true.

J: (laughs)

B: One of the simplest demonstrations of this heuristic was actually produced by Tversky and Kahneman in the classic study in which they asked people to think of the frequency of words that begin with the letter K, compared to words having K as the third letter. Since it's much easier for most people to think of K words, it skews our perception of the relative frequency of such words. There's also an interesting corollary to the availability heuristic and it demonstrates just how primed people are to fall prey to it. It turns out that just imagining something can make it later seem more likely to happen. For example, most people in the mid-70s when they were asked to imagine Gerald Ford winning the presidency later considered him much more likely to win the election than his opponent. People asked to do the same about Jimmy Carter also subsequently considered him to be the overwhelmingly likely winner.

E: Yeah, Bob; it's putting things in front people's minds and they make decisions based on what it was that was just recently seen. And people who are trying to sell things are aware of this, and they take advantage of this heuristic in people. For example, take the lottery. Now, lotteries; they don't try to sell tickets by emphasising the statistical odds that any ticket has of winning, because of course, the odds are very, very against you winning jackpots. So those who do advertise lotteries, they want the first thing that comes to a person's mind to be winning. That's why places that sell lottery tickets have signs out in front saying, "we had a ten thousand dollar winner here"; "we had a million-dollar winner here". It draws you in; it makes you more likely to actually want to play. If you see signs denoting that there are winners to this game, you're more likely to actually purchase those tickets rather than thinking about the odds in which you are in a much greater position to actually lose.

J: Another example of the availability heuristic is about bigotry. This one might be difficult to hear, but tell many of these things that you've heard before, or match up with a pre-conception you have: Have you heard that Jewish people are greedy or that Italian people are greasy or that Southern Americans are stupid or French people are mean or Germans are strict or that Russians and Irish people drink too much alcohol or Gypsies steal and lie. And of course, most of this information basically comes from conversation and comes down from your family or comes from the society that you live in. Very rarely do we have a lot of first-hand information that reinforces these things. It's really just programming that we get from our childhood.

S: Yeah, that's where confirmation bias kicks in. So, once we have a stereotype or a bigoted perspective, then we look for examples that confirm the stereotype. And because we can think of available examples, partly because they're propagated, because they support the stereotype, we think that therefore it's likely. If we see on the news, for example, people of colour being arrested and then we think, "oh that's therefore typical; that's representative of that group, because I am exposed to examples of it happening". So, the combination of availability with anecdotes with confirmation bias does lead to bigotry and stereotyping, even when those stereotypes go against the statistical evidence. That's how we are programmed to evaluate information and it's incredibly flawed. It leads us to conclusions and to biased outlooks that do not accord with reality.

S: SGU 5x5 is a companion podcast to the Skeptics' Guide to the Universe, a weekly science podcast brought to you by the New England Skeptical Society in association with skepchick.org. For more information on this and other episodes, visit our website at www.theskepticsguide.org. Music is provided by Jake Wilson.


Navi-previous.png SGU HRes Logo sm.gif Navi-next.png