5X5 Episode 105

Representativeness Heuristic
S: This is the SGU 5X5 and tonight we are talking about the. Heuristics are quick rules of thumb that are true most of the time but are not strictly logically correct. So they may serve us well as a first approximation of what's likely to be true, but because they're not strictly logically correct, they can also lead us astray; lead us to conclusions which are not true. The representativeness heuristic, for example, occurs when you assume that an individual or an object is a member of a group because it has the features of that group, but you fail to consider other factors like the so-called base rate; the frequency of that group itself. Let me give you the classic example from the literature. This was first described by and  and they used this example in the psychological literature. The description of Tom W., who they write: is of high intelligence although lacking in true creativity, he has a need for order and clarity and for neat and tidy systems in which every detail finds its appropriate place. His writing is rather dull and mechanical occasionally enlivened by somewhat corny puns and by flashes of imagination of the sci-fi type. He has a strong drive for competence. He seems to feel little sympathy for other people and does not enjoy interacting with others. Self centered, he nonetheless has a deep moral sense. Subjects of the study were then asked to guess is Tom attends a specific college. Is it most likely that he is in computer science and engineering, the humanities, medicine or some other speciality, some other major. And most people guessed that he was in either engineering or computer science because he's representative of those groups. But they failed to consider the percentage of students at the college that belonged to each of those groups; the base rate. What if, for example, 99% of the students are in the humanities? Is it still more likely that Tom is an engineering student? No, actually it's more likely that he's in the humanities because of the base rate. So we make that mental error all the time, assuming things belong to a group that they look like or resemble; that we feel they're representative of and not considering the statistics involved.

R: Another example of the representative heuristic is the conjunction fallacy, which says that the probability of two things happening can't be more than the probability of just one of those things happening alone. So to give you an example, I could tell you that Steve likes being out in nature and he enjoys quiet solitary activities. And if I told you you have two options: one is Steve is a sports fan and two, Steve is a sports fan and a bird watcher, which would you say is more likely? A lot of people will immediately identify nature and solitary activities with bird watching and say that the second one is more likely. When, in fact, it can't be, because you would have to increase the odds that he would be a sports and a bird watcher as opposed to just being a sports fan. It's much more likely that he's just one of those things rather than both.

S: Right, well the probability of A+B can't be greater than the probability of A or the probability of B. But we're so compelled by the representativeness of the description that we make that fairly basic mathematical error that we can easily see when it's stripped of the representativeness and you say it in mathematical terms, it's obvious.

B: One of my favourite examples of the representative heuristic is regression to the mean. This the tendency of extreme events like scores or certain behaviours to return towards more average levels. A great example is the Sports Illustrated Jinx. Many people think that just being on the cover of that magazine causes teams to perform worse. This is actually regression to the mean, a type of representative heuristic. If a team, for example, gets on the cover of Sports Illustrated, it just kind of makes sense that they're most likely having a really really good year. So it's just a matter of time, though, that they will quickly return to their average level of play.

J: I have one called the. And this is one that Bob and I are very aware of, since every time we go to TAM every year we always throw some money down on blackjack or whatnot and it's fun, but anyway. So the gambler's fallacy is the belief that the probability of an event actually changes based on what has transpired previously. So, a good example would be the roll of a dice or a coin toss. And if you flipped a coin, say, 15 times and got heads every time you might think that there's less of a chance of heads coming up on the next flip but the reality is that every flip is independent and has nothing to do with prior flips. And getting back to being in the casino, betting at a roulette table, you may have noticed there are displays at each table that show the last 20 or so outcomes. And the casino is using the fact that they know people believe that the odds change depending on what's come before and this entices people to place bets. So you can guarantee that the casino is using it, it's there for them to make money and it's not accurate.

S: Yeah, so the representative heuristic is the underlying cognitive bias in the gambler's fallacy, because people think they have a sense of what kind of pattern represents randomness. If you flip six heads in a row, you think, "oh, well, it would represent or seem more random for tails to occur next", but in fact you have to ignore what looks random and go purely on the statistics involved and independent events are not affected by what's happened in the past. So the probability of that next flip is still 50-50; it's not more likely to be tails because that represents what looks like a random sequence.

E: Yeah, all this talk of the representative heuristic reminded me of a time a long time ago, in which I was in the market for purchasing a new car and there was a particular brand of car in which I didn't have much confidence. But this particular model of this brand of car had all the features I wanted. I went ahead and bought it brand new. 2,000 miles into driving the car, bam, the engine actually dies, and worse than that it actually kind of destroyed itself; tore itself to pieces. I was thinking the whole time, "I knew I shouldn't have done it; I should have paid attention; I knew that this", you know, I kind of always had in the back of my mind that something was going to go wrong with this vehicle. But when I stopped, and I felt that way the first day, but then I thought about it a little bit further and did a little bit more research into it, and it turns out that the problem I had with the engine on this car is actually a very rare problem, something that really doesn't happen all that often so I kind of slipped into that heuristic of convincing myself of something was one way to begin with and then when it happens it's like, "oh well, that figures". But then you look a little bit deeper and it turns out well, no actually this is actually a very rare kind of event and I probably shouldn't have been so hasty to judge it so harshly.

S: This heuristic also comes up quite a bit in medicine. Physicians have to make diagnoses often based on a list of signs and symptoms that a patient has, and&mdash;especially inexperience physicians fall into the representative heuristic all the time. If a patient has features which looks like a disease then they think that that disease is very likely. But they fail to consider how common those diseases are. It's actually much more likely to have atypical presentation of a very common disease than a classic or typical presentation of an extremely rare disease. We often refer to students or inexperienced doctors making the rare diagnosis because of the representative diagnostic as a "zebra", referring to ' statement to Watson that if you hear the clopping of hooves on a cobblestone street in London, you should think "horse" not "zebra", because horses are common. Even if the clopping sounds like a zebra's clopping, it's still more likely to be a horse.