SGU Episode 522

From SGUTranscripts
Jump to navigation Jump to search
  Emblem-pen-orange.png This episode needs: transcription, proofreading, formatting, links, 'Today I Learned' list, categories, segment redirects.
Please help out by contributing!
How to Contribute


SGU Episode 522
July 11th 2015
Dragon lizard.jpg
(brief caption for the episode icon)

SGU 521                      SGU 523

Skeptical Rogues
S: Steven Novella

B: Bob Novella

J: Jay Novella

E: Evan Bernstein

Quote of the Week

It is true that that may hold in these things, which is the general root of superstition; namely, that men observe when things hit, and not when they miss; and commit to memory the one, and forget and pass over the other.

Francis Bacon

Links
Download Podcast
Show Notes
Forum Discussion


Introduction[edit]

You're listening to the Skeptics' Guide to the Universe, your escape to reality.

Forgotten Superheroes of Science (2:59)[edit]

  • Charles H. Townes: Nobel Prize-winner and co-inventor of the fucking Laser

News Items[edit]

New Sleep Recommendations (7:10)[edit]

A Logic Lesson (16:51)[edit]

S: All right, hey, did you guys take that puzzle, that science puzzle in the New York Times?

J: Yeah

E: Yes, damn it.

S: How d'ya do?

E: As bad as everyone else.

J: I did very well.

S: Yeah, this is a very interesting puzzle, which I like because it teaches, in a very simple way, a good lesson in critical thinking. So, the puzzle is - and I will provide the link. And if you want to take it before you listen to the rest of this segment, go ahead. I also blogged about it on Neurologica. So you could go there, and I link to the puzzle. It's called A Quick Logic Lesson.

Essentially, what the puzzle is, it gives you three digits, and then it says, “The sequence of three numbers follow some rule. You can try to figure out the rule by entering in number sequences,” you know, “three numbers into the box. And then we'll tell you whether or not that sequence fits the rule or not. When you think you have figured out the rule, enter it in in the box further below. And then we'll discuss what the real answer is.”

Apparently, seventy percent of people fall into – according to this survey – fall into a very common logical error when figuring out the puzzle. And I know this is something we talked about on the show before, but this is a nice, elegant demonstration. I have given it to a few people, and a lot of them, about half of the people I've asked to do it while I watch, committed this error.

So, actually, the explanation after you give your answer to the puzzle talks about confirmation bias. So that's a good, solid, critical thinking concept to get across to people that

B: It's huge!

S: yeah. You tend to seek out confirming data, rather than negative data. So they explained it as, “People will enter in sequences that fit their hypothesis. And then when they get a positive response, they think, 'Oh, I must be correct, because I got a positive response.'” And they said that that's confirmation bias, 'cause you're not entering sequences designed to get a “no” answer, 'cause people don't like getting a negative response.

That's really only partially correct. And while the lesson they were teaching is legitimate, there's actually a more subtle thought process going on here. There's something specifically called the congruence bias.

Congruence bias comes from when you have a hypothesis to explain some phenomenon, that you tend to only test your own hypothesis, rather than also testing competing hypotheses. That's really, I think, a better description of what's going on here.

So you think, “All right,” thinking in your head, “I think that the rule is, you double the number,” right? So, you start with one, it goes to two, and then four. Two, then it goes to four, and then eight. So you put in a number sequence that follows the rule, and if you get a positive response, it's, “Okay, that confirms my hypothesis.” But the way science really progresses is that you have to do experiments designed to get a negative result, that you try to break your rule. And only if your rule survives every attempt that you have of breaking it, do you get closer and closer to confidence.

You can never really prove your hypothesis is the one rule, and there aren't other versions of it that might also fit it, you know, unless it's something very, very specific like the sequence of prime numbers, or whatever.

In this case, the rule was very simple. The rule was that each number in the sequence is higher than the previous number in the sequence. That's it. You don't even have to be whole numbers. They don't have to be positive numbers.

B: (Laughs) Nice!

S: It's very simple. It's so simple. Again, people had, the problem with that rule, it's actually a challenging rule, because it's really hard to prove, because it's so general. But the problem with that rule is it's designed to maximize the congruence bias, right? Because

E: Right

S: if you come up with any alternate hypothesis that involves increase values, you'll get a positive response. So it subsumes a lot of other, more specific rules; does that make sense?

E: I saw it as, that you're not, your first guess, you're not trying that. You're not trying to break some rule that you've already preconceived in your mind, and then starting off by starting to break it. You want to establish

S: Yeah!

E: a rule in the first place, and then go from there. So it makes sense that you are gonna put in three numbers that are in ascending order, of some combination.

S: When I went through the test, I just tested it systematically. My first guess was like most people, maybe it's the doubling. So I put that in, and it worked. But then you have to go, “Okay, well, let me see if I can come up with some other rule. I'll just do all even numbers.” And that worked too. It's like, “Oh, okay.” And that broke the first rule. So I said, “Okay, now I have to come up with some other option,” So that I put in just random numbers, and it was negative. So now I have something to work with. It's like, “Okay,” then eventually, I figured out,

E: Right

S: “Maybe it's just ascending.” So then I just did ascending, but like numbers that were odd, and there was like, clearly no other relationship among them. And that also was positive. Yeah, but that's a really important lesson, because, if you are trained to think scientifically, or to think critically, or if you, like, for me, I'm a physician. I do this every day. You know, I generate a differential diagnosis, and I teach students. You can't just test your own idea. You gotta think of all the possible alternatives. You gotta test them too.

So for somebody that has like, a profession like science, or that is dependent upon investigation, going through this sequence of events was intuitive and trivial. But for most people who don't do this, they just tested their own hypothesis. When they got positive confirmation, they thought they were done!

So, this applies in everyday life. You know, we think of things, and then we look for a positive example, and if we find one, we think that that confirms our idea, rather than thinking, “All right, can I turn this on its head? What if my idea is wrong? Then what would I expect to see?” You know what I mean?

E: Right

S: So this does then dovetail with confirmation bias, because, but there's so many lessons. What I love about this puzzle is that the more you think about it, the more sort of critical thinking lessons you could derive from it.

So, another one, for example, is that the rule here was very non-specific, right? It was very simple, in that numbers only had to be higher than the previous one. You could also think of that as a very general or non-specific rule, right? As opposed to a very highly specific rule, which, there would be some complicated mathematical formula that would determine the next number in the sequence. And it really would only be one number, one next number in the sequence. That would be a very specific rule.

So you could think about things like non-specific symptoms versus specific symptoms, or anything, any outcome that really isn't specific to any one hypothesis. But if it happens to fit your hypothesis, you interpret it as if it were specific. Does that make sense?

J: Yep

S: Yeah. So, that's a big critical thinking lesson. Again, very critically important for medical students, when I'm teaching them. It's like, yeah, so this person had a rash. And the rash is definitely a sign of that disease, but it's not that specific a sign, 'cause you get rashes with thousands of other things too, right?

So then you want to look for a pattern, you know, a pattern of signs, or of results that fit your hypothesis. And then you try to get more specific. Are the results of it only fit my hypothesis? And then again, you need to try to then to break your hypothesis, in order to really, really test it.

So, anyway, these kinds of tests, I think, are really instructive, because they sort of force you to think. And then you can analyze your thought process, which of course is the important thing to learn in all of this. It's how to think in a systematic and logical way. But the harder part is to realize how this applies in your everyday life. This is not, this doesn't just apply to math puzzles. This applies to your thinking about everything. You know, how you make sense of the world. Your political views. You know, when you think, “Oh,” whatever, I talk to people every day who use this kind of bad reasoning to justify their political views, or their ideology. And they're not yet, always step back, and say, “Okay, well how specific is it? Does it, what other hypotheses are consistent with this? Does this really establish my hypothesis?”

You have to sort of bake that into your thinking about everything. Otherwise, the default mode is to just, you'll have a set of beliefs that have no relationship to reality, you know what I mean? It's just, whatever beliefs you inherited, or are consistent with your personality, or your culture, your upbringing, whatever, but ... and you'll be absolutely convinced that the evidence supports your beliefs, but it doesn't. Because you're just falling for the congruence bias, combined with confirmation bias. And obviously, there's a host of other biases as well, but those two sort of work in concert to convince you that the evidence proves your hypothesis, even when it doesn't.

Dragon Lizards and Climate (26:35)[edit]

Limits of Phase Change Memory (29:30)[edit]

Who's That Noisy (38:09)[edit]

  • Answer to last week: Philae Probe

(Commercial from 40:09 until 41:41)

  • Segment continues after commercial break

Name That Logical Fallacy (44:10)[edit]

  • Dr. Robert Sears

In a San Jose Mercury News article today about the recently signed vaccination bill, this passage made me wonder what you guys would say was the logical fallacy or fallacies: Dr. Robert Sears, a Capistrano Beach pediatrician known for his unorthodox views on childhood vaccination, pointed out that only 70 of 120 state legislators voted for the bill. “If vaccines were so good that they should be forced on everyone, 120 legislators would have voted yes,” he said.

Questions and Emails[edit]

Question #1: Cruciferous Vegetables and Thyroid (49:02)[edit]

Hi, I was wondering if you could shed some insight into whether or not cruciferous vegetables such as brussels sprouts and broccoli are really actually bad for your thyroid gland or if it is all just overblown hysteria? What is the science behind that?

Question #2: Education and Paranormal Belief (54:24)[edit]

Hi guys, In listening to this week's podcast (7/4/15), I heard it stated that a graduate-level science education is necessary to successfully divest one's self from pseudoscientific belief. I'm not sure how that line is derived, but as one who has only finished high school, I can assure you that I do not buy into the pseudoscience/naturopathy/magical b.s. thinking. Simply possessing basic intellect should be enough to spot garbage statements (that is all that I possess). Just thought I'd mention that not all of us uneducated slobs buy into pseudoscience. Thanks for your time! Aaron Alcott Ames, Iowa

Science or Fiction (1:02:02)[edit]

Item #1: A recent study finds that grey squirrels are capable of solving complex puzzles,rivaling primates. Item #2: A large genetic analysis finds that increased genetic diversity is associated with being taller and smarter. Item #3: A new technique allows examiners to extend the window in which the time of death can be accurately determined from 36 hours to 240 hours.

Skeptical Quote of the Week (1:16:50)[edit]

"It is true that that may hold in these things, which is the general root of superstition; namely, that men observe when things hit, and not when they miss; and commit to memory the one, and forget and pass over the other." — Francis Bacon

Announcements (1:19:26)[edit]

  • TAM in Los Vegas. Workshop on how to argue skeptically.

S: The Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information on this and other episodes, please visit our website at theskepticsguide.org, where you will find the show notes as well as links to our blogs, videos, online forum, and other content. You can send us feedback or questions to info@theskepticsguide.org. Also, please consider supporting the SGU by visiting the store page on our website, where you will find merchandise, premium content, and subscription information. Our listeners are what make SGU possible.


References[edit]


Navi-previous.png Back to top of page Navi-next.png