SGU Episode 513

From SGUTranscripts
Jump to navigation Jump to search
  Emblem-pen-orange.png This episode needs: transcription, proofreading, formatting, links, 'Today I Learned' list, categories, segment redirects.
Please help out by contributing!
How to Contribute

SGU Episode 513
May 9th 2015
(brief caption for the episode icon)

SGU 512                      SGU 514

Skeptical Rogues
S: Steven Novella

B: Bob Novella

J: Jay Novella

E: Evan Bernstein


JG: Julia Galef

Download Podcast
Show Notes
Forum Discussion


  • 10 Years of the SGU.

You're listening to the Skeptics' Guide to the Universe, your escape to reality.

News Items[edit]

Slapping Therapy Death (2:41)[edit]

Black Box Tech (10:02)[edit]

  • A discussion of where technology is leading.

(Commercial at 30:17)

Tesla Home Battery (31:36)[edit]

Podcasting Patent Troll (45:31)[edit]

(Commercial at 57:01)

Interview with Julia Galef (58:17)[edit]

S: All right, we have our next interview coming up.


S: Hi Julia

J, B, and E: Hey!

JG: Hey guys! Whoo! Happy ten year anniversary! I made you guys something. I made you a present. I looked up online what the gift was for a ten year anniversary; and it was tin or aluminum.[1] You know that?

B: I knew that, but I forgot it.

JG: It's not my best work, but I made you this tinfoil hat to celebrate, and also to protect my brain from alien (inaudible)


B: Oh wow.

S: That's excellent.

E: Fashionable and practical. That's great!

JG: Thank you! Thank you! I'll send it to you after the show's over.

J: Yes, feel free to send that to Bob.

S: Yes.


JG: He needs it most, right?

S: Of course.

B: Promise me you'll wear that for this entire interview, though. Oh yeah.

JG: I'm ... why would I not?

J: What a fashion statement, is she.

S: So, Julia, you are a fellow skeptical podcaster – Rationally Speaking. How many years are you into, with your podcast now?

JG: It's almost exactly five years. We just celebrated our five year anniversary at NECSS. So, we are exactly half as old as you,

S: Well, this year anyway.

JG: which is kind of cool


JG: Yeah, yeah, this year. We do it bi-weekly. And we've taken a few summer breaks, which I suppose reflects our more academic attitude, that we feel we deserve a summer hiatus.

S: You could call that academic, yeah.

JG: Yeah, you guys are work horses. And I don't know if, maybe some of your audience already knows this, but Massimo and I have been co-hosts from the start. We co-founded the podcast together. And as of this month, Massimo has decided to officially retire, and move on to bigger and better projects, I imagine.

J: I heard that he's too cool for podcasting.

E: Yeah, that's right. Podcasting is...

JG: That's what we think.


JG: Maybe the rest of us got cooler, and …

E: He didn't see the tinfoil hat. Maybe he would have stuck around if he saw the tinfoil hat.

JG: Yeah, he left a little bit too...

S: Maybe he just shot his wad. There's nothing left to talk about …

B: Nothing else to say. He said it all.

JB: I forgot how colorful this podcast is. We try to keep things classy on...

J: Whoa!

JB: No, no shooting of wads over at RS.

S: Well, he did sort of talk about his philosophy, right?

JB: (Inaudible) And our intrepid producer, Benny Pollak records and edits every episode.

S: Since you bring up changing your mind...

JB: As I often do.

S: We were chatting about that a little bit at NECSS, and I thought it would be a good topic of discussion. What's interesting is that – and I think we've had this experience too – is that we certainly acknowledge that it's necessary to be open to changing one's mind. I think we all feel that we would be perfectly willing to change our mind if confronted with a better argument, or better evidence on any particular point.

And you're like, “Oh, great! Tell me about something which you have changed your mind.” And I've had that challenge put to me as well. And of course, nothing comes to mind when you get asked that question.

JG: Right.

S: Because I think that, well, you tell me about it. What are your thoughts on changing one's mind. Why is it so hard to think of examples of it happening? Does that mean we're hypocrites, or is it something else?

JG: So, I originally thought it's because we're hypocrites. That was my original hypothesis, because, as you said, I kept having this experience of talking to leaders of the Skeptic movement, proponents of rationality and critical thinking who would wax rhapsodic about how important it was to change your mind; and how should all change our minds all the time; and it's not shameful; it's admirable! And then I would ask for an example, and they would go, “Uh....”

And so, originally, I thought, “Yeah, we're all talking the talk; and we're not walking the walk.” And then I started noticing that after that moment, when people failed to produce an example of changing their mind, literally, sometimes ten or fifteen minutes later in the conversation, they would just organically bring up something that their views had shifted on.

I had this experience with Randi, recently. He was one of our more recent guests. He, again, was talking about how everyone should change their mind; and how it's important for the scientific mindset. And I asked him for an example. And he said, “I can't really think of one.” And then ten minutes later, he ended up talking about – well, I guess Massimo brought up the blog post that Randi wrote about global warming. Yeah.

And in that post, Randi had said, “Yeah, you know, it's not really clear to me either way.” And after so much backlash, Randi sort of acknowledged, “Yeah, well, maybe the consensus is stronger than I realized. Maybe I have more of a leadership role, and therefore more moral responsibility to investigate the consensus before I post about my ignorance than I realized.”

And he even shifted his opinion in discussion with us in that podcast episode, when I brought up an analogy to the alleged vaccine-autism connection; that if someone in Randi's position had said, “Well, I just don't know if there's a connection between vaccines and autism. I don't know! I'm not taking a position,” that people would rightly say, “Hey! The evidence is clear. And if you claim that it's not clear to you, then that's actually taking a position.”

And Randi agreed. So, I think that was a great example of him sort of changing his mind on the spot, despite having been unable to produce examples.

B: You caught him in the act of shifting his position.

JG: Yeah! Yeah, it was great, actually. And I really do think it's wonderful, and I told him so at the time. And so I think part of what's going on with our inability to call up examples on the spot, is that mind-changing is usually not a very discrete event, right?

Usually, there's this – this is so meta. I'm sorry you guys, I always do this, but my example of how I used to think that people were hypocrites, and then gradually, I started noticing examples where, in fact, they did have examples of changing their mind. They just didn't realize it, when asked. That was a gradual process where I just had to be on the lookout for examples that contradicted my assumption. And eventually, I started to notice those examples accumulate. And I sort of stepped back, and I was like, “Wait a minute...”

But that was gradual, and so, I think it's not like a “Boom” moment that gets seared into your brain. Like, February 10th, 1992 was the day that I changed my mind about whatever. So, it's sort of harder to do that search query, and come up with examples, because they're not tagged as such.

S: It's the assumption about, how do you define changing your mind?

JG: Right.

S: And I think people think of it as, well I was way over here, and then I had to make a massive shift, and alter my way of thinking in a completely different direction. But that is not how we function. And psychologists have demonstrated this for decades, that essentially we follow a Bayesian approach.

And a Bayesian approach is, we have a certain belief system about whatever. Any fact, any thing. And we gradually update it when new information becomes available. Of course, if you're emotionally invested, you resist that process. You sort of get anchored.

JG: There's some friction there, yeah.

S: It's the motivated reasoning. You sort of anchor yourself to the position. But, if you're not highly emotionally invested in the outcome, we just happily update our beliefs and our thinking and our knowledge about things as new information comes in. And it is a step by step gradual process. What we don't do is take one piece of information, and then shift all the way over to that. We just gradually move in that direction. We add it to our existing beliefs and body of knowledge.

So there's this slow shifting. And I realize, if you have changed your mind about anything, if you think about it in a broad sense, it's like, yeah, every day. Every time I read a news item, I'm changing my mind about something. I never thought that the universe was expanding; now I believe it is expanding. I changed my mind on that because new evidence came into play.

B: True.

S: It's not like I believed it wasn't expanding. I just didn't believe it was expanding, if that makes sense. Or medical treatments. Every day, I'm reading new studies, and altering what I think works, or doesn't work. I used to do lots of things that I don't do any more in medicine. So, each one of those is changing my mind about a treatment. A study was done, and showed that the preliminary evidence showing maybe it worked. Now we have evidence showing it doesn't work. Okay, I change my practice based upon that new evidence.

B: Part of the problems here, I think, is when somebody asks us that question, our go-to knee-jerk reaction is, you think of skepticism, and of course, ESP, Bigfoot, UFO's, we're not gonna change our mind about any of that, because we know it's all bullshit.

S: But we could. We could...

B: We could give a scenario about how we could change our mind, but we haven't. Ever since I've been an activist, we haven't. But that's, whoa, wait, I didn't change my mind about anything.

S: Because we're dealing with issues that are easy, in a way.

J: So, if there was something fundamental about, say, homeopathy, where we know it's wrong. And the amount of evidence that we would have to be given, and the amount of change that would have to take place, or something brand new coming up that no one's ever thought of. But, as an example, I've been reading quite a bit about the riots that happened recently in the United States.

JG: Yeah.

J: Yeah, the Baltimore riots. And when I first saw the news, without getting too deep into any politics, I had a very strong belief about, “Ah, okay,” I don't wanna say.

S: Initial reaction.

J: I had my initial emotional reaction. Then I'm on Facebook; I'm looking at news. And I read a couple of news items. And I'm like, “Oh, I didn't think about that.” And then I felt a big shift in where I was going with it. And then I found that I had gone to about four or five different positions. It's not even like, it's not black or white. There was all these nuances of different positions that I'm starting to read, until, now, I'm kind of settling in to what I think my assessment is. But it's evolved on a daily basis since it happened.

JG: Right. Evolving might be a word that better describes what happens, and indeed, what should happen, and makes it sort of – and I forget if I was telling you this, Steve – but I think that this confusion over, of this implication of changing your mind as being discrete and sudden is not only leading people to have a hard time calling up examples of having changed their minds, but I think it also prevents them sometimes from evolving their mind the way that they should.

My theory is that when people encounter a new argument or evidence that challenges one of their worldviews, and they think their options are either, “I have to agree that this evidence is correct. And I have to instantly change my mind about this deeply held world view.” Or, “I have to find a reason why this evidence is somehow flawed, so that I can reject it, and not change my mind.” And when it's binary like that, it seems so absurd to have to relinquish an entire worldview because of one piece of evidence.

You feel like you don't have the option of shifting, gradually evolving your beliefs. Then, honestly, the more sensible option really is to reject new evidence and arguments. And I think it's only when the availability of that third way, that gradual evolution is made salient to you, that you're able to do that.

S: Again, slowly evolving your opinion based upon evidence and ideas, is what we do, and it's actually the way to go. But also, suspending beliefs, suspending judgment, saying, there are topics that come up, I'm going to assume that I don't know enough about this to really have a firm opinion. I just feel like I don't know enough about this. So I'm going to not have a strong opinion about this until I learn more. That's legitimate.

I mean, it's a cop-out if you're using that to evade taking an opinion on something you very well have enough information about. But legitimately, I'll give you an example. And again, this comes up a lot, practically week to week for this show. For example, I remember we would occasionally get questions about recycling. And I had to make a decision whether or not to talk about recycling on the show.

And for two years, I decided not to talk about it because I felt I didn't understand it well enough where I felt comfortable discussing it, and unavoidably taking a position on it until I had wrapped my head around it enough.

JG: So, I don't know in what domains you're actively seeking out and open-mindedly considering new evidence. If those domains are mainly about skepticism and science and medicine, and the things that you have a lot of practice thinking well about those things. Then I think it's probably correct not to expect there to be radical shifts, because you're in your element, essentially.

When I look at the things that top scientists and top skeptics are, I think, clearly wrong about. Of course it's all relative. But, the things that I think they're wrong about, they're mostly not in those peoples' areas of expertise.

S: Yeah, exactly.

JG: So, look at the top skeptic heroes. Look at Feynman, and Neil deGrasse Tyson … I can see things that I think they're wrong about. They're just not in physics, say. I think Feynman was wrong about women. I think he was wrong about to what extent women have the capacity for intelligence and moral worth. I think Neil is wrong about atheism to some degree. We have philosophical disagreements.

S: Yeah, so Neil deGrasse Tyson's a great example of that. He's brilliant in anything in astronomy and physics. But the farther away you get from that, then obviously, the more out of his element he is. So, he has commented on alternative medicine in a way that makes the medical skeptics cringe a little bit.

JG: Yeah.

S: But, to his credit, when we explained to him, it's like, “Well, maybe we weren't expressing that in a way that really captures the essence of what's really going on.” He was open to it, and happy to evolve his thinking about it.

All right, Julia, this has been fantastic. Thank you so much!

JG: Yeah! I enjoyed this!

S: Thanks for spending some time with us.

JG: Thank you guys! Happy ten year anniversary!

J: Thank you so much! We'll see you soon!

JG: Here's to ten more!

J: Absolutely.

B: I want that had A.S.A.P!

E: Great hat, yeah, the hat.

Announcement (1:13:00)[edit]

  • TAM coming up. Buy tickets now.

Science or Fiction (1:14:48)[edit]

Item #1: The sound of a TIE fighter engine is actually an elephant call mixed with other sounds. Item #2: Most actors cast as Storm Troopers needed to be left handed due to the design of their E-11 blaster rifle. Item #3: Luke Skywalker and his relatives were originally imagined as dwarfs in early drafts. Item #4: George Lucas originally planned for Yoda to be played by an adorable monkey wearing a mask and carrying a cane. Item #5: R2D2 is named after a piece of film editor's jargon. It means Reel 2 Dialog 2. Item #6: During filming of Empire, the actors had a fake script that read, 'Luke, you are your own father.'

Science or Fiction (1:25:34)[edit]

Item #1: The Bates Vision Correction System claims to correct vision by staring in particular compass directions, aligning the eyes with the Earth's magnetic field. Item #2: Dr. Randell Mills claims that he has a process to make hydrogen atoms shrink into 'hydrinos,' providing a source of free energy. Item #3: The 'New Chronology' claims that events attributed to ancient Greek, Roman, and Egyptian culture actually occurred in the Middle Ages and that recorded human history began around 800 AD.

10 Hour Special Info (1:31:10)[edit]

  • Many thank you's are given to everyone who helped create the show

S: The Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information on this and other episodes, please visit our website at, where you will find the show notes as well as links to our blogs, videos, online forum, and other content. You can send us feedback or questions to Also, please consider supporting the SGU by visiting the store page on our website, where you will find merchandise, premium content, and subscription information. Our listeners are what make SGU possible.


Navi-previous.png Back to top of page Navi-next.png