SGU Episode 568

From SGUTranscripts
Jump to: navigation, search
  Emblem-pen-orange.png This episode needs:  transcription,  proof-reading,  formatting,  links,  'Today I Learned' list,  categories,  segment redirects. How to Contribute

SGU Episode 568
May 28th 2016
SGU 567 SGU 569
Skeptical Rogues
S: Steven Novella
B: Bob Novella
J: Jay Novella
E: Evan Bernstein
C: Cara Santa Maria
RW: Richard Wiseman
H: Hai Ting
M: Matthew Schickele
Quote of the Week
“The fact that a believer is happier than a skeptic is no more to the point than the fact that a drunken man is happier than a sober one
George Bernard Shaw
Download Podcast
Show Notes
Forum Topic


You're listening to the Skeptics' Guide to the Universe, your escape to reality.

S: Hello, and welcome to The Skeptic's Guide to the Universe.

(Audience applauds)

S: Today is Saturday, May 14th, 2016; and this is your host, Steven Novella. (Applause) Joining me this week are Bob Novella,

B: Hey everybody.


S: Cara Santa Maria

C: Howdy!


S: Jay Novella,

(Star Wars blaster sound)


S: Evan Bernstein,


S: and we have a special guest, actually, back by popular demand, Richard Wiseman.

(Evan laughs, audience applauds)

R: It feels like it's been a year. How popular was that?


S: It was building. It was building.

RW: It was building, yes. One person, yes.

Interview With Richard Wiseman ()[edit]

S: We had to get the follow up to the dead dog in my garden story, so we had to bring you back to tell us – you said there was some very interesting follow up.

RW: No, I didn't. (Audience laughs) Last time I was on your show, I mentioned I had a dead dog in my garden. And it's still there. It's still deceased. And it hasn't made any bid for freedom.

S: So it's still dead.

J: That's the update.

RW: Well, the update was, so basically, we have – we live in Edinborough, in quite an old house; and there's a dead dog in the garden. It was buried in 1830. And quite famously buried. So they're writing about it at the time. And a team of archaeologists contacted us, and said, “Can we scan your garden to find out where the dead dog is.” 'Cause they were gonna charge us quite a lot of money.

And then we worked out the easiest. He said we'd dig up the garden and find the dog if it's actually there, rather than scan the whole garden.

S: Why were they gonna charge you money that you let them scan your garden over their archaeological research? I don't – I'm missing something.

RW: They're very entrepreneurial. (Everyone laughs) Very entrepreneurial. And so they go around finding a garden that may have dead dogs, and offer their services as ...

S: Do they say, (British accent) “Scan your garden for you governor?”


RW: They are from Wales.


RW: (British accent) Scan the garden, come on! (Difficult to hear - 2:28) Come on, Mary, and dance so ... rooftops of London. (Impression ends) That's my ... Dick Van Dyke impression. And when I first saw Mary Poppins with that impression, I thought he'd just got learning difficulties. I've never heard of any accent like that. (Accent again) Come on, Mary!

B: When was the dog buried?

RW: 1830.

B: Wouldn't it just be a black smear in the dirt at this point?

RW: Well, so that was the lot of debate amongst the archaeologists. Some of the ones that were gonna charge us to dig up the garden – scan the garden – suggested there may be something there if it was in a box, or is a big-boned dog, or it got cold, then in which case they might be. But there's still consensus amongst the archaeologists – exactly what you're saying. There will be nothing there.

J: But how does anybody know that it is there?

B: Historical records?

RW: Oh, yeah. So, the house was owned by Walter Scott, who was a quite famous Scottish writer. So, he loved his dogs, and one day one of his dogs died, and so he takes it out to the garden, and he writes in his journal about burying the dog; and he buries it outside the windows, so when he writes, he knows where the dog is. We've got a gravestone.

J: Oh, okay.

RW: A market, yeah, that's right.

S: So there's a marker. So you know where the dog is, right?

RW: Yeah, well, they said they were gonna knock off some money from the scat.


S: Because there is ... right under the gravestone.

RW: Yeah, there's roughly ... if I was you, I'd scan where the gravestone is.


RW: I'm no expert you understand, but that's where I would focus my attention, yeah. So it's been quite fascinating week, really.

S: I can imagine. So, we all enjoyed your show at NECSS.

RW: Thank you!

C: Yeah!

S: You did a ...

(Audience applauds)

S: You gave a version of your talk, which is fabulous. And then you did ...

RW: I didn't give a version of my talk; I gave my talk.

S: Well, you iterate it though. You tweak it over time, don't you?

RW: Well, yes.


S: I've seen it before, and there was some new bits in there.

RW: One new bit, and I messed up. But other than that, it was perfect.

S: Yeah. Other than the stuff that wasn't perfect, it was perfect.

RW: Yeah.

S: But then, after that, you did a new stage show.

RW: Yes.

S: I don't know how new it is, but we've never seen it before, called, “Experiment?”

RW: Experimental

S: Experimental.

RW: Experimental, which is the show with no performer. And so it takes the audience through a series of psychological games and experiments. Yes.

S: You thought removing yourself from the performance would be the update.

RW: Public demand!

S: Yes


RW: I said, “How can I improve the work that I do?” And several people suggested I should leave the stage.


RW: And I misunderstood the tone of their comments, and tried to do the show with no one on stage.

S: Right

RW: And it's a nightmare. It's a run. Because I'm running it from the wings, and I've a very, very complicated power point set up. And trying to watch what the audience are doing. And it takes people a ... but it's magnificent. And it's won many, many awards – not really, I'm just making that up.


RW: So yes, I'm very excited about it. And as you know, it has got the balloon popping

S: Yes!

RW: moment in it.

S: Right, it does, except this one didn't.

RW: Yeah, it went a bit wrong.

(Stopped transcribing at 5:37)

What's the Word (14:34)[edit]

S: All right, Cara, you're gonna start us off with a word of the week.

C: Yep. What's the Word? This week, it is autotomy. Not autonomy, but autotomy. Any guesses?

RW: It's obvious.

C: Now you do. We may have talked about it before the show. Autotomy is the reflexive separation of an appendage, or other part of the body. Now this is most common in vertebrates, like spiders and worms, but of course, we probably likely know it when we think about lizards that drop their tails when there's a predator nearby. And that's specifically called cottle-autotomy. It's the tail that's being removed.

So, this was first used in 1887, and the roots are Greek. Of course, “auto” means self.

B: Self

C: And “tommy” or (toe-mee) means removing something, like a hystorectomy, right? We hear it all the time, and in surgical removal. Therefore, a common synonym for autotomy is “self-amputation.”

B: Ooh.

C: Yeah

J: But it's built in biologically.

C: It's built in biologically, and it is reflexive. I don't think that it's something

E: It has to happen

C: that they can control, yeah.

S: Do any animals have syphalic autotomy?

C: I don't think so. (Audience laughs) Don't think so! But you can cut the head off of a flatworm, or a plenarian. It wouldn't fall off on its own, but

S: Yeah

C: you could cut it. And both halves will regenerate, so, that's interesting. It's a fun thing to do with the kids.

E: What's it called when the female praying mantis bites of the head of the male praying mantis?

C: Um...

S: Artifact of research.

(Cara and audience laugh)

S: I think that

C: It doesn't actually happen?

S: That was observed only under the stress of keeping them in a cage and then there wasn't observed in the wild.

B: What?

S: The original research was probably ...

B: You're kidding!

S: an artifact.

J: That needs to go in George's ... by the way.

B: I never got the memo on that.

C: Yeah!

S: Everything everyone knows is probably wrong.

RS: When I was ten, I had one favorite joke that was word-based, which was the teacher saying, “Can you give me a sentence with the world centimeter in?” And the kids said, “My grandma arrived at the train station, and I was sent to meet her.” (Audience laughs) And it still makes me giggle. Sometimes when you have to give a talk, you stand in the wings, and you try and think of something funny. So you walk on with a little smile on your face.

S: Yes.

RS: And that's the joke I think of.

B: I like that one.

C: Yeah, I like it.

S: I was sent to meet her. That reminds me of the joke, “You ever hear about the guy who ran over himself? Well he was sick, but he's running out of food, so he called his friend to see if he could run over to the store to buy some food for him. But his friend was busy. So he had to run over himself.”

C: Ah, there you go!

(Audience groans)

(Steve clears his throat)

(Cara and the audience laugh)

J: Should I tell my parrot joke?

S: (Laughs) The pirate joke?

C: Which pirate joke?

S: The pirate joke.

J: I've told you this one before. I have one epic one. So this pirate walks into a bar with a steering wheel attached to his zipper, and the bar-tender's like, “What the hell is that?” The pirate goes, “Arr, it's drivin' me nuts!”

(Lots of laughter)

S: One second delay there.

J: I have another pirate joke.

S: That's okay! That's okay! Better than that, so we actually, helping us transition, or segue from segment to segment for this episode, we have Hai-Ting and Matthew, who are ...


S: gonna give us a little skeptical interlude.

(Guitar music begins)

H: (Singing) The fact that a believer is happier than a skeptic is no more to the point than the fact that a drunken man is happier than a sober one. The happiness of credulity is a cheap and dangerous quality.

(Guitar strums off-key and stops. Audience laughs)


H: (Speaking normally) And that of course is the great historical skeptic George Bernard Shaw.


News Items[edit]

Watch Pseudoscience (18:50)[edit]

H: (Singing) When we start to take a rigorous systematic look at nature with methods that control for bias, we found that almost everything we believed about the world was wrong.

(Speaking) And that's a quote from the great contemporary skeptic, Steven Novella.

S: Huh.


S: I thought that sounded familiar. (Chuckles) When you were singing, I'm thinking, “Was that me or Carl Sagan?”

(Audience laughs)

B: (Dismissive) Yeah, yeah.

H: He said something so close to that just a moment ago. We were like, “He just said that again!”

S: Bob, are you ...

M: You put up with a lot of shit.

S: I ... recycling my stuff all the time.

Tabletop Particle Accelerators (27:27)[edit]

GMO Sugar (34:50)[edit]

H: This is a quote from the late, great, Perry DeAngelis. (Singing) When you get right down to it, it's why I joined the skeptical movement. When you get right down to it, it's why I joined the skeptical movement.

M: (Singing) When you get right down to it. It's why we joined the skeptical movement.

H: When you get right down to it, it's why I joined the skeptical movement. To make myself immune to anal probes.

(Laughter and applause)

E: Yeah, that's Perry.

S: Yeah


J: I think my favorite thing that Perry said was, it was something about the weather control in China.

E: Yes, yes, yes.

S: Yeah. He said, “Of course China will control the weather. If it doesn't cooperate, they'll have it shot.”


S: Which was true, because they were shooting the clouds with ...

J: Seeds

E: Seeds. Yeah, they were trying ... before the Olympics, they were experimenting on ways to make sure it did not rain during the ...

S: Yeah

J: Oh! That's right!

Facebook News Algorithm Bias (46:55)[edit]

S: Okay, Jay, tell us a little bit about social media algorithms, and how they're destroying the world.

J: So like most things you do on the internet, Facebook is really watching what you do. They're reading your behavior, what you pause on – like if you're on your iPhone, you're scrolling with your thumb, and you pause, and you read something; they know you did that.

B: Whoa.

J: Yeah, I mean, how else do they know what content you like, 'cause a lot of times, Facebook knows we're headline readers.

S: What if you're scratching your ass?

C: Yeah

J: That's why some weird things will show up in your ...


E: Part of the algorithm.

B: That explains this ...

C: Plus or minus ass scratching.

J: And they're giving, you know, this is standard operating procedure for reading usage statistics and content. What are you pausing on? What are you clicking on? How long are you staying on there? And Facebook has an algorithm that collects all the data on you, and then distributes data to you depending on what it thinks you want to read.

When I first heard about it, I'm like, “All right, that's cool. It makes sense.” But there's a really big problem with that. One, so, we're skeptics. So as an example, we're scrolling through, we're reading skeptical content, and a lot of times we'll be reading content that agrees with what we already think, right? Like, what politics happening – a perfect example. There might be certain things. I like to read stuff that makes Trump look funny, or whatever.

E: Star Wars

J: Yeah, Star Wars and Trump – and he's gonna be in the next movie.

E: Sith lord.

(Cara laughs)

J: So, what happens is – I was gonna say Al Gore gets involved, but he actually doesn't. Facebook will then push news items to you that it thinks you want. So it's filtering what you see. And it has to because there's just too much content out there. But it becomes like you're in a bubble of the types of content that you want to see, so you're not exposed to other things.

So this isn't like going to a news station or a news app on your phone where you're just going through all different types of news. Let's say, I like the science news, news about movies, then this and that; and you have to still scroll through hundreds of news items to find the ones you're interested in. It's picking what you see.

Why is this bad? Like I just said, 'cause it's picking what you see, and narrowing your scope. And it also gives them the ability to feed things to you that they want you to see. And this is an accusation that's come out recently, where Facebook has been accused of actually having a political leaning where the people that they use to help admin the algorithm are making decisions that are influencing the news items that you're seeing, and the content that you're seeing; 'cause it's not just news items. It's also, it's basically everything that you see on Facebook. It's posts, it's anything that could be identified with key words, they're filtering.

This is why – I have five thousand people on my Facebook page, and I'm not seeing the ones – I'm seeing basically the people that I look at the most, come up the most in my feed. I could see the algorithm working. 'Cause there's a lot of people on there I never see news items from, or see posts from 'em, or anything. I wonder, “Why the hell?” It should be random every day. No, I'm seeing the same forty or fifty people because it read me at one point. I stopped at one point. Some friends, but a lot of people I don't know, and I'm like, “I don't – what the hell? Why?”

Now it makes perfect sense. So think about this: When you're a company like Facebook that has that kind of control over the content that you read. They're actually shaping what you think. Let that sink in for a second. They actually have some control over what you think. Over enough time, with that practice, they can influence what you think, and there're examples that you could read about online right now.

Something that's going on, there was an upheaval in Brazil about a politician, and the sentiment towards the government. And there's a lot of people in Brazil that believe things about the government because somebody put a post on Facebook about it, it became popular, and then more people started writing about that. This is all bullshit by the way. It's completely wrong. But people read it, started writing about it themselves, posting that on Facebook, and it just kept going like this and like this. And then you get to the point where the vast majority of the population believes in this false information. And it came out on or two people writing about it.

Talk about a little microcosm example explaining how deadly this is. I think it is a very big problem. And we're not flipping through the newspapers and magazines any more. We're letting companies dictate what we see. What do you think about that?

S: There's a few layers here. One layer is just the algorithm running itself, not, without the humans kind of just not doing anything, and does just having an algorithm tend to create echo chambers, or magnify fringe information? One of the problems is that it might magnify conspiracy theories and

J: Well it does! That's what they're saying.

S: Yeah. Those get magnified.

J: This is interesting 'cause I read a lot of conspiracy theories.

S: Yeah

J: They're fascinating, especially if there's videos attached and everything. And I absolutely notice that I get a lot of it now.

S: Yes.

J: And for someone like me, it's great. But

S: If you're a believer ...

J: Yeah, for Joe Regular Guy that doesn't really know, let's say that something caught their interest, or they're reading, “Oh, I remember about the Moon landing hoax. Let me read about that.” A month or two later this person is getting a lot of crap. And it's sinking into their consciousness.

C: Well, and you also remember that these organizations, the Facebooks and the Googles, are selling your marketing profile to large corporations.

S: Yeah

C: So you as a consumer are clicking around all of these different things. That's big money to targeted marketing campaigns. And so now your profile is such that they probably think that you're a conspiracy theory ...

J: You're connected to that content.

C: Yeah, I remember when Google used to have (and journalists would use it all the time) they used to have something called – not analytics, I think maybe it was Google Insights,

J: Yeah

C: and you would go into it. I might be mixing it up with Analytics, but one of them, you can look up a key term, and see how it trends, and it really helps with journalists. The other would allow you to see your backdoor profile based on all of your searches. And fundamentally, without fail, every time I checked, it thought that I was a fifty-five to seventy year old man, because I was a science journalist!

J: Oh yeah.

C: And that's the target demographic.

E: Right, no women in science.

C: Everything I read, you know, they always assumed that I was somebody else. It's not accurate!

J: Yeah

C: That's the problem too. There's an underlying assumption that these are perfectly accurate. And they're not!

S: So, that's true. That's just another layer of that. They may not be – as you say – be accurate, and how does that bias things? Another layer here though is the human level. Facebook has the answer, and saying, “Well, you know, we have to have human editors in there managing the algorithm,” as you say, “because it would spit out crazy stuff every now and then.” The algorithm isn't perfect, so you need to have at least a filter in there to make sure that ... the example I gave, at lunch time, you would get articles about lunch, 'cause that's what everyone's talking about.


S: They have to filter out that kind of ...

J: Why is that?

S: noise.

J: Why is that? People post what they had for lunch.

S: I don't know. But as you say, one of, when you put people in the mix, then you have unconscious bias could come in. So a lot of this is coming out of the accusation, as you were saying, that they were specifically filtering out conservative articles, and that's based on some anonymous information from people who were working there. So that really hasn't been verified. They're denying it. They're saying that these were just necessary ...

E: ... investigate it.

J: Yeah, but give me a break. By the time that happens,

S: Yeah

J: legally, but by the time that anyone has the right to do it, Facebook would have changed it.

S: Right

J: No one would ever know.

S: But an interesting question that came out of this too is, “Does Facebook have a responsibility to be neutral?” Are they a news source, or are they just an aggregator?

J: Both, they're both.

S: 'Cause there are plenty of news aggregators, outlets, on the internet that are not neutral, and make no pretense,

E: Right

S: like the Drudge Report is an obvious one. It's a very popular one. It's a conservative news aggregator which I look at occasionally. You know, it's interesting because how it biases, it biases the news in what it chooses to present, right? So it's about as crazy, it'll always publish a story about how cold the weather is, you know?

E: Yeah, it's ...

S: It's true, it is cold, wherever, that time; it's like it's always background about how cold it is,

E: Climate change

S: because they're trying to deny global warming. But just by the cherry-picking the news that they report.

J: Well I think outlets like that, they have the right to pick their editorial policy.

S: But should it be transparent?

J: Well, I think it should be transparent, but this Facebook thing though is not a Drudge Report aggregator.

S: Yes.

J: This is very different.

S: I think, something like Facebook, it seems like it is neutral, right? So if Facebook said, “Oh no, we're a news outlet with a liberal editorial policy,” if they were absolutely transparent about that, I think it would be fine. But if they're trying to pretend not to be, right? “Now we just have an algorithm that feeds you what you want to see. We're not biasing it in any ideological way,” but they are – and sometimes, it can be really subtle, like the example that was brought up when they discussed this on NPR for example; if you published the masses, “Go out and vote,” there'll be a certain percentage increase in the people who vote. They could decide who to send that message to. They could say, “We're only gonna send this message to this demographic, who will vote the way we want to vote.” And

J: That's serious.

S: they could put their thumb on a scale. Do they have a right to do that? Is that a violation somehow of the public trust given what they are?

C: It's probably constitutional violation in there somewhere.

S: Well, it's scary!

E: They should disclose!

S: Yeah.

C: But the problem is, is it something they can disclose? Is it that nuanced? Is there malfeasance happening within the company? Is this something that, as you talked about, as we talk about all the time, unconscious bias when you are a coder, and you are writing the algorithm. I mean, these things don't write themselves, right? And they have to constantly be updated. So is there unconscious bias? Is it conscious bias?

We see this even in Google! We've had a lot of historical conversations about Google, which I think is the epitomy of neutrality. It has to be. It's the only way that most people interact with the internet. And there are definitely, as skeptics in the room, you've all put in some sort of skeptical search term, and only gotten an Answers in Genesis as your top Google hit

E: Sure

C: or something like that, and you're thinking, “Why would that be what fully populates?” Those algorithms are dark algorithms. They're not publicly available. You know, when I was working for another, a news source – I worked at Huffington Post for a while – and they are one of these news sources, like you said, that they are very transparent about being a liberal-leaning news aggregator.

And so when I was trying to help them actually put together a science page, I remember, they have within their organization whole teams of people whose job it is to try to crack Google's algorithm. And every major outlet has this.

Following the trends of Google's algorithm to figure out how to improve SEO so that our articles are top-loaded over

E: Right

C: another news outlet puts out.

S: And outlets like Google and Facebook do have a right to keep their algorithms secret. And they justify that by saying, “If people knew our algorithm, they would scam it. They would scam the algorithm. And therefore it wouldn't work.”

C: Or somebody else could rip off, like, that's what makes them ...

S: Not even that, just that they'll break it.

C: Yeah

S: They'll try to break the algorithm, and that will skew it. And therefore, to keep corporations from skewing our algorithm to their interests, we have to keep it secret.

J: The bottom line is: We should be able to, as an administrator of our own experience, put the parameters in that we're comfortable with.

S: Yeah. With Google, you can do that.

C: But like,

J: Yeah. With Google, you have way more

C: Yeah

J: control. Plus, in Google, you can also wipe out your profile.

S: Yeah

J: A lot of people don't know that, but you can

C: You can go incognito too.

J: That doesn't, that's, incognito, really, that doesn't do much at all.

C: It clarifies a lot of your search results.

J: No, but still, they're still tracking.

B: I though incognito prevented an outside party from tracking your

J: No no no.

C: No, I think it keeps you from having cookies.

J: Yeah, it doesn't – you're not anonymous with incognito, not even close.

C: But if you put in a search term incognito, you will get a totally different page than what you usually do

B: Okay

C: because you're not logged into your profile, basically.

RW: Has anyone thought calling their business, “Incognito?” 'Cause you must get loads of hits if that's the case.


RW: I've got two thoughts on this: One is – I now realize in my news feed why there's so much, so many items related to sex with goats.

(Cara and audience laugh)

RW: Quite frankly, I'm embarrassed by that. And so should some of the goats have been. They seem to be enjoying it. The other thought is, I don't think it's a problem.

J: You don't think it's a problem?

RW: If I were to find out Fox News says, “Go out and vote,” they're clearly talking to the types of people who will vote for whoever they are, the Republicans or whatever. It's just another channel, isn't it?

J: Well, I don't disagree, right? But what Steve was saying, I think is the most important fact, about the thumb on the scale concept about, they can choose what you see, who to send it to, like, they can shape things that happen in the real world

RW: But that's what Fox News does.

S: But they're

RW: They send out the items they want to their viewers, and they encourage their viewers to vote.

S: But they're claiming to be a neutral platform.

RW: Yeah

C: So does Fox News! (Laughs)

(Audience laughs)

S: That's different.

C: Fair and

S: They're fair and balanced, but Facebook is like, “We're not anything. We're not a news outlet.

C: It's not. Facebook is not a journalistic outlet. Like, that is a big, right. They're an aggregator.

RW: Well, that's just an educational, you just have to say to users, “Of course they are.” And they're trying to make money, and they're gonna have sponsors and so on, and then everyone knows that what they're getting is biased in the same way as you know you go to any news source, you have to balance it by who's telling you.

B: Right

C: So this is a recapitulation of a story that we've been having for quite a while, which is what are the boundaries between ethics and law when it comes to being a trusted source to communicate information to the public. And I think that we've been having this conversation about the transition of journalism to being more commercially based, right?

When it bleeds, it leads. “We're doing this for the advertisers.” This is another example of that. And we know now that we're living kind of in a branded society; and we know that we're making massive trade-offs between privacy and convenience.

J: Yep

C: And if you don't want Facebook – and this is what many people will say – if you don't want Facebook to know all this about you, and sell your profile, don't use Facebook. It is a choice that we do.

RW: When I buy a newspaper, in a sense they know my profile. If it's a left-wing newspaper in the UK, they know that I'm probably a leftie. If a right-wing, then a ... so in a sense, it's only just giving you what you're normally self-selecting anyway.

C: But there was a time when newspapers were neutral.

RW: Oh, that was back in the 1880's ...


J: But still

E: Nah, not even then

J: Richard, I think part of the problem is like, the world is changed quite a bit. We're all kid of used to having such amazingly strong opinions coming from the media. They are selecting the content we see. There isn't an unbiased, neutral presentation of news. Like, what do we say, who's the news reporter that nobody knew what leaning he was?

S: Walter Cronkite.

J: Walter Cronkite, a little left. Everybody thought that Walter Cronkite believed what what they believed because he never let on what his political leanings were.

C: That's not how we do news any more.

J: I know! I just find that it's,

C: Also

E: It's consumer driven. People want what they want.

S: All right, let's do another transition.

H: This is from one of those aforementioned female scientists, Lisa Randall.

(Music begins.)

H: (Singing) One of the key features distinguishing creativity in science from other forms of creativity is the constraint that ultimately your models have to match reality.

(Laughter, applause)

Blinking (1:03:35)[edit]

H: This is from a little book called, “Why Does E = MC2.” Highly recommended, by Brian Cox and Jeff Forshaw. All of this music by the way is composed by Matthew Shickaly.



H: (Singing) Even the right starting point, the route to a deeper understanding of nature is traveled in small steps. Small steps carefully taken. Science is at its heart a modest pursuit, and this modesty is the key to its success. Da da da da-a-a-a da da dum.

(Wordless singing continues)

H: (Singing) Small steps carefully taken.


Science or Fiction (1:14:13)[edit]

(Hosted by Cara)

(Science or Fiction music)
It's time for Science or Fiction

S: And until next week, this is your Skeptic's Guide to the Universe.

S: The Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information on this and other episodes, please visit our website at, where you will find the show notes as well as links to our blogs, videos, online forum, and other content. You can send us feedback or questions to Also, please consider supporting the SGU by visiting the store page on our website, where you will find merchandise, premium content, and subscription information. Our listeners are what make SGU possible.

Today I Learned:[edit]

  • Cara says that when she checks her marketing profile on Google, it thinks that she is an old man because she's a science journalist.


Navi-previous.png Back to top of page Navi-next.png