SGU Episode 895

From SGUTranscripts
Jump to navigation Jump to search
  Emblem-pen-green.png This transcript is not finished. Please help us finish it!
Add a Transcribing template to the top of this transcript before you start so that we don't duplicate your efforts.

Template:Editing required (w/links) You can use this outline to help structure the transcription. Click "Edit" above to begin.

SGU Episode 895
September 3rd 2022
895 roof panels.jpg

Solar panels on roof

SGU 894                      SGU 896

Skeptical Rogues
S: Steven Novella

B: Bob Novella

C: Cara Santa Maria

J: Jay Novella

E: Evan Bernstein

Guest

SY: Dr. Seema Yasmin, British writer & science communicator

Quote of the Week

If you have an effect that nobody can replicate, then your phenomenon fades away. So if you want to have a legacy, then you jolly well better have an effect that replicates.

Susan Fiske, American social psychologist

Links
Download Podcast
Show Notes
Forum Discussion

Introduction, Steve's COVID, Cara in FL

Voice-over: You're listening to the Skeptics' Guide to the Universe, your escape to reality.

[00:09.000 --> 00:14.440] Hello and welcome to the Skeptics' Guide to the Universe. Today is Wednesday, August

[00:14.440 --> 00:19.280] 31st, 2022, and this is your host, Stephen Novella. Joining me this week are Bob Novella.

[00:19.280 --> 00:20.280] Hey, everybody.

[00:20.280 --> 00:21.280] Kara Santamaria.

[00:21.280 --> 00:22.280] Howdy.

[00:22.280 --> 00:23.280] Jay Novella.

[00:23.280 --> 00:24.280] Hey, guys.

[00:24.280 --> 00:25.280] And Evan Bernstein.

[00:25.280 --> 00:26.280] Good evening, friends.

[00:26.280 --> 00:29.280] Well, guys, it finally happened.

[00:29.280 --> 00:30.280] Oh, boy.

[00:30.280 --> 00:31.280] What happened?

[00:31.280 --> 00:32.280] It's no longer elite.

[00:32.280 --> 00:33.280] What happened?

[00:33.280 --> 00:34.280] You went on your first date.

[00:34.280 --> 00:35.280] I got COVID.

[00:35.280 --> 00:36.280] Oh.

[00:36.280 --> 00:37.280] Am I the last one standing?

[00:37.280 --> 00:38.280] Why did you do that?

[00:38.280 --> 00:39.280] No.

[00:39.280 --> 00:40.280] Why?

[00:40.280 --> 00:41.280] The Italy trip got me.

[00:41.280 --> 00:42.280] Too many crowds.

[00:42.280 --> 00:43.280] Italy.

[00:43.280 --> 00:48.280] Despite masking everywhere, I was masked the whole time.

[00:48.280 --> 00:51.280] There's only so much you can social distance.

[00:51.280 --> 00:52.280] We did the best we could.

[00:52.280 --> 00:56.280] It was probably the plane ride home, but based on the timing.

[00:56.280 --> 01:00.280] I'm at the tail end now because I was getting it just last week.

[01:00.280 --> 01:03.280] Were you feeling it in Italy or when you got home?

[01:03.280 --> 01:04.280] No.

[01:04.280 --> 01:05.280] No.

[01:05.280 --> 01:08.280] I wasn't feeling it until like a day after I got home.

[01:08.280 --> 01:10.280] Well, at least it didn't ruin your holiday.

[01:10.280 --> 01:11.280] Yeah, that's true.

[01:11.280 --> 01:12.280] That's a good thing.

[01:12.280 --> 01:15.280] It started with just a scratchy throat.

[01:15.280 --> 01:17.280] Even to the point, I'm like, is that something real?

[01:17.280 --> 01:19.280] Is the air just drying here?

[01:19.280 --> 01:20.280] Is something going on?

[01:20.280 --> 01:22.280] And then it hit me like a Mack truck.

[01:22.280 --> 01:26.280] Within two hours, clearly I had viremia.

[01:26.280 --> 01:29.280] So the virus is just replicating like mad in my bloodstream.

[01:29.280 --> 01:33.280] So I had the fever, chills, muscle aches, fatigue.

[01:33.280 --> 01:34.280] Were they multiplying?

[01:34.280 --> 01:36.280] They were multiplying, yeah.

[01:36.280 --> 01:39.280] Nice day.

[01:39.280 --> 01:42.280] And then it just turned into a really bad cold.

[01:42.280 --> 01:44.280] I'm still at the tail end.

[01:44.280 --> 01:45.280] Did you die?

[01:45.280 --> 01:46.280] Throughout the show.

[01:46.280 --> 01:47.280] No, I managed not to die, which is good.

[01:47.280 --> 01:48.280] That's good.

[01:48.280 --> 01:50.280] Well, look, let's say for what it is.

[01:50.280 --> 01:54.280] The vaccine, I know that most people that listen to this show

[01:54.280 --> 01:55.280] already know this.

[01:55.280 --> 01:58.280] But we need to take a moment to recognize

[01:58.280 --> 02:00.280] the miracle of freaking technology,

[02:00.280 --> 02:05.280] which is these vaccines that have saved millions of lives.

[02:05.280 --> 02:08.280] Steve, as much as Omicron isn't as deadly

[02:08.280 --> 02:13.280] as the first couple of versions, you could have died.

[02:13.280 --> 02:16.280] The vaccine turned it into a bad cold, basically.

[02:16.280 --> 02:17.280] Right.

[02:17.280 --> 02:18.280] But that's what I'm getting at.

[02:18.280 --> 02:20.280] You got a sickness that potentially

[02:20.280 --> 02:22.280] could have killed you if it weren't for the vaccine

[02:22.280 --> 02:24.280] that you've been taking over the last two years.

[02:24.280 --> 02:27.280] Yeah, if I didn't know, if this was not COVID,

[02:27.280 --> 02:29.280] I would have just thought it was a bad cold.

[02:29.280 --> 02:31.280] I wouldn't have thought of any different about it.

[02:31.280 --> 02:32.280] You know what I mean?

[02:32.280 --> 02:33.280] Not even flu level.

[02:33.280 --> 02:34.280] I've had the flu.

[02:34.280 --> 02:35.280] Flu is worse than this.

[02:35.280 --> 02:36.280] At least my version of it.

[02:36.280 --> 02:37.280] I know every one is different.

[02:37.280 --> 02:39.280] And you can get a much more severe.

[02:39.280 --> 02:42.280] What I had was somewhere between a cold and a flu.

[02:42.280 --> 02:43.280] Sounds about right.

[02:43.280 --> 02:45.280] Still sucks, but yeah.

[02:45.280 --> 02:47.280] And there's apparently a very long tail.

[02:47.280 --> 02:49.280] It doesn't just go away.

[02:49.280 --> 02:51.280] It takes people who've had this version.

[02:51.280 --> 02:53.280] Yeah, mine was about three weeks.

[02:53.280 --> 02:54.280] Yeah, it's about three weeks.

[02:54.280 --> 02:55.280] Mine was about three.

[02:55.280 --> 02:57.280] And don't forget, guys, that there is also

[02:57.280 --> 03:00.280] that very distinct possibility that after you feel good

[03:00.280 --> 03:03.280] for a couple of weeks, you'll get hit with incredible fatigue.

[03:03.280 --> 03:05.280] Yeah, I heard that too.

[03:05.280 --> 03:06.280] Oh, yeah.

[03:06.280 --> 03:08.280] Yeah, that got me and my wife nobody's business.

[03:08.280 --> 03:12.280] One day, I asked my wife, I'm like, are you exhausted?

[03:12.280 --> 03:14.280] Do you feel like somebody unplugged your energy?

[03:14.280 --> 03:15.280] And she's like, yeah.

[03:15.280 --> 03:17.280] Like, it hit us literally on the same day.

[03:17.280 --> 03:18.280] It was crazy.

[03:18.280 --> 03:20.280] Steve, what are the latest recommendations

[03:20.280 --> 03:22.280] about quarantining now when you have it?

[03:22.280 --> 03:23.280] Is it five days?

[03:23.280 --> 03:26.280] Yeah, five days from the time when you test positive,

[03:26.280 --> 03:29.280] assuming you're not febrile, right?

[03:29.280 --> 03:33.280] Although my workplace required seven days.

[03:33.280 --> 03:35.280] I couldn't go into work for seven days.

[03:35.280 --> 03:36.280] I still don't understand that.

[03:36.280 --> 03:37.280] I don't understand.

[03:37.280 --> 03:41.280] Like, if you're testing positive, you have a viral load.

[03:41.280 --> 03:42.280] Yeah.

[03:42.280 --> 03:43.280] You are infectious.

[03:43.280 --> 03:46.280] You should not go anywhere until your viral load

[03:46.280 --> 03:48.280] drops enough that you're testing negative.

[03:48.280 --> 03:51.280] It just makes no sense that it's five days, seven days,

[03:51.280 --> 03:53.280] 10 days, whatever the CDC keeps throwing around.

[03:53.280 --> 03:57.280] If you're still positive, you should not be around people.

[03:57.280 --> 04:00.280] Yeah, but I tell you, it's five days isolation

[04:00.280 --> 04:03.280] and then 10 days of masking and social distancing.

[04:03.280 --> 04:05.280] And those recommendations are based on the fact

[04:05.280 --> 04:08.280] that most people have either had it or are vaccinated by now,

[04:08.280 --> 04:09.280] right?

[04:09.280 --> 04:10.280] Right.

[04:10.280 --> 04:12.280] It's the idea that it's not as bad of an illness anymore

[04:12.280 --> 04:15.280] because it's not a naive population.

[04:15.280 --> 04:18.280] It's an increasingly resistant population.

[04:18.280 --> 04:22.280] And so how much are we going to disrupt society for what,

[04:22.280 --> 04:25.280] for most people, will be a bad cold, right?

[04:25.280 --> 04:26.280] So that's part of the calculation.

[04:26.280 --> 04:30.280] It's not just like 100% preventing transmission.

[04:30.280 --> 04:31.280] It's the trade-off.

[04:31.280 --> 04:35.280] What are we going to give up for reducing the transmission?

[04:35.280 --> 04:38.280] Remember, people forget the whole flattening the curve

[04:38.280 --> 04:39.280] thing.

[04:39.280 --> 04:40.280] Yeah.

[04:40.280 --> 04:43.280] And the point of the really severe restrictions early on

[04:43.280 --> 04:46.280] was to keep hospitals from getting overwhelmed.

[04:46.280 --> 04:50.280] And that's not really an issue anymore because the Paxilavid

[04:50.280 --> 04:52.280] and everything, it's like the hospitalization issue

[04:52.280 --> 04:53.280] is not as severe.

[04:53.280 --> 04:56.280] It's just funny to me that the idea now is sort of like,

[04:56.280 --> 04:58.280] OK, you're better but not completely better.

[04:58.280 --> 05:00.280] You can go back to work.

[05:00.280 --> 05:02.280] And I know that this has always been the way, right?

[05:02.280 --> 05:05.280] People go to work with colds all the time, but they shouldn't.

[05:05.280 --> 05:06.280] They shouldn't.

[05:06.280 --> 05:08.280] I'm always like, why are you here?

[05:08.280 --> 05:09.280] Go home.

[05:09.280 --> 05:10.280] That's never a good idea.

[05:10.280 --> 05:12.280] Yeah, it's worse than that because a lot, especially

[05:12.280 --> 05:15.280] in the United States, a lot of employers

[05:15.280 --> 05:17.280] like don't want you to take sick days.

[05:17.280 --> 05:18.280] I know.

[05:18.280 --> 05:20.280] And it's messed up because, of course, it's clear

[05:20.280 --> 05:23.280] that this is speaking from a position of privilege.

[05:23.280 --> 05:26.280] But when we look at it from a structural perspective,

[05:26.280 --> 05:28.280] sure, yes, from an individual perspective,

[05:28.280 --> 05:30.280] it's absolutely a position of privilege

[05:30.280 --> 05:32.280] to say I don't want to go to work because I'm sick

[05:32.280 --> 05:34.280] or I don't want you to go to work while you're sick

[05:34.280 --> 05:35.280] because I don't want to get sick.

[05:35.280 --> 05:37.280] But from a structural perspective,

[05:37.280 --> 05:40.280] it saves everybody money if people don't go to work

[05:40.280 --> 05:41.280] when they're sick.

[05:41.280 --> 05:43.280] Including the employer because they're not going to now

[05:43.280 --> 05:45.280] have a rash of sickness among their employees.

[05:45.280 --> 05:46.280] Right.

[05:46.280 --> 05:47.280] But here's the other thing.

[05:47.280 --> 05:52.280] I think people are more likely to mask if they have a cold.

[05:52.280 --> 05:54.280] They're more likely to stay at home.

[05:54.280 --> 05:56.280] And also, for me, I did telehealth all week.

[05:56.280 --> 05:58.280] It's not like I couldn't do anything.

[05:58.280 --> 06:00.280] I think people will just work from home if they're sick.

[06:00.280 --> 06:03.280] Yeah, that's how I was after my surgery.

[06:03.280 --> 06:05.280] If you have that kind of job, yeah.

[06:05.280 --> 06:07.280] You didn't even have those advantages, say, 20 years ago.

[06:07.280 --> 06:11.280] So up until only recent times has this been a possibility

[06:11.280 --> 06:14.280] for a good amount of the working population

[06:14.280 --> 06:16.280] to have opportunities to work from home effectively.

[06:16.280 --> 06:17.280] Yeah.

[06:17.280 --> 06:18.280] Frogs.

[06:18.280 --> 06:19.280] True.

[06:19.280 --> 06:20.280] Frogs.

[06:20.280 --> 06:21.280] I just haven't said anything.

[06:21.280 --> 06:23.280] I don't want to.

[06:23.280 --> 06:24.280] No, that's a good word.

[06:24.280 --> 06:25.280] Yeah.

[06:25.280 --> 06:26.280] Bob has no opinion other than frogs.

[06:26.280 --> 06:27.280] Interesting choice.

[06:27.280 --> 06:28.280] I like toadles.

[06:28.280 --> 06:30.280] I like turtles.

[06:30.280 --> 06:32.280] Hey, I have some news.

[06:32.280 --> 06:33.280] Oh, yeah?

[06:33.280 --> 06:34.280] What?

[06:34.280 --> 06:37.280] You're on the show live from Fort Lauderdale, Florida.

[06:37.280 --> 06:38.280] Florida?

[06:38.280 --> 06:40.280] Yeah, you're on the East Coast with us now.

[06:40.280 --> 06:41.280] Whoa.

[06:41.280 --> 06:42.280] Low ride in our time zone.

[06:42.280 --> 06:43.280] Low fame in our time zone.

[06:43.280 --> 06:44.280] Late at night.

[06:44.280 --> 06:45.280] Wow.

[06:45.280 --> 06:46.280] Yeah.

[06:46.280 --> 06:47.280] And I'll tell you, it is hot.

[06:47.280 --> 06:48.280] It is hot outside.

[06:48.280 --> 06:49.280] It is humid.

[06:49.280 --> 06:50.280] But I love it.

[06:50.280 --> 06:52.280] It has been really beautiful.

[06:52.280 --> 06:54.280] So far, the bugs have not eaten me alive.

[06:54.280 --> 06:56.280] I think that's just been luck.

[06:56.280 --> 06:58.280] I'm not going to count those chickens.

[06:58.280 --> 07:00.280] But it's lovely here.

[07:00.280 --> 07:03.280] And the people, I will say, I had forgotten how nice the people

[07:03.280 --> 07:04.280] are in the South.

[07:04.280 --> 07:06.280] They're just so friendly.

[07:06.280 --> 07:10.280] My favorite part so far, I mean, I've only been here a few days,

[07:10.280 --> 07:12.280] is all of the herps.

[07:12.280 --> 07:16.280] There are so many lizards in just different reptiles in Florida.

[07:16.280 --> 07:20.280] You're taking a walk down the street, and there's iguanas everywhere.

[07:20.280 --> 07:22.280] It's so cool.

[07:22.280 --> 07:23.280] Like, huge iguanas.

[07:23.280 --> 07:24.280] I felt like I was in the Galapagos.

[07:24.280 --> 07:25.280] Huge.

[07:25.280 --> 07:26.280] That's, yeah.

[07:26.280 --> 07:27.280] Like, eight, 10 feet.

[07:27.280 --> 07:28.280] Oh, wait.

[07:28.280 --> 07:29.280] That's an alligator.

[07:29.280 --> 07:30.280] Big, big iguanas.

[07:30.280 --> 07:31.280] They got those, too.

[07:31.280 --> 07:32.280] And I haven't seen any alligators yet.

[07:32.280 --> 07:35.280] But by every body of water I've walked by, and there are little bodies

[07:35.280 --> 07:38.280] of water everywhere, like little ponds everywhere.

[07:38.280 --> 07:40.280] There are iguanas kind of surrounding them.

[07:40.280 --> 07:44.280] And there are anoles and lots of small lizards, too, running around

[07:44.280 --> 07:46.280] and, like, geckos and things.

[07:46.280 --> 07:50.280] But the only other place I've seen iguanas in the world is the Galapagos.

[07:50.280 --> 07:51.280] I'm not sure where else they are.

[07:51.280 --> 07:54.280] I also know that Florida has flamingos, and the only other place in the

[07:54.280 --> 07:57.280] world that has flamingos are, I mean, it's multiple countries,

[07:57.280 --> 07:59.280] but several African countries.

[07:59.280 --> 08:05.280] But, yeah, just the ecology here is unique and really exciting,

[08:05.280 --> 08:09.280] and it's kind of a reason that a lot of people really do enjoy Florida.

[08:09.280 --> 08:13.280] There are things about Florida I don't like, but I'm trying to silver line.

[08:13.280 --> 08:15.280] Well, that leads into my question.

[08:15.280 --> 08:17.280] Like, have you seen any sightings?

[08:17.280 --> 08:18.280] Have you had any sightings of...

[08:18.280 --> 08:19.280] Of Florida Man?

[08:19.280 --> 08:20.280] Florida Man?

[08:20.280 --> 08:21.280] Oh, yeah.

[08:21.280 --> 08:24.280] I've seen Florida Man, like, 50 times already.

[08:24.280 --> 08:25.280] Whoa.

[08:25.280 --> 08:26.280] They're all over the place.

[08:26.280 --> 08:28.280] Florida Man is everywhere you look.

[08:28.280 --> 08:31.280] I haven't yet, but there's also so much great stuff here,

[08:31.280 --> 08:33.280] and everybody's just been really friendly and really neighborly,

[08:33.280 --> 08:37.280] and I start my first day of my internship tomorrow.

[08:37.280 --> 08:38.280] Oh, boy.

[08:38.280 --> 08:39.280] So we'll see how that goes.

[08:39.280 --> 08:40.280] Wow, good luck.

Announcements (8:40)

  • 6-hour live stream

[08:40.280 --> 08:45.280] Hey, this is a good time to let our audience know that we have an

[08:45.280 --> 08:47.280] upcoming live stream.

[08:47.280 --> 08:55.280] This is going to be a six-hour SGU live show on September 24th,

[08:55.280 --> 09:01.280] so just three weeks after this show comes out, 12 p.m. to 6 p.m. Eastern time.

[09:01.280 --> 09:02.280] 12 noon.

[09:02.280 --> 09:03.280] 12 noon.

[09:03.280 --> 09:04.280] It's still p.m., right?

[09:04.280 --> 09:05.280] Yes, it is.

[09:05.280 --> 09:06.280] It is p.m.

[09:06.280 --> 09:07.280] No, it's not.

[09:07.280 --> 09:08.280] Yes, it is.

[09:08.280 --> 09:09.280] 12 noon is...

[09:09.280 --> 09:10.280] 12 noon is...

[09:10.280 --> 09:11.280] There is no 12 p.m.

[09:11.280 --> 09:13.280] It makes no logical sense.

[09:13.280 --> 09:14.280] Move on.

[09:14.280 --> 09:15.280] Oh, no, we're doing this again.

[09:15.280 --> 09:16.280] All right, move on.

[09:16.280 --> 09:21.280] Hey, Bob, what time zone is it at the North Pole?

[09:21.280 --> 09:23.280] Jay, don't listen to her.

[09:23.280 --> 09:27.280] If it's 12 p.m. in the North Pole, go ahead.

[09:27.280 --> 09:30.280] 12 noon to 6 p.m. Eastern time.

[09:30.280 --> 09:31.280] Thank you.

[09:31.280 --> 09:33.280] We will be doing a six-hour SGU live stream.

[09:33.280 --> 09:35.280] It's going to be a lot of fun.

[09:35.280 --> 09:41.280] Put that on your calendar, and we'll be letting everybody know how to sign into that.

[09:41.280 --> 09:43.280] It's free and open to the public.

[09:43.280 --> 09:48.280] Well, we're going to be doing a couple of SGU episodes, and we're going to be

[09:48.280 --> 09:51.280] doing a deep dive on the book, the new book coming out.

[09:51.280 --> 09:52.280] Yeah.

[09:52.280 --> 09:58.280] Among other things, and guests, and possibly food being prepared live in Steve's kitchen.

[09:58.280 --> 09:59.280] Possibly.

[09:59.280 --> 10:00.280] I hope so.

[10:00.280 --> 10:01.280] At some point, that will probably happen.

[10:01.280 --> 10:02.280] I'm going to be hungry.

[10:02.280 --> 10:03.280] Yeah.

[10:03.280 --> 10:04.280] Those are always a lot of fun.

[10:04.280 --> 10:05.280] And six hours now is like nothing.

[10:05.280 --> 10:06.280] You know, we could do six hours.

[10:06.280 --> 10:07.280] We're just getting started.

[10:07.280 --> 10:08.280] Six hours.

[10:08.280 --> 10:09.280] Yep.

[10:09.280 --> 10:10.280] Just getting warmed up.

[10:10.280 --> 10:14.280] Just getting warmed up.

[10:14.280 --> 10:19.280] And just one other sort of chatty thing before we get to the news items is we were very anxiously

[10:19.280 --> 10:24.280] following the countdown to the Artemis 1 launch on Monday.

[10:24.280 --> 10:25.280] Oh, boy.

[10:25.280 --> 10:26.280] But it got scrubbed.

[10:26.280 --> 10:27.280] That scrubbed.

[10:27.280 --> 10:31.280] They had a pressure issue in one of the tanks.

[10:31.280 --> 10:32.280] Was it pressure or temperature?

[10:32.280 --> 10:33.280] One of those things.

[10:33.280 --> 10:37.280] And they couldn't fix it in the two-hour launch window that they had.

[10:37.280 --> 10:41.280] So as soon as they realized we're not going to fix this in time, they had to scrub it.

[10:41.280 --> 10:43.280] I read that it could have been a faulty sensor.

[10:43.280 --> 10:45.280] There might not have been an actual problem.

[10:45.280 --> 10:46.280] Right.

[10:46.280 --> 10:47.280] But whatever.

[10:47.280 --> 10:48.280] It doesn't matter.

[10:48.280 --> 10:49.280] Absolutely.

[10:49.280 --> 10:51.280] You can't risk this at this point.

[10:51.280 --> 10:53.280] Do they have another launch date now?

[10:53.280 --> 10:54.280] Friday, I believe.

[10:54.280 --> 10:55.280] Tomorrow.

[10:55.280 --> 10:57.280] Well, it's tomorrow or yesterday.

[10:57.280 --> 10:58.280] Yesterday.

[10:58.280 --> 10:59.280] Yeah.

[10:59.280 --> 11:00.280] Time travel.

[11:00.280 --> 11:01.280] It's in two days or yesterday to begin.

[11:01.280 --> 11:02.280] How will it go yesterday?

[11:02.280 --> 11:03.280] We will find out.

[11:03.280 --> 11:04.280] Tomorrow.

[11:04.280 --> 11:05.280] Stay tuned.

[11:05.280 --> 11:09.280] As you're listening to this, you probably know more than we do as you're recording it.

[11:09.280 --> 11:11.280] We know more than we do right now.

[11:11.280 --> 11:14.280] We'll give an update next week one way or the other.

[11:14.280 --> 11:16.280] I know podcast time.

[11:16.280 --> 11:18.280] It's all timey-wimey.

[11:18.280 --> 11:19.280] Talk about time travel.

[11:19.280 --> 11:20.280] Timey-wimey.

[11:20.280 --> 11:22.280] Boy, try doing a podcast for 18 years.

[11:22.280 --> 11:24.280] All right, let's go on with some news items.

News Items

S:

B:

C:

J:

E:

(laughs) (laughter) (applause) [inaudible]

Hot Summer (11:24)

[11:24.280 --> 11:27.280] Jay, you're going to start by telling us how hot it's been this summer.

[11:27.280 --> 11:28.280] Oh, my gosh.

[11:28.280 --> 11:33.280] Yeah, I just wanted to go over some facts that have come across my desk recently about

[11:33.280 --> 11:38.280] global warming and the temperatures that a lot of us have been experiencing that live

[11:38.280 --> 11:43.280] in this part of the world, right, because I'm in the United States.

[11:43.280 --> 11:46.280] So I'm going to get right to the point.

[11:46.280 --> 11:51.280] This year, even though this past year was hotter than most that we've had recently,

[11:51.280 --> 11:56.280] this summer that we've experienced will be one of the coolest years that we'll have in

[11:56.280 --> 11:57.280] a very long time.

[11:57.280 --> 11:58.280] Think about that.

[11:58.280 --> 11:59.280] That's so depressing, Jay.

[11:59.280 --> 12:00.280] Yeah, it is.

[12:00.280 --> 12:05.280] It's horribly depressing, but it's important to talk about because I know most people that

[12:05.280 --> 12:12.280] listen to this show understand climate change and probably want to see a lot of things done

[12:12.280 --> 12:14.280] to mitigate it as best as we can.

[12:14.280 --> 12:15.280] But we need to think about that.

[12:15.280 --> 12:19.280] This may have been the coldest summer for the rest of our life.

[12:19.280 --> 12:20.280] Yep.

[12:20.280 --> 12:21.280] Right.

[12:21.280 --> 12:22.280] Yeah.

[12:22.280 --> 12:23.280] So hot.

[12:23.280 --> 12:24.280] Yeah.

[12:24.280 --> 12:25.280] Oh, and but let me let's dig in.

[12:25.280 --> 12:26.280] Dry or wet.

[12:26.280 --> 12:27.280] Depending on.

[12:27.280 --> 12:30.280] It'll get worse than it is now, of course, because global warming is getting better

[12:30.280 --> 12:33.280] because we're still, you know, polluting the atmosphere.

[12:33.280 --> 12:39.280] So this this past July in Europe, there was a heat wave and where the temperatures went

[12:39.280 --> 12:43.280] over one hundred and four degrees Fahrenheit or 40 degrees Celsius.

[12:43.280 --> 12:47.280] This broke literally the historical record for high temperatures there.

[12:47.280 --> 12:52.280] China in July hit one hundred and five degrees Fahrenheit or forty point nine degrees Celsius

[12:52.280 --> 12:54.280] here in the United States.

[12:54.280 --> 12:59.280] Areas in Texas had two weeks of of one hundred degrees or more Fahrenheit or thirty seven

[12:59.280 --> 13:00.280] point seven degrees Celsius.

[13:00.280 --> 13:06.280] So, you know, we're seeing spots all around the world where when when there was a summer

[13:06.280 --> 13:13.280] that summer was one of the hottest that have ever been recorded in the history of mankind.

[13:13.280 --> 13:14.280] You know what I mean?

[13:14.280 --> 13:17.280] Like, you know, the last couple of hundred years, say, you know, I'm not going back.

[13:17.280 --> 13:22.280] We don't go back that much farther when it comes to legitimate temperatures.

[13:22.280 --> 13:29.280] So averaging out June and July in the northern hemisphere, it was in the top three hottest

[13:29.280 --> 13:30.280] summers ever.

[13:30.280 --> 13:34.280] Unfortunately, these temperatures are not where the record high temperatures stay.

[13:34.280 --> 13:37.280] They're where these heat levels begin on their rise upward.

[13:37.280 --> 13:39.280] You understand what I'm saying here?

[13:39.280 --> 13:44.280] The heat that we had in the northern hemisphere this year, right, in the past couple of months,

[13:44.280 --> 13:48.280] as an example, they were at one of the highest that's ever been measured.

[13:48.280 --> 13:52.280] But these temperatures are just going to go up from this high temperature.

[13:52.280 --> 13:53.280] Right.

[13:53.280 --> 13:56.280] So, again, what Steve said, this was the coldest summer that you're probably ever going to

[13:56.280 --> 13:58.280] have for the rest of your life.

[13:58.280 --> 14:01.280] We're not just hitting high temperatures, but we're hitting longer time frames that

[14:01.280 --> 14:05.280] temperatures are going to stay high as well, which is the really dangerous part, which

[14:05.280 --> 14:06.280] I'll get into in a second.

[14:06.280 --> 14:12.280] So tracking temperatures since about 1850, we're seeing a bold warming trend since the

[14:12.280 --> 14:14.280] late 80s and early 90s.

[14:14.280 --> 14:18.280] Since the 90s, each decade to come was the hottest one on record.

[14:18.280 --> 14:25.280] So the 90s were the hottest decade of all time, that the aughts were the hottest, the

[14:25.280 --> 14:26.280] teens were the hottest.

[14:26.280 --> 14:31.280] And now moving into the 2020s, it's already lining up to be the hottest decade of all

[14:31.280 --> 14:32.280] time.

[14:32.280 --> 14:33.280] Huh.

[14:33.280 --> 14:34.280] I wonder who predicted that.

[14:34.280 --> 14:35.280] Yes, I know.

[14:35.280 --> 14:37.280] But nobody believed us.

[14:37.280 --> 14:41.280] So this is the trend is what I'd like to say.

[14:41.280 --> 14:42.280] People who believe the scientists.

[14:42.280 --> 14:43.280] Yes.

[14:43.280 --> 14:46.280] Who believe in science and the process of climate scientists.

[14:46.280 --> 14:50.280] So we all agree it's shockingly obvious what's going on.

[14:50.280 --> 14:54.280] There will still be oddball years and locations on the planet where the average temperature

[14:54.280 --> 14:57.280] could be lower than what you would consider hot.

[14:57.280 --> 14:58.280] Right.

[14:58.280 --> 15:00.280] You know, this is we're talking about global averages here.

[15:00.280 --> 15:04.280] So if you live in some town somewhere out there, then you have a cool summer.

[15:04.280 --> 15:07.280] It doesn't mean that all of global warming is wrong.

[15:07.280 --> 15:08.280] It's just the way weather works.

[15:08.280 --> 15:12.280] But when we look at the temperatures globally, the average trend is temperatures are going

[15:12.280 --> 15:15.280] up, sealed, signed and delivered.

[15:15.280 --> 15:16.280] That's what's happening.

[15:16.280 --> 15:21.280] So going with global projections and the use of historical temperatures, what we're seeing

[15:21.280 --> 15:27.280] is that summer temperatures are increasing from four to seven degrees, depending on where

[15:27.280 --> 15:28.280] you live.

[15:28.280 --> 15:33.280] And this is going this four to seven degree rise is being realized right now.

[15:33.280 --> 15:38.280] So this four to seven degrees is a projection that goes to 2050.

[15:38.280 --> 15:43.280] But, you know, we're seeing we're seeing temperatures already on an average increase going up.

[15:43.280 --> 15:49.280] But by 2050, through all the ways that we have to to project into the future, the temperature

[15:49.280 --> 15:54.280] could go up as much as seven degrees where you are, wherever you are right now.

[15:54.280 --> 15:57.280] And that is a lot for the summer, I'm saying, not your winter.

[15:57.280 --> 16:02.280] I'm just saying in the summer, your your your hottest temperature can go up to seven degrees.

[16:02.280 --> 16:05.280] So as global warming increases, summer heat waves will be more common.

[16:05.280 --> 16:09.280] They'll last longer and it gets worse the closer to the equator you get.

[16:09.280 --> 16:12.280] Now, Kara, I'm going to use you as an example.

[16:12.280 --> 16:17.280] So we're in the United States, almost half of all Americans now experience at least three

[16:17.280 --> 16:21.280] consecutive days of 100 degrees Fahrenheit or thirty seven point seven degrees Celsius

[16:21.280 --> 16:22.280] or hotter in a row.

[16:22.280 --> 16:23.280] Right.

[16:23.280 --> 16:24.280] That's what just happened this summer.

[16:24.280 --> 16:28.280] That's the average experience of somebody living in the United States.

[16:28.280 --> 16:30.280] Now, keep in mind, this is the average.

[16:30.280 --> 16:38.280] If you are in Florida, Kara, you could have 70 consecutive days of 100 degrees or 37.7.

[16:38.280 --> 16:39.280] Yes.

[16:39.280 --> 16:40.280] Get it.

[16:40.280 --> 16:41.280] Get it.

[16:41.280 --> 16:44.280] Everybody keeps telling me it's going to get better next month and then it'll be beautiful

[16:44.280 --> 16:46.280] for like six months straight.

[16:46.280 --> 16:50.280] And that average number is going to go up as the years go by.

[16:50.280 --> 16:52.280] You know, you see the case the case that I'm building here.

[16:52.280 --> 16:55.280] We need to make take action as quickly as possible.

[16:55.280 --> 16:58.280] Right. You know, this is we're seeing some activity happening with the U.S.

[16:58.280 --> 16:59.280] government right now.

[16:59.280 --> 17:05.280] Biden just allocated funds, a good amount of funds to help with global warming mitigation

[17:05.280 --> 17:07.280] nowhere near where it should be.

[17:07.280 --> 17:11.280] It's literally a drop in the bucket of where we need to go.

[17:11.280 --> 17:14.280] But at least it's starting and hopefully it'll be sustained.

[17:14.280 --> 17:19.280] Now, as expected, overnight low temperatures are also increasing.

[17:19.280 --> 17:23.280] This is troubling because many people don't have access to air conditioning.

[17:23.280 --> 17:29.280] And this means that people will be living for extended periods of time in high temperatures

[17:29.280 --> 17:31.280] with no nighttime recovery period.

[17:31.280 --> 17:33.280] And this is not a good thing.

[17:33.280 --> 17:39.280] It's troubling because, you know, the extended heat exposure is, of course, dangerous to both

[17:39.280 --> 17:41.280] your physical and your mental health.

[17:41.280 --> 17:44.280] It's going to cost globally.

[17:44.280 --> 17:50.280] It's going to be trillions of dollars in medical attention that people are going to need just

[17:50.280 --> 17:52.280] because of global warming.

[17:52.280 --> 17:54.280] I saw a video recently.

[17:54.280 --> 17:56.280] I was very sad to see this.

[17:56.280 --> 18:00.280] But there was a ton of elderly people in China and they're all sitting in an air conditioned

[18:00.280 --> 18:04.280] supermarket to get out of the heat because they had no other way to cool off.

[18:04.280 --> 18:09.280] And, Jay, this is also another positive feedback loop for climate change, because think about

[18:09.280 --> 18:16.280] it, as temperatures warm, more locations on the planet, you basically can't exist without

[18:16.280 --> 18:17.280] air conditioning.

[18:17.280 --> 18:22.280] And people are going to be running their air conditioner for more, right, higher to get

[18:22.280 --> 18:27.280] the temperature from a higher point down to a comfortable zone for more days of the year.

[18:27.280 --> 18:29.280] That's going to use up a lot of electricity.

[18:29.280 --> 18:33.280] That electricity is going to be coming in part from burning coal or other fossil fuels

[18:33.280 --> 18:36.280] that have completely switched over.

[18:36.280 --> 18:42.280] And so that's, again, another negative feedback into worsening climate change.

[18:42.280 --> 18:43.280] Definitely.

[18:43.280 --> 18:48.280] I wonder if we're going to see underground, more underground homes, because I remember

[18:48.280 --> 18:54.280] reading and I've, you know, if you're under the ground, it's like a consistent temperature.

[18:54.280 --> 18:58.280] Once you get past a certain depth, it's pretty consistent, like 58 degrees.

[18:58.280 --> 19:01.280] That's my number that I've read as well, Evan.

[19:01.280 --> 19:07.280] And so 58 degrees, it takes much less energy to go from 58 to a comfort level than to go

[19:07.280 --> 19:14.280] from, you know, 20, 30, you know, Fahrenheit or 90, 100 Fahrenheit down or up to your comfort

[19:14.280 --> 19:15.280] zone.

[19:15.280 --> 19:18.280] So I wonder if we're going to see a lot of that, because that seems like a good idea

[19:18.280 --> 19:19.280] now.

[19:19.280 --> 19:25.280] Well, Bob, a way to capitalize on the continuous temperature underground is to use geothermal.

[19:25.280 --> 19:31.280] I was going to put geothermal in my house, but I ran into a snag because it takes eight

[19:31.280 --> 19:35.280] months for them to actually go through everything that needs to be done.

[19:35.280 --> 19:39.280] There's lots of permitting and they have to drill a well and do a lot of heavy lifting

[19:39.280 --> 19:40.280] to do it.

[19:40.280 --> 19:45.280] It's much more expensive to do, but in the long run, God, I wish I did it.

[19:45.280 --> 19:46.280] I really do.

[19:46.280 --> 19:50.280] I wish I could have, but I couldn't go another summer with my children in an unconditioned

[19:50.280 --> 19:51.280] house.

[19:51.280 --> 19:52.280] So I had to go.

[19:52.280 --> 19:54.280] I went with a gas furnace.

[19:54.280 --> 19:59.280] But anyway, you know, if you're looking to get new equipment for your house, consider

[19:59.280 --> 20:00.280] geothermal.

[20:00.280 --> 20:02.280] I really am impressed with how it works.

[20:02.280 --> 20:03.280] Yeah.

[20:03.280 --> 20:04.280] Now here's another problem.

[20:04.280 --> 20:08.280] Current energy grids are not built to handle significant increases in electrical capacity.

[20:08.280 --> 20:09.280] Straight up.

[20:09.280 --> 20:10.280] That's it.

[20:10.280 --> 20:11.280] That's the way it is.

[20:11.280 --> 20:13.280] Most countries are not ready.

[20:13.280 --> 20:19.280] They are not there to handle the increase in electrical use and electricity production.

[20:19.280 --> 20:24.280] So it's too late to completely avoid rising temperatures, but it's very important and

[20:24.280 --> 20:28.280] relevant to say we do, however, have control over how bad it gets and how well we deal

[20:28.280 --> 20:30.280] with the heat when it does come.

[20:30.280 --> 20:36.280] And we can mitigate a lot of future damage that we haven't done yet, that if we do nothing

[20:36.280 --> 20:37.280] is going to take place.

[20:37.280 --> 20:38.280] Right.

[20:38.280 --> 20:42.280] So, you know, we need to we need to vote for people who care about this.

[20:42.280 --> 20:47.280] And we need to and everybody needs to become an activist when it comes to global warming.

[20:47.280 --> 20:50.280] Obviously, this is a story that we will continue to follow.

[20:50.280 --> 20:56.280] But, you know, we've been doing it for 17 years and pretty much exactly what the scientists

[20:56.280 --> 20:59.280] were saying was going to happen 17 years ago is happening now.

[20:59.280 --> 21:00.280] You know what I mean?

[21:00.280 --> 21:01.280] It's not only if anything worse.

[21:01.280 --> 21:02.280] Yeah.

[21:02.280 --> 21:04.280] If anything, it's getting a little bit worse.

Tear Down This Paywall (21:04)

[21:04.280 --> 21:05.280] All right.

[21:05.280 --> 21:10.280] Kara, tell us about getting rid of paywalls for scientific studies.

[21:10.280 --> 21:19.280] The White House Office of Science and Technology Policy, also called the OSTP, released new

[21:19.280 --> 21:27.280] guidance several days ago on August 25th to make federally funded research freely available.

[21:27.280 --> 21:30.280] The funny thing is the press releases without delay.

[21:30.280 --> 21:34.280] But I think what they mean is that in the future, when federally funded research is

[21:34.280 --> 21:38.280] published, it will need to be freely available without delay.

[21:38.280 --> 21:43.280] There will be a delay in this policy being enacted, but hopefully it's a short one.

[21:43.280 --> 21:48.280] So basically, during the Obama era, there was a rule.

[21:48.280 --> 21:57.280] It was announced in 2013 to try to basically make it so that money that our tax dollars

[21:57.280 --> 22:03.280] or I'm sorry, research that our tax dollars pay for is, you know, available to us to read.

[22:03.280 --> 22:04.280] Nice.

[22:04.280 --> 22:05.280] Yeah, to learn about.

[22:05.280 --> 22:09.280] So seems pretty simple and straightforward, but there were too many loopholes in that

[22:09.280 --> 22:11.280] 2013 policy.

[22:11.280 --> 22:17.280] And what ended up happening kind of across the board was a one year embargo.

[22:17.280 --> 22:22.280] And so what we'll often find now, and sometimes it's worse than this, but we'll often find

[22:22.280 --> 22:29.280] is that if an organization, whether it's a, you know, a public university or a research

[22:29.280 --> 22:36.280] facility receives federal funding, meaning funding that comes, you know, is diverted,

[22:36.280 --> 22:40.280] our tax funds are diverted towards this research and utilized in part or in whole to help fund

[22:40.280 --> 22:46.280] that research, that it was required, okay, this needs to be readily available.

[22:46.280 --> 22:52.280] But the journals basically, which are for profit, required that there be a one year

[22:52.280 --> 22:57.280] embargo, meaning after one year, it would be open and freely available.

[22:57.280 --> 23:02.280] But before then, you'd have to subscribe to the journal to be able to access it.

[23:02.280 --> 23:07.280] And anybody who's kind of worked in academia, or maybe you haven't, but you've wanted to

[23:07.280 --> 23:14.280] access a journal article, or an entire journal before in order to, I don't know, read about

[23:14.280 --> 23:18.280] something, look something up, you pretty quickly realize that if you aren't affiliated with

[23:18.280 --> 23:26.280] an organization that has a subscription, you're going to be paying dozens, hundreds, let's

[23:26.280 --> 23:31.280] say dozens to hundreds of dollars out of pocket for a single article, and upwards of thousands

[23:31.280 --> 23:36.280] of dollars of out of pocket if you want to subscribe to some journals.

[23:36.280 --> 23:43.280] And you think about these institutional subscriptions, and they are big, big, big moneymakers for

[23:43.280 --> 23:44.280] these journals.

[23:44.280 --> 23:50.280] Now, the publishers argue that this is necessary to keep the industry alive, that it's an economic

[23:50.280 --> 23:55.280] imperative, that it also keeps the editors doing what they need to do, which is peer

[23:55.280 --> 23:56.280] review.

[23:56.280 --> 24:00.280] But of course, the editors and the peer reviewers argue, and many of the scientists themselves

[24:00.280 --> 24:06.280] who are publishing in these journals, argue that this is a volunteer job.

[24:06.280 --> 24:09.280] It's fundamentally important for science to move forward.

[24:09.280 --> 24:12.280] And we kind of need to see the whole system overhauled, right?

[24:12.280 --> 24:20.280] Like the entire publishing system within academia needs to be overhauled in order to allow for

[24:20.280 --> 24:24.280] free and unfettered access to this information.

[24:24.280 --> 24:28.280] But as a start, basically, the OSTP is saying a few things.

[24:28.280 --> 24:33.280] They're saying, number one, if your research is funded federally, it will need to not be

[24:33.280 --> 24:36.280] behind a paywall.

[24:36.280 --> 24:39.280] Plain and simple, there cannot be a paywall.

[24:39.280 --> 24:45.280] And that must happen immediately upon publication, not after a 12-month embargo.

[24:45.280 --> 24:50.280] They're also saying that it needs to be formatted in a relatively consistent way, especially

[24:50.280 --> 24:56.280] one that allows it to be readable by screen reading software so that it can be easily

[24:56.280 --> 24:57.280] searched, cataloged.

[24:57.280 --> 25:02.280] Also, all of the metadata needs to be there, especially we're talking funding sources,

[25:02.280 --> 25:09.280] where are the authors' affiliations, so that when we read these articles, we know immediately

[25:09.280 --> 25:12.280] how they were paid for and who was involved in doing them.

[25:12.280 --> 25:17.280] They also require that the data be freely available.

[25:17.280 --> 25:22.280] Because very often, you'll read a journal article and it'll be a summary of the data.

[25:22.280 --> 25:27.280] But if there are actual data sets that accompany it, which there should be, those as supplements

[25:27.280 --> 25:33.280] also need to be freely available, unless there is a legitimate reason for them not to be.

[25:33.280 --> 25:39.280] Like it would be unethical to publish them, or it would be like a security breach to publish them.

[25:39.280 --> 25:44.280] And the interesting thing is, too, the guidance isn't just talking about peer-reviewed journals.

[25:44.280 --> 25:47.280] It's talking about all scholarly publications.

[25:47.280 --> 25:51.280] So they also are including now conference proceedings and book chapters.

[25:51.280 --> 25:57.280] So anything that's considered a scholarly publication, if it was funded by basically

[25:57.280 --> 26:01.280] your tax dollars, you should be able to read it and you should be able to see where your

[26:01.280 --> 26:03.280] tax dollars are going.

[26:03.280 --> 26:07.280] There's a couple of kind of timelines here.

[26:07.280 --> 26:19.280] Basically, they are asking that all of this be instituted at the very latest by December 31st, 2025.

[26:19.280 --> 26:26.280] But that by the middle of next year, 2023, every organization, so we're talking like

[26:26.280 --> 26:33.280] academic organizations, universities, research labs, agencies that spend more than $100 million

[26:33.280 --> 26:41.280] on research have to have a plan within the next six months for how it's going to increase

[26:41.280 --> 26:43.280] public access.

[26:43.280 --> 26:49.280] And I think you get a little bit longer if you are a smaller agency, obviously, because

[26:49.280 --> 26:52.280] your funding isn't there to be able to do that.

[26:52.280 --> 26:54.280] The thing here, yes, it sounds simple.

[26:54.280 --> 26:58.280] Like we are paying for the science, we should have access to the science.

[26:58.280 --> 27:02.280] But a big part of this push is really about equity, right?

[27:02.280 --> 27:08.280] So not only is the OSTP saying you have to abide by these guidelines, but they're also saying

[27:08.280 --> 27:12.280] we want a statement from you of how you're going to make your science more equitable.

[27:12.280 --> 27:17.280] Like we want to know how people are going to be able to access this freely so that it's not going

[27:17.280 --> 27:21.280] to stay within the hands of the people who are continuing to push it out, but that we're going

[27:21.280 --> 27:27.280] to see that this information is spread out amongst industry, amongst individuals, and amongst

[27:27.280 --> 27:33.280] different institutions so we can improve innovation and not stifle it.

[27:33.280 --> 27:38.280] Because as long as this is behind a paywall, as long as research is behind a paywall, and only

[27:38.280 --> 27:43.280] the elite or those with specific subscriptions or those with specific affiliations can access it,

[27:43.280 --> 27:49.280] that's only going to serve to stifle innovation and communication.

[27:49.280 --> 27:54.280] And so the idea here is why are we failing ourselves simply for economic gains?

[27:54.280 --> 28:00.280] And for economic gains, for a very specific and narrow industry, a very specific and narrow few.

[28:00.280 --> 28:09.280] This is ultimately, from a grander economic perspective, going to improve all of our lives.

[28:09.280 --> 28:13.280] Yeah, I mean, it can be a very tricky question because you have to have some business model,

[28:13.280 --> 28:17.280] whatever that is, whether it's 100 percent government-supported, pay-to-play.

[28:17.280 --> 28:23.280] But then is that business? I mean, I think it's a little confusing to use the word business model.

[28:23.280 --> 28:24.280] Well, sure it is.

[28:24.280 --> 28:26.280] Well, it's not corporate.

[28:26.280 --> 28:31.280] It's not a corporate model, but it's a business model, meaning that you have to pay for the work that gets done.

[28:31.280 --> 28:33.280] Right, and we're paying for it with our tax dollars.

[28:33.280 --> 28:37.280] Yeah, I know that. But in terms of publication.

[28:37.280 --> 28:43.280] Right. So open access, there's kind of a long debate over how to do open access.

[28:43.280 --> 28:45.280] Some people pay for it at the top end, right?

[28:45.280 --> 28:51.280] They say, OK, I'm going to pay when I submit this journal article, and then that's going to cover

[28:51.280 --> 28:58.280] the fees that are involved. But we have to remember that that usually is more the case in not-for-profit journals.

[28:58.280 --> 29:01.280] But a lot of these journals are actually for-profit journals.

[29:01.280 --> 29:05.280] So there are people making money on the back end, and that's not necessary.

[29:05.280 --> 29:08.280] There are downsides with every model, basically.

[29:08.280 --> 29:14.280] But interestingly, it's like they're using the best of the worst and the worst of the best right now.

[29:14.280 --> 29:21.280] There are private journals that require inordinate fees to subscribe, yet they still expect their editors

[29:21.280 --> 29:24.280] and their peer reviewers to do it on a volunteer basis.

[29:24.280 --> 29:25.280] Yeah, welcome to academia.

[29:25.280 --> 29:27.280] And it's kind of like, wait a minute.

[29:27.280 --> 29:30.280] Yeah, that's not an uncommon academic model.

[29:30.280 --> 29:32.280] So you should be privileged to be doing this for us.

[29:32.280 --> 29:33.280] Right.

[29:33.280 --> 29:35.280] Thank you.

[29:35.280 --> 29:39.280] You get academic credit for it, right, so you don't get money. That's basically how it works.

[29:39.280 --> 29:45.280] But yeah, I mean, I do think that just changing one thing may not be the answer,

[29:45.280 --> 29:49.280] that we do need to reimagine from the ground up how we're doing this.

[29:49.280 --> 29:50.280] Completely agree.

[29:50.280 --> 29:55.280] And also, a lot of people are already saying, listen, this was supposed to be groundbreaking in 2013,

[29:55.280 --> 30:01.280] and they immediately found a loophole and just figured out how to not allow this to be free and unfettered.

[30:01.280 --> 30:03.280] Probably something similar is going to happen.

[30:03.280 --> 30:05.280] Let's do it for reals now.

[30:05.280 --> 30:07.280] Yeah, let's stay on top of it.

[30:07.280 --> 30:08.280] All right. Thanks, Kara.

Volcano Catastrophe (30:09)

[30:08.280 --> 30:12.280] Bob, tell us how likely it is that we're all going to be wiped out by a volcano.

[30:12.280 --> 30:15.280] Oh, Christ. Now we've got to worry about this?

[30:15.280 --> 30:16.280] Yep.

[30:16.280 --> 30:17.280] Yep.

[30:17.280 --> 30:19.280] Yep.

[30:19.280 --> 30:22.280] That's my news item. Yep.

[30:22.280 --> 30:26.280] All right, guys, volcanoes in the news in a big way this week.

[30:26.280 --> 30:32.280] Some scientists are now saying that the risk of a major volcanic eruption is higher than we've commonly believed,

[30:32.280 --> 30:37.280] and we are unprepared at a level that they describe as reckless.

[30:37.280 --> 30:38.280] Oh, my gosh.

[30:38.280 --> 30:42.280] But when has humanity ever been prepared for a volcanic catastrophe?

[30:42.280 --> 30:45.280] Well, yeah, that's one of the interesting parts of this.

[30:45.280 --> 30:47.280] What can we do?

[30:47.280 --> 30:52.280] I recommend the article published in a recent Nature.

[30:52.280 --> 30:54.280] Check it out. Fascinating stuff.

[30:54.280 --> 31:02.280] This was written by experts from the University of Cambridge's Center for the Study of Existential Risk.

[31:02.280 --> 31:03.280] Oh, boy.

[31:03.280 --> 31:04.280] And the University of Birmingham.

[31:04.280 --> 31:10.280] Question, who remembers the Tonga volcano eruption from this past January 2022?

[31:10.280 --> 31:11.280] Oh, yeah.

[31:11.280 --> 31:14.280] I remembered it, but, you know, barely.

[31:14.280 --> 31:20.280] I remember, oh, this was like the biggest one that we've ever like officially recorded.

[31:20.280 --> 31:22.280] And then my memory ran out.

[31:22.280 --> 31:23.280] It's like, well, what else?

[31:23.280 --> 31:26.280] I don't really remember much else about it.

[31:26.280 --> 31:31.280] So this was a volcano that's called Hunga Tonga, Hunga Ha'apai,

[31:31.280 --> 31:35.280] and it erupted last January 15th in the Polynesian Kingdom of Tonga.

[31:35.280 --> 31:38.280] So, like I said, I barely really remember much of the details,

[31:38.280 --> 31:42.280] but the experts liken it to an asteroid that just misses the Earth.

[31:42.280 --> 31:47.280] And they say it should be a wake-up call, a wake-up call, this volcanic eruption.

[31:47.280 --> 31:48.280] Why?

[31:48.280 --> 31:54.280] Well, OK, first, the damage from this eruption was locally pretty bad.

[31:54.280 --> 31:57.280] Ash fell over hundreds of kilometers.

[31:57.280 --> 32:02.280] Tsunamis, it caused tsunamis that reached as far as Japan and North and South America.

[32:02.280 --> 32:04.280] Their submarine cables were broken,

[32:04.280 --> 32:08.280] disabling their communication with the outside world for, I think, for days.

[32:08.280 --> 32:12.280] And so, yeah, so locally, if you were in that area, it was not fun,

[32:12.280 --> 32:14.280] but it could have been worse.

[32:14.280 --> 32:16.280] It could have easily been worse.

[32:16.280 --> 32:21.280] First off, if the eruption lasted more than 11 hours, it could have been a lot worse.

[32:21.280 --> 32:24.280] Or if it happened somewhere else,

[32:24.280 --> 32:28.280] say a more densely populated area near global infrastructure,

[32:28.280 --> 32:32.280] like electricity grids or vital shipping lanes, could have been much worse.

[32:32.280 --> 32:37.280] Much more worse that we would probably all remember the eruption in detail

[32:37.280 --> 32:41.280] because we would have been talking about it all the time ever since January

[32:41.280 --> 32:45.280] because the impacts, the nasty impacts on global supply chains,

[32:45.280 --> 32:47.280] food resources, climate, you know,

[32:47.280 --> 32:51.280] those things that are already not great for other non-volcanic reasons,

[32:51.280 --> 32:53.280] they would have just been made worse.

[32:53.280 --> 32:55.280] So you may think, like I did,

[32:55.280 --> 32:58.280] that the odds of a really bad volcanic eruption are low, right?

[32:58.280 --> 33:01.280] I mean, my thinking was, oh, yeah, a super volcano,

[33:01.280 --> 33:05.280] the last thing I heard is that it's unlikely for many thousands of years.

[33:05.280 --> 33:10.280] Now, maybe for that level of super volcano, but a little bit less,

[33:10.280 --> 33:15.280] but also very devastating, is apparently much more likely.

[33:15.280 --> 33:18.280] At this point, you might want to fast forward to Steve's much happier talk

[33:18.280 --> 33:20.280] on solar power, just saying.

[33:20.280 --> 33:24.280] The latest thinking on this is coming from the data from recent ice cores,

[33:24.280 --> 33:28.280] which paint a much more grim picture than I expected.

[33:28.280 --> 33:32.280] These scientists now say that the chance of an eruption this century

[33:32.280 --> 33:38.280] that is 10 to 100 times greater than Tonga is one in six.

[33:38.280 --> 33:40.280] One in six.

[33:40.280 --> 33:45.280] That puts it at a level that I think needs to be taken very, very seriously.

[33:45.280 --> 33:49.280] So Tonga was like a five on the volcanic explosivity index.

[33:49.280 --> 33:52.280] They're saying that a magnitude seven, so what they're saying here is that

[33:52.280 --> 33:58.280] a magnitude seven 10 to 100 times worse than Tonga can easily happen

[33:58.280 --> 34:01.280] within the next few generations, three or four generations, say.

[34:01.280 --> 34:02.280] I mean, look it up.

[34:02.280 --> 34:06.280] Magnitude seven is described in Wikipedia as super colossal,

[34:06.280 --> 34:09.280] super colossal, and the scale only goes to eight.

[34:09.280 --> 34:11.280] There is no nine.

[34:11.280 --> 34:16.280] It ends with eight, and eight, by the way, is described as mega colossal.

[34:16.280 --> 34:18.280] You don't even want to go there.

[34:18.280 --> 34:22.280] I think that's probably in the realm of super volcano.

[34:22.280 --> 34:27.280] But these sixes and sevens are not good either, and they're incredibly likely.

[34:27.280 --> 34:31.280] Now, the impact of such a large-scale eruption on the climate

[34:31.280 --> 34:34.280] would actually be similar to the disastrous effects

[34:34.280 --> 34:38.280] of a good-sized asteroid or comet hitting the Earth.

[34:38.280 --> 34:41.280] They can be comparable in many instances.

[34:41.280 --> 34:45.280] But a major and distressing difference between these two nightmare scenarios,

[34:45.280 --> 34:49.280] though, is the effort and money that's being used to mitigate the damage.

[34:49.280 --> 34:54.280] So hundreds of millions of dollars flow annually towards planetary defense

[34:54.280 --> 34:57.280] to deal with the inevitable deadly impact.

[34:57.280 --> 35:00.280] We're going to be hit by a comet or an asteroid.

[35:00.280 --> 35:04.280] It's going to happen, and we're spending hundreds of millions of dollars

[35:04.280 --> 35:07.280] globally, more than that, worldwide every year.

[35:07.280 --> 35:12.280] NASA itself is spending $300 million, and they should, by the way.

[35:12.280 --> 35:14.280] I'm not begrudging that.

[35:14.280 --> 35:19.280] They should be spending this and more for this type of asteroid and comet research.

[35:19.280 --> 35:22.280] Like, $300 million they're spending on their DART mission

[35:22.280 --> 35:27.280] that's going to test the possibility of deflecting future asteroids away from the Earth.

[35:27.280 --> 35:31.280] And that's great. I love it. I've been advocating that for decades.

[35:31.280 --> 35:35.280] That is an existential threat for a civilization

[35:35.280 --> 35:38.280] that we can actually do something about, these asteroids and comets.

[35:38.280 --> 35:41.280] We should put even more money into it.

[35:41.280 --> 35:46.280] But there is no similar coordination or investment that exists

[35:46.280 --> 35:52.280] to mitigate large eruptions, even though they are hundreds of times more likely.

[35:52.280 --> 35:57.280] This type of volcanic eruption is hundreds of times more likely than being hit

[35:57.280 --> 36:03.280] by an asteroid or a comet of similar devastating consequences.

[36:03.280 --> 36:07.280] It's much more likely. That's like not wearing a seatbelt,

[36:07.280 --> 36:11.280] but being deathly afraid of bears, even though you've never seen one in the wild

[36:11.280 --> 36:13.280] where you live your entire life.

[36:13.280 --> 36:15.280] Talking to you, mom.

[36:15.280 --> 36:20.280] So it's like, it's really, it's really, it's kind of like perverse.

[36:20.280 --> 36:25.280] If these studies are correct and their estimates are correct,

[36:25.280 --> 36:29.280] we've really got to be taking this super seriously, I think, at this point.

[36:29.280 --> 36:32.280] I mean, come on, one in six?

[36:32.280 --> 36:35.280] You roll a six-sided die for the rest of this century?

[36:35.280 --> 36:37.280] We're going to get hit with like a seven?

[36:37.280 --> 36:40.280] So the researchers in the Nature article say,

[36:40.280 --> 36:45.280] we call for increased attention to and coordination in research

[36:45.280 --> 36:48.280] aimed at forecasting, preparedness, and mitigation.

[36:48.280 --> 36:50.280] So what can be done though, Evan, right?

[36:50.280 --> 36:52.280] You said it. Well, what the hell are we going to do?

[36:52.280 --> 36:55.280] So here are some of the things, just a few of the things that they propose

[36:55.280 --> 36:56.280] that can be done.

[36:56.280 --> 37:02.280] Step one seems to be that we need to pinpoint the risks themselves.

[37:02.280 --> 37:06.280] We actually don't even know where a lot of these active volcanoes are.

[37:06.280 --> 37:10.280] Many of the volcanoes that are potentially active are unknown,

[37:10.280 --> 37:14.280] and we need to find them, especially the ones, the critical ones,

[37:14.280 --> 37:16.280] that could impact critical global infrastructure.

[37:16.280 --> 37:21.280] These are the ones we need to find and do some of these other ideas,

[37:21.280 --> 37:25.280] like we need to improve monitoring on the ground and in space.

[37:25.280 --> 37:27.280] So here's a surprising statistic.

[37:27.280 --> 37:31.280] Only one quarter of the eruptions since 1950

[37:31.280 --> 37:36.280] have been even monitored by something like a seismograph, 25% of them.

[37:36.280 --> 37:38.280] The other ones, they erupted,

[37:38.280 --> 37:41.280] and we had no really hard data on what was happening.

[37:41.280 --> 37:46.280] And only a third of that data was actually entered into the Global Database

[37:46.280 --> 37:51.280] for Volcanic Unrest, which I kind of love that name,

[37:51.280 --> 37:54.280] the Global Database for Volcanic Unrest.

[37:54.280 --> 37:56.280] So that's disappointing.

[37:56.280 --> 38:01.280] We're not even doing a good job monitoring even now, necessarily,

[38:01.280 --> 38:05.280] and we don't even know where all these potentially active volcanoes are.

[38:05.280 --> 38:10.280] So this would obviously provide better advanced warning of eruptions,

[38:10.280 --> 38:13.280] and those warnings could be greatly improved when combined

[38:13.280 --> 38:16.280] with new types of analyses that are being done with, guess what,

[38:16.280 --> 38:20.280] artificial intelligence that could even make our predictions even better.

[38:20.280 --> 38:22.280] So this one was my favorite.

[38:22.280 --> 38:26.280] This is the best one that I'll cover for their specific recommendations.

[38:26.280 --> 38:30.280] We need to do more research into volcanic geoengineering.

[38:30.280 --> 38:33.280] So what's the worst part of the biggest eruptions?

[38:33.280 --> 38:36.280] It's not really the explosivity necessarily, right?

[38:36.280 --> 38:40.280] But the sulfur aerosols that get injected into the stratosphere.

[38:40.280 --> 38:42.280] You don't want stuff in the stratosphere like that

[38:42.280 --> 38:44.280] because there's no real weather up there.

[38:44.280 --> 38:48.280] Things that are put in there aren't going to get rained out.

[38:48.280 --> 38:51.280] They could stay there for weeks, months, or years.

[38:51.280 --> 38:54.280] So instead, it stays there and it blocks the sun

[38:54.280 --> 38:56.280] and can abruptly cool the Earth.

[38:56.280 --> 38:59.280] So maybe we should look forward to this so that the global warming

[38:59.280 --> 39:01.280] will be cooled by this volcanic eruption.

[39:01.280 --> 39:04.280] Are you saying it becomes a doomsday shroud?

[39:04.280 --> 39:07.280] Well, it depends.

[39:07.280 --> 39:11.280] I mean, sure, if it's big enough, we get a good seven or an eight.

[39:11.280 --> 39:14.280] Sure, this is like we're talking worst-case scenarios

[39:14.280 --> 39:16.280] are not pleasant for that.

[39:16.280 --> 39:18.280] So they're saying that research needs to be conducted

[39:18.280 --> 39:22.280] to look into what can be done to minimize a volcanic winter.

[39:22.280 --> 39:26.280] How could we actually ameliorate this effect, this volcanic winter?

[39:26.280 --> 39:29.280] Now, the scientists say one option that's worth exploring

[39:29.280 --> 39:33.280] is using something like short-lived hydrofluorocarbons,

[39:33.280 --> 39:37.280] which can have a warming effect to counteract the cooling impact

[39:37.280 --> 39:39.280] of these sulfates.

[39:39.280 --> 39:44.280] Something like that may be worth investigating.

[39:44.280 --> 39:47.280] And they would say, absolutely, it's worth investigating

[39:47.280 --> 39:49.280] something like that to deal with this.

[39:49.280 --> 39:52.280] So even more dramatic and beneficial is the possibility

[39:52.280 --> 39:55.280] of actually interacting with the magma bodies themselves

[39:55.280 --> 39:57.280] in the crust.

[39:57.280 --> 40:01.280] In 2024, researchers are planning to actually drill into a magma pocket

[40:01.280 --> 40:05.280] to create a long-term magma observatory, if you will,

[40:05.280 --> 40:07.280] to help improve predictions.

[40:07.280 --> 40:09.280] Actually going down, drilling into the crust deep enough

[40:09.280 --> 40:12.280] where you could have contact with some of these areas.

[40:12.280 --> 40:13.280] Oh, my gosh.

[40:13.280 --> 40:16.280] And the equipment can withstand that heat and pressure?

[40:16.280 --> 40:19.280] Well, I think it probably depends on a lot of variables.

[40:19.280 --> 40:22.280] But if it's in the crust, we could probably get there

[40:22.280 --> 40:25.280] one way or the other, or at least try.

[40:25.280 --> 40:28.280] We're not going to get down into the mantle, not anytime soon.

[40:28.280 --> 40:30.280] But this is all in the crust.

[40:30.280 --> 40:36.280] Even bolder and cooler, in a sense, is the idea to research methods

[40:36.280 --> 40:40.280] to manipulate the magma itself or the nearby rocks

[40:40.280 --> 40:43.280] to potentially reduce the explosivity.

[40:43.280 --> 40:45.280] Think about that.

[40:45.280 --> 40:49.280] Having the technology, and I don't think we're near there yet,

[40:49.280 --> 40:53.280] to turn a magnitude 7 into a 6 or a 5.

[40:53.280 --> 40:56.280] That kind of sounds to me, my first thought was,

[40:56.280 --> 40:59.280] it sounds like a job for Q to do something like that.

[40:59.280 --> 41:02.280] But there is actually funding from the European Research Council

[41:02.280 --> 41:06.280] for a project called magma outgassing during eruptions

[41:06.280 --> 41:08.280] and geothermal exploration.

[41:08.280 --> 41:10.280] So that one was really fascinating.

[41:10.280 --> 41:14.280] If we could actually manipulate it somehow in the ground,

[41:14.280 --> 41:17.280] I'm not sure even how they would do it, except the name of this project

[41:17.280 --> 41:20.280] is actually very telling, magma outgassing.

[41:20.280 --> 41:24.280] If you can make some strategic holes in the crust

[41:24.280 --> 41:27.280] to actually do some outgassing, and that would obviously, right,

[41:27.280 --> 41:30.280] that could potentially for sure reduce this explosivity.

[41:30.280 --> 41:33.280] So maybe that wouldn't be as hard as it potentially seems,

[41:33.280 --> 41:35.280] although we'll have to see the research on that.

[41:35.280 --> 41:37.280] All right, so I will end with the scientists' plea

[41:37.280 --> 41:40.280] at the end of their Nature article, which was really good.

[41:40.280 --> 41:43.280] They said, whether scientists should conduct any volcano engineering,

[41:43.280 --> 41:46.280] which has obvious risks, is a matter for debate,

[41:46.280 --> 41:49.280] but such a debate requires rigorous theoretical

[41:49.280 --> 41:51.280] and experimental research to underpin it.

[41:51.280 --> 41:54.280] In our view, the lack of investment, planning, and resources

[41:54.280 --> 41:58.280] to respond to big eruptions is reckless.

[41:58.280 --> 42:02.280] Will humanity learn from volcanologies near Misantonga,

[42:02.280 --> 42:05.280] or will a large magnitude eruption be the next

[42:05.280 --> 42:08.280] planet-disrupting event to catch the world unawares

[42:08.280 --> 42:10.280] after the pandemic?

[42:10.280 --> 42:12.280] Discussions must start now.

[42:12.280 --> 42:15.280] Wow, wow, I mean, that's just like, holy crap,

[42:15.280 --> 42:19.280] here's something else, got to worry about, Jesus.

[42:19.280 --> 42:22.280] What will happen next, a coronal mass ejection

[42:22.280 --> 42:25.280] or a massive volcanic eruption?

[42:25.280 --> 42:27.280] I'm betting on coronal mass ejection.

[42:27.280 --> 42:30.280] Well, probably the, yeah, just what would be in the line.

[42:30.280 --> 42:34.280] Yeah, I think a CME followed by the volcanic eruption.

[42:34.280 --> 42:36.280] Yeah, I think a CME will trigger a volcanic eruption.

[42:36.280 --> 42:37.280] It's going to be a double tap.

[42:37.280 --> 42:39.280] Oh, the one-two punch, yeah.

[42:39.280 --> 42:42.280] Evan, how much were you resisting saying liquid hot magma

[42:42.280 --> 42:44.280] that entire time?

[42:44.280 --> 42:46.280] I know Jay was.

[42:46.280 --> 42:48.280] Jay was thinking hot pockets.

[42:48.280 --> 42:50.280] No, I was thinking hot magma.

[42:50.280 --> 42:53.280] I mean, Dr. Evil ruined several words for me.

[42:53.280 --> 42:55.280] He said magma.

[42:55.280 --> 42:57.280] Bob said magma so many times, all I could see,

[42:57.280 --> 43:00.280] liquid hot magma.

[43:00.280 --> 43:01.280] What words, Jay?

[43:01.280 --> 43:02.280] What words did he ruin?

[43:02.280 --> 43:03.280] And lasers.

[43:03.280 --> 43:06.280] Laser.

[43:06.280 --> 43:08.280] Well, everyone, we're going to take a quick break

[43:08.280 --> 43:11.280] from our show to talk about our sponsor this week, BetterHelp.

[43:11.280 --> 43:13.280] You know, we all face problems in life,

[43:13.280 --> 43:15.280] and it's pretty easy to get overwhelmed.

[43:15.280 --> 43:18.280] And therapy can help you learn how to actually solve problems

[43:18.280 --> 43:21.280] that you're dealing with so you could deal with them on your own.

[43:21.280 --> 43:24.280] The therapist will teach you evidence-based approaches

[43:24.280 --> 43:26.280] that can help you deal with your problems.

[43:26.280 --> 43:29.280] And that's a good suggestion because recently, I mean,

[43:29.280 --> 43:32.280] in recent years, I started seeing a therapist,

[43:32.280 --> 43:35.280] an online therapist during the COVID years.

[43:35.280 --> 43:39.280] I did so because I felt like in certain ways, my moods were changing

[43:39.280 --> 43:42.280] or I felt I had little control at times

[43:42.280 --> 43:44.280] over some of my emotions and my moods.

[43:44.280 --> 43:49.280] And I really wanted to try to figure out what might be going on with myself.

[43:49.280 --> 43:50.280] Are these normal?

[43:50.280 --> 43:52.280] Are they unusual?

[43:52.280 --> 43:57.280] And it takes a professional to help me suss out those kinds of thoughts

[43:57.280 --> 44:00.280] and emotions that I really hadn't experienced before in life.

[44:00.280 --> 44:02.280] It has been of great benefit to me.

[44:02.280 --> 44:04.280] When you want to be a better problem solver,

[44:04.280 --> 44:05.280] therapy can get you there.

[44:05.280 --> 44:10.280] Visit betterhelp.com.sgu today to get 10% off your first month.

[44:10.280 --> 44:15.280] That's better, H-E-L-P.com.sgu.

[44:15.280 --> 44:17.280] All right, guys, let's get back to the show.

Solar Energy Update (44:19)

[44:17.280 --> 44:21.280] All right, it's time for an update about solar energy,

[44:21.280 --> 44:23.280] as Bob said previously.

[44:23.280 --> 44:24.280] Solar power.

[44:24.280 --> 44:25.280] Here comes the sun.

[44:25.280 --> 44:30.280] Right now, solar photovoltaic energy is the cheapest form of energy

[44:30.280 --> 44:32.280] ever devised by humanity.

[44:32.280 --> 44:37.280] It's the cheapest way to make electricity of any method ever devised.

[44:37.280 --> 44:39.280] And it's getting cheaper.

[44:39.280 --> 44:43.280] Between 2010 and 2020,

[44:43.280 --> 44:49.280] how much do you think the cost of solar power measured per capacity,

[44:49.280 --> 44:52.280] like per watt, like the cost per watt?

[44:52.280 --> 44:53.280] How much has that dropped?

[44:53.280 --> 44:55.280] And how much did scientists think it was going to drop?

[44:55.280 --> 45:00.280] So a study looked at more than 2,900 predictions

[45:00.280 --> 45:04.280] of how much solar power is going to decrease between 2010 and 2020.

[45:04.280 --> 45:08.280] Their average prediction was 2.6% annually,

[45:08.280 --> 45:13.280] and not a single one predicted a greater than 6% reduction.

[45:13.280 --> 45:17.280] What do you think the actual annual reduction in cost was

[45:17.280 --> 45:19.280] between 2010 and 2020?

[45:19.280 --> 45:20.280] 10,000%.

[45:20.280 --> 45:21.280] 10.

[45:21.280 --> 45:22.280] 10.

[45:22.280 --> 45:23.280] 15%.

[45:23.280 --> 45:24.280] Wow.

[45:24.280 --> 45:25.280] 15%.

[45:25.280 --> 45:28.280] That's more than twice the highest estimate,

[45:28.280 --> 45:31.280] and it's five times the average estimate.

[45:31.280 --> 45:33.280] And this was a short-term prediction,

[45:33.280 --> 45:38.280] which people famously screw up egregiously by overpredicting.

[45:38.280 --> 45:39.280] Wow.

[45:39.280 --> 45:41.280] And they massively underpredicted it.

[45:41.280 --> 45:46.280] And did they identify in the ways why it was underpredicted?

[45:46.280 --> 45:47.280] In other words...

[45:47.280 --> 45:49.280] It's hard to say why it was underpredicted.

[45:49.280 --> 45:50.280] People suck at it.

[45:50.280 --> 45:52.280] But the more interesting question is,

[45:52.280 --> 45:55.280] why did solar prices plummet so much?

[45:55.280 --> 45:58.280] And we're talking mainly about silicon here.

[45:58.280 --> 46:00.280] We'll get to that in a second, the different types.

[46:00.280 --> 46:04.280] Essentially, every single piece of the chain got cheaper.

[46:04.280 --> 46:10.280] We got cheaper sources of raw materials, cheaper manufacturing.

[46:10.280 --> 46:15.280] When you scale up and mass-produce the solar panels,

[46:15.280 --> 46:17.280] they became much cheaper to produce.

[46:17.280 --> 46:18.280] Economies of scale.

[46:18.280 --> 46:20.280] The sun got bigger and hotter.

[46:20.280 --> 46:21.280] Economies of scale.

[46:21.280 --> 46:25.280] Greater technology, greater efficiency of the solar panels themselves.

[46:25.280 --> 46:30.280] But also, actually, most of the cost for rooftop solar is the installation,

[46:30.280 --> 46:32.280] and that just got more efficient as well.

[46:32.280 --> 46:33.280] Plus, they last longer,

[46:33.280 --> 46:38.280] so they were valuable in terms of their amortized benefit.

[46:38.280 --> 46:42.280] Some predictions go even as high as over 40 years, right, Steve?

[46:42.280 --> 46:43.280] Really?

[46:43.280 --> 46:44.280] Some of them.

[46:44.280 --> 46:47.280] You don't get to the 80% level until 40 years?

[46:47.280 --> 46:51.280] No, 40 to 50 is the high end of the predictions,

[46:51.280 --> 46:54.280] but even those, some experts think are conservative,

[46:54.280 --> 46:59.280] and those panels that are rated for 40 to 50 years in the real world

[46:59.280 --> 47:01.280] may last for up to 100 years.

[47:01.280 --> 47:02.280] Wow, man.

[47:02.280 --> 47:06.280] But 20 is the industry standard, at least 20,

[47:06.280 --> 47:10.280] but most are in the 20 to, like, 30 to 35 range,

[47:10.280 --> 47:13.280] but really, the good ones are like 40 to 50,

[47:13.280 --> 47:14.280] and that may be conservative.

[47:14.280 --> 47:18.280] Steve, are we talking about where the effectiveness goes from 100% down to 80%?

[47:18.280 --> 47:25.280] Yeah, it maintains more than 80% of its efficiency for that period of time.

[47:25.280 --> 47:28.280] So this is for silicon solar panels.

[47:28.280 --> 47:33.280] Right now, the industry standard in terms of efficiency is around 20%, right?

[47:33.280 --> 47:38.280] So if you buy silicon crystal solar photovoltaic panels right now,

[47:38.280 --> 47:41.280] they're going to be rated for something like 30 years, 35 years.

[47:41.280 --> 47:44.280] They'll be about 20% efficiency,

[47:44.280 --> 47:55.280] and the cost per watt of capacity for rooftop solar is $2.65 per watt.

[47:55.280 --> 48:00.280] For utility-scale solar, it's $0.89 per watt.

[48:00.280 --> 48:01.280] It's a lot cheaper.

[48:01.280 --> 48:07.280] Again, because most of that cost is bolting them into your roof, right?

[48:07.280 --> 48:11.280] But silicon is not the only game in town.

[48:11.280 --> 48:19.280] There are three other categories of solar panels that I want to talk about.

[48:19.280 --> 48:26.280] There's perovskite, organic, and then what I'm calling the wildcards.

[48:26.280 --> 48:32.280] There's some research out there that, like, if any of those research hits,

[48:32.280 --> 48:37.280] if they actually manage to convert that into an industrial process, all bets are off.

[48:37.280 --> 48:39.280] We'll talk about them less.

[48:39.280 --> 48:40.280] All right.

Perovskite panels (48:20)

[48:40.280 --> 48:43.280] So I think I've mentioned these different types of solar panels before on the show,

[48:43.280 --> 48:46.280] but it's good to just give an update of where is everything right now,

[48:46.280 --> 48:49.280] because there's been a lot of recent advances.

[48:49.280 --> 48:54.280] And I'll remind you, when we first started this podcast back in 2005,

[48:54.280 --> 48:59.280] silicon photovoltaics were around 10%, 11% efficient, right?

[48:59.280 --> 49:01.280] And obviously, they were much more expensive.

[49:01.280 --> 49:03.280] Even the sun was 11% back then.

[49:03.280 --> 49:09.280] Yeah, and they were struggling to have a 20-year, they were like 15 to 20-year lifespan.

[49:09.280 --> 49:11.280] So they've dramatically improved since then,

[49:11.280 --> 49:15.280] and through just steady, continuous, incremental improvement.

[49:15.280 --> 49:19.280] Where are we right now with perovskite?

[49:19.280 --> 49:23.280] So perovskite is a type of chemical crystal, right?

[49:23.280 --> 49:27.280] It's compounds that follow a chemical formula of ABX3,

[49:27.280 --> 49:33.280] where A and B are cations and X is an anion that binds to both A and B, right?

[49:33.280 --> 49:37.280] That's a perovskite crystal if you follow that chemical formula.

[49:37.280 --> 49:39.280] Why are we so interested in them?

[49:39.280 --> 49:45.280] Well, many researchers think that perovskite photovoltaic cells will be the successor of silicon.

[49:45.280 --> 49:46.280] Really?

[49:46.280 --> 49:48.280] And yes, for two main reasons.

[49:48.280 --> 49:51.280] The big one is economics.

[49:51.280 --> 49:56.280] One of the big costs of manufacturing silicon photovoltaic cells

[49:56.280 --> 50:04.280] is that they require very high temperatures, 3,000 degrees Fahrenheit for manufacture.

[50:04.280 --> 50:07.280] Perovskite can be manufactured at room temperature.

[50:07.280 --> 50:10.280] That's huge in terms of the cost of production.

[50:10.280 --> 50:15.280] And of course, carbon efficiency, not just cost effectiveness,

[50:15.280 --> 50:17.280] but energy efficiency, carbon efficiency, because it costs a lot.

[50:17.280 --> 50:19.280] Sure, it's a cleaner.

[50:19.280 --> 50:24.280] It costs a lot of energy to manufacture things at 3,000 degrees Fahrenheit.

[50:24.280 --> 50:25.280] So that's one main reason.

[50:25.280 --> 50:32.280] The other one is that the theoretical upper limit of efficiency of conversion of sunlight into electricity is higher.

[50:32.280 --> 50:36.280] The theoretical upper limit for silicon is 29 percent,

[50:36.280 --> 50:40.280] although already there's research saying, oh, we could cheat it this way by doing whatever,

[50:40.280 --> 50:42.280] get up to 30 or 31, whatever percent.

[50:42.280 --> 50:44.280] But, you know, let's just say 29 percent.

[50:44.280 --> 50:48.280] For perovskite, it's more like 30 in the upper 30s.

[50:48.280 --> 50:49.280] Nice.

[50:49.280 --> 50:56.280] Yeah, the limit of silicon and probably anything you could do to cheat the silicon to a higher percentage,

[50:56.280 --> 50:58.280] you could probably do it for the perovskite.

[50:58.280 --> 51:00.280] So what, 37, 38?

[51:00.280 --> 51:02.280] Yeah, maybe 40 percent.

[51:02.280 --> 51:04.280] Wow, that would be amazing.

[51:04.280 --> 51:07.280] So those are the two big advantages, right?

[51:07.280 --> 51:13.280] Cheaper to manufacture because of the lower temperature and a higher potential conversion efficiency than silicon.

[51:13.280 --> 51:15.280] But there's a big downside.

[51:15.280 --> 51:16.280] Do you guys remember what it is?

[51:16.280 --> 51:18.280] Yes, it causes cancer.

[51:18.280 --> 51:21.280] No, the lifespan.

[51:21.280 --> 51:23.280] Lifespan, I would think.

[51:23.280 --> 51:31.280] When they first made perovskite crystals, they only survived for seconds before they broke down.

[51:31.280 --> 51:33.280] That's a problem.

[51:33.280 --> 51:35.280] And now we got it up to minutes.

[51:35.280 --> 51:39.280] Then they got it up to minutes and then hours and days.

[51:39.280 --> 51:46.280] And now they're running at about, well, so a couple of months ago,

[51:46.280 --> 51:53.280] researchers published a study where they had a perovskite that was stable up to 51,000 hours of continuous sunlight,

[51:53.280 --> 52:02.280] which they say would translate in real-world application to 30 years, which is now beyond the 20-year industry standard.

[52:02.280 --> 52:03.280] That's great.

[52:03.280 --> 52:04.280] I know.

[52:04.280 --> 52:06.280] You got me all nervous and upset.

[52:06.280 --> 52:07.280] But this is very new.

[52:07.280 --> 52:08.280] 15 years.

[52:08.280 --> 52:09.280] This is new.

[52:09.280 --> 52:11.280] It's still a problem.

[52:11.280 --> 52:13.280] But this is in the lab.

[52:13.280 --> 52:16.280] Will it translate into manufacturing in the real world?

[52:16.280 --> 52:17.280] We will see.

[52:17.280 --> 52:24.280] They did it by essentially making a cap, like a two-dimensional cap layer to go over it to basically protect it.

[52:24.280 --> 52:26.280] And it worked, the two-dimensional cap layer.

[52:26.280 --> 52:28.280] It's a thin film, only several molecules.

[52:28.280 --> 52:33.280] So there's been no evidence of SEF, spontaneous existence failure?

[52:33.280 --> 52:35.280] No, no SEF.

[52:35.280 --> 52:38.280] Which show is that from?

[52:38.280 --> 52:40.280] Douglas Adams, I think that's what he called it.

[52:40.280 --> 52:41.280] Oh, yes, of course.

[52:41.280 --> 52:44.280] So this is like new in the lab kind of science.

[52:44.280 --> 52:50.280] We have to see how long it will take and if it will fully translate into industrial production.

[52:50.280 --> 52:56.280] But again, if you recall, we talked about all these news items like for batteries and photovoltaic cells,

[52:56.280 --> 52:59.280] and you never know which ones are going to pan out.

[52:59.280 --> 53:06.280] Over time, the things we were talking about in 2010 are the solar panels of 2020.

[53:06.280 --> 53:07.280] You know what I mean?

[53:07.280 --> 53:14.280] So the things we're talking about now by 2030 probably are going to be our solar panels.

[53:14.280 --> 53:16.280] Hope so.

[53:16.280 --> 53:23.280] Steve, so the 30-year perovskite, what efficiency is that at right now?

[53:23.280 --> 53:27.280] That was at 14.9% efficiency.

[53:27.280 --> 53:30.280] Still not ready for primetime at all.

[53:30.280 --> 53:37.280] 15% is in the commercial range as it was at 15% to 17% conversion.

[53:37.280 --> 53:40.280] So that's a little bit less than the top-of-the-line silicon.

[53:40.280 --> 53:42.280] Why use it now?

[53:42.280 --> 53:44.280] There's multiple different kinds of silicon.

[53:44.280 --> 53:45.280] It's cheaper.

[53:45.280 --> 53:49.280] It's cheaper, thinner, more flexible.

[53:49.280 --> 53:57.280] Would you pay twice as much to go from 70% to 20% efficiency?

[53:57.280 --> 53:58.280] It's not worth it.

[53:58.280 --> 54:01.280] But the thing is that we're not mass-producing them yet.

[54:01.280 --> 54:05.280] So until we mass-produce them, we can't really compare apples to apples.

[54:05.280 --> 54:08.280] But this is within the commercial range.

[54:08.280 --> 54:10.280] I think that's a way to say it.

[54:10.280 --> 54:13.280] It could end up being cost-effective.

[54:13.280 --> 54:19.280] It has to go through that same process that silicon went through, getting it so much cheaper.

[54:19.280 --> 54:21.280] So I think that perovskite, if I had to guess,

[54:21.280 --> 54:27.280] I would say they would be the solar panels that are going to be industry standard maybe in the 2030s.

[54:27.280 --> 54:33.280] And we're going to be looking at higher efficiencies, much lower production costs, et cetera.

[54:33.280 --> 54:34.280] All right.

Organic panels (54:33)

[54:34.280 --> 54:39.280] The next type of solar panel is organic, organic solar panels.

[54:39.280 --> 54:42.280] These are essentially plastics.

[54:42.280 --> 54:43.280] They're only for vegans?

[54:43.280 --> 54:44.280] How does that work?

[54:44.280 --> 54:45.280] They're polymers.

[54:45.280 --> 54:48.280] They're organic in the chemical sense that they contain carbon.

[54:48.280 --> 54:56.280] They're carbon-based polymers, essentially a form of plastic that can be made into ink-like substances

[54:56.280 --> 55:00.280] which can then literally be printed onto a substrate.

[55:00.280 --> 55:04.280] We could print solar panels, and you could print them on a rigid substrate

[55:04.280 --> 55:07.280] or you could print them on a flexible substrate,

[55:07.280 --> 55:11.280] and then you could have a thin, flexible solar panel, basically.

[55:11.280 --> 55:14.280] You could essentially roll it out like a carpet on your roof.

[55:14.280 --> 55:15.280] Yeah, right.

[55:15.280 --> 55:16.280] That would be awesome.

[55:16.280 --> 55:19.280] And so installation is much cheaper.

[55:19.280 --> 55:26.280] Overall, from what I'm reading, the overall cost is about half of silicon.

[55:26.280 --> 55:29.280] But more expensive than perovskite.

[55:29.280 --> 55:30.280] No, I don't think so.

[55:30.280 --> 55:31.280] I don't know.

[55:31.280 --> 55:35.280] I haven't seen a head-to-head between those two, so I'm not really sure.

[55:35.280 --> 55:38.280] Again, we don't really know until we start mass producing.

[55:38.280 --> 55:39.280] Yeah, yeah, yeah.

[55:39.280 --> 55:40.280] Good deal.

[55:40.280 --> 55:43.280] But theoretically, it should all be cheaper.

[55:43.280 --> 55:48.280] They're predicting, yeah, it'll start at about half as much, half the cost,

[55:48.280 --> 55:52.280] but then with mass production, it should get down to a quarter of the cost,

[55:52.280 --> 55:54.280] 25% of the cost of silicon.

[55:54.280 --> 55:59.280] But what's the limiting factor with organic cells?

[55:59.280 --> 56:01.280] There's two, actually.

[56:01.280 --> 56:06.280] One is they're at the 11% efficiency range right now.

[56:06.280 --> 56:11.280] So that's where the silicon was in 2005, right?

[56:11.280 --> 56:15.280] So it's interesting to think, though, that the current state

[56:15.280 --> 56:22.280] or organic photovoltaics are better than the state of silicon in 2005

[56:22.280 --> 56:25.280] when solar was really starting to take off.

[56:25.280 --> 56:28.280] It's only because it can't compete, really,

[56:28.280 --> 56:34.280] with the current advanced silicon photovoltaics head-to-head.

[56:34.280 --> 56:38.280] But we're just starting to see mass production now,

[56:38.280 --> 56:44.280] like companies producing commercial-scale organic solar panels.

[56:44.280 --> 56:45.280] So we'll see.

[56:45.280 --> 56:47.280] We have to just follow this closely.

[56:47.280 --> 56:50.280] In three, four, five years, we're going to have a really good idea

[56:50.280 --> 56:52.280] about how that industry is shaping up.

[56:52.280 --> 56:55.280] And there's no reason to think that the incremental improvements

[56:55.280 --> 56:59.280] will closely track silicon or even...

[56:59.280 --> 57:01.280] But they might, so...

[57:01.280 --> 57:02.280] Or organic.

[57:02.280 --> 57:05.280] They might, but they could be slower or faster.

[57:05.280 --> 57:07.280] Well, listen to this.

[57:07.280 --> 57:12.280] So what we're seeing in the lab is always sort of predicting

[57:12.280 --> 57:15.280] what we're going to be seeing commercially a few years down the road.

[57:15.280 --> 57:20.280] Already in the lab, researchers have achieved 18% conversion efficiency

[57:20.280 --> 57:22.280] with organic solar cells.

[57:22.280 --> 57:24.280] Again, that's in the commercial range.

[57:24.280 --> 57:28.280] That's almost as good as present-day silicon.

[57:28.280 --> 57:30.280] 20% versus 18% is nothing.

[57:30.280 --> 57:37.280] And again, at half the cost, getting down to 25% with mass production.

[57:37.280 --> 57:40.280] However, the other problem is, again,

[57:40.280 --> 57:42.280] lifespan is not as good as silicon.

[57:42.280 --> 57:47.280] Right now, for what's being cranked out, it's about 10 years.

[57:47.280 --> 57:52.280] Five to 10 years, sort of 10 years is optimistic.

[57:52.280 --> 57:54.280] Sort of a narrow scope of use in that regard.

[57:54.280 --> 57:55.280] But again, it's so cheap.

[57:55.280 --> 57:57.280] Buy it again in 10 years.

[57:57.280 --> 57:58.280] Exactly.

[57:58.280 --> 58:01.280] You have to consider the lifetime cost.

[58:01.280 --> 58:02.280] Exactly.

[58:02.280 --> 58:06.280] But there's already, again, laboratory studies showing that...

[58:06.280 --> 58:11.280] Well, one researcher said that they can alter the junction,

[58:11.280 --> 58:16.280] the organic junction, so that it could survive for 27,000 years.

[58:16.280 --> 58:18.280] It basically doesn't break down.

[58:18.280 --> 58:19.280] What?

[58:19.280 --> 58:23.280] But again, I don't know how that will translate into commercial.

[58:23.280 --> 58:25.280] Even if they did half of that.

[58:25.280 --> 58:26.280] Yeah.

[58:26.280 --> 58:27.280] 13,000 years.

[58:27.280 --> 58:29.280] But Evan, you hit upon it.

[58:29.280 --> 58:31.280] It's not 2005, right?

[58:31.280 --> 58:33.280] It's 2022.

[58:33.280 --> 58:38.280] So given the realities of the market, how are organic solar cells

[58:38.280 --> 58:41.280] going to break into the market?

[58:41.280 --> 58:44.280] Do you want to put 11% efficient panels on your roof?

[58:44.280 --> 58:48.280] You're not going to be able to get up to covering 100% of your electricity.

[58:48.280 --> 58:52.280] It's probably worth the extra expense just to get the silicon.

[58:52.280 --> 58:57.280] However, there are some applications for which it may be ideal.

[58:57.280 --> 59:02.280] For example, rooftop solar on electric vehicles.

[59:02.280 --> 59:04.280] So think about that.

[59:04.280 --> 59:07.280] Some car companies are already doing this.

[59:07.280 --> 59:10.280] So they're thin, they're flexible, they're light,

[59:10.280 --> 59:12.280] so you can put them on the roof of a car.

[59:12.280 --> 59:15.280] And the 10-year lifespan is not that big a deal.

[59:15.280 --> 59:18.280] That's about what the lifespan of your car is.

[59:18.280 --> 59:20.280] It's not a home.

[59:20.280 --> 59:24.280] Plus, they would be useful for small devices, for remote devices,

[59:24.280 --> 59:27.280] for things with batteries where you don't want to have to go in there

[59:27.280 --> 59:30.280] and change the battery, wearable electronics.

[59:30.280 --> 59:32.280] So again, thin, flexible, light.

[59:32.280 --> 59:34.280] Think about those features.

[59:34.280 --> 59:39.280] So probably what's going to happen is there's going to be these parallel markets

[59:39.280 --> 59:43.280] where organic solar cells are going to fill a different niche

[59:43.280 --> 59:46.280] than the crystal panels.

[59:46.280 --> 59:50.280] Silicon now may be perovskite in 5 to 10 years.

[59:50.280 --> 59:54.280] And that will bootstrap the organic solar cell industry

[59:54.280 --> 59:56.280] by filling these other applications,

[59:56.280 --> 01:00:01.280] putting it where you basically can't put silicon because it's too heavy or whatever.

[01:00:01.280 --> 01:00:03.280] And you can also, again, because it's so cheap,

[01:00:03.280 --> 01:00:05.280] you could basically paint your whole house in it.

[01:00:05.280 --> 01:00:09.280] This is the stuff that if you're going to paint your house in organic cells,

[01:00:09.280 --> 01:00:12.280] in photovoltaics, it's going to be organic.

[01:00:12.280 --> 01:00:16.280] Or this is the one where you can have your windows be collecting energy.

[01:00:16.280 --> 01:00:20.280] The other advantage to organic is that it can produce electricity

[01:00:20.280 --> 01:00:25.280] at really low light levels, including indoor light levels.

[01:00:25.280 --> 01:00:28.280] Whereas silicon cannot do that.

[01:00:28.280 --> 01:00:29.280] And you need the sun.

[01:00:29.280 --> 01:00:31.280] Yeah, broader applications.

"Wildcards" panels (1:01:37)

[01:00:31.280 --> 01:00:37.280] So just given the steady drumbeat of advances and scientific advances,

[01:00:37.280 --> 01:00:42.280] solar is just going to be unbelievable over the next 20 years.

[01:00:42.280 --> 01:00:44.280] It's already the cheapest form of energy.

[01:00:44.280 --> 01:00:47.280] It's just going to get ridiculously cheap.

[01:00:47.280 --> 01:00:51.280] The only real limiting factor is the intermittency

[01:00:51.280 --> 01:00:54.280] in terms of providing energy for the grid.

[01:00:54.280 --> 01:00:57.280] That's where batteries come in or other forms of grid storage.

[01:00:57.280 --> 01:01:01.280] If you could marry solar to grid storage, we're basically done.

[01:01:01.280 --> 01:01:04.280] Although some people have argued, I don't totally buy this,

[01:01:04.280 --> 01:01:09.280] that you have to have a shitload of overcapacity and distribution

[01:01:09.280 --> 01:01:13.280] as one way to overcome the intermittency without grid storage.

[01:01:13.280 --> 01:01:16.280] And that makes it less and less cost-effective.

[01:01:16.280 --> 01:01:20.280] But they say, yeah, but it's going to get so freaking cheap it doesn't matter.

[01:01:20.280 --> 01:01:21.280] It just doesn't matter.

[01:01:21.280 --> 01:01:23.280] You could put it freaking everywhere.

[01:01:23.280 --> 01:01:24.280] It's just going to be so cheap.

[01:01:24.280 --> 01:01:25.280] So that may be true.

[01:01:25.280 --> 01:01:29.280] So it becomes hard to predict what the grid's going to look like in 10 to 20 years

[01:01:29.280 --> 01:01:35.280] because everything's got to be compared to this plummeting price of solar.

[01:01:35.280 --> 01:01:40.280] All right, very quickly before I end, the wild cards.

[01:01:40.280 --> 01:01:47.280] So I want to talk about something called, the company is called Nova Solix.

[01:01:47.280 --> 01:01:53.280] And they're using basically RFID technology.

[01:01:53.280 --> 01:01:57.280] You guys familiar with the concept of a rectifier?

[01:01:57.280 --> 01:01:58.280] Yeah.

[01:01:58.280 --> 01:01:59.280] No, what is that?

[01:01:59.280 --> 01:02:03.280] So a rectifier is basically an antenna that captures electromagnetic radiation

[01:02:03.280 --> 01:02:04.280] and turns it into energy.

[01:02:04.280 --> 01:02:09.280] So if your RFID tag does it with radio frequency, it's radio frequency ID,

[01:02:09.280 --> 01:02:11.280] and that actually powers the tag itself.

[01:02:11.280 --> 01:02:19.280] Well, what if you could make rectifying antenna that are tuned to the full visible spectrum of light put out by the sun?

[01:02:19.280 --> 01:02:21.280] Have they cracked that nut?

[01:02:21.280 --> 01:02:23.280] Well, a company claims that they have.

[01:02:23.280 --> 01:02:24.280] Whoa.

[01:02:24.280 --> 01:02:28.280] The company is Nova Solix, and they claim that they have cracked this.

[01:02:28.280 --> 01:02:36.280] They use carbon nanotubes because they are the correct size to capture that frequency of light.

[01:02:36.280 --> 01:02:44.280] And of course, you can have a ton of them on a photovoltaic wafer because they're so tiny, right?

[01:02:44.280 --> 01:02:45.280] They're nanoscale.

[01:02:45.280 --> 01:02:49.280] They claim theoretical efficiency of, guess how high?

[01:02:49.280 --> 01:02:50.280] 50%.

[01:02:50.280 --> 01:02:51.280] 1.7.

[01:02:51.280 --> 01:02:52.280] 90%.

[01:02:52.280 --> 01:02:53.280] Wow.

[01:02:53.280 --> 01:02:54.280] Oh, no.

[01:02:54.280 --> 01:02:55.280] Come on.

[01:02:55.280 --> 01:02:56.280] They're not there now.

[01:02:56.280 --> 01:02:59.280] That's the theoretical potential of this technology.

[01:02:59.280 --> 01:03:02.280] 90% is what they're claiming they can get to.

[01:03:02.280 --> 01:03:08.280] They're saying they're already like in the 40s in terms of their percent efficiency, 45%.

[01:03:08.280 --> 01:03:10.280] What about construction costs over that?

[01:03:10.280 --> 01:03:11.280] Yeah.

[01:03:11.280 --> 01:03:14.280] So again, until they're mass producing these things, who knows?

[01:03:14.280 --> 01:03:17.280] It uses carbon nanotubes, but they're small, right?

[01:03:17.280 --> 01:03:18.280] So that's fine.

[01:03:18.280 --> 01:03:24.280] We could mass produce lots of small carbon nanotubes as long as we don't have to weave them together into a giant cable.

[01:03:24.280 --> 01:03:25.280] Yeah.

[01:03:25.280 --> 01:03:26.280] How perfect do they have to be?

[01:03:26.280 --> 01:03:30.280] Some application of nanotubes like that have to be pristine.

[01:03:30.280 --> 01:03:31.280] Yeah.

[01:03:31.280 --> 01:03:32.280] They're not structural.

[01:03:32.280 --> 01:03:34.280] I don't think they have to be pristine.

[01:03:34.280 --> 01:03:35.280] Yeah.

[01:03:35.280 --> 01:03:36.280] Well, that's good.

[01:03:36.280 --> 01:03:37.280] Not structural.

[01:03:37.280 --> 01:03:38.280] Wow, 90%.

[01:03:38.280 --> 01:03:45.280] I mean, that's so above and beyond anything else that it would be worth it for niche applications

[01:03:45.280 --> 01:03:51.280] where they would even spend a lot of money on it because they need that efficiency.

[01:03:51.280 --> 01:03:55.280] So they're claiming proof of concept at 43% efficiency.

[01:03:55.280 --> 01:03:59.280] They're claiming – so even if they don't get beyond 43%, but anything between that

[01:03:59.280 --> 01:04:01.280] and 90 is just gravy, in my opinion.

[01:04:01.280 --> 01:04:05.280] But also, they say that they predict that the production costs – and again, they're

[01:04:05.280 --> 01:04:09.280] just hyping themselves, so it's hard to say until they do it – but that it actually

[01:04:09.280 --> 01:04:13.280] would be much cheaper than current production.

[01:04:13.280 --> 01:04:16.280] They're claiming 10 cents per watt, remember?

[01:04:16.280 --> 01:04:17.280] Damn.

[01:04:17.280 --> 01:04:19.280] We're at 90 cents now?

[01:04:19.280 --> 01:04:21.280] Current solar is –

[01:04:21.280 --> 01:04:22.280] 89 cents?

[01:04:22.280 --> 01:04:27.280] 265 per watt on rooftop, 89 cents utility scale.

[01:04:27.280 --> 01:04:33.280] So they would say if we use the utility scale, 89 cents down to 10 cents, it's almost like

[01:04:33.280 --> 01:04:34.280] 10% of the cost of –

[01:04:34.280 --> 01:04:36.280] My gosh, it all sounds too good.

[01:04:36.280 --> 01:04:39.280] Right, but there's nothing about it that's pseudoscience.

[01:04:39.280 --> 01:04:40.280] It's just – yeah, it all –

[01:04:40.280 --> 01:04:41.280] Engineering, man.

[01:04:41.280 --> 01:04:42.280] It's all engineering.

[01:04:42.280 --> 01:04:45.280] It's all getting it to work in a pragmatic way.

[01:04:45.280 --> 01:04:46.280] It's all pragmatic.

[01:04:46.280 --> 01:04:51.280] Again, proof of concept in the lab means the science works, but that does not always translate

[01:04:51.280 --> 01:04:52.280] to the industry.

[01:04:52.280 --> 01:04:53.280] Real world, yeah.

[01:04:53.280 --> 01:04:58.280] But we do know that enough of them do that we make steady progress.

[01:04:58.280 --> 01:05:00.280] But imagine if this hits, right?

[01:05:00.280 --> 01:05:01.280] If this actually works, and suddenly –

[01:05:01.280 --> 01:05:03.280] Nothing like a so-called –

[01:05:03.280 --> 01:05:04.280] Yeah, suddenly we have –

[01:05:04.280 --> 01:05:05.280] Some random leap.

[01:05:05.280 --> 01:05:08.280] 90% efficient panels at 10 cents per watt.

[01:05:08.280 --> 01:05:09.280] It's then –

[01:05:09.280 --> 01:05:10.280] You're powering the planet.

[01:05:10.280 --> 01:05:11.280] It's a different world.

[01:05:11.280 --> 01:05:12.280] It's a different world.

[01:05:12.280 --> 01:05:15.280] You power the planet at that point, basically.

[01:05:15.280 --> 01:05:18.280] I assume they could be flexible as well.

[01:05:18.280 --> 01:05:19.280] Yeah, yeah, yeah.

[01:05:19.280 --> 01:05:21.280] I mean, what wouldn't you want to power with this?

[01:05:21.280 --> 01:05:22.280] Because they're not crystals.

[01:05:22.280 --> 01:05:23.280] Exactly.

[01:05:23.280 --> 01:05:25.280] I mean, it blows everything else out the door.

[01:05:25.280 --> 01:05:26.280] Everything.

[01:05:26.280 --> 01:05:27.280] Right.

[01:05:27.280 --> 01:05:28.280] So who knows?

[01:05:28.280 --> 01:05:29.280] Yeah.

[01:05:29.280 --> 01:05:30.280] Okay.

[01:05:30.280 --> 01:05:31.280] Well, it's nice.

[01:05:31.280 --> 01:05:32.280] If that will pay off.

[01:05:32.280 --> 01:05:33.280] Wow.

[01:05:33.280 --> 01:05:37.960] But the thing is, all of these proof of concepts in the lab things that we've been talking

[01:05:37.960 --> 01:05:41.280] about over the years have slowly been trickling into the market.

[01:05:41.280 --> 01:05:42.280] Yeah.

[01:05:42.280 --> 01:05:43.280] That's right.

[01:05:43.280 --> 01:05:44.280] They're coming around.

[01:05:44.280 --> 01:05:45.280] So here's the thing.

[01:05:45.280 --> 01:05:50.840] With solar, the baseline is we're going to have further incremental improvements, right?

[01:05:50.840 --> 01:05:54.340] The next 20 years are probably going to be similar to the last 20 years.

[01:05:54.340 --> 01:06:00.880] So by the time we get to 2030 or 2040, solar is going to be ridiculously cheap and powerful,

[01:06:00.880 --> 01:06:03.000] even without any breakthroughs.

[01:06:03.000 --> 01:06:08.060] But it's at least a coin flip that we may also get a massive breakthrough that puts

[01:06:08.060 --> 01:06:09.560] it on even a different level.

[01:06:09.560 --> 01:06:11.920] But even without it, it's still going to be off the hook.

[01:06:11.920 --> 01:06:12.920] Yeah.

[01:06:12.920 --> 01:06:13.920] Yeah.

[01:06:13.920 --> 01:06:19.080] So all of our projections about the future assumes no further advances in solar power,

[01:06:19.080 --> 01:06:21.560] but that's really not realistic.

[01:06:21.560 --> 01:06:27.520] To do that, whenever we predict what things are going to be like in 2030, 2040, we have

[01:06:27.520 --> 01:06:33.640] to include at least the incremental advances in solar power, because it's almost guaranteed

[01:06:33.640 --> 01:06:34.640] at this point.

[01:06:34.640 --> 01:06:35.640] Yeah.

[01:06:35.640 --> 01:06:36.640] I mean, there's no reason to think that we're going to hit some wall.

[01:06:36.640 --> 01:06:42.720] What's the world going to be like with a 90% efficiency that's off the hook?

[01:06:42.720 --> 01:06:43.720] What does that mean?

[01:06:43.720 --> 01:06:49.920] I mean, how much of your roof would even need to be covered if we had 90%?

[01:06:49.920 --> 01:06:53.360] But yeah, we would just cover, I mean, I don't know why we wouldn't just do it.

[01:06:53.360 --> 01:06:55.720] It's not how cheap it is, right?

[01:06:55.720 --> 01:06:59.120] But imagine if you could pay for all of your electricity.

[01:06:59.120 --> 01:07:04.960] Instead of now spending between $40,000 and $60,000, it costs you $5,000.

[01:07:04.960 --> 01:07:06.760] That's a completely different world, again, right?

[01:07:06.760 --> 01:07:07.760] Yeah.

[01:07:07.760 --> 01:07:08.760] Yeah.

[01:07:08.760 --> 01:07:09.760] Go to the bank right now and pull it out.

[01:07:09.760 --> 01:07:10.760] Who can afford it?

[01:07:10.760 --> 01:07:12.720] What house wouldn't have it?

[01:07:12.720 --> 01:07:13.720] Every house would have it.

[01:07:13.720 --> 01:07:14.720] Every one.

[01:07:14.720 --> 01:07:18.720] And then all your electricity is covered for free for 20 years, 30 years, you know?

[01:07:18.720 --> 01:07:19.720] Right.

[01:07:19.720 --> 01:07:20.720] Yeah.

[01:07:20.720 --> 01:07:21.720] All right.

[01:07:21.720 --> 01:07:22.720] We're getting there.

[01:07:22.720 --> 01:07:23.720] Let's see.

[01:07:23.720 --> 01:07:24.720] We're getting there.

[01:07:24.720 --> 01:07:26.720] It'll be interesting to follow in the next five to 10 years.

[01:07:26.720 --> 01:07:27.720] Yeah.

[01:07:27.720 --> 01:07:28.720] Cool, cool.

[01:07:28.720 --> 01:07:34.360] Pretty much, we've been saying politics has essentially failed, though I think we're starting

[01:07:34.360 --> 01:07:36.000] to turn a corner now.

[01:07:36.000 --> 01:07:38.840] But in the background, scientific advance is going...

[01:07:38.840 --> 01:07:43.260] If there's any hope of solving global warming, it's going to be this slow background advance

[01:07:43.260 --> 01:07:44.840] in the science and the background.

[01:07:44.840 --> 01:07:45.840] And it's happening.

[01:07:45.840 --> 01:07:46.840] And solar, I think, is...

[01:07:46.840 --> 01:07:48.400] And politics will eventually, hopefully, catch up.

[01:07:48.400 --> 01:07:49.400] Yeah, hopefully.

[01:07:49.400 --> 01:07:50.400] But it usually lags.

[01:07:50.400 --> 01:07:51.680] It usually lags.

[01:07:51.680 --> 01:08:02.320] Which it is.

Interview with Dr. Seema Yasmin (1:07:53)

[01:08:02.320 --> 01:08:04.480] Joining us now is Dr. Seema Yasmin.

[01:08:04.480 --> 01:08:06.520] Seema, welcome to The Skeptics Guide.

[01:08:06.520 --> 01:08:07.520] Thank you.

[01:08:07.520 --> 01:08:08.520] Thanks for having me.

[01:08:08.520 --> 01:08:13.640] And you have a book coming out available September 20th called What the Fact?

[01:08:13.640 --> 01:08:16.600] Finding the Truth in All the Noise.

[01:08:16.600 --> 01:08:17.840] Tell us what the book's about.

[01:08:17.840 --> 01:08:20.320] I am very excited about this book.

[01:08:20.320 --> 01:08:26.600] It's my first time writing for a younger audience, although it's really a book for everyone.

[01:08:26.600 --> 01:08:31.760] And I think that a lot of people need right now because What the Fact is a navigation

[01:08:31.760 --> 01:08:32.760] guide.

[01:08:32.760 --> 01:08:40.480] The spotting and surviving, all the misinformation and disinformation and malinformation, all

[01:08:40.480 --> 01:08:48.080] the different kinds of false information that are flying around our information ecosystem.

[01:08:48.080 --> 01:08:54.200] So the book uses loads of anecdotes, very wild and absurd and hard to believe stories

[01:08:54.200 --> 01:09:00.000] to kind of really show how even those of us who think we're highly intelligent and highly

[01:09:00.000 --> 01:09:04.160] educated can spot a lie from a mile away.

[01:09:04.160 --> 01:09:10.960] Actually the way that our brains are wired, we are all susceptible to falling for bunk.

[01:09:10.960 --> 01:09:16.440] And this book is arming you with the tools to prevent you from being duped.

[01:09:16.440 --> 01:09:19.640] Tell us what the difference is between misinformation and disinformation.

[01:09:19.640 --> 01:09:24.640] Yeah, I mean, they're definitely words that get banded a lot at the moment, especially

[01:09:24.640 --> 01:09:26.200] since the beginning of COVID.

[01:09:26.200 --> 01:09:28.940] But there is a difference between them.

[01:09:28.940 --> 01:09:34.880] So misinformation is false information, but it's false information that isn't really spread

[01:09:34.880 --> 01:09:37.380] without any bad intention.

[01:09:37.380 --> 01:09:41.300] And often it's spread by somebody who doesn't actually realize that what they're telling

[01:09:41.300 --> 01:09:43.200] you is false.

[01:09:43.200 --> 01:09:49.560] So it might be like a friend saying, Hey, I heard that if you gargle with this particular

[01:09:49.560 --> 01:09:52.160] mouthwash, you won't get COVID.

[01:09:52.160 --> 01:09:58.140] Now they don't know that that's bunk and they're not saying it to harm you in any way, which

[01:09:58.140 --> 01:10:04.280] is different from disinformation, which is also false information, but false information

[01:10:04.280 --> 01:10:09.520] that is spread with this deliberate intent to cause harm and the people spreading it

[01:10:09.520 --> 01:10:11.680] know that it's false information.

[01:10:11.680 --> 01:10:16.480] So it could be like the bots and the trolls that we see on social media that we've now

[01:10:16.480 --> 01:10:22.680] linked back to some Russian agencies that are deliberately trying to cause mayhem and

[01:10:22.680 --> 01:10:27.740] panic, whether it's spreading falsehoods about elections or about the pandemic or any other

[01:10:27.740 --> 01:10:28.740] kind of topic.

[01:10:28.740 --> 01:10:32.800] Is there a difference in the information itself or just the intent, like, or even if it's

[01:10:32.800 --> 01:10:33.800] just a tendency?

[01:10:33.800 --> 01:10:39.200] Because I would suspect that disinformation is more crafted and curated, not just organic,

[01:10:39.200 --> 01:10:40.200] right?

[01:10:40.200 --> 01:10:41.360] Yeah, exactly.

[01:10:41.360 --> 01:10:46.040] They can look similar, but disinformation is really interesting that you can kind of

[01:10:46.040 --> 01:10:50.000] follow the trail, follow the crumbs back to its origins.

[01:10:50.000 --> 01:10:51.780] Like where did this come from?

[01:10:51.780 --> 01:10:59.360] This idea, for example, just picking one random example that if you get the COVID vaccine,

[01:10:59.360 --> 01:11:04.600] it will give you magnetic superpowers, for example, if you remember that particular piece

[01:11:04.600 --> 01:11:05.600] of bunk.

[01:11:05.600 --> 01:11:09.120] I mean, at one point there was so many rumors spreading about the COVID vaccine that it

[01:11:09.120 --> 01:11:15.440] was really hard to keep track, but sometimes it starts off as like a wild rumor, but other

[01:11:15.440 --> 01:11:22.720] times it came from a bad actor, perhaps a foreign government, a group that has a nefarious

[01:11:22.720 --> 01:11:27.760] agenda is out to dupe people, out to cause havoc.

[01:11:27.760 --> 01:11:33.740] And we've seen this play out, like I said, not just in an anti-vaccine sense, but also,

[01:11:33.740 --> 01:11:39.840] you know, disinformation that was crafted really specifically to cause lots of distrust

[01:11:39.840 --> 01:11:45.420] around the Black Lives Matter movement or to get people not voting for a particular

[01:11:45.420 --> 01:11:48.840] presidential candidate in the US election.

[01:11:48.840 --> 01:11:54.120] So yeah, disinformation, often when you trace it back to its roots, it was very carefully

[01:11:54.120 --> 01:11:58.360] crafted to cause a particular kind of chaos.

[01:11:58.360 --> 01:12:05.360] Yeah, that I would imagine would be much more nefarious because if it's crafted, like people

[01:12:05.360 --> 01:12:11.600] learn over time what buttons to push, like how to make the information spread more effectively

[01:12:11.600 --> 01:12:14.800] and to have more of an effect on their intended target, right?

[01:12:14.800 --> 01:12:16.800] That's a craft, disinformation.

[01:12:16.800 --> 01:12:18.120] Absolutely.

[01:12:18.120 --> 01:12:21.360] There's basically a disinformation playbook.

[01:12:21.360 --> 01:12:24.720] And you know, nowadays we're all saying like, oh, fake news, fake news, there's so much

[01:12:24.720 --> 01:12:29.280] fake news out here and it's all the fault of the internet, social media.

[01:12:29.280 --> 01:12:30.840] Some of that is true, right?

[01:12:30.840 --> 01:12:36.680] Having this worldwide web definitely allows you to accelerate the transmission of falsehoods

[01:12:36.680 --> 01:12:37.680] globally.

[01:12:37.680 --> 01:12:46.000] But when you take disinformation apart, even nowadays, you see how it was constructed.

[01:12:46.000 --> 01:12:52.600] It's straight out of, for example, the KGB's playbook, disinformation playbook that was

[01:12:52.600 --> 01:12:58.640] deployed during the Soviet Union's information warfare, basically.

[01:12:58.640 --> 01:13:04.680] So it's the same crafty tactics that in and of themselves are not high tech.

[01:13:04.680 --> 01:13:10.260] They're things like have a falsehood, but have a kernel of truth in it that makes it

[01:13:10.260 --> 01:13:11.880] seem believable.

[01:13:11.880 --> 01:13:18.480] For example, when you craft a falsehood, word it so that it's really like emotionally triggering,

[01:13:18.480 --> 01:13:20.180] that will help it go viral.

[01:13:20.180 --> 01:13:24.000] So none of that stuff, none of those techniques are actually that new.

[01:13:24.000 --> 01:13:26.120] They are really, really old.

[01:13:26.120 --> 01:13:28.920] It's just like the dark side of clickbait, basically.

[01:13:28.920 --> 01:13:29.920] Basically, yeah.

[01:13:29.920 --> 01:13:30.920] And I think it's fascinating.

[01:13:30.920 --> 01:13:33.840] You say clickbait, that's such a word of the time.

[01:13:33.840 --> 01:13:40.460] We click buttons nowadays to spread this information and disinformation, but you can trace it back

[01:13:40.460 --> 01:13:46.120] decades ago, sometimes even hundreds of years ago to know that humans have always known

[01:13:46.120 --> 01:13:48.520] how to dupe other humans.

[01:13:48.520 --> 01:13:54.840] We just keep getting better at it and technology has really armed us to be able to send those

[01:13:54.840 --> 01:13:59.200] falsehoods flying around the world a lot faster.

[01:13:59.200 --> 01:14:05.400] When I talk to people online that don't come from the same place that I'm coming from,

[01:14:05.400 --> 01:14:09.840] from information of someone that would believe in something completely different than what

[01:14:09.840 --> 01:14:15.080] I would typically believe in, and today it's usually around politics, it feels like there

[01:14:15.080 --> 01:14:19.960] is an impenetrable wall between what I think and what they think.

[01:14:19.960 --> 01:14:23.360] And I have found absolutely no way to breach that.

[01:14:23.360 --> 01:14:29.560] Do you have any idea how we will see our way through this or what the short-term future

[01:14:29.560 --> 01:14:30.560] might be?

[01:14:30.560 --> 01:14:31.560] I do, I do.

[01:14:31.560 --> 01:14:36.860] And I can just imagine so many people listening to this right now shaking their heads or nodding

[01:14:36.860 --> 01:14:42.320] their heads like, yes, I've been in that situation, whether it was around an election, whether

[01:14:42.320 --> 01:14:46.200] it was around COVID vaccines, whether it was around monkeypox.

[01:14:46.200 --> 01:14:53.680] But yes, I am very hopeful because as much false information as there is out there, as

[01:14:53.680 --> 01:15:01.880] many bad actors as there are out there, there's so much evidence now, like real good evidence

[01:15:01.880 --> 01:15:05.880] bases for how to disagree effectively.

[01:15:05.880 --> 01:15:08.440] And I just think we're not having enough of those conversations.

[01:15:08.440 --> 01:15:12.320] Of course we get around the water cooler and we're like, oh, I have this uncle and he just

[01:15:12.320 --> 01:15:16.560] won't get vaccinated or he just believes this about the election that it was a fraud, for

[01:15:16.560 --> 01:15:17.560] example.

[01:15:17.560 --> 01:15:23.360] But there is information out there about how to have an effective disagreement.

[01:15:23.360 --> 01:15:26.460] And I detail this in What the Fact.

[01:15:26.460 --> 01:15:31.300] And I didn't expect that when I was beginning to write the book that I would include that

[01:15:31.300 --> 01:15:35.820] in it because I was like, oh, you know, I research misinformation and disinformation.

[01:15:35.820 --> 01:15:38.140] So this book's going to be all about that.

[01:15:38.140 --> 01:15:43.320] And then I got to a point where it was really natural to include in What the Fact a whole

[01:15:43.320 --> 01:15:49.520] chapter about, well, what do you do when you've spotted a falsehood, but your mom or your

[01:15:49.520 --> 01:15:54.140] spouse or your brother is like, no, you're wrong.

[01:15:54.140 --> 01:15:59.220] And they're just digging their heels in deeper into the falsehood.

[01:15:59.220 --> 01:16:04.680] And so there are even scripts in this book to kind of show you like how a conversation

[01:16:04.680 --> 01:16:11.280] can slowly move towards helping someone understand their biases or helping them look at the falsehood

[01:16:11.280 --> 01:16:17.240] that they believe in and really just showing them like, here's why this is not factual.

[01:16:17.240 --> 01:16:18.680] So that is hope.

[01:16:18.680 --> 01:16:24.440] And there's evidence and lots of research being done now to help us spot the lies out

[01:16:24.440 --> 01:16:28.340] there and to have these effective conflicts, if you like.

[01:16:28.340 --> 01:16:32.160] Do you distinguish between disinformation and gaslighting?

[01:16:32.160 --> 01:16:37.080] Because again, to me, that is like another level when there's not only like one lie or

[01:16:37.080 --> 01:16:42.560] a couple of lies, but a such an overwhelming campaign that you're actually creating an

[01:16:42.560 --> 01:16:44.820] alternate reality for somebody.

[01:16:44.820 --> 01:16:46.200] Do I create a distinction?

[01:16:46.200 --> 01:16:53.560] I think that these different strategies work really well together to dupe us, to have us

[01:16:53.560 --> 01:17:00.720] falling for bunk and then to further polarize us so that where we feel safest is in a really

[01:17:00.720 --> 01:17:07.400] comfortable echo chamber where everyone who's around us also confirms our belief, whether

[01:17:07.400 --> 01:17:14.440] that belief is factual, whether that belief is completely absurd and conspiratorial.

[01:17:14.440 --> 01:17:22.280] So I think that gaslighting often works alongside disinformation to not just deepen our belief

[01:17:22.280 --> 01:17:28.200] in falsehoods, but to cause those deeper rifts between groups in society that just keep us

[01:17:28.200 --> 01:17:32.520] separated and keep us really deep in our disagreements.

[01:17:32.520 --> 01:17:39.360] So from what you're saying, it seems that also on the good side, there's basically two,

[01:17:39.360 --> 01:17:44.360] I think I would break it up into two different approaches in terms of combating misinformation.

[01:17:44.360 --> 01:17:50.680] There's protecting yourself, like being able to recognize misinformation and disinformation,

[01:17:50.680 --> 01:17:55.000] and getting through to somebody else who maybe is being victimized by it.

[01:17:55.000 --> 01:17:56.000] Let's start with the first one.

[01:17:56.000 --> 01:18:02.480] What are your key takeaways for how to protect yourself from disinformation?

[01:18:02.480 --> 01:18:08.320] So even that first stage, I would divide into two things, two phases.

[01:18:08.320 --> 01:18:13.800] And one is learning as what the fact outlines and helps you figure out what are those red

[01:18:13.800 --> 01:18:20.360] flags for like, wait a second, I'm not just going to automatically absorb this news story.

[01:18:20.360 --> 01:18:23.400] I'm going to ask a few questions first because I don't know, I'm starting to think this

[01:18:23.400 --> 01:18:24.400] might be suspect.

[01:18:24.400 --> 01:18:29.840] So there are red flags that you can look for there, the red flags of virality, if you like,

[01:18:29.840 --> 01:18:35.000] the fact that when something is emotionally triggering, the fact that when something can't

[01:18:35.000 --> 01:18:41.200] be easily reinforced by other similar news articles, those kinds of red flags in and

[01:18:41.200 --> 01:18:45.080] of themselves are really important to learn, look out for, just be aware that people use

[01:18:45.080 --> 01:18:52.400] those particular strategies to really strengthen the efficacy of a falsehood flying around.

[01:18:52.400 --> 01:18:57.160] But then the other thing is kind of like, I think even this deeper self work, which

[01:18:57.160 --> 01:19:02.180] is also in what the fact there's a whole chapter about our brains and why it is that, again,

[01:19:02.180 --> 01:19:05.600] even if you think you're super duper educated and you're super skeptical, like whatever

[01:19:05.600 --> 01:19:11.040] you think of yourself, you have a human brain and therefore your human brain is susceptible

[01:19:11.040 --> 01:19:14.160] to falling for bunk a lot of the time.

[01:19:14.160 --> 01:19:18.820] And so what I do and what the fact is kind of help us understand that susceptibility

[01:19:18.820 --> 01:19:23.240] so that we're not prideful all the time and just like, nope, I know what to believe or

[01:19:23.240 --> 01:19:25.880] I don't know what to, oh, I know what not to believe.

[01:19:25.880 --> 01:19:30.480] But actually if you understand a little bit more about how cognitive biases and heuristics,

[01:19:30.480 --> 01:19:36.880] which are like mental shortcuts work, then you can kind of be more aware of them.

[01:19:36.880 --> 01:19:41.280] And also I kind of go into detail about this idea that, look, to get to where you have

[01:19:41.280 --> 01:19:45.120] gotten in life, you've had to believe in certain things.

[01:19:45.120 --> 01:19:50.520] You've had to even have this belief that because the sun rose today, it will rise again tomorrow,

[01:19:50.520 --> 01:19:51.520] for example.

[01:19:51.520 --> 01:19:59.080] But actually it's a really good idea for most, if not all of your beliefs to have an

[01:19:59.080 --> 01:20:03.400] open mind about them and to be very open to challenging your beliefs.

[01:20:03.400 --> 01:20:06.780] It doesn't mean you have to knock them down and crush them all the time.

[01:20:06.780 --> 01:20:12.320] It means that whether it's a belief in genetically modified organisms or whether it's a belief

[01:20:12.320 --> 01:20:18.160] in climate change, be open to challenging and testing that belief from time to time.

[01:20:18.160 --> 01:20:22.000] And then also what I say is instead of being like an on off switch, like, yes, I believe,

[01:20:22.000 --> 01:20:28.640] no, I don't believe, actually is any belief ever 100%, should it be 100%, is that safe?

[01:20:28.640 --> 01:20:34.880] What about instead of that, if we assigned levels of credence or levels of strength to

[01:20:34.880 --> 01:20:39.160] our belief and then we keep an open mind, we take in more evidence every now and again,

[01:20:39.160 --> 01:20:43.640] we're willing to be challenged, and then we perhaps reassign a different level of credence

[01:20:43.640 --> 01:20:44.720] to that belief.

[01:20:44.720 --> 01:20:48.220] So all of these are things that kind of, you know, the new language around this is developing

[01:20:48.220 --> 01:20:54.200] mental immunity, so you can kind of develop this intellectual resistance, if you like,

[01:20:54.200 --> 01:20:55.200] to falling for falsehood.

[01:20:55.200 --> 01:20:57.920] So all of these things can work together.

[01:20:57.920 --> 01:21:06.440] I was wondering, doctor, what the most surprising thing came out of your research into preparing

[01:21:06.440 --> 01:21:07.440] for this book.

[01:21:07.440 --> 01:21:10.560] Is there something that absolutely took you by surprise?

[01:21:10.560 --> 01:21:16.720] You were totally shocked when you came across it, and is it in the book?

[01:21:16.720 --> 01:21:20.840] Yes, and it's like a little footnote in the book because it was something I discovered

[01:21:20.840 --> 01:21:24.680] towards the end stages of writing What the Fact.

[01:21:24.680 --> 01:21:30.320] So last year, I was invited to speak at the Vatican because the Pope is very concerned

[01:21:30.320 --> 01:21:34.200] about this post-truth society that we live in.

[01:21:34.200 --> 01:21:37.820] And people probably know Pope Francis has been very worried for a long time about the

[01:21:37.820 --> 01:21:44.240] existential threats of climate change, and then became really worried about people thinking

[01:21:44.240 --> 01:21:48.240] that climate change was a hoax and it wasn't even a real problem.

[01:21:48.240 --> 01:21:53.600] When COVID happened, the Pope and the Vatican kind of became also worried about the anti-science

[01:21:53.600 --> 01:21:56.640] movement kind of at large, you know, in general.

[01:21:56.640 --> 01:22:01.840] So I go to give my talk about my particular research that I do on all of this, but there

[01:22:01.840 --> 01:22:04.780] was another researcher there.

[01:22:04.780 --> 01:22:11.000] Something he said blew my mind and I was like, I have to put this in the book because I want

[01:22:11.000 --> 01:22:12.400] everyone's minds to be blown.

[01:22:12.400 --> 01:22:17.400] I really want us to question like, why we humans believe the things we believe and how

[01:22:17.400 --> 01:22:19.960] we form our beliefs.

[01:22:19.960 --> 01:22:24.360] So here's what he said, and mind you, I'm writing this book that's all about, hey, be

[01:22:24.360 --> 01:22:26.400] aware of these red flags, right?

[01:22:26.400 --> 01:22:30.720] Be aware of how susceptible we are to sometimes falling for lies.

[01:22:30.720 --> 01:22:38.160] And this researcher talked about the former president, President Trump, and said, on average,

[01:22:38.160 --> 01:22:42.880] the president told about 22 lies per day during his presidency, right?

[01:22:42.880 --> 01:22:46.840] So I mean, I think a lot of us are aware of that statistic because it's shocking in and

[01:22:46.840 --> 01:22:55.160] of itself, but then he said, telling the kind of lies that he told, which were really obvious

[01:22:55.160 --> 01:23:01.800] lies was a political strategy, and it's not a new one.

[01:23:01.800 --> 01:23:05.320] It's a really old strategy.

[01:23:05.320 --> 01:23:12.680] And what this strategy does is it signals from a leader to swathes of the public, I

[01:23:12.680 --> 01:23:14.240] am lying.

[01:23:14.240 --> 01:23:21.120] You can tell I'm lying because this is such a blatant lie, however, the fact that I am

[01:23:21.120 --> 01:23:28.760] telling such blatant lies signals to you that I don't give a damn about the status quo.

[01:23:28.760 --> 01:23:36.040] And actually my lying signals to you that I am authentic and genuine.

[01:23:36.040 --> 01:23:40.920] And so this is why many of us would be watching press conferences, for example, and be like,

[01:23:40.920 --> 01:23:43.720] wait, no, you just said black is white and white is black.

[01:23:43.720 --> 01:23:45.400] Like, you know, that's not true.

[01:23:45.400 --> 01:23:50.760] And others would watch the same press conference and be like, yeah, he's speaking for me.

[01:23:50.760 --> 01:23:53.340] He represents me.

[01:23:53.340 --> 01:23:59.100] And so this phenomenon is now on a page in what the fact because I thought it's really

[01:23:59.100 --> 01:24:03.520] helpful for us to understand not just how to spot lies, not just how to understand our

[01:24:03.520 --> 01:24:11.000] susceptibility to lies, but actually how lies can be weaponized, even when people know that

[01:24:11.000 --> 01:24:13.200] the lie is a lie.

[01:24:13.200 --> 01:24:17.040] So that was definitely something that kind of like floored me for a while.

[01:24:17.040 --> 01:24:18.520] It's powerful, very powerful.

[01:24:18.520 --> 01:24:19.520] Yeah.

[01:24:19.520 --> 01:24:25.480] Emma, are there any sources of misinformation that are worth noting, you know, like an extreme

[01:24:25.480 --> 01:24:28.320] source of misinformation that a lot of people don't know about?

[01:24:28.320 --> 01:24:29.960] There are so many.

[01:24:29.960 --> 01:24:33.400] I think, you know, the interesting thing is when you think about misinformation, right,

[01:24:33.400 --> 01:24:37.760] so specifically information that's false, but the person telling you doesn't know it's

[01:24:37.760 --> 01:24:42.160] false and they're not saying a falsehood to you to dupe you.

[01:24:42.160 --> 01:24:46.480] They really think they're trying to actually trying to help you that you have to kind of

[01:24:46.480 --> 01:24:51.720] consider like whether we're more susceptible to that at times instead of disinformation,

[01:24:51.720 --> 01:24:54.800] because the misinformation is likely to come from a source that's close to you and somebody

[01:24:54.800 --> 01:24:56.440] that you know.

[01:24:56.440 --> 01:25:01.760] And I think that's important to remember because it reminds us that even what we understand

[01:25:01.760 --> 01:25:07.880] to be truthful, this whole construct of truth is actually tribal.

[01:25:07.880 --> 01:25:12.440] And where we like to think that, hey, I'm a scientist, I'm like a really rational, logical

[01:25:12.440 --> 01:25:17.520] person, actually your beliefs aren't always about being rational.

[01:25:17.520 --> 01:25:22.160] Your beliefs are often about having a sense of belonging to a community.

[01:25:22.160 --> 01:25:29.400] So even when you turn on the TV or you crack open a book to get news, to get information,

[01:25:29.400 --> 01:25:33.780] you're often going to a source where you already have a deep seated understanding, a subconscious

[01:25:33.780 --> 01:25:37.920] understanding that the source you're going to is just going to reaffirm your worldview.

[01:25:37.920 --> 01:25:40.300] It's not going to shock you too much.

[01:25:40.300 --> 01:25:45.760] And that's kind of the lovely feeling, the comfort of echo chambers, but also the danger

[01:25:45.760 --> 01:25:46.760] of echo chambers.

[01:25:46.760 --> 01:25:51.440] And it's why, even though it's really uncomfortable and the world is already so uncomfortable,

[01:25:51.440 --> 01:25:55.680] but that's why it's so important to get out of your echo chamber and be exposed to different

[01:25:55.680 --> 01:25:56.680] sources of information.

[01:25:56.680 --> 01:25:57.680] All right.

[01:25:57.680 --> 01:25:59.440] Well, Seema, thank you so much for joining us.

[01:25:59.440 --> 01:26:03.040] Your book is What the Fact, available September 20th.

[01:26:03.040 --> 01:26:05.240] You can pre-order now.

[01:26:05.240 --> 01:26:09.780] So are you working on a new project now or are you all focused on your latest book?

[01:26:09.780 --> 01:26:15.140] I'm working on my first novel, but also on promoting What the Fact.

[01:26:15.140 --> 01:26:18.400] We have lots of school visits lined up, which is really fun.

[01:26:18.400 --> 01:26:21.240] It's my first time writing a book for teenagers.

[01:26:21.240 --> 01:26:26.600] And so actually even recording the audio book was the most fun I've had recording an audio

[01:26:26.600 --> 01:26:32.160] book because we got to be a little silly and I got to do silly sound effects and stuff.

[01:26:32.160 --> 01:26:35.020] So yeah, the novel I'm working on is also a young adult novel.

[01:26:35.020 --> 01:26:37.080] So I think this might be my jam now.

[01:26:37.080 --> 01:26:40.080] More bored of the adults.

[01:26:40.080 --> 01:26:43.080] Kids are more fun, yeah.

[01:26:43.080 --> 01:26:44.080] Exactly.

[01:26:44.080 --> 01:26:45.080] Exactly.

[01:26:45.080 --> 01:26:46.080] Well, thanks again.

[01:26:46.080 --> 01:26:47.280] It was great talking with you.

[01:26:47.280 --> 01:26:48.280] Thanks all of you.

[01:26:48.280 --> 01:26:49.280] Take care.

[01:26:49.280 --> 01:26:50.280] Thanks, Seema.

Who's That Noisy? (1:26:50)


New Noisy (1:30:50)

[male voice? shushing in perhaps an Asian language]

J: ... what is that that we're hearing.

[01:26:50.280 --> 01:26:51.280] All right, Jay.

[01:26:51.280 --> 01:26:52.280] It's Who's That Noisy Time.

[01:26:52.280 --> 01:26:56.240] Okay, guys, last week I played this noisy.

[01:26:56.240 --> 01:26:57.240] Somebody here has to know that.

[01:26:57.240 --> 01:26:58.800] That's from a video game, isn't it?

[01:26:58.800 --> 01:26:59.800] All right.

[01:26:59.800 --> 01:27:02.280] Well, let me get into what the listeners thought.

[01:27:02.280 --> 01:27:06.400] So a listener named Chris Ford wrote in and said, is that the sound from the TV series

[01:27:06.400 --> 01:27:07.520] In Search Of?

[01:27:07.520 --> 01:27:08.520] The new one or the old one?

[01:27:08.520 --> 01:27:09.520] No, it sounds familiar.

[01:27:09.520 --> 01:27:11.640] I don't think it is, but it does sound a little familiar.

[01:27:11.640 --> 01:27:16.520] Well, it's an incorrect answer, but there is definitely something about that in the

[01:27:16.520 --> 01:27:19.960] original In Search Of theme, I bet you.

[01:27:19.960 --> 01:27:24.200] In the back of my head, I can kind of hear what I think Chris is hearing, but someone

[01:27:24.200 --> 01:27:26.160] will have to go figure that out.

[01:27:26.160 --> 01:27:31.960] Another listener named John Karabik wrote in and said, he said, LTL sometimes correspondent.

[01:27:31.960 --> 01:27:32.960] What does that mean?

[01:27:32.960 --> 01:27:33.960] LTL?

[01:27:33.960 --> 01:27:34.960] LTL.

[01:27:34.960 --> 01:27:35.960] Long term, long time listener.

[01:27:35.960 --> 01:27:36.960] Long term, long time listener.

[01:27:36.960 --> 01:27:37.960] Sometimes correspondent.

[01:27:37.960 --> 01:27:38.960] There you go.

[01:27:38.960 --> 01:27:39.960] Thank you, Bob.

[01:27:39.960 --> 01:27:44.560] That sounded like the handheld Mattel Coleco football game wind tone to me.

[01:27:44.560 --> 01:27:49.960] Been 40-ish years, and if it's not that, it's a similar late 70s, early 80s handheld.

[01:27:49.960 --> 01:27:55.560] Okay, so do you guys know what the Mattel Coleco football game sound is?

[01:27:55.560 --> 01:27:56.560] I have heard of it.

[01:27:56.560 --> 01:27:57.560] Jay, I was going to say that.

[01:27:57.560 --> 01:27:59.760] I remember that, but I didn't think it was that.

[01:27:59.760 --> 01:28:04.240] All right, well, let me play that for you.

[01:28:04.240 --> 01:28:06.320] Here's that sound that he was referring to.

[01:28:06.320 --> 01:28:08.320] Do you hear that?

[01:28:08.320 --> 01:28:09.320] Yeah.

[01:28:09.320 --> 01:28:10.320] Yeah, that's it.

[01:28:10.320 --> 01:28:11.320] Here it is again.

[01:28:11.320 --> 01:28:12.320] Okay.

[01:28:12.320 --> 01:28:18.400] Now, did you know that that sound was played in a very popular song?

[01:28:18.400 --> 01:28:20.880] I think we're going back to the 80s, okay?

[01:28:20.880 --> 01:28:23.360] This is the... Well, I'm not going to tell you who it is.

[01:28:23.360 --> 01:28:27.840] You guys should tell me what the song is and who wrote it, but here's the song that has

[01:28:27.840 --> 01:28:35.000] that sound in it.

[01:28:35.000 --> 01:28:36.000] Super Tramp?

[01:28:36.000 --> 01:28:37.000] It's Super Tramp.

[01:28:37.000 --> 01:28:38.000] Oh, yeah.

[01:28:38.000 --> 01:28:39.000] It does have it.

[01:28:39.000 --> 01:28:40.000] Did you hear that?

[01:28:40.000 --> 01:28:41.000] I did hear it.

[01:28:41.000 --> 01:28:42.000] That's the logical song.

[01:28:42.000 --> 01:28:43.000] That's correct, Evan.

[01:28:43.000 --> 01:28:44.000] Good job, man.

[01:28:44.000 --> 01:28:48.520] I remember hearing that song and hearing that sound and realizing what it was like the first

[01:28:48.520 --> 01:28:51.080] time I heard that song when I was a child.

[01:28:51.080 --> 01:28:52.080] Like an Easter egg.

[01:28:52.080 --> 01:28:53.080] That's cool.

[01:28:53.080 --> 01:28:54.080] Totally, yeah.

[01:28:54.080 --> 01:28:56.880] I just think it was very cleverly used in the song as well.

[01:28:56.880 --> 01:29:00.680] It actually fits even though it's like clearly from a video game.

[01:29:00.680 --> 01:29:02.080] I just find that fascinating.

[01:29:02.080 --> 01:29:07.660] So no, it is not the handheld Mattel ColecoVision football or Coleco football.

[01:29:07.660 --> 01:29:09.760] So many of us own those back in the day.

[01:29:09.760 --> 01:29:10.760] Oh, yeah.

[01:29:10.760 --> 01:29:11.760] I know I had one.

[01:29:11.760 --> 01:29:12.760] I still have one.

[01:29:12.760 --> 01:29:17.680] So I have a... And the bottom line is almost everybody that listens to Who's That Noisy

[01:29:17.680 --> 01:29:18.960] knew what this was.

[01:29:18.960 --> 01:29:19.960] What?

[01:29:19.960 --> 01:29:20.960] Except for all of you.

[01:29:20.960 --> 01:29:21.960] Yeah, except for us.

[01:29:21.960 --> 01:29:22.960] Except for you guys.

[01:29:22.960 --> 01:29:26.440] So Evan, you were the closest because you said, is it a video game?

[01:29:26.440 --> 01:29:28.880] And the answer is, yes, it's from a video game.

[01:29:28.880 --> 01:29:31.160] I will play it for you again.

[01:29:31.160 --> 01:29:35.840] Oh, that's Zelda, isn't it Zelda?

[01:29:35.840 --> 01:29:36.840] That is Zelda.

[01:29:36.840 --> 01:29:37.840] Correct.

[01:29:37.840 --> 01:29:38.840] This is from The Legend of Zelda.

[01:29:38.840 --> 01:29:44.200] This is the sound that you hear when you solve a puzzle or pick up the sword or something.

[01:29:44.200 --> 01:29:50.080] Some type is like, yeah, yeah, yeah, yeah, but it's definitely when you solve a puzzle

[01:29:50.080 --> 01:29:56.000] of some sort and you know, it's one of those sounds that has been in the game I think from

[01:29:56.000 --> 01:29:57.000] the beginning.

[01:29:57.000 --> 01:29:59.440] I'm pretty sure it was from the beginning.

[01:29:59.440 --> 01:30:01.640] Bottom line is I love that sound.

[01:30:01.640 --> 01:30:09.240] I love the sound design of a lot of old school games and Zelda is one of them because the

[01:30:09.240 --> 01:30:13.800] soundscape that they've come up with for that series of games is fantastic.

[01:30:13.800 --> 01:30:20.500] I mean, most of the Zelda games have a wonderful soundtrack with a lot of these retro sounds

[01:30:20.500 --> 01:30:22.480] that they've been using for decades.

[01:30:22.480 --> 01:30:26.960] If you're interested in playing a really fun game, it's a younger person's game, but it's

[01:30:26.960 --> 01:30:29.080] still fantastic as an adult to play.

[01:30:29.080 --> 01:30:30.220] They're all great.

[01:30:30.220 --> 01:30:33.680] I have a couple of favorites, but there's a ton of them and they're all different.

[01:30:33.680 --> 01:30:35.360] Anyway, that was this week's puzzle.

[01:30:35.360 --> 01:30:38.240] So that was originally sent in by me.

[01:30:38.240 --> 01:30:43.320] If you remember, I came up with this one because when I don't find a noisy that I like, I will

[01:30:43.320 --> 01:30:47.360] defer to probably a video game sound because there's an infinite number of those and I'm

[01:30:47.360 --> 01:30:49.460] fascinated by all of them.

[01:30:49.460 --> 01:30:54.940] I do have a new noisy this week, this noisy was sent in by a listener named Joshua Twilley

[01:30:54.940 --> 01:31:08.840] and here is that sound.

[01:31:08.840 --> 01:31:12.120] That goes on for about 48 seconds.

[01:31:12.120 --> 01:31:16.160] I would like to know very specifically, what is that that we're hearing?

[01:31:16.160 --> 01:31:20.600] Give me any details that you can come up with and you can send that information to me at

[01:31:20.600 --> 01:31:26.000] WTN at the skeptics guide.org and don't forget to send me any cool noises that you heard

[01:31:26.000 --> 01:31:27.000] this week.

More Announcements (1:31:27)

[01:31:27.000 --> 01:31:29.280] Steve, yeah, there's events.

[01:31:29.280 --> 01:31:30.280] You said one already.

[01:31:30.280 --> 01:31:34.560] We have on September 24th, we have a six hour live stream.

[01:31:34.560 --> 01:31:36.320] We're going to be talking about our new book.

[01:31:36.320 --> 01:31:39.960] We're going to be doing a couple of SGU episodes.

[01:31:39.960 --> 01:31:44.520] These will all be live and then we will definitely have other things going on.

[01:31:44.520 --> 01:31:48.320] Very likely that we'll have some guests and other fun things happening.

[01:31:48.320 --> 01:31:51.240] So we would really love it if you can join us at that live stream.

[01:31:51.240 --> 01:31:56.600] You can get details about this event on our events page on the skeptics guide.org.

[01:31:56.600 --> 01:31:58.560] So please do join us for that.

[01:31:58.560 --> 01:32:02.280] Even though Nexus is over Steve, you could still buy tickets for Nexus because it's all

[01:32:02.280 --> 01:32:03.280] recorded.

[01:32:03.280 --> 01:32:04.760] It's all still amazingly relevant.

[01:32:04.760 --> 01:32:08.060] We had wonderful speakers and we had a fantastic keynote.

[01:32:08.060 --> 01:32:11.100] We had Bill Nye speaking to David Copperfield.

[01:32:11.100 --> 01:32:17.640] So if you're interested, you can also go to NECSS.org and get tickets for that.

[01:32:17.640 --> 01:32:23.740] Now there are events, Steve, that we have planned in Arizona in December of this year.

[01:32:23.740 --> 01:32:28.880] If you would like to see us put on one of our two different types of shows that we do,

[01:32:28.880 --> 01:32:30.760] then you should go to our events page.

[01:32:30.760 --> 01:32:33.520] It's theskepticsguide.org forward slash events.

[01:32:33.520 --> 01:32:38.740] We are putting on a live SGU recording podcast where you're in the audience listening to

[01:32:38.740 --> 01:32:40.860] us record the show.

[01:32:40.860 --> 01:32:46.600] And then we have added an extra hour of time to spend with the audience to have fun, hang

[01:32:46.600 --> 01:32:51.000] out, talk, and do a few different things that we've been coming up with that we don't want

[01:32:51.000 --> 01:32:52.000] to reveal.

[01:32:52.000 --> 01:32:53.000] But it's going to be a lot of fun.

[01:32:53.000 --> 01:32:57.120] We're calling it the Private Show Plus because of all the extra time that we'll have to hang

[01:32:57.120 --> 01:32:58.160] out and talk.

[01:32:58.160 --> 01:33:00.080] So please do join us for one of those shows.

[01:33:00.080 --> 01:33:04.080] We'll be doing one of those each, one in Phoenix and one in Tucson.

[01:33:04.080 --> 01:33:08.760] And then we will also be doing a skeptical extravaganza of special significance.

[01:33:08.760 --> 01:33:11.720] We'll be doing one of those in Phoenix and one of those in Tucson as well.

[01:33:11.720 --> 01:33:14.240] So whichever city you're closer to, please buy tickets for those.

[01:33:14.240 --> 01:33:18.880] You can get all of this information on our website, theskepticsguide.org forward slash

[01:33:18.880 --> 01:33:19.880] events.

[01:33:19.880 --> 01:33:20.880] All right.

[01:33:20.880 --> 01:33:21.880] Thanks, Jay.

Science or Fiction (1:33:24)

Theme: Common Myths

Item #1: The notion that humans have as many hair follicles as chimpanzees, on average, is false, with our closest cousins having 2-3 times as many as humans.[5]
Item #2: Bagpipes do not have their origin in Scotland, but are rather an ancient instrument. In fact, Nero was more likely to have played the bagpipes than the fiddle while Rome burned (although that is also a myth).[6]
Item #3: Contrary to common lore, the QWERTY keyboard layout was not created to limit jamming but rather was designed for convenience.[7]

Answer Item
Fiction Chimps: 2-3x more follicles
Science Bagpipes not Scottish in origin
Science
Qwerty made for convenience
Host Result
Steve win
Rogue Guess
Evan
Qwerty made for convenience
Bob
Chimps: 2-3x more follicles
Cara
Chimps: 2-3x more follicles
Jay
Chimps: 2-3x more follicles

Voice-over: It's time for Science or Fiction.

Evan's Response

Bob's Response

Cara's Response

Jay's Response

Steve Explains Item #2

Steve Explains Item #1

Steve Explains Item #3

[01:33:21.880 --> 01:33:26.780] Well, guys, let's go on with science or fiction.

[01:33:26.780 --> 01:33:35.880] It's time for science or fiction.

[01:33:35.880 --> 01:33:40.400] Each week, I come up with three science news items or facts, two real and one fake.

[01:33:40.400 --> 01:33:45.160] Then I challenge my panel of skeptics to tell me which one is the fake.

[01:33:45.160 --> 01:33:46.880] There's a theme this week.

[01:33:46.880 --> 01:33:50.200] The theme is common myths.

[01:33:50.200 --> 01:33:53.200] Again, don't be confused.

[01:33:53.200 --> 01:33:57.080] As usual, these statements are just either true or false as stated.

[01:33:57.080 --> 01:34:00.080] I'm going to get so screwed on.

[01:34:00.080 --> 01:34:01.080] Yeah, that's right.

[01:34:01.080 --> 01:34:03.480] We get all twisted pretzel-like with this one.

[01:34:03.480 --> 01:34:05.280] Just even forget the theme.

[01:34:05.280 --> 01:34:07.600] Just tell me if these statements are true or false.

[01:34:07.600 --> 01:34:10.640] Now you said forget it, now I'm going to remember it.

[01:34:10.640 --> 01:34:13.240] Don't push the button, Evan.

[01:34:13.240 --> 01:34:14.240] Here we go.

[01:34:14.240 --> 01:34:21.120] Item number one, the notion that humans have as many hair follicles as chimpanzees on average

[01:34:21.120 --> 01:34:28.480] is false, with our closest cousins having two to three times as many as humans.

[01:34:28.480 --> 01:34:34.880] Number two, bagpipes do not have their origin in Scotland, but are rather an ancient instrument.

[01:34:34.880 --> 01:34:40.000] In fact, Nero was more likely to play the bagpipes than the fiddle while Rome burned,

[01:34:40.000 --> 01:34:42.560] although that is also a myth.

[01:34:42.560 --> 01:34:48.080] And item number three, contrary to common lore, the QWERTY keyboard layout was not created

[01:34:48.080 --> 01:34:52.200] to limit jamming, but rather was designed for convenience.

[01:34:52.200 --> 01:34:55.040] Evan, go first.

[01:34:55.040 --> 01:35:00.800] Two of these three are kind of a little familiar to me, and one of them I've never heard before.

[01:35:00.800 --> 01:35:05.080] The one I've never heard before is the notion that humans have as many hair follicles as

[01:35:05.080 --> 01:35:06.080] chimpanzees.

[01:35:06.080 --> 01:35:08.600] I've never, sorry, never heard that before.

[01:35:08.600 --> 01:35:14.640] I don't know if that helps me in this particular game or not.

[01:35:14.640 --> 01:35:27.120] I suppose it would make sense that the chimpanzees would have more hair follicles.

[01:35:27.120 --> 01:35:34.040] So I'm guessing that that one is, as you stated, Steve, is going to be science, our

[01:35:34.040 --> 01:35:38.600] closest cousins having two to three times as many as humans.

[01:35:38.600 --> 01:35:41.200] I don't see a problem with that.

[01:35:41.200 --> 01:35:45.280] The bagpipes, oh boy.

[01:35:45.280 --> 01:35:54.240] So in a way it kind of makes sense that they would be more older than Scotland.

[01:35:54.240 --> 01:35:59.840] Scotland's obviously famous for its bagpipes, but the fact that, were they the first culture

[01:35:59.840 --> 01:36:04.240] or society or group of people to, or did they invent the bagpipe?

[01:36:04.240 --> 01:36:07.080] Yeah, I'm kind of thinking no.

[01:36:07.080 --> 01:36:11.120] There were probably other things, maybe not as we understand the bagpipes today, some

[01:36:11.120 --> 01:36:14.120] other versions of it that took place before that.

[01:36:14.120 --> 01:36:18.420] I don't know about the Nero part of this, but that one seems all right to me, which

[01:36:18.420 --> 01:36:22.140] kind of leaves me with the QWERTY keyboard.

[01:36:22.140 --> 01:36:29.760] The QWERTY keyboard layout, I think, happened in the late 1800s.

[01:36:29.760 --> 01:36:35.900] I don't know that it was designed for convenience specifically, but I also can't remember why

[01:36:35.900 --> 01:36:38.040] it was laid out the way it was.

[01:36:38.040 --> 01:36:44.920] It was not the first design, I do know that, but it got adopted and, I don't know, to limit

[01:36:44.920 --> 01:36:45.920] jamming.

[01:36:45.920 --> 01:36:51.800] I don't think that one's right as far as being designed for convenience.

[01:36:51.800 --> 01:36:54.320] So I'm going to say the QWERTY keyboard one is the fiction.

[01:36:54.320 --> 01:36:55.320] Okay, Bob.

[01:36:55.320 --> 01:37:03.120] I mean, I've known some very hair-suit people, and them, you know, I can't imagine anything

[01:37:03.120 --> 01:37:06.800] having two to three times as much hair as Richard Howells.

[01:37:06.800 --> 01:37:07.800] Wow.

[01:37:07.800 --> 01:37:11.000] You actually called the person out by name.

[01:37:11.000 --> 01:37:12.000] Oh, yeah.

[01:37:12.000 --> 01:37:13.000] Hell yeah, you did.

[01:37:13.000 --> 01:37:14.000] Okay.

[01:37:14.000 --> 01:37:15.000] Yeah.

[01:37:15.000 --> 01:37:16.400] Sisters, like one of sisters.

[01:37:16.400 --> 01:37:18.960] He's also the hairiest person I've ever personally met.

[01:37:18.960 --> 01:37:19.960] Right.

[01:37:19.960 --> 01:37:22.080] I'm sure he owns that.

[01:37:22.080 --> 01:37:23.080] He knows that.

[01:37:23.080 --> 01:37:24.080] Yeah.

[01:37:24.080 --> 01:37:25.080] He's a good guy.

[01:37:25.080 --> 01:37:26.080] It's not new to him.

[01:37:26.080 --> 01:37:27.880] My sister's boyfriend, like many decades ago.

[01:37:27.880 --> 01:37:29.880] Not more hair than Harry McCallison.

[01:37:29.880 --> 01:37:30.880] Good guy.

[01:37:30.880 --> 01:37:31.880] Hey, Rich, how's it going, bro?

[01:37:31.880 --> 01:37:36.560] Yeah, I've just read that and believed it for so long that I'm going to choose to continue

[01:37:36.560 --> 01:37:37.560] believing it.

[01:37:37.560 --> 01:37:38.560] Bagpipes.

[01:37:38.560 --> 01:37:41.600] What the hell do I know about the origins of bagpipes?

[01:37:41.600 --> 01:37:42.600] Whatever.

[01:37:42.600 --> 01:37:44.400] I'll say that's science.

[01:37:44.400 --> 01:37:49.280] And all right, so what did I say about one?

[01:37:49.280 --> 01:37:50.280] I don't know.

[01:37:50.280 --> 01:37:51.280] I'm not really sure.

[01:37:51.280 --> 01:37:52.280] She's nebulous.

[01:37:52.280 --> 01:37:54.000] Basically, he called it fiction.

[01:37:54.000 --> 01:37:56.720] I think this makes sense to me.

[01:37:56.720 --> 01:37:58.760] This makes sense to me.

[01:37:58.760 --> 01:37:59.920] I'll say no more.

[01:37:59.920 --> 01:38:02.520] You do have to say which one you think is the fiction, though, unequivocally.

[01:38:02.520 --> 01:38:03.520] Please declare it.

[01:38:03.520 --> 01:38:04.520] No, I don't want to.

[01:38:04.520 --> 01:38:05.520] No, I already did.

[01:38:05.520 --> 01:38:06.520] Chimpanzees.

[01:38:06.520 --> 01:38:07.520] Chimpanzees.

[01:38:07.520 --> 01:38:08.520] Fiction.

[01:38:08.520 --> 01:38:09.520] Okay.

[01:38:09.520 --> 01:38:10.520] Kara.

[01:38:10.520 --> 01:38:11.520] Okay.

[01:38:11.520 --> 01:38:14.560] I have no idea what just happened, so I'm just going to try and go through this with

[01:38:14.560 --> 01:38:15.560] you guys.

[01:38:15.560 --> 01:38:16.560] Okay.

[01:38:16.560 --> 01:38:21.160] Everyone says QWERTY is the fiction, Bob says CHIMPS is the fiction, and I have to make

[01:38:21.160 --> 01:38:24.800] sure that I'm not double-negative-ing, negative.

[01:38:24.800 --> 01:38:32.360] So basically, humans have as many hair follicles as chimps.

[01:38:32.360 --> 01:38:39.600] The notion that it's false is the item, which means that it would be true that they do would

[01:38:39.600 --> 01:38:41.680] be if that one was the fiction.

[01:38:41.680 --> 01:38:42.680] I have no idea what you just said.

[01:38:42.680 --> 01:38:45.960] That's a heck of a way of saying that, Kara.

[01:38:45.960 --> 01:38:46.960] You're making it more complicated.

[01:38:46.960 --> 01:38:47.960] This just says—

[01:38:47.960 --> 01:38:48.960] No, I'm not.

[01:38:48.960 --> 01:38:49.960] You made it complicated.

[01:38:49.960 --> 01:38:50.960] So the notion—

[01:38:50.960 --> 01:38:55.920] Chimpanzees have two to three times as many hair follicles as humans.

[01:38:55.920 --> 01:38:56.920] That's the item.

[01:38:56.920 --> 01:38:59.080] No, the item says that that's—oh, okay.

[01:38:59.080 --> 01:39:01.240] The fact that they have the same is false.

[01:39:01.240 --> 01:39:02.240] I see.

[01:39:02.240 --> 01:39:03.240] I think that that's false.

[01:39:03.240 --> 01:39:06.560] I think they have the same number because somebody with hirsutism, I don't think they

[01:39:06.560 --> 01:39:08.360] have more follicles.

[01:39:08.360 --> 01:39:11.120] I think they just grow hair out of those follicles.

[01:39:11.120 --> 01:39:14.000] I think we are apes.

[01:39:14.000 --> 01:39:18.280] The fact that we have—and some people are hairier than others—there's a follicle there

[01:39:18.280 --> 01:39:22.920] even if there's no hair growing out of it, so I think that that one's the fiction, plain

[01:39:22.920 --> 01:39:23.920] and simple.

[01:39:23.920 --> 01:39:24.920] Okay, NJ.

[01:39:24.920 --> 01:39:29.160] Yeah, I mean, I think we have probably relatively the same amount of hair follicles as other

[01:39:29.160 --> 01:39:31.960] chimps, so I think that one is wrong.

[01:39:31.960 --> 01:39:36.000] Okay, so you all agree on number two, so we'll start there.

[01:39:36.000 --> 01:39:40.920] Bagpipes do not have their origin in Scotland but are rather an ancient instrument.

[01:39:40.920 --> 01:39:45.440] In fact, Nero was more likely to play the bagpipes than the fiddle while Rome burned,

[01:39:45.440 --> 01:39:47.600] although that is also a myth.

[01:39:47.600 --> 01:39:52.800] You guys all think this one is science, correct as stated, and this one is science.

[01:39:52.800 --> 01:39:53.800] Yeah!

[01:39:53.800 --> 01:39:54.800] Good, good, good.

[01:39:54.800 --> 01:39:57.240] Yeah, bagpipes are thousands of years old.

[01:39:57.240 --> 01:39:58.240] Wow.

[01:39:58.240 --> 01:39:59.320] They were in ancient Rome.

[01:39:59.320 --> 01:40:03.160] Nero actually had a coin printed with a picture of him playing the bagpipes.

[01:40:03.160 --> 01:40:04.160] Oh, jeez.

[01:40:04.160 --> 01:40:05.160] At first, nothing.

[01:40:05.160 --> 01:40:09.640] But is there any direct evidence?

[01:40:09.640 --> 01:40:16.800] Before Roman culture in the Middle East and in Asia, there were bagpipe-like instruments

[01:40:16.800 --> 01:40:21.880] or maybe pan flute more type instruments.

[01:40:21.880 --> 01:40:26.200] Rome was probably the first ones to bring it all together where you had like the bag

[01:40:26.200 --> 01:40:28.720] and you blow into it and you had pipes with the noise.

[01:40:28.720 --> 01:40:33.000] They called their version of that the Tibia Utricularis.

[01:40:33.000 --> 01:40:38.000] Tibia Utricularis, that's basically a Roman bagpipe, but it's basically a bagpipe.

[01:40:38.000 --> 01:40:39.800] I mean, it's the same instrument.

[01:40:39.800 --> 01:40:43.360] Obviously, there's always going to be incremental changes over time, exactly how it's made,

[01:40:43.360 --> 01:40:44.920] whatever, but it was bagpipes.

[01:40:44.920 --> 01:40:47.940] Yeah, ancient instrument.

[01:40:47.940 --> 01:40:50.600] Not only not Scottish, but like thousands of years old.

[01:40:50.600 --> 01:40:51.600] Cool.

[01:40:51.600 --> 01:40:53.120] All right, let's go back to number one.

[01:40:53.120 --> 01:40:57.300] The notion that humans have as many hair follicles as chimpanzees on average is false, with our

[01:40:57.300 --> 01:41:01.280] closest cousins having two to three times as many as humans.

[01:41:01.280 --> 01:41:06.240] Jay, Kara, and Bob all think this one is the fiction, Evan, you think this one is science,

[01:41:06.240 --> 01:41:09.480] but this one is the fiction.

[01:41:09.480 --> 01:41:10.480] Yeah, baby.

[01:41:10.480 --> 01:41:13.480] Because, yes, humans and chimps have the same number of hair follicles.

[01:41:13.480 --> 01:41:15.960] That's what I get for not being misinformed, I tell you.

[01:41:15.960 --> 01:41:21.800] Humans and chimps do have the same number of hair follicles, although less than monkeys.

[01:41:21.800 --> 01:41:22.800] Fewer.

[01:41:22.800 --> 01:41:23.800] Fewer than monkeys.

[01:41:23.800 --> 01:41:24.800] Not surprised.

[01:41:24.800 --> 01:41:25.800] Yeah.

[01:41:25.800 --> 01:41:35.000] So apes just overall have fewer hair follicles per size than do our monkey cousins, and

[01:41:35.000 --> 01:41:37.200] humans and chimps are the same.

[01:41:37.200 --> 01:41:41.200] The difference is, as you say, it's the type of hair, not the number of follicles.

[01:41:41.200 --> 01:41:42.760] We have the very thin.

[01:41:42.760 --> 01:41:44.560] It's got a name, too.

[01:41:44.560 --> 01:41:45.560] So human.

[01:41:45.560 --> 01:41:46.560] Velis.

[01:41:46.560 --> 01:41:47.560] Velis.

[01:41:47.560 --> 01:41:48.560] It's velis.

[01:41:48.560 --> 01:41:49.560] Peach fuzz.

[01:41:49.560 --> 01:41:50.560] Yeah, it's velis hair.

[01:41:50.560 --> 01:41:51.560] Yeah.

[01:41:51.560 --> 01:41:53.800] Of course, we have the same on our heads and our pubes, and under our arms we have the

[01:41:53.800 --> 01:41:57.920] denser, thicker, longer hair, but we have the velis hair everywhere else.

[01:41:57.920 --> 01:42:01.440] Well, and some people have it all over their chest and their back.

[01:42:01.440 --> 01:42:02.440] Yeah.

[01:42:02.440 --> 01:42:06.880] So there's a lot of highly variable, highly variable, and it's an interesting evolutionary

[01:42:06.880 --> 01:42:12.280] question as to why humans are different than all other primates, specifically apes, in

[01:42:12.280 --> 01:42:13.280] this regard.

[01:42:13.280 --> 01:42:14.280] What was it for?

[01:42:14.280 --> 01:42:18.360] Was it because we didn't need it because we started wearing clothes?

[01:42:18.360 --> 01:42:21.840] Was it because of parasites?

[01:42:21.840 --> 01:42:22.840] Was it because of sweating?

[01:42:22.840 --> 01:42:23.840] Temperature regulation?

[01:42:23.840 --> 01:42:27.520] Got tired of picking the lice off of each other.

[01:42:27.520 --> 01:42:30.800] Because we also have different, we also have more sweat glands.

[01:42:30.800 --> 01:42:34.640] So yeah, what's primary and what's secondary is hard to say.

[01:42:34.640 --> 01:42:35.640] What was driving what.

[01:42:35.640 --> 01:42:36.640] All right.

[01:42:36.640 --> 01:42:41.160] All this means that contrary to common lore, the QWERTY keyboard layout was not created

[01:42:41.160 --> 01:42:45.320] to limit jamming, but rather was designed for convenience is science.

[01:42:45.320 --> 01:42:48.600] Steve, let me say what my brain dredged up.

[01:42:48.600 --> 01:42:53.640] I believe this because from what I remember from a long time ago, the convenience was

[01:42:53.640 --> 01:42:58.560] that when the people were selling the typewriters, they could spell the word typewriter on the

[01:42:58.560 --> 01:42:59.560] top row.

[01:42:59.560 --> 01:43:00.560] Didn't have to think about it.

[01:43:00.560 --> 01:43:01.560] That's total bullshit.

[01:43:01.560 --> 01:43:02.560] Yeah.

[01:43:02.560 --> 01:43:03.560] That sounds...

[01:43:03.560 --> 01:43:04.560] That's bullshit.

[01:43:04.560 --> 01:43:07.440] Well, it still helped me.

[01:43:07.440 --> 01:43:08.440] So I...

[01:43:08.440 --> 01:43:09.440] You can, though.

[01:43:09.440 --> 01:43:10.440] That is...

[01:43:10.440 --> 01:43:11.440] So what did you learn?

[01:43:11.440 --> 01:43:12.960] There's no historical evidence for that.

[01:43:12.960 --> 01:43:14.960] It doesn't really even make that much sense.

[01:43:14.960 --> 01:43:15.960] Sure it does.

[01:43:15.960 --> 01:43:18.440] They didn't have to know how to type, right?

[01:43:18.440 --> 01:43:21.160] They could just look at all the letters on the top row and not have to worry about any

[01:43:21.160 --> 01:43:22.160] of the other rows.

[01:43:22.160 --> 01:43:23.160] That was just made up.

[01:43:23.160 --> 01:43:24.160] That's silly.

[01:43:24.160 --> 01:43:25.160] There's no history for that.

[01:43:25.160 --> 01:43:28.680] So I did leave out a little detail, though.

[01:43:28.680 --> 01:43:35.000] The QWERTY keyboard was designed for ease and convenience of use, for efficiency of use.

[01:43:35.000 --> 01:43:38.400] But the question is, for who?

[01:43:38.400 --> 01:43:39.400] Not efficiency.

[01:43:39.400 --> 01:43:40.400] Yeah, for who?

[01:43:40.400 --> 01:43:41.400] Yeah, efficiency.

[01:43:41.400 --> 01:43:42.400] But for who?

[01:43:42.400 --> 01:43:44.080] Dvorak is much more efficient.

[01:43:44.080 --> 01:43:47.200] Is it for people who type with one finger, Steve?

[01:43:47.200 --> 01:43:48.800] Oh, what in the heck?

[01:43:48.800 --> 01:43:50.520] What preceded the typewriter?

[01:43:50.520 --> 01:43:52.240] It's what were they using it for?

[01:43:52.240 --> 01:43:56.880] What were the early adopters using a typewriter for?

[01:43:56.880 --> 01:43:59.400] What did people type before there were typewriters?

[01:43:59.400 --> 01:44:01.920] They didn't.

[01:44:01.920 --> 01:44:04.600] But there was another layout to the keyboard, actually.

[01:44:04.600 --> 01:44:05.600] There were other layouts.

[01:44:05.600 --> 01:44:06.600] It was like...

[01:44:06.600 --> 01:44:07.600] What did they use it for?

[01:44:07.600 --> 01:44:08.600] What do you call that?

[01:44:08.600 --> 01:44:09.600] Morse code?

[01:44:09.600 --> 01:44:10.600] Morse code.

[01:44:10.600 --> 01:44:11.600] Does it have something to do with that?

[01:44:11.600 --> 01:44:12.600] Telegraph.

[01:44:12.600 --> 01:44:13.600] Telegraphs.

[01:44:13.600 --> 01:44:15.400] That's what it was for.

[01:44:15.400 --> 01:44:21.240] The QWERTY keyboard was optimally laid out for telegraph user, typers, right?

[01:44:21.240 --> 01:44:22.240] What?

[01:44:22.240 --> 01:44:23.240] How?

[01:44:23.240 --> 01:44:27.000] It was used for use with telegraphs because they had their own little shorthand for Morse

[01:44:27.000 --> 01:44:28.000] code.

[01:44:28.000 --> 01:44:29.000] They had their own little shorthand.

[01:44:29.000 --> 01:44:30.000] Wow.

[01:44:30.000 --> 01:44:31.000] Interesting.

[01:44:31.000 --> 01:44:32.000] Yeah.

[01:44:32.000 --> 01:44:38.920] It was optimized for them, and it makes sense for the limited set of letters that they used

[01:44:38.920 --> 01:44:41.560] for their code.

[01:44:41.560 --> 01:44:42.560] They were the early adopters.

[01:44:42.560 --> 01:44:43.560] Right?

[01:44:43.560 --> 01:44:44.560] Isn't that interesting?

[01:44:44.560 --> 01:44:45.560] Yeah.

[01:44:45.560 --> 01:44:46.560] It's cool.

[01:44:46.560 --> 01:44:50.760] But then once it became solidified, then it was inertia at that point.

[01:44:50.760 --> 01:44:52.400] Nobody wanted to change it.

[01:44:52.400 --> 01:44:53.400] Yeah.

[01:44:53.400 --> 01:44:54.400] It's done, man.

[01:44:54.400 --> 01:44:55.400] I'm not going to relearn it.

[01:44:55.400 --> 01:45:03.680] There were interesting precursors to the QWERTY layout that were interesting, that were kind

[01:45:03.680 --> 01:45:10.360] of all over the place, but pretty quickly settled on the QWERTY keyboard with a couple

[01:45:10.360 --> 01:45:11.360] of tweaks.

[01:45:11.360 --> 01:45:13.960] At first, the period was in the middle of the upper row.

[01:45:13.960 --> 01:45:16.660] There were some weird changes.

[01:45:16.660 --> 01:45:24.360] But they did swap around letters, too, also, like they swapped the C and the X to make

[01:45:24.360 --> 01:45:28.840] the C an easier letter to use, and the X didn't have to be as easy to use.

[01:45:28.840 --> 01:45:29.840] Yeah.

[01:45:29.840 --> 01:45:34.240] But a QWERTY keyboard, someone typing on it, from what I've read, someone typing on a QWERTY

[01:45:34.240 --> 01:45:40.880] keyboard will never beat the best Dvorak typist in terms of just pure, broad speed.

[01:45:40.880 --> 01:45:41.880] Yeah.

[01:45:41.880 --> 01:45:46.320] If you were designing it de novo, like without any historical contingency, you would use

[01:45:46.320 --> 01:45:51.080] something more like a Dvorak keyboard, where you're going to have all your power fingers

[01:45:51.080 --> 01:45:57.360] are going to be used the most often, and you're not likely to use the same hand over and over

[01:45:57.360 --> 01:46:02.240] and over again, going back between your two hands.

[01:46:02.240 --> 01:46:11.560] So the QWERTY is not optimal for typing words, but it's reasonable, just not optimal.

[01:46:11.560 --> 01:46:12.560] Yeah, it works.

[01:46:12.560 --> 01:46:13.560] It works.

[01:46:13.560 --> 01:46:17.160] It's good enough that it's not worth relearning how to type.

[01:46:17.160 --> 01:46:21.320] We need that voice recognition software to start working.

[01:46:21.320 --> 01:46:22.320] Where's that?

[01:46:22.320 --> 01:46:23.320] It works.

[01:46:23.320 --> 01:46:25.520] Google's voice recognition works pretty damn good.

[01:46:25.520 --> 01:46:26.520] It's pretty good.

[01:46:26.520 --> 01:46:28.320] Then why are we still typing on QWERTYs?

[01:46:28.320 --> 01:46:29.320] Case closed.

[01:46:29.320 --> 01:46:31.600] I like dictating on my phone whenever I can.

[01:46:31.600 --> 01:46:32.600] All right.

[01:46:32.600 --> 01:46:33.600] Good job, everyone.

Skeptical Quote of the Week (1:46:35)

If you have an effect that nobody can replicate, then your phenomenon fades away. So if you want to have a legacy, then you jolly well better have an effect that replicates.
Susan Fiske, American social psychologist


[01:46:33.600 --> 01:46:38.320] Evan, give us a quote.

[01:46:38.320 --> 01:46:43.280] If you have an effect that nobody can replicate, then your phenomenon fades away.

[01:46:43.280 --> 01:46:49.760] So if you want to have a legacy, then you jolly well better have an effect that replicates.

[01:46:49.760 --> 01:46:54.520] Susan Fisk, professor of psychology and public affairs from the Department of Psychology,

[01:46:54.520 --> 01:46:56.000] Princeton University.

[01:46:56.000 --> 01:46:58.360] Yes, replication is king.

[01:46:58.360 --> 01:46:59.360] Absolutely.

[01:46:59.360 --> 01:47:00.360] Absolutely.

[01:47:00.360 --> 01:47:01.360] So important.

[01:47:01.360 --> 01:47:02.360] Phenomenon.

[01:47:02.360 --> 01:47:03.360] Phenomenon.

[01:47:03.360 --> 01:47:08.200] Yeah, because if it doesn't replicate, it's probably not real.

[01:47:08.200 --> 01:47:09.200] Right.

[01:47:09.200 --> 01:47:10.200] It's error.

[01:47:10.200 --> 01:47:11.200] Error.

[01:47:11.200 --> 01:47:12.200] Error.

[01:47:12.200 --> 01:47:15.760] So we need to know whether or not it's error, I guess, is a better way to put that.

[01:47:15.760 --> 01:47:19.480] I don't know why pseudoscientists can't grasp that concept.

[01:47:19.480 --> 01:47:20.480] They don't want to.

[01:47:20.480 --> 01:47:21.480] Because they don't want to.

[01:47:21.480 --> 01:47:22.480] Because they don't want to.

[01:47:22.480 --> 01:47:23.480] Yeah.

[01:47:23.480 --> 01:47:25.800] They're working backwards from the desired outcome.

Signoff/Announcements

S: —and until next week, this is your Skeptics' Guide to the Universe.

S: Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information, visit us at theskepticsguide.org. Send your questions to info@theskepticsguide.org. And, if you would like to support the show and all the work that we do, go to patreon.com/SkepticsGuide and consider becoming a patron and becoming part of the SGU community. Our listeners and supporters are what make SGU possible.

[top]                        

Today I Learned

  • Fact/Description, possibly with an article reference[8]
  • Fact/Description
  • Fact/Description

Notes

References

Vocabulary


Navi-previous.png Back to top of page Navi-next.png