SGU Episode 896

From SGUTranscripts
Jump to navigation Jump to search
  Emblem-pen-green.png This transcript is not finished. Please help us finish it!
Add a Transcribing template to the top of this transcript before you start so that we don't duplicate your efforts.
  Emblem-pen-orange.png This episode needs: transcription, formatting, links, 'Today I Learned' list, categories, segment redirects.
Please help out by contributing!
How to Contribute

This is an outline for a typical episode's transcription. Not all of these segments feature in each episode.
There may also be additional/special segments not listed in this outline.

You can use this outline to help structure the transcription. Click "Edit" above to begin.


SGU Episode 896
September 10th 2022
896 chicxulub meteor.jpeg

depiction of Chicxulub meteor

SGU 895                      SGU 897

Skeptical Rogues
S: Steven Novella

B: Bob Novella

C: Cara Santa Maria

J: Jay Novella

E: Evan Bernstein

Quote of the Week

A good ghost story may hold entertainment and even cultural value, but the popular portrayal of pseudoscientific practices as science may be detracting from efforts to cultivate a scientifically literate public.

Micheal Knees, American engineering psychologist

Links
Download Podcast
Show Notes
Forum Discussion

Introduction, another Artemis launch scrubbed

Voice-over: You're listening to the Skeptics' Guide to the Universe, your escape to reality.

[00:09.840 --> 00:13.440] Hello and welcome to the Skeptics Guide to the Universe. Today is Wednesday,

[00:13.440 --> 00:17.360] September 7th, 2022, and this is your host, Stephen Novella.

[00:17.360 --> 00:19.840] Joining me this week are Bob Novella. Hey, everybody.

[00:19.840 --> 00:21.760] Kara Santamaria. Howdy.

[00:21.760 --> 00:23.280] Jay Novella. Hey, guys.

[00:23.280 --> 00:26.400] And Evan Bernstein. Good evening, everyone.

[00:26.400 --> 00:32.400] So we had this scrubbing of the second launch date for the Artemis 1.

[00:32.400 --> 00:35.120] Why do they keep doing that to us? Frustrating.

[00:35.120 --> 00:39.280] Yeah, so, I mean, the first, you know, this was supposed to fly in 2017.

[00:39.280 --> 00:44.720] This is now a five-year rolling delay in terms of getting this thing off the ground.

[00:44.720 --> 00:49.200] But yeah, so on last Monday or Tuesday, I think it was Monday,

[00:49.200 --> 00:51.120] they were going to try to do a launch.

[00:51.120 --> 00:55.040] They had a temperature problem in the engines,

[00:55.040 --> 00:58.720] and then it turned out they couldn't fix it within the launch window,

[00:58.720 --> 01:01.440] so they had to scrub. Turned out it was a faulty sensor.

[01:01.440 --> 01:04.880] Everything was fine, but whatever, one faulty sensor scrubs a launch.

[01:05.760 --> 01:08.640] So they rescheduled it for Saturday, and then on Saturday,

[01:08.640 --> 01:10.400] they had actually a more serious problem.

[01:10.400 --> 01:12.640] I'm not sure why they didn't have the same problem on Monday.

[01:12.640 --> 01:16.800] They had a hydrogen leak from the liquid hydrogen gassing.

[01:16.800 --> 01:19.120] Now, this is a serious problem because...

[01:19.120 --> 01:21.600] Yeah. You don't f*** around with hydrogen, man.

[01:21.600 --> 01:26.560] If it gets too, if the percentage of hydrogen outside the tank gets too high,

[01:26.560 --> 01:29.520] there's a chance that it could explode, you know, when the ship takes off,

[01:29.520 --> 01:30.880] which would be bad, right?

[01:30.880 --> 01:33.840] You don't want the explosion to be happening outside of the tank.

[01:33.840 --> 01:35.920] Oh, what do they call that? There's a name for that.

[01:35.920 --> 01:37.120] Catastrophic failure?

[01:37.120 --> 01:37.760] No, no, no.

[01:37.760 --> 01:39.280] No, you're right, Bob. There is a name for that.

[01:40.240 --> 01:43.600] And it's hilarious. Explosive disassembly or something.

[01:43.600 --> 01:44.320] Yeah, okay.

[01:44.320 --> 01:46.560] That's so scary.

[01:46.560 --> 01:48.960] Let's disassemble it with explosives, yay.

[01:48.960 --> 01:52.640] So this is interesting. So they had to scrub that because they couldn't fix that in time.

[01:52.640 --> 01:54.160] They tried a couple of things to, like,

[01:54.160 --> 01:57.040] I'll change the temperature to get the seals to work, but it didn't work.

[01:57.040 --> 02:01.200] They could potentially fix this problem on the launch pad,

[02:01.200 --> 02:03.760] but by the time they could do that,

[02:03.760 --> 02:10.640] the batteries that are needed for the abort system to work would have to be recycled.

[02:10.640 --> 02:17.920] So they have to bring the ship back to the building just to swap out the abort batteries.

[02:17.920 --> 02:20.240] But of course, while it's there, they'll fix everything.

[02:20.240 --> 02:21.840] And they got to reset everything.

[02:21.840 --> 02:23.520] It's like outside the window.

[02:23.520 --> 02:26.880] So now it's like you're keeping all these plates spinning, you know?

[02:26.880 --> 02:29.360] And if you don't get it to fly within a certain amount of time,

[02:29.360 --> 02:32.960] you got to bring it back and reset everything, you know, and then try again.

[02:32.960 --> 02:36.800] So now the earliest, they haven't set a new launch date yet as this recording,

[02:36.800 --> 02:38.560] but the earliest would be mid-October.

[02:38.560 --> 02:39.600] It would be like six weeks.

[02:39.600 --> 02:41.360] October 2023.

[02:41.360 --> 02:42.480] Yeah, 2022.

[02:42.480 --> 02:43.680] Oh, okay.

[02:43.680 --> 02:48.160] We did talk about it briefly during the live show, Jane.

[02:48.160 --> 02:51.440] You brought up the fact that you've heard some criticism.

[02:51.440 --> 02:54.160] So I did a deeper dive on it because I've heard some criticism too,

[02:54.160 --> 02:55.520] and I wanted to know where that was.

[02:55.520 --> 02:59.440] The bottom line is that it's just really expensive, you know?

[02:59.440 --> 03:04.160] They're spending, you know, $150 billion to get this thing up.

[03:04.160 --> 03:11.680] It's going to cost a billion dollars or $2 billion a launch just for the launch fees itself.

[03:11.680 --> 03:14.000] If you amortize the development cost,

[03:14.000 --> 03:17.440] it's going to be between four and five billion dollars per launch,

[03:18.400 --> 03:22.720] and they only have the infrastructure to launch one a year.

[03:22.720 --> 03:24.080] That's all we're going to get out of it.

[03:24.080 --> 03:26.080] One launch a year, and at the end of the day,

[03:26.080 --> 03:29.840] it's probably going to be at like four to five billion dollars per launch.

[03:29.840 --> 03:32.160] So that's mainly where the criticism is coming from.

[03:32.160 --> 03:32.880] It's expensive.

[03:33.520 --> 03:37.120] It's not really going to be able to do that many launches.

[03:37.120 --> 03:40.480] But you got to keep in mind that you go back to 2011

[03:40.480 --> 03:42.960] when they canceled the Constellation program,

[03:42.960 --> 03:47.360] which is the predecessor to the Space Launch System, the SLS,

[03:47.360 --> 03:51.280] and also that was the end of the life of the space shuttle.

[03:51.280 --> 03:54.640] So we had no, basically, no rockets to go up.

[03:54.640 --> 04:00.160] So at that time, the Obama administration basically made a bargain with NASA.

[04:00.160 --> 04:06.480] They said, okay, we will fund the SLS program for deep space,

[04:06.480 --> 04:10.800] but you are going to contract out low Earth orbit to private industry.

[04:10.800 --> 04:11.680] So that's what they did.

[04:12.480 --> 04:15.920] And that's where SpaceX comes from and like Blue Origin, all these companies.

[04:15.920 --> 04:17.040] So that worked out really well.

[04:17.040 --> 04:21.120] The low Earth orbit, you know, and SpaceX worked out tremendously well,

[04:21.120 --> 04:25.440] but they're kind of hobbled with this really over budget, delayed,

[04:25.440 --> 04:29.680] really expensive SLS, you know, heavy launch system.

[04:30.400 --> 04:34.400] And, you know, now looking back 11 years later,

[04:34.400 --> 04:37.360] it's like, you know, there's nothing innovative about it.

[04:37.360 --> 04:45.040] It's not reusable, you know, and the SpaceX is basically completely leapfrogged over it.

[04:45.040 --> 04:47.600] So I think that's where a lot of the criticism comes from.

[04:47.600 --> 04:50.560] But still, here we are, you know, it's going to get us to the moon.

[04:50.560 --> 04:53.760] You also have to keep in mind that at the other end of the spectrum,

[04:54.400 --> 04:56.800] the Artemis program, not the SLS,

[04:56.800 --> 05:00.640] but the Artemis program was originally planned for 2028.

[05:00.640 --> 05:04.720] Well, to the moon, right, to be back on the moon in 2028.

[05:04.720 --> 05:08.160] That's Artemis mission, not the SLS system, right?

[05:08.160 --> 05:09.280] So not the rocket.

[05:09.280 --> 05:15.520] But the Artemis mission was moved up from 2028 to 2024 by the Trump administration.

[05:15.520 --> 05:18.400] And then it's now pushed back to 2025.

[05:18.400 --> 05:20.880] That's still three years ahead of schedule.

[05:20.880 --> 05:22.080] Of original schedule, yes.

[05:22.080 --> 05:23.120] Original schedule.

[05:23.120 --> 05:26.320] And nobody ever thought that the 2024 thing was realistic.

[05:26.320 --> 05:28.640] NASA was like, this is just not going to be like, OK, sure, right.

[05:28.640 --> 05:32.560] But they knew politically it sounded good, but never going to happen.

[05:32.560 --> 05:35.120] So, all right, we're still on track to get back to the moon

[05:35.120 --> 05:36.720] by the middle of this decade.

[05:36.720 --> 05:39.360] And hopefully, you know, the SLS will work out.

[05:39.360 --> 05:40.720] Artemis will launch.

[05:40.720 --> 05:43.760] It's obviously I'd rather have them scrub for six weeks

[05:43.760 --> 05:45.200] and have the thing blow up on the pad.

[05:45.200 --> 05:46.800] That would be a disaster.

[05:46.800 --> 05:47.840] My gosh.

[05:47.840 --> 05:51.760] What I do think is that NASA should already be planning

[05:51.760 --> 05:53.760] the successor of the SLS, though.

[05:54.960 --> 05:55.200] Right.

[05:55.200 --> 05:55.920] I mean, they shouldn't.

[05:55.920 --> 05:59.040] Well, the SLS is expensive to fly.

[05:59.040 --> 06:01.600] And it's like, you know, it's not reusable.

[06:01.600 --> 06:03.440] It's not efficient or whatever.

[06:04.000 --> 06:06.000] They should probably just contract out, you know,

[06:06.000 --> 06:09.360] to the private space industry now to develop the next thing

[06:09.360 --> 06:12.320] that's going to be able to get to the moon and to Mars

[06:12.960 --> 06:14.720] and not try to do it themselves.

[06:14.720 --> 06:15.440] You know what I mean?

[06:16.000 --> 06:16.560] Yeah.

[06:16.560 --> 06:19.760] Yeah, I mean, that's a really hard thing to predict, Steve.

[06:19.760 --> 06:22.800] You know, first of all, we don't know how well the SLS is going to work.

[06:22.800 --> 06:26.640] It seems like private industry is going to work out better

[06:26.640 --> 06:28.880] than NASA owning their own rockets at this point.

[06:28.880 --> 06:29.920] Don't you agree?

[06:29.920 --> 06:32.160] I mean, for low Earth orbit, it's worked out really well.

[06:32.800 --> 06:34.720] You know, that was sort of the division of labor.

[06:34.720 --> 06:37.120] They would let private industry handle low Earth orbit

[06:37.120 --> 06:39.040] and then NASA will do deep space, right?

[06:39.040 --> 06:40.720] Go back to the moon and then eventually Mars.

[06:41.280 --> 06:45.360] Orion, which is NASA's capsule, that is the only spaceship

[06:45.360 --> 06:47.760] that can get to the, you know, to the moon now, right?

[06:47.760 --> 06:49.120] That can do deep space missions.

[06:49.120 --> 06:51.680] It's rated for 21 days.

[06:51.680 --> 06:54.240] It's long enough to get to the moon and back, you know what I mean?

[06:54.240 --> 06:56.480] So the Dragon module can't do it?

[06:56.480 --> 06:59.360] Well, according to NASA, it's the only one that's rated for,

[06:59.360 --> 07:00.720] like, moon missions at this point.

[07:00.720 --> 07:05.200] So they would, not that you, you know, I'm sure you could get the Dragon capsule

[07:05.200 --> 07:09.280] or a version of it to the point where it would be rated for deep space,

[07:09.280 --> 07:10.480] but it isn't right now.

[07:11.200 --> 07:15.040] But again, they gave the contract to SpaceX, remember, for the lunar lander

[07:15.040 --> 07:20.240] and Musk wants to convert the Starship into a lunar lander.

[07:20.240 --> 07:22.240] Yeah, that's still on.

[07:22.240 --> 07:23.760] Which is, like, weird in a way.

[07:24.640 --> 07:28.080] Would that ship, Steve, leave from Earth or would it stay?

[07:28.080 --> 07:29.120] Well, it'd have to, right?

[07:29.120 --> 07:31.680] We're not going to build it on Earth, send it to the moon,

[07:31.680 --> 07:34.560] and then it's going to land on, that's the ship that's going to land on the moon.

[07:34.560 --> 07:36.560] But, you know, I think we talked about it at the time,

[07:36.560 --> 07:39.360] it's like, yeah, but it's going all the way to the moon.

[07:39.360 --> 07:41.760] Why don't you just make that your moon ship, you know what I mean?

[07:41.760 --> 07:46.800] Like, why are you going to take the SLS to the moon, then hop on over into the Starship

[07:46.800 --> 07:49.120] to go down, to land down on the moon?

[07:49.120 --> 07:49.680] I don't know.

[07:49.680 --> 07:51.680] I don't know exactly how that's going to work.

[07:51.680 --> 07:56.320] So, okay, so it is that way, that ship is going to basically ferry people

[07:56.320 --> 08:00.400] from low moon orbit to the surface.

[08:00.400 --> 08:01.440] Yes, that's right.

[08:01.440 --> 08:04.480] And it stays out there and they just refuel it and keep reusing it.

[08:04.480 --> 08:05.520] I guess so.

[08:05.520 --> 08:08.560] Steve, I'm hoping that the next thing that will be developed

[08:08.560 --> 08:14.720] will be a deep space nuclear rocket, because they're developing nuclear rockets for cislunar.

[08:14.720 --> 08:17.920] Now, they won't be really rated for beyond cislunar, right?

[08:17.920 --> 08:20.960] They really won't be designed to go beyond the moon.

[08:20.960 --> 08:25.680] But, and this is why NASA is working with them on this, once they have it,

[08:25.680 --> 08:29.040] then the homework, you know, the foundational homework will be done,

[08:29.040 --> 08:32.880] and then NASA could take that and then extend it and then make it,

[08:32.880 --> 08:34.640] you know, for a much deeper space.

[08:34.640 --> 08:36.320] So that's my hope.

[08:36.320 --> 08:40.400] That's my hope. The question is, is it going to be the next gen deep space,

[08:40.400 --> 08:41.920] or is it going to be the one after that?

[08:42.560 --> 08:48.560] Well, maybe just like let private companies handle just the heavy lift rockets

[08:48.560 --> 08:49.600] that get you to the moon.

[08:50.240 --> 08:54.560] And NASA just completely focuses on developing nuclear rockets.

[08:54.560 --> 08:57.120] Yeah, shit man, I'd be, I'm all for that.

[08:57.120 --> 08:58.400] Because that's the next thing we need.

[08:58.400 --> 09:01.680] And chemical rockets are just so inefficient, you know,

[09:01.680 --> 09:04.560] like it's just not the way to get to Mars and back.

[09:04.560 --> 09:11.760] No, anything beyond the moon, and chemical rockets are just going to be marginalized.

[09:11.760 --> 09:14.080] I mean, of course, now I'm thinking much deeper into the future,

[09:14.080 --> 09:16.880] but as we, as the decades and centuries accrue,

[09:17.760 --> 09:21.360] chemical is really going to be just like maybe for Earth launch.

[09:21.360 --> 09:22.160] And that's it.

[09:22.160 --> 09:25.120] Getting out of Earth's gravity well, that's pretty much going to be it.

[09:25.120 --> 09:28.640] Right. But that's, you know, who knows how long that's going to take,

[09:29.600 --> 09:32.880] you know, when chemical no longer has any role in deep space,

[09:32.880 --> 09:35.600] because, you know, long distance rocket equation says,

[09:35.600 --> 09:37.360] screw you chemical rockets.

[09:37.360 --> 09:38.080] Yeah.

[09:38.080 --> 09:38.640] Yeah.

[09:38.640 --> 09:40.720] And then, and then eventually fusion.

[09:40.720 --> 09:44.480] Once we get to fusion, then we're, that's the, that's the game.

[09:44.480 --> 09:45.440] Started man, that's good.

[09:45.440 --> 09:45.920] Yeah.

[09:45.920 --> 09:46.800] And what's interesting is-

[09:46.800 --> 09:49.200] Especially the hydrogen proton proton fusion engine.

[09:49.200 --> 09:54.640] Once we develop fusion engines, that's going to be our engines forever.

[09:54.640 --> 10:00.080] Like there's the probability that anything will replace it is so remote.

[10:00.080 --> 10:05.200] Like we don't know if it will ever happen and if it does, it will be in the distant far future.

[10:05.200 --> 10:05.600] Right.

[10:05.600 --> 10:08.560] So that's the brass ring right there.

[10:08.560 --> 10:11.360] Well, for reaction rockets, yes.

[10:11.360 --> 10:16.080] I think that's going to be it for quite, for potentially centuries.

[10:16.080 --> 10:18.560] And you could do an amazing amount of things-

[10:18.560 --> 10:19.520] I think thousands of years.

[10:19.520 --> 10:21.920] With, with, that's silly.

[10:21.920 --> 10:22.960] Technically centuries too.

[10:22.960 --> 10:23.840] But that's, yeah.

[10:23.840 --> 10:24.960] But that's, yeah.

[10:24.960 --> 10:28.320] I mean, even the best we can do with that type of reaction rocket,

[10:28.320 --> 10:31.120] say a fusion hydrogen proton proton, which is really efficient,

[10:31.120 --> 10:34.960] like say 11%, 11% speed of light exhaust velocity.

[10:34.960 --> 10:40.400] That is, you could still do, you know, 20% the speed of light with that type of rocket.

[10:40.400 --> 10:45.040] And if you don't care about cargo at all, you can get that rocket up to 50% the speed of light.

[10:45.680 --> 10:49.520] But then cargo of course becomes literally a millionth of the payload,

[10:49.520 --> 10:52.720] but still 10%, 20% the speed of light with a super advanced-

[10:52.720 --> 10:58.560] Give it a bob, you add, add a little bit of light sails and then that'll get you.

[10:58.560 --> 10:58.880] Yes.

[10:58.880 --> 10:59.680] That'll get you there.

[10:59.680 --> 11:01.840] So that's going to be light sails and fusion.

[11:01.840 --> 11:02.720] That's going to be space travel.

[11:02.720 --> 11:06.880] That seems to be, I think that's pretty much where we're going for centuries.

[11:06.880 --> 11:10.960] Unless an ASI, artificial super intelligence, rises and then all bets are off.

[11:10.960 --> 11:16.880] But even then, he or she would be constrained to, to the physics, to physics as we know it.

[11:16.880 --> 11:19.280] And even, even, you know, the ASI might say,

[11:19.280 --> 11:22.640] damn man, this is the best I could do, but it's still going to be cool.

[11:22.640 --> 11:24.560] Yeah. It's almost as if we wrote a whole book about it.

[11:24.560 --> 11:24.720] Yeah.

[11:26.560 --> 11:30.560] It's almost as if I just did a deep dive research on it because I talked about it at Dragon Con.

[11:31.200 --> 11:31.680] Dragon Con.

[11:31.680 --> 11:32.480] How was Dragon Con?

[11:33.040 --> 11:33.920] It was great.

[11:33.920 --> 11:39.360] Liz and I went first time in three years and I know you guys were just so wicked jealous.

[11:39.360 --> 11:40.000] It was great.

[11:40.000 --> 11:40.560] Totally.

[11:40.560 --> 11:42.160] It was pretty much as we remember it.

[11:42.160 --> 11:45.840] Amazing costumes, amazing fun, lots of people.

[11:45.840 --> 11:50.160] And pretty much, I was double masked for like four days in a row

[11:50.160 --> 11:55.680] and I took a, took a test today and totally clean, no, totally negative.

[11:55.680 --> 11:59.360] So I think I totally, you know, got away with it totally.

[12:00.400 --> 12:01.040] I did a talk.

[12:01.040 --> 12:04.480] I called the science, I called, I called the science panel guys and I'm like,

[12:04.480 --> 12:07.840] I want to do the future of rockets.

[12:07.840 --> 12:10.880] And they'd made a panel with like five guys and I was one of them.

[12:10.880 --> 12:12.320] And I just went off.

[12:12.320 --> 12:14.480] I did a deep dive for weeks.

[12:14.480 --> 12:17.840] For weeks I did a deep dive just to refresh my memory and all the research that I had

[12:17.840 --> 12:20.880] done for the chapter of the book about future rockets.

[12:20.880 --> 12:21.920] And I got it down, man.

[12:21.920 --> 12:26.880] I made an awesome bullet list of all the top, the top things that I needed to keep straight

[12:26.880 --> 12:27.440] in my head.

[12:27.440 --> 12:29.680] And it was so much fun to research.

[12:29.680 --> 12:34.000] And there was a great panel, great panel, great fellow panelists with me.

[12:34.000 --> 12:36.960] They were all very knowledgeable and it was great.

[12:36.960 --> 12:38.880] But also I did some skeptical stuff.

[12:38.880 --> 12:40.240] I talked about the two books.

[12:40.240 --> 12:45.760] I did a, I did a one man show on stage on the skeptical track and I was like, oh boy,

[12:45.760 --> 12:46.960] this is scary.

[12:46.960 --> 12:47.840] But it was fine.

[12:47.840 --> 12:48.640] It was fine.

[12:48.640 --> 12:52.160] I just, I just went off on the books and then I started talking about rockets again.

[12:52.160 --> 12:52.960] And then that was it.

[12:52.960 --> 12:54.400] I was in my happy place.

[12:54.960 --> 12:56.560] And, uh, totally great.

[12:56.560 --> 13:00.160] Bob, totally utterly, absolutely.

[13:00.160 --> 13:04.560] Indubitably your solo talk was basically like a pared down news item for Bob.

[13:04.560 --> 13:06.080] Yeah, that's basically what it was.

[13:06.880 --> 13:07.520] It was great.

[13:07.520 --> 13:13.760] And, uh, so many, as usual, so many great costumes, the talent on display at Dragon

[13:13.760 --> 13:19.840] Con blows me away every time I go and I'm determined next year to have an awesome homemade

[13:19.840 --> 13:21.520] costume, which I didn't have this year.

[13:22.320 --> 13:22.480] Yeah.

[13:22.480 --> 13:24.240] We haven't been, I've been what, in four years now.

[13:24.240 --> 13:26.480] It'll be, we're definitely going to make a plan to go next year.

[13:27.120 --> 13:27.440] Yeah.

[13:27.440 --> 13:28.800] I mean, we were fine.

[13:28.800 --> 13:30.480] Pandemic willing, but I hopefully will.

[13:30.480 --> 13:30.720] Yeah.

[13:30.720 --> 13:31.200] It's time.

[13:31.200 --> 13:33.360] I mean, as long as things are good, we gotta go.

[13:33.360 --> 13:36.960] We were surrounded at times by thousands of people.

[13:36.960 --> 13:39.840] And at a couple of times I was like, this is uncomfortable.

[13:40.560 --> 13:44.080] But I had my double masks, you know, I held my breath a lot.

[13:44.640 --> 13:46.000] And it, and I'm fine.

[13:46.000 --> 13:48.720] Both Liz and I are both, you know, totally testing negative.

[13:48.720 --> 13:50.400] And it's been many, it's been days.

[13:50.400 --> 13:51.200] So it's doable.

[13:51.200 --> 13:53.760] Just, you know, you just, you know, you could take it easy.

[13:53.760 --> 13:57.840] You don't have to go into the big shoulder to shoulder crowds, um, you know?

[13:57.840 --> 13:58.320] And, uh, it's totally doable.

[13:58.320 --> 14:00.400] How about the, uh, the merch room?

[14:00.400 --> 14:00.800] Oh yeah.

[14:00.800 --> 14:02.640] That was, that was, you know, it was Christmas.

[14:02.640 --> 14:04.080] I'm, I'm walking towards it.

[14:04.080 --> 14:04.240] Yeah.

[14:04.240 --> 14:04.720] But how was it?

[14:04.720 --> 14:06.320] Was there a shoulder to shoulder in there?

[14:06.320 --> 14:07.200] No, no.

[14:07.200 --> 14:10.400] The first day, the first day it opened where I was like waiting for it.

[14:10.400 --> 14:13.360] And it was, it was, there's four floors, as you know.

[14:13.360 --> 14:17.680] And, uh, it was not shoulder to shoulder craziness at that, that the first few hours that I was

[14:17.680 --> 14:18.400] there.

[14:18.400 --> 14:20.080] And, uh, so that's, so that was fine too.

[14:20.080 --> 14:21.680] I was worried about that as well.

[14:21.680 --> 14:26.160] By the way, one last detail I've got to mention about the Orion capsule is that it's not just

[14:26.160 --> 14:27.920] that it's rated for 21 days.

[14:27.920 --> 14:32.800] When you come back from the moon, you reenter the atmosphere much faster than when you,

[14:32.800 --> 14:35.520] than when you're just coming down from low earth orbit.

[14:35.520 --> 14:39.360] And so the capsule has to be rated for high speed reentry.

[14:39.360 --> 14:39.680] Yeah.

[14:39.680 --> 14:43.360] And I think the, the Orion capsule is the only one that could do that.

[14:43.360 --> 14:48.400] So like the dragon capsule would really need to be redesigned or refitted to be a high

[14:48.400 --> 14:49.360] speed reentry.

[14:49.360 --> 14:51.520] That's yeah, that's, yeah, that's major.

[14:51.520 --> 14:53.200] You're not going to just slap on duct tape.

[14:53.200 --> 14:56.160] That's like a major event, major redesign.

[14:56.160 --> 14:56.960] I'm sure they could.

[14:56.960 --> 14:57.360] Yeah.

[14:57.360 --> 14:57.520] Yeah.

[14:57.520 --> 14:58.880] But I'm sure they could do it if they wanted to.

[14:58.880 --> 15:02.480] All right, Bob, um, you're going to start us off with a quickie.

[15:02.480 --> 15:05.200] You're going to tell us about Frank Drake.

[15:05.200 --> 15:06.320] Thank you, Steve.

Quickie with Bob: Frank Drake (15:00)

  • Frank Drake passes away [link_URL TITLE][1]

[15:06.320 --> 15:08.800] Hello and welcome to your Quickie with Bob.

[15:08.800 --> 15:12.800] No need to gird your loins for this one, Kara, but you may need a hanky.

[15:12.800 --> 15:17.600] We lost astrophysicist Frank Drake this past September 2nd, 2022.

[15:17.600 --> 15:18.880] He was 92.

[15:18.880 --> 15:19.680] Good run.

[15:19.680 --> 15:20.400] Nice run.

[15:20.400 --> 15:20.880] Good run.

[15:20.880 --> 15:24.960] We always say that when you're like, you know, in the eighties or nineties, but 92 is great.

[15:24.960 --> 15:30.240] I would, I would pay a good chunk of money right now if you can guarantee me, um, 92.

[15:30.240 --> 15:35.600] So he is most famous, of course, for his 1961 Drake equation.

[15:35.600 --> 15:39.920] I would love to have an equation named after me like that, which attempts to determine

[15:39.920 --> 15:44.560] the number of technological civilizations in our galaxy whose signals we could detect.

[15:45.200 --> 15:46.800] We talked about it on the show many times.

[15:46.800 --> 15:49.120] I won't go into any more detail on the equation itself.

[15:49.120 --> 15:51.120] We all know it by heart.

[15:51.120 --> 15:52.080] He did this now.

[15:52.080 --> 15:58.160] He did this after doing the very first modern SETI experiment in 1960 called Project

[15:58.160 --> 16:04.960] OSMA using a radio telescope to examine the stars, Tau Ceti and Epsilon Eridani, two names

[16:04.960 --> 16:06.560] of stars that I absolutely love.

[16:06.560 --> 16:09.040] I think it's what the Star Trek vibe or whatever.

[16:09.040 --> 16:10.240] I just love those names.

[16:10.240 --> 16:12.320] To me, there's just so science fictiony.

[16:12.320 --> 16:18.080] Now he used a part of the spectrum called the water hole, which is awesome on many levels

[16:18.080 --> 16:20.720] because it's near the hydrogen spectral lines.

[16:20.720 --> 16:21.360] Get it?

[16:21.360 --> 16:26.400] And it's also that, that part of the spectrum, the electromagnetic spectrum that's especially

[16:26.400 --> 16:31.600] quiet, and he reasoned that other intelligences would realize that as well and that it would

[16:31.600 --> 16:35.360] be a really good, efficient frequency to communicate over.

[16:36.000 --> 16:42.480] Now that experiment took two months and $2,000 in new equipment, and he essentially created

[16:42.480 --> 16:47.280] a new field by doing that SETI, the search for extraterrestrial intelligence.

[16:47.280 --> 16:51.760] From what I could tell, he did come up with that Drake equation to necessarily determine

[16:51.760 --> 16:56.320] the number of aliens that are out there, but as a way to stimulate discussions at the first

[16:56.320 --> 17:02.560] SETI meeting, because he was asked, hey, dude, because he became famous the year after he

[17:02.560 --> 17:02.960] did this.

[17:02.960 --> 17:09.120] He became well known the world over, and he was asked, hey, have this SETI conference,

[17:09.120 --> 17:10.560] the very first SETI conference.

[17:11.360 --> 17:17.280] He then came up with the Drake equation for that to stimulate discussions and thinking.

[17:17.280 --> 17:21.680] Drake's passion for astronomy and the possibility of life out there began when he was eight

[17:21.680 --> 17:26.000] years old, imagining alien Earths scattered across the night sky.

[17:26.000 --> 17:31.680] After his dad told him there were many worlds out there in space, and that was in 1938,

[17:31.680 --> 17:32.160] by the way.

[17:32.160 --> 17:32.960] Good on you, dad.

[17:33.760 --> 17:37.520] Seth Shostak said of him, Drake was never an impatient listener.

[17:37.520 --> 17:40.720] He was, to my mind, one of the last nice guys around.

[17:40.720 --> 17:44.800] He was never moody, never angry, and he didn't show the slightest annoyance if you walked

[17:44.800 --> 17:48.320] into his office and took his attention away from whatever he was doing.

[17:48.320 --> 17:53.920] And I read that over and over, people who had known him, that he was such a great, great

[17:53.920 --> 17:54.480] guy.

[17:54.480 --> 18:00.080] And I'll end with a quote from Nadia Drake, Frank's daughter, a titan in life, dad leaves

[18:00.080 --> 18:01.600] a titanic absence.

[18:02.720 --> 18:04.800] This was your sad quickie with Bob.

[18:04.800 --> 18:05.120] Thank you.

[18:06.000 --> 18:07.920] Yeah, it's always like a bittersweet, right?

[18:07.920 --> 18:14.000] It is sad to lose a giant like Frank Drake, but you're happy that he lived a long life.

[18:14.000 --> 18:16.560] He was relevant to the end.

[18:17.200 --> 18:23.440] Yeah, I mean, for centuries, as we're looking, and I think we'll never stop searching for

[18:23.440 --> 18:29.840] a life out there, his name will, and he will be in the thoughts of all the other big explorers

[18:29.840 --> 18:32.880] that haven't even been born yet that will be looking to the stars.

[18:32.880 --> 18:35.120] Bob, you and I are going to have to come up with our own equation.

[18:35.120 --> 18:36.400] It'll be the novella equation.

[18:36.400 --> 18:37.040] Yes, yes.

[18:37.040 --> 18:40.480] How about the probability that AI will wipe out human civilization?

[18:40.480 --> 18:41.920] Yes, all right.

[18:41.920 --> 18:42.640] Done.

[18:42.640 --> 18:43.440] We will do this.

[18:43.440 --> 18:44.800] That's too good not to happen.

[18:44.800 --> 18:45.040] All right.

[18:45.040 --> 18:45.840] All right.

[18:45.840 --> 18:46.640] Well, we'll work on it.

[18:46.640 --> 18:47.520] We'll come up with some other ideas.

[18:47.520 --> 18:48.880] That's horrible.

[18:48.880 --> 18:49.760] That's horrible.

News Items

S:

B:

C:

J:

E:

(laughs) (laughter) (applause) [inaudible]

follow up on the MOXIE instrument on Mars (18:51)

  • [link_URL TITLE][2]

[18:50.720 --> 18:51.440] All right, Jay.

[18:51.440 --> 18:55.520] You actually were kind of an astronomy space theme.

[18:56.480 --> 19:02.320] You're going to tell us, give us a follow up on the MOXIE instrument on Mars.

[19:02.320 --> 19:04.240] Yeah, do you guys remember MOXIE?

[19:04.240 --> 19:05.040] The oxygen thing?

[19:05.040 --> 19:06.640] MOXIE is creating oxygen on Mars right now.

[19:06.640 --> 19:07.040] Yeah.

[19:07.040 --> 19:08.000] Oh, okay.

[19:08.000 --> 19:08.800] Now I remember.

[19:08.800 --> 19:12.160] It's one of my favorite things about Perseverance.

[19:12.160 --> 19:17.760] So just to go through the basics so you guys understand, it's totally worth talking about

[19:17.760 --> 19:19.840] again because it's this fascinating technology.

[19:19.840 --> 19:25.920] It's an instrument about the size of a lunchbox that is connected to the Perseverance rover.

[19:25.920 --> 19:30.240] It happens to live in the front right side of Perseverance.

[19:30.240 --> 19:36.240] And its job is to take in the Martian atmosphere, which is 96% death, right?

[19:36.240 --> 19:38.640] It's 96% carbon dioxide.

[19:38.640 --> 19:41.840] And what it does is it strips the carbon atom away from the oxygen atoms.

[19:41.840 --> 19:43.200] I'll get into more detail about that.

[19:44.240 --> 19:48.640] And they're testing it and it's gone amazingly well.

[19:48.640 --> 19:50.480] So let me get into some details.

[19:50.480 --> 19:56.240] So first of all, MOXIE stands for Mars Oxygen In-Situ Resource Utilization Experiment.

[19:56.880 --> 20:00.320] And it couldn't be, the name is perfect for what this little bugger does.

[20:00.320 --> 20:02.880] So details about how it works.

[20:02.880 --> 20:08.560] So it takes in the Martian air and it filters it to remove any contaminants that happen

[20:08.560 --> 20:11.440] to be in there and dust particles, dirt and all that crap.

[20:11.440 --> 20:15.760] The air is then pressurized and then it's fed into something called the Solid Oxide

[20:15.760 --> 20:16.640] Electrolyzer.

[20:16.640 --> 20:19.280] That totally sounds like, what was that?

[20:19.280 --> 20:20.480] The encabulator, Steve?

[20:21.040 --> 20:22.720] Yeah, the turbo encabulator.

[20:22.720 --> 20:23.220] Yeah.

[20:23.840 --> 20:29.120] So the Solid Oxide Electrolyzer, this is an instrument that electrochemically splits the

[20:29.120 --> 20:33.120] carbon dioxide molecule into oxygen ions and carbon monoxide.

[20:33.680 --> 20:40.880] The oxygen ions are separated from the carbon monoxide and then they combine them to make

[20:40.880 --> 20:43.920] molecular oxygen, which is essentially just oxygen.

[20:43.920 --> 20:47.440] MOXIE then measures how much oxygen it creates, right?

[20:47.440 --> 20:54.000] So as it has created this batch, it measures how much it has just created and it also checks

[20:54.000 --> 20:56.720] its purity before it releases it back into the atmosphere.

[20:56.720 --> 21:00.320] And that's what MOXIE is doing right now, is just spitting this stuff right back out

[21:00.320 --> 21:01.200] into the atmosphere.

[21:01.920 --> 21:04.640] The single unit weighs, how much do you guys think this thing weighs?

[21:04.640 --> 21:05.140] Four pounds?

[21:06.160 --> 21:07.280] 21 kilos.

[21:07.280 --> 21:10.080] Not bad, 37 pounds or 17 kilos.

[21:10.080 --> 21:14.720] And now it does the work of a small tree, if you can believe that.

[21:14.720 --> 21:15.280] The unit was-

[21:15.280 --> 21:15.780] How small?

[21:16.640 --> 21:17.520] I'll get into detail.

[21:17.520 --> 21:21.760] A small tree, meaning I would say anything that's probably below 10 feet, like it's

[21:21.760 --> 21:23.520] a non-mature tree.

[21:24.160 --> 21:29.040] The unit was first turned on in February 2021 and every test they ran worked perfectly.

[21:29.040 --> 21:33.920] They ran seven experimental runs at different conditions, which now that I think about it,

[21:33.920 --> 21:39.360] of course they had to test it under different conditions because Mars can be so variable.

[21:39.360 --> 21:44.160] They first have to warm up MOXIE for a few hours because it's cold.

[21:44.160 --> 21:47.600] And then they run it for about an hour and then they shut it down.

[21:47.600 --> 21:49.680] And that's them just running a cycle test on it.

[21:50.800 --> 21:54.880] And what they do is they run it during the day, then they tested it at night, then they

[21:54.880 --> 22:00.640] tested it in different seasons because the temperature and the air pressure, the density

[22:00.640 --> 22:05.360] and the overall air temperature can vary a lot, like 100 degree shifts in temperature

[22:05.360 --> 22:08.080] depending on the season and time of day and all that.

[22:08.080 --> 22:13.600] So they haven't tested it during dawn and dusk because there are significant temperature

[22:13.600 --> 22:17.200] changes that happen during those times and they just want to like do preliminary testing

[22:17.200 --> 22:19.280] and then they're going to get into the more advanced testing.

[22:19.280 --> 22:25.120] But so far, every scenario that they put it through, it worked fantastically well.

[22:25.120 --> 22:28.880] It produces six grams of oxygen per hour.

[22:29.440 --> 22:32.720] So this is equal to, as I said, a small tree on Earth.

[22:32.720 --> 22:38.320] MOXIE is the first, it's the first thing that we've put on another planet that does what

[22:38.320 --> 22:38.800] guys?

[22:38.800 --> 22:41.280] Creates oxygen.

[22:41.280 --> 22:46.640] Well, more importantly, it's the first thing that ever used local resources and manufactured

[22:46.640 --> 22:48.400] them into something that's usable.

[22:48.400 --> 22:48.880] That's cool.

[22:50.000 --> 22:50.720] Very cool.

[22:50.720 --> 22:53.040] That is like, that is a milestone here.

[22:53.040 --> 22:57.680] It's incredibly useful because it could save, I'm going to start off by saying millions,

[22:57.680 --> 23:01.840] but after hearing Steve talk about how expensive these missions are, it could save billions

[23:01.840 --> 23:07.600] of dollars or more quintillions in cost to ship oxygen to Mars, right?

[23:07.600 --> 23:11.680] Think about it because we'd have to ship frequently ship a lot of oxygen to Mars.

[23:11.680 --> 23:13.600] That stuff is heavy, by the way.

[23:13.600 --> 23:14.880] Sorry, the launch is late, guys.

[23:14.880 --> 23:16.080] Hold your breath for a week.

[23:16.080 --> 23:20.400] The current version of MOXIE was made small deliberately so it could fit on perseverance

[23:20.400 --> 23:26.080] and it wasn't built to run continuously, but in its current form, it has proven to be very

[23:26.080 --> 23:29.840] efficient, which is very important because it won't use a lot of energy and it's reliable.

[23:29.840 --> 23:35.600] The next big test for MOXIE is to run it when the atmosphere is at its densest and they

[23:35.600 --> 23:37.920] plan to run it for as long as possible now.

[23:37.920 --> 23:40.960] They're just going to let that little bugger keep chugging along and just see what happens

[23:40.960 --> 23:43.520] to it because that'll teach us more about what to do.

[23:43.520 --> 23:48.160] Since MOXIE has to be turned on and then it has to be heated up and then they turn it

[23:48.160 --> 23:50.720] off, it goes through something called thermal stress, right?

[23:50.720 --> 23:54.240] Temperature goes up and the metal and parts expand and do what they're going to do and

[23:54.240 --> 23:56.320] then when it cools off, it shrinks back down.

[23:56.320 --> 24:01.600] Now since MOXIE is able to handle thermal stress, the researchers say that a new larger

[24:01.600 --> 24:06.720] system, MOXIE on steroids, would be able to last a very long time since it won't be

[24:06.720 --> 24:12.240] experiencing anywhere near the number of thermal stresses that MOXIE has already proven to

[24:12.240 --> 24:12.960] go through.

[24:12.960 --> 24:18.880] I know they've only tested it seven times, but that's a lot and they could turn on the

[24:18.880 --> 24:23.680] larger version of it and it may never turn off until its end of life cycle.

[24:23.680 --> 24:24.480] It just does what it does.

[24:24.480 --> 24:25.520] Just let it run, yeah.

[24:25.520 --> 24:30.560] Yeah, the larger version of MOXIE could be placed on Mars way before we put humans there.

[24:30.560 --> 24:32.800] It could be producing oxygen for a long time.

[24:32.800 --> 24:35.600] There could be a whole cache of oxygen ready to go.

[24:36.560 --> 24:42.720] The new unit, of course, they want it to run continuously and it could make approximately

[24:43.360 --> 24:46.400] several hundred trees worth of oxygen per day.

[24:46.400 --> 24:47.120] Per day?

[24:47.120 --> 24:47.600] Yes.

[24:47.600 --> 24:48.160] Not bad.

[24:48.160 --> 24:48.640] How many?

[24:48.640 --> 24:49.440] How much is that?

[24:49.440 --> 24:53.920] As a point of reference, Kara, I'm going to tell you a single person needs about seven

[24:53.920 --> 24:56.240] to eight trees worth of oxygen a day.

[24:56.240 --> 24:57.600] Oh, damn.

[24:57.600 --> 24:58.800] That's a lot.

[24:58.800 --> 24:59.040] Yep.

[24:59.600 --> 25:05.120] But, you know, if you do the math, you know, several hundred trees divided by eight turns

[25:05.120 --> 25:09.200] into quite a good crew of people there that the machine could keep alive.

[25:09.200 --> 25:12.240] And who says that they don't put two or three MOXIE machines?

[25:12.240 --> 25:13.840] Yeah, they want some redundancy.

[25:14.720 --> 25:18.160] The great thing about oxygen is what, first, it keeps us alive.

[25:18.720 --> 25:23.040] And the second great thing is that it, of course, can be used as fuel because we need

[25:23.040 --> 25:24.800] fuel to get off the surface of Mars.

[25:24.800 --> 25:29.600] And oxygen is a primary component in fuel, you know, in chemical fuel.

[25:29.600 --> 25:32.560] So thank you, MOXIE, for working.

[25:32.560 --> 25:35.680] So the carbon monoxide is useful, too.

[25:35.680 --> 25:36.000] Oh, yeah.

[25:36.000 --> 25:38.640] Don't discount the carbon monoxide is a high energy molecule.

[25:38.640 --> 25:42.880] And that's feedstock for things like hydrocarbons.

[25:42.880 --> 25:44.640] So all that you need is hydrogen.

[25:44.640 --> 25:49.360] If we could get a source of hydrogen on Mars, then you can combine the hydrogen with the

[25:49.360 --> 25:51.120] carbon monoxide to make methane.

[25:51.120 --> 25:55.120] The hydrogen, obviously, itself could be burned with the oxygen as rocket fuel.

[25:55.920 --> 25:58.000] And there are sources of hydrogen on Mars.

[25:58.000 --> 26:03.360] There are there's a lot of water on Mars and so not all of it is in drinkable form.

[26:03.360 --> 26:09.360] There are like what they call perchlorate brines, which is a lot of hydroxyl groups,

[26:09.360 --> 26:13.920] a lot of a lot of water type, you know, molecular groups in there.

[26:13.920 --> 26:14.880] Water-ish.

[26:14.880 --> 26:18.880] Yeah, well, hydrogen and oxygen, but it's not necessarily drinkable water.

[26:18.880 --> 26:23.840] But you get that you split the hydrogen off, you have pure hydrogen, you have more oxygen,

[26:23.840 --> 26:26.800] you could make fuel, you have oxygen to burn with the fuel.

[26:26.800 --> 26:30.560] We definitely are going to need to be able to make all of our fuel for the return trip

[26:30.560 --> 26:31.600] locally on Mars.

[26:31.600 --> 26:33.280] You can't carry all that crap with you.

[26:33.280 --> 26:36.000] Yeah, rocket equation will kill you if you try to do that.

[26:36.000 --> 26:42.400] So and then if and then if we could find a source of nitrogen on Mars, then we also have

[26:42.400 --> 26:45.600] our fertilizer to grow our own food there.

[26:45.600 --> 26:50.160] And there is nitrogen on Mars already fixed in the form of nitrates.

[26:50.160 --> 26:56.160] So, yeah, the bottom line is pretty much we have everything we need on Mars, you know,

[26:56.160 --> 26:57.840] for food, oxygen and water.

[26:57.840 --> 26:59.600] Except for the hamburger molecules.

[26:59.600 --> 27:05.280] Well, yeah, but you just grow the food and then you raise the animals and slaughter them.

[27:05.280 --> 27:08.080] And then you have your hamburger.

[27:08.080 --> 27:10.080] Thank you, Dr. Strangelove.

[27:10.080 --> 27:12.880] Just make some lab-grown meat, that'd be fine.

[27:12.880 --> 27:14.400] Yeah, there you go, lab-grown meat.

do people like you more if you talk more or talk less (27:16)

  • [link_URL TITLE][3]

[27:14.400 --> 27:21.120] All right, Kara, tell us, do people like you more if you talk more or talk less?

[27:21.120 --> 27:22.480] Let's get into it.

[27:22.480 --> 27:23.480] Let's talk about it.

[27:23.480 --> 27:27.360] When you say you, do you mean someone specifically or people in general?

[27:27.360 --> 27:28.360] People in general.

[27:28.360 --> 27:29.360] Yeah.

[27:29.360 --> 27:33.440] So I guess that is the important question, and that is the question that some researchers

[27:33.440 --> 27:41.120] from a little place called Harvard and the University of Virginia wanted to know.

[27:41.120 --> 27:45.880] So they have a new empirical paper that was published in Personality and Social Psychology

[27:45.880 --> 27:49.600] Bulletin called Speak Up!

[27:49.600 --> 27:53.000] Mistaken beliefs about how much to talk in conversations.

[27:53.000 --> 27:57.080] And as the title implies, I probably shouldn't have said that out loud, very often people

[27:57.080 --> 28:02.160] make judgments that are not reflective of reality about how much they should speak based

[28:02.160 --> 28:04.800] on what their outcome goals are.

[28:04.800 --> 28:12.080] So they wanted to know if somebody wants to be liked, how much do they think they should

[28:12.080 --> 28:13.480] talk in a conversation?

[28:13.480 --> 28:19.200] If somebody wants to be or to seem important, how much do they think they should talk in

[28:19.200 --> 28:20.360] a conversation?

[28:20.360 --> 28:25.200] And finally, if somebody just wants to enjoy the conversation, how much do they think they

[28:25.200 --> 28:26.200] should talk?

[28:26.200 --> 28:30.000] And they used a couple of different paradigms to look at this, like most psychology experiments

[28:30.000 --> 28:33.280] they kind of ran it a few different ways to ask different questions.

[28:33.280 --> 28:37.800] So first, I guess I'm curious from all of you, what do you think across those three

[28:37.800 --> 28:38.800] different parameters?

[28:38.800 --> 28:44.080] If somebody wants to be liked, what percentage of time do you think that they will think

[28:44.080 --> 28:45.080] that they should talk?

[28:45.080 --> 28:49.760] Do they think or should they do this in a one on one conversation?

[28:49.760 --> 28:50.760] What's the environment?

[28:50.760 --> 28:51.760] One on one.

[28:51.760 --> 28:52.760] So these are diets.

[28:52.760 --> 28:53.760] Yeah.

[28:53.760 --> 28:57.720] So with a partner, how much and again, not how much should they talk to be liked?

[28:57.720 --> 29:00.720] How much do they think they should talk to be liked?

[29:00.720 --> 29:01.720] 30%.

[29:01.720 --> 29:08.160] Well, no, are minutes, are we measuring, are we, 50% of time, 31% you think 50% you think

[29:08.160 --> 29:16.600] 30% definitely lower than 50% probably 30% for Evan says 30%.

[29:16.600 --> 29:17.600] Okay.

[29:17.600 --> 29:18.600] So it's interesting.

[29:18.600 --> 29:23.480] Evan, you said before you said 30%, you said definitely lower than 50%.

[29:23.480 --> 29:24.800] That's what the researchers thought as well.

[29:24.800 --> 29:30.920] They were like, okay, people whose goal is to be liked very often, and this bears out

[29:30.920 --> 29:34.720] in the literature, very often think they should talk less than half the time.

[29:34.720 --> 29:39.800] Cause it's a, it's a gesture of politeness and courtesy in a, in a cultural sense.

[29:39.800 --> 29:42.720] People love to talk about themselves.

[29:42.720 --> 29:44.360] And they actually have a name for this.

[29:44.360 --> 29:46.160] They call it the reticence bias.

[29:46.160 --> 29:47.160] Reticence.

[29:47.160 --> 29:48.160] Very good.

[29:48.160 --> 29:49.160] Yeah.

[29:49.160 --> 29:53.920] As they say, the reticence bias we suggest is rooted in the fact that people lack confidence

[29:53.920 --> 29:56.000] in their conversational abilities.

[29:56.000 --> 30:00.520] They go on to talk about social anxiety, but they also, um, and that the people are hard

[30:00.520 --> 30:05.160] on themselves and that they don't get a lot of valid external feedback.

[30:05.160 --> 30:10.000] Um, but generally speaking, people think that if they listen more than they talk, they're

[30:10.000 --> 30:11.880] going to be liked more.

[30:11.880 --> 30:13.060] Here's the rub.

[30:13.060 --> 30:15.980] That's not the case.

[30:15.980 --> 30:22.480] So second paradigm do if people's goal is not to be liked, but it's to seem interesting.

[30:22.480 --> 30:23.480] What do you think?

[30:23.480 --> 30:25.560] Well, then I would think talk more, right?

[30:25.560 --> 30:26.560] You would think that, right?

[30:26.560 --> 30:28.280] Then they would want to talk more.

[30:28.280 --> 30:32.000] And that's exactly how it bared out with their, with their study.

[30:32.000 --> 30:37.760] People when told that the goal was to seem interesting, did talk more in the dyad because

[30:37.760 --> 30:40.240] they thought, or they, they selected that option.

[30:40.240 --> 30:44.120] I should talk more in the dyad because that will make me seem more interesting.

[30:44.120 --> 30:50.080] And then finally, if they wanted to just enjoy themselves in the conversation, what do you

[30:50.080 --> 30:51.080] think?

[30:51.080 --> 30:52.080] Don't talk at all.

[30:52.080 --> 30:53.080] Even.

[30:53.080 --> 30:54.080] Even.

[30:54.080 --> 30:55.080] Yeah.

[30:55.080 --> 30:56.080] 50 50.

[30:56.080 --> 30:57.080] So.

[30:57.080 --> 30:58.520] That's the most important question.

[30:58.520 --> 31:06.200] But what if your goal is to get laid, that probably is the most important.

[31:06.200 --> 31:07.560] That is their follow up study.

[31:07.560 --> 31:09.840] Well, you got to do all of them though.

[31:09.840 --> 31:10.840] You got to be liked.

[31:10.840 --> 31:15.480] You got to be interesting thing at all at the same time.

[31:15.480 --> 31:16.480] And enjoy yourself.

[31:16.480 --> 31:17.480] Yeah.

[31:17.480 --> 31:19.400] So actually, interestingly, something will bear out from that.

[31:19.400 --> 31:24.120] So basically they, um, they did a couple of studies where they were forecasting.

[31:24.120 --> 31:27.320] So they, you know, this is very classic psychology study.

[31:27.320 --> 31:32.120] So they used a mechanical Turk and Qualtrics and they asked some questions.

[31:32.120 --> 31:34.880] Let me go to study one.

[31:34.880 --> 31:41.520] And they asked some questions where they imagined having a conversation with different conversational

[31:41.520 --> 31:43.320] prompts.

[31:43.320 --> 31:47.000] And then they were asked, you know, how much should they, how much did they think they

[31:47.000 --> 31:48.000] should talk?

[31:48.000 --> 31:50.240] Because remember, this is not about whether or not they did.

[31:50.240 --> 31:54.360] It's about their self kind of assessment of should they talk more, talk less.

[31:54.360 --> 31:55.840] And those were the outcomes, right?

[31:55.840 --> 32:03.520] I think it was, uh, let's see if the goal was to be liked, people said on average that

[32:03.520 --> 32:12.520] they should speak 43% of the time, whereas their partner should speak 56% of the time.

[32:12.520 --> 32:16.440] If the goal was to be interesting, they said they themselves should speak about 57 and

[32:16.440 --> 32:20.320] a half percent of the time, whereas their partner should speak 42% of the time.

[32:20.320 --> 32:24.740] And if the goal was to enjoy themselves, it was right around 50, 50.

[32:24.740 --> 32:30.960] And then they repeated the study and found very similar outcomes.

[32:30.960 --> 32:35.520] And then they actually kind of forced the hand and made people talk a certain amount

[32:35.520 --> 32:36.520] of time.

[32:36.520 --> 32:37.520] Right?

[32:37.520 --> 32:38.680] Like, of course that would be the followup study.

[32:38.680 --> 32:42.400] It was a little bit arbitrary and a little bit fake because they used like a computer

[32:42.400 --> 32:45.560] would cue them and say, talk now, now you talk, now you talk.

[32:45.560 --> 32:51.120] But it actually measured it out at these certain points, like 30%, 40%, 50%, 60% or 70% of

[32:51.120 --> 32:52.120] the time.

[32:52.120 --> 32:53.800] What do you think happened to that?

[32:53.800 --> 32:54.840] It worked.

[32:54.840 --> 32:55.840] What worked?

[32:55.840 --> 33:02.360] They were, they achieved the goal of like, you know, enjoying themselves more or being

[33:02.360 --> 33:03.360] liked more.

[33:03.360 --> 33:04.360] Interesting.

[33:04.360 --> 33:05.360] It worked.

[33:05.360 --> 33:06.360] Yeah.

[33:06.360 --> 33:10.680] So, so basically that the outcomes would have followed the intentions.

[33:10.680 --> 33:14.200] Like if they thought they wanted to be liked and they spoke less than they were liked more.

[33:14.200 --> 33:16.160] No, the opposite.

[33:16.160 --> 33:17.160] They were wrong.

[33:17.160 --> 33:18.160] The opposite happened.

[33:18.160 --> 33:19.160] Yeah.

[33:19.160 --> 33:20.160] Their initial assessment was wrong.

[33:20.160 --> 33:21.160] It's kind of neither.

[33:21.160 --> 33:27.240] It sort of doesn't follow the most clear pattern, but it looks like it's a little bit bimodal.

[33:27.240 --> 33:32.760] The more important outcome that the researchers point to is something that they call, they

[33:32.760 --> 33:34.320] call it halo ignorance.

[33:34.320 --> 33:37.520] And to understand halo ignorance, you have to first understand the halo effect.

HALO Effect (33:27)

[33:37.520 --> 33:40.720] Do you guys, have you ever heard of the halo effect in social psychology?

[33:40.720 --> 33:41.720] Okay.

[33:41.720 --> 33:42.720] It's pretty interesting.

[33:42.720 --> 33:49.120] I'm going to describe it because I had experience demonstrating it when I worked on brain games.

[33:49.120 --> 33:54.480] So this was a really fun episode that we did where I got to go to this like fake art gallery

[33:54.480 --> 33:59.760] and I played the gallerist and opposite me was Colin Hanks.

[33:59.760 --> 34:01.200] You guys know Colin Hanks, right?

[34:01.200 --> 34:02.200] Tom Hanks' son.

[34:02.200 --> 34:03.200] Colin.

[34:03.200 --> 34:04.200] Yes.

[34:04.200 --> 34:05.200] Oh, with all the tattoos.

[34:05.200 --> 34:06.200] He was in the offer.

[34:06.200 --> 34:07.200] He was in the offer.

[34:07.200 --> 34:08.200] No, he doesn't have a lot of tattoos.

[34:08.200 --> 34:09.200] Oh, maybe the other son.

[34:09.200 --> 34:12.200] No, he's like a very famous actor who looks a lot like Tom Hanks.

[34:12.200 --> 34:13.200] I see.

[34:13.200 --> 34:14.200] Yeah.

[34:14.200 --> 34:15.200] Yeah.

[34:15.200 --> 34:16.200] And has been in, yeah, he was in band of brothers.

[34:16.200 --> 34:17.200] He was in a ton of stuff.

[34:17.200 --> 34:23.760] So he played himself in one paradigm where he had all, we set the art gallery up with

[34:23.760 --> 34:27.360] a bunch of art and in one paradigm he was like, Hey, I'm Colin Hanks.

[34:27.360 --> 34:29.160] I've decided my hand at art.

[34:29.160 --> 34:32.320] I really interested in you guys as you know, feedback, blah, blah, blah.

[34:32.320 --> 34:36.980] And then we got their, their feedback, not to his face where they talked about whether

[34:36.980 --> 34:39.060] or not they enjoyed his art.

[34:39.060 --> 34:44.100] And then in another paradigm, we dressed him up as this alter ego named Giannis Patch.

[34:44.100 --> 34:48.280] And he had like a soul patch and was dressed really weird and was like super aggressive

[34:48.280 --> 34:51.320] and like not nice to anybody.

[34:51.320 --> 34:54.600] Like he was very holier than thou and just played this awful character that nobody got

[34:54.600 --> 34:55.600] along with.

[34:55.600 --> 34:56.600] And what do you think happened?

[34:56.600 --> 35:01.360] They liked Colin Hanks' art and they hated Giannis Patch's art, even though it was the

[35:01.360 --> 35:02.360] same art.

[35:02.360 --> 35:03.360] Wow.

[35:03.360 --> 35:04.360] Right.

[35:04.360 --> 35:10.840] So the halo effect really is this idea that there are global features that come together.

[35:10.840 --> 35:14.040] So if you like the person, you like the person's art.

[35:14.040 --> 35:16.000] If you dislike the person, you dislike the person's art.

[35:16.000 --> 35:17.960] Those things aren't independent of each other.

[35:17.960 --> 35:21.120] And so what the researchers here are saying is that there's, there's probably something

[35:21.120 --> 35:22.400] called halo ignorance.

[35:22.400 --> 35:24.760] Most people aren't aware of the halo effect.

[35:24.760 --> 35:29.920] And so they don't realize that it's not so simple that I want to be liked, but I want

[35:29.920 --> 35:33.360] to seem, I want to be liked here and I want to seem important here.

[35:33.360 --> 35:35.780] They're not mutually exclusive.

[35:35.780 --> 35:40.600] Generally speaking, if somebody in a conversation has a positive effect on somebody, you're

[35:40.600 --> 35:42.640] going to like them and find them interesting.

[35:42.640 --> 35:47.400] If they have a negative effect, you're going to dislike them and find them disinteresting.

[35:47.400 --> 35:48.400] But most people-

[35:48.400 --> 35:50.400] And your breath smells too.

[35:50.400 --> 35:51.400] Exactly.

[35:51.400 --> 35:52.400] Okay.

[35:52.400 --> 35:58.140] So most people are ignorant of the halo effect, which is probably why they estimate that they

[35:58.140 --> 36:01.860] have to speak differently for different outcomes.

[36:01.860 --> 36:07.740] But the truth of the matter is that basically the big takeaway, because the, the final study

[36:07.740 --> 36:09.860] outcomes are a little all over the place.

[36:09.860 --> 36:10.860] They're not super clean.

[36:10.860 --> 36:14.080] Like the less you talk, the less you're like, the more you talk, the more you're like, it's

[36:14.080 --> 36:15.420] not really that clean.

[36:15.420 --> 36:23.360] It's basically that generally speaking, if you're in the 30% or 40% group, you're kind

[36:23.360 --> 36:26.080] of not liked or interesting.

[36:26.080 --> 36:31.160] Like if you don't talk that much, you don't get this good feedback.

[36:31.160 --> 36:36.880] So most people think I need to be quiet and be a good listener in a dyad and then I'm

[36:36.880 --> 36:38.600] going to be more well liked.

[36:38.600 --> 36:41.660] But the truth is that's, that doesn't bear out.

[36:41.660 --> 36:47.820] If you talk way too much, we start to see diminishing returns, so it's sort of somewhere

[36:47.820 --> 36:48.820] in the middle.

[36:48.820 --> 36:54.560] Kara, the halo effect reminds me of the Oscars because I, right?

[36:54.560 --> 37:00.820] I always had the feeling that like the, the movie that wins like best art direction.

[37:00.820 --> 37:04.800] Was that really the movie that had the best art direction or was that just the favorite

[37:04.800 --> 37:07.360] movie that was nominated for the, for best art?

[37:07.360 --> 37:08.360] Yeah.

[37:08.360 --> 37:09.360] What was it?

[37:09.360 --> 37:13.200] Was it one of the eight movies that got nominated for everything because there's like only certain

[37:13.200 --> 37:15.200] movies that were quote, Oscar worthy.

[37:15.200 --> 37:19.660] Did that movie really have the best editing and costuming and all the other little technical

[37:19.660 --> 37:22.760] things just because it was a popular movie?

[37:22.760 --> 37:23.760] You don't, yeah.

[37:23.760 --> 37:24.760] It just boggles the imagination.

[37:24.760 --> 37:25.760] Completely agree.

[37:25.760 --> 37:26.880] Completely agree.

[37:26.880 --> 37:35.400] I had a friend once who used to call the Oscars rich people prom and nary was their true statement.

[37:35.400 --> 37:42.840] The difference between a person perceived wanting to be liked versus that same person

[37:42.840 --> 37:46.800] perceiving to be either polite or courteous.

[37:46.800 --> 37:50.240] We talking, we're splitting hairs, are they two very different things?

[37:50.240 --> 37:51.240] It wasn't the paradigm.

[37:51.240 --> 37:52.240] So I can't tell you that.

[37:52.240 --> 37:53.240] You know what I mean?

[37:53.240 --> 37:54.840] They did not ask that question.

[37:54.840 --> 37:57.240] They asked three very simple questions.

[37:57.240 --> 37:58.240] Okay.

[37:58.240 --> 38:01.840] Do you want to be like, you know, if you want to be liked, how often will you talk?

[38:01.840 --> 38:03.880] If you want to seem interesting, how often will you talk?

[38:03.880 --> 38:05.720] If you want to enjoy the conversation, how often will you talk?

[38:05.720 --> 38:12.720] But I wonder if a person can confuse the desire to be liked with the desire to be accommodating

[38:12.720 --> 38:17.800] or you know, courteous to the other person by speaking less.

[38:17.800 --> 38:23.520] Is it, you know, can that be, can those two things be confused and therefore that's why

[38:23.520 --> 38:28.080] people would think that, yeah, I should probably talk less, not so much to be liked, but just

[38:28.080 --> 38:29.680] to give for consideration.

[38:29.680 --> 38:30.680] Right.

[38:30.680 --> 38:33.840] But, but you're basically talking about constructs, right?

[38:33.840 --> 38:36.560] Like these aren't, these aren't actual things that exist.

[38:36.560 --> 38:39.880] Being liked is not specifically different from being courteous.

[38:39.880 --> 38:43.920] It's all in how you construct that reality, how you interpret it.

[38:43.920 --> 38:48.240] So if your definition of being liked is that you're very courteous, that's how you're

[38:48.240 --> 38:50.240] going to view that when you do this study.

[38:50.240 --> 38:51.240] Okay.

[38:51.240 --> 38:52.240] All right.

[38:52.240 --> 38:54.600] So you can totally interpret that differently.

[38:54.600 --> 38:56.680] And that's the interesting thing about psychology.

[38:56.680 --> 39:01.480] It's why we have to always operationally define everything because nothing, there's no like

[39:01.480 --> 39:04.960] fundamental truth to the idea of being liked.

[39:04.960 --> 39:05.960] It's how we define it.

[39:05.960 --> 39:06.960] Right.

[39:06.960 --> 39:07.960] Right.

[39:07.960 --> 39:08.960] Yes.

[39:08.960 --> 39:09.960] It does seem nebulous.

[39:09.960 --> 39:10.960] Yeah.

[39:10.960 --> 39:12.440] And there's probably, sorry, there's probably a ton of crossover there.

[39:12.440 --> 39:17.640] If I want to seem likable, I might have different factors that if I were to ask, if I were to

[39:17.640 --> 39:22.400] sit down in a separate study and the researchers were to say, what are the five features of

[39:22.400 --> 39:24.920] likability that are most important?

[39:24.920 --> 39:27.960] My list might be different than your list, but I wouldn't be surprised if there was a

[39:27.960 --> 39:30.680] lot of crossover among people.

[39:30.680 --> 39:35.680] And so courteousness is probably one of those things.

[39:35.680 --> 39:37.200] So I wouldn't say it's a conflation.

[39:37.200 --> 39:40.120] I would say it's part of it.

[39:40.120 --> 39:42.880] And the United States, is this a US study?

[39:42.880 --> 39:43.880] Yeah, absolutely.

[39:43.880 --> 39:44.880] This is a US study.

[39:44.880 --> 39:48.640] So we've got to remember that there's always a massive culture bias in these kinds of studies.

[39:48.640 --> 39:53.160] And this really only applies to the group of college students that they were looking

[39:53.160 --> 39:54.160] at.

[39:54.160 --> 39:59.680] But they do, like any good paper, cite the available research literature, show areas

[39:59.680 --> 40:05.120] where this reinforces things that have already been studied and basically move the needle

[40:05.120 --> 40:08.840] that much more because there is a body of literature around this.

[40:08.840 --> 40:14.000] But so the cool thing is basically the big two outcomes here are the reticence bias seems

[40:14.000 --> 40:15.640] to exist.

[40:15.640 --> 40:20.400] People generally think they are reticent to speak because they think that they will be

[40:20.400 --> 40:23.200] liked less when they speak more.

[40:23.200 --> 40:24.600] And that does not bear out.

[40:24.600 --> 40:29.240] The truth is you're actually liked more if you speak more up to a point.

[40:29.240 --> 40:30.320] And yeah, up to a point.

[40:30.320 --> 40:35.920] And then the other one, yeah, they did show that with but it's actually it's not a huge

[40:35.920 --> 40:40.160] effect like you it does start to have diminishing returns at the 70%.

[40:40.160 --> 40:42.000] And they didn't look at anything over 70%.

[40:42.000 --> 40:46.240] They didn't look at the extremes like 90% of the time.

[40:46.240 --> 40:54.800] But interestingly, being interesting plateaus more so basically that are liked and are interesting

[40:54.800 --> 40:59.120] and enjoy themselves less when they don't speak that much.

[40:59.120 --> 41:03.920] It all kind of peaks around the 50% so about when you're even and then you see a diminishing

[41:03.920 --> 41:08.920] return that's the heaviest on enjoyment.

[41:08.920 --> 41:15.920] So basically, you're seeing that there is a sort of bimodal distribution, but the researchers

[41:15.920 --> 41:19.960] make a kind of blanket statement, which I think is kind of a good statement, which is

[41:19.960 --> 41:22.840] that all things being equal, you should talk.

[41:22.840 --> 41:27.560] Because all things being equal, you're actually going to have a better outcome from talking

[41:27.560 --> 41:29.700] more than talking less.

[41:29.700 --> 41:34.440] And that flies in the face of most people's preconceived notions, which is often the case,

[41:34.440 --> 41:37.940] you know, people usually misjudge.

[41:37.940 --> 41:44.120] It's interesting how bad we are at intuitively like thinking about how our behavior affects

[41:44.120 --> 41:48.120] our relationships, you know, people do things that have the opposite effect of what they

[41:48.120 --> 41:49.120] want.

[41:49.120 --> 41:50.120] Absolutely.

[41:50.120 --> 41:51.120] Yeah.

[41:51.120 --> 41:52.120] People shoot themselves in the foot all the time.

[41:52.120 --> 41:57.040] It's also I mean, that's a fundamental part of exactly what Jay mentioned in the ad this

[41:57.040 --> 42:00.020] week of cognitive behavioral therapy.

[42:00.020 --> 42:06.360] It's recognizing these biases and recognizing all of the times that we we act in a way that

[42:06.360 --> 42:11.480] we think is going to achieve a goal when actually it gives us the opposite outcome or an outcome

[42:11.480 --> 42:13.160] that we weren't looking for.

[42:13.160 --> 42:14.160] Right.

[42:14.160 --> 42:17.320] That's why studies like this are important.

[42:17.320 --> 42:18.320] All right.

[42:18.320 --> 42:19.320] Thanks, Kara.

[42:19.320 --> 42:22.400] Well, everyone, we're going to take a quick break from our show to talk about our sponsor

[42:22.400 --> 42:23.880] this week, Wondrium.

[42:23.880 --> 42:26.600] Guys, we talk about therapy a lot on the show.

[42:26.600 --> 42:31.740] And I found a course on Wondrium that is called cognitive behavioral therapy techniques for

[42:31.740 --> 42:33.240] retraining your brain.

[42:33.240 --> 42:37.600] So it goes over the foundations of cognitive behavioral therapy, which, by the way, is

[42:37.600 --> 42:42.820] basically one of if not the best technique that you can learn in therapy to help yourself

[42:42.820 --> 42:46.080] get out of your anxiety and your depression.

[42:46.080 --> 42:48.780] They talk about setting your own goals.

[42:48.780 --> 42:50.800] They talk about dealing with stress.

[42:50.800 --> 42:54.760] They talk about dealing with anxiety and fear, how to treat your depression.

[42:54.760 --> 42:57.800] They have 24 chapters in this course.

[42:57.800 --> 42:59.760] And I really do recommend this course.

[42:59.760 --> 43:03.360] And get this, Wondrium will help you learn about pretty much anything you're curious

[43:03.360 --> 43:08.240] about from history to science to language to travel, even learning how to cook.

[43:08.240 --> 43:13.440] You get unlimited access to thousands of hours of trustworthy audio and video courses, documentaries,

[43:13.440 --> 43:15.520] tutorials and so much more.

[43:15.520 --> 43:19.840] And you can learn wherever and whenever that works for you because of the flexibility of

[43:19.840 --> 43:20.840] the platform.

[43:20.840 --> 43:25.640] We highly recommend signing up for Wondrium and Wondrium is offering our listeners a free

[43:25.640 --> 43:27.820] month of unlimited access.

[43:27.820 --> 43:31.120] Sign up today through our special URL to get this offer.

[43:31.120 --> 43:34.200] Go to Wondrium.com slash skeptics.

[43:34.200 --> 43:42.520] Again, that's W-O-N-D-R-I-U-M.com slash skeptics.

[43:42.520 --> 43:44.720] All right, guys, let's get back to the show.

Electromagnetic Dynamic Resonance (43:45)

  • [link_URL TITLE][4]

[43:44.720 --> 43:47.160] Guys, what do you know about EMDR?

[43:47.160 --> 43:49.520] Oh, EMDR.

[43:49.520 --> 43:50.520] That's a new boy band.

[43:50.520 --> 43:51.520] I know a lot about that.

[43:51.520 --> 43:52.520] South Korea.

[43:52.520 --> 43:55.520] Electromagnetic Dynamic Resonance?

[43:55.520 --> 43:56.520] What?

[43:56.520 --> 44:01.560] Eye movement and desensitization and reprocessing.

[44:01.560 --> 44:04.080] Eye movement, desensitization and reprocessing.

[44:04.080 --> 44:05.080] Oh, yeah, that too.

[44:05.080 --> 44:06.080] Yeah.

[44:06.080 --> 44:07.080] All right.

[44:07.080 --> 44:10.600] So, Kara, do you know when it was developed and how?

[44:10.600 --> 44:12.280] Maybe by the VA.

[44:12.280 --> 44:13.280] Probably not.

[44:13.280 --> 44:14.280] But they use it.

[44:14.280 --> 44:21.720] In 1987, a PhD psychologist, Francine Shapiro, was walking through the park when she realized

[44:21.720 --> 44:29.920] that her eye movements, looking around at the trees, reduced her anxiety and depression.

[44:29.920 --> 44:31.480] And that was it.

[44:31.480 --> 44:38.000] EMDR was born, a single subjective personal observation.

[44:38.000 --> 44:44.800] She said, maybe the eye movements itself is making me feel less anxious and depressed.

[44:44.800 --> 44:45.800] And that was it.

[44:45.800 --> 44:46.880] That's like the entire base of it.

[44:46.880 --> 44:53.120] It did not come out of any basic science or any neuroscience or any thinking about how

[44:53.120 --> 44:55.080] the brain works or how depression works or anything.

[44:55.080 --> 44:58.080] It was just that naked observation.

[44:58.080 --> 45:05.360] Now that is reminiscent of a lot of medical pseudosciences where a single quirky observation

[45:05.360 --> 45:11.480] is the entire basis of the whole thing, like iridology and chiropractic, et cetera.

[45:11.480 --> 45:12.800] Yeah, so that was the origin.

[45:12.800 --> 45:14.200] Doesn't mean it doesn't work.

[45:14.200 --> 45:18.000] Is it possible that she made an observation that actually was based on reality?

[45:18.000 --> 45:20.520] Well, that's how science often is started.

[45:20.520 --> 45:21.520] Yeah, it's okay.

[45:21.520 --> 45:25.200] You make an observation, then you test it.

[45:25.200 --> 45:30.760] It's fine as a method of hypothesis generation, but not hypothesis testing, right?

[45:30.760 --> 45:35.040] You can't conclude that it's real because you made an anecdotal observation.

[45:35.040 --> 45:36.400] All right.

[45:36.400 --> 45:39.600] So that was, what, 35 years ago.

[45:39.600 --> 45:45.560] So there's been 35 years of research into EMDR, mainly for PTSD, post-traumatic stress

[45:45.560 --> 45:46.560] disorder.

[45:46.560 --> 45:51.360] That's the most common thing that it is used for and studied for, but also pretty much

[45:51.360 --> 45:52.360] everything.

[45:52.360 --> 45:56.920] It's been also studied for anxiety and depression and other things as well.

[45:56.920 --> 45:59.440] So what's the idea behind it?

[45:59.440 --> 46:05.080] Again, there really isn't anything very compelling in terms of what's the putative mechanism.

[46:05.080 --> 46:10.920] There have been probably dozens, hundreds of proposed possible mechanisms, but it all

[46:10.920 --> 46:15.560] is some version of, oh, you're sort of forcing the right half of the brain to communicate

[46:15.560 --> 46:20.280] with the left half, and there's something going on there, and you're rewiring the brain.

[46:20.280 --> 46:26.360] That's the reprocessing, the connection between the memory and the emotional feeling.

[46:26.360 --> 46:34.440] But it's all this made-up, hand-waving, very, I think, neurologically naive kind of statements,

[46:34.440 --> 46:38.400] and there's really no science behind it.

[46:38.400 --> 46:44.160] But again, irrelevant to the core question, or at least not irrelevant, but yeah, does

[46:44.160 --> 46:45.160] it work?

[46:45.160 --> 46:50.840] It sort of, it does impact how we address that question, but you could have sufficient

[46:50.840 --> 46:54.400] evidence that it works even if you don't know what the mechanism is and even if it was based

[46:54.400 --> 46:56.700] on a quirky observation.

[46:56.700 --> 47:02.320] So what's the research been into EMDR over the last 35 years?

[47:02.320 --> 47:08.360] So from where I'm sitting from like a psychologist in training who reads the APA literature,

[47:08.360 --> 47:13.260] blah, blah, blah, and I'll give you my, I'm going to approach this really quickly with

[47:13.260 --> 47:18.000] my skeptical hat, but also with my psychologist hat, and also I am in the process of co-editing

[47:18.000 --> 47:21.120] a volume about pseudoscience and therapy, so it's informed by that, too.

[47:21.120 --> 47:22.900] I've read some good chapters on this.

[47:22.900 --> 47:30.120] Number one, the evidence base shows that people who get EMDR have better outcomes than people

[47:30.120 --> 47:32.280] who don't get therapy.

[47:32.280 --> 47:36.040] It also shows that they have sometimes better outcomes than people who get certain types

[47:36.040 --> 47:42.680] of therapy, but from where I'm sitting and a lot of people who, I think it was Rosen

[47:42.680 --> 47:44.860] who dubbed this a purple hat therapy.

[47:44.860 --> 47:46.400] Did you come across that statement?

[47:46.400 --> 47:47.400] Yeah.

[47:47.400 --> 47:48.400] Yeah.

[47:48.400 --> 47:49.400] I love this, right?

[47:49.400 --> 47:55.800] The idea is, is it the EMDR or is it the fact that they're learning anti-anxiety reduction

[47:55.800 --> 47:56.800] skills?

[47:56.800 --> 48:01.420] Like if you taught somebody how to reduce their anxiety while driving and then you gave

[48:01.420 --> 48:05.600] them a purple hat and said, wear this while driving, then they finish driving and they

[48:05.600 --> 48:07.320] go, oh my God, I wasn't anxious.

[48:07.320 --> 48:09.200] Was it because of the purple hat?

[48:09.200 --> 48:10.200] Yeah.

[48:10.200 --> 48:12.600] So that's exactly what's going on here.

[48:12.600 --> 48:18.520] So Scott Lillian Field, who is a skeptical psychologist, yeah, passed away a couple

[48:18.520 --> 48:19.520] years ago.

[48:19.520 --> 48:20.520] Yeah.

[48:20.520 --> 48:21.520] Yeah, unfortunately.

[48:21.520 --> 48:22.520] Died young, really tragic.

[48:22.520 --> 48:23.520] But great guy.

[48:23.520 --> 48:29.400] He did a very good review of the literature on EMDR a few years ago, and I liked the way

[48:29.400 --> 48:30.400] he summarized it.

[48:30.400 --> 48:35.080] All right, so first we could ask, does EMDR work better, this is like for PTSD, does it

[48:35.080 --> 48:38.840] work better than nothing, right, than no therapy?

[48:38.840 --> 48:40.520] And the answer is like, yeah, it clearly does.

[48:40.520 --> 48:47.760] But remember, EMDR is you're moving your eyes while you're doing essentially exposure therapy,

[48:47.760 --> 48:49.600] imaginal exposure therapy.

[48:49.600 --> 48:57.000] So it's not, in fact, Shapiro, you know, tried doing the eye movements by itself and it didn't

[48:57.000 --> 48:58.640] work at all.

[48:58.640 --> 49:03.320] So she had to combine it with, basically with cognitive behavioral therapy, and so now it

[49:03.320 --> 49:04.320] works.

[49:04.320 --> 49:08.360] When you combine it with this already established effective psychological treatment, it quote

[49:08.360 --> 49:09.360] unquote works.

[49:09.360 --> 49:10.360] All right, but he said-

[49:10.360 --> 49:13.560] How can you possibly have an effect size that's large enough to differentiate?

[49:13.560 --> 49:21.160] Yeah, so he said, in the literature, you compare EMDR to no therapy at all, and of course it

[49:21.160 --> 49:23.160] works, compared to nothing.

[49:23.160 --> 49:29.300] You can also compare it to an intervention, but the intervention itself is not effective.

[49:29.300 --> 49:35.000] So for example, it's often compared to passive listening, where you just have a therapist

[49:35.000 --> 49:40.640] going, mm-hmm, yeah, tell me more, mm-hmm, you know, not doing any therapy, just listening,

[49:40.640 --> 49:42.240] which isn't really an effective treatment.

[49:42.240 --> 49:43.720] And yeah, is it better than that?

[49:43.720 --> 49:44.720] Sure.

[49:44.720 --> 49:50.320] Is it better than doing the exact same thing, but without the eye movement?

[49:50.320 --> 49:52.560] No, it's not better than doing it.

[49:52.560 --> 49:58.040] And if you adequately control it, where you're doing, say, a fixed gaze therapy, as opposed

[49:58.040 --> 50:02.280] to eye movement therapy, but otherwise, cognitively, you're doing the exact same thing.

[50:02.280 --> 50:08.920] You're imagining the stressor, and also imagining yourself in a safe space at the same time,

[50:08.920 --> 50:09.920] whatever.

[50:09.920 --> 50:14.260] If you're going through the cognitive therapy, it's exactly the same.

[50:14.260 --> 50:21.600] So the EMDR, again, my purple hat phrase that I like to use is the part of this nutritious

[50:21.600 --> 50:22.600] breakfast, right?

[50:22.600 --> 50:25.080] So you remember those commercials?

[50:25.080 --> 50:26.640] When served with a nutritious breakfast.

[50:26.640 --> 50:29.560] Yeah, so it was like selling Pop-Tarts or something.

[50:29.560 --> 50:34.480] It's like, this Danish is part of this, and they show a nutritious breakfast with orange

[50:34.480 --> 50:41.200] juice and whatever, and this tumor is part of a healthy body.

[50:41.200 --> 50:42.200] It's irrelevant.

[50:42.200 --> 50:45.000] It's an irrelevant part of this nutritious breakfast, but that's the thing.

[50:45.000 --> 50:51.500] It's a completely irrelevant superficial component of a treatment that is already established

[50:51.500 --> 50:55.400] as being effective, and it doesn't appear to add anything.

[50:55.400 --> 51:00.280] And so, and again, another critic who, again, I like this statement that I've applied this

[51:00.280 --> 51:06.600] to many other things, like, what is unique about EMDR doesn't work, and what works about

[51:06.600 --> 51:09.040] EMDR is not unique, right?

[51:09.040 --> 51:10.040] So it's like-

[51:10.040 --> 51:11.040] You said that a lot about chiropractic.

[51:11.040 --> 51:12.040] Chiropractic, yeah.

[51:12.040 --> 51:16.600] What chiropractors do that works isn't unique to chiropractors, and what is unique to chiropractors

[51:16.600 --> 51:17.600] doesn't work.

[51:17.600 --> 51:20.280] But in any case, so that's the case of EMDR.

[51:20.280 --> 51:24.120] It's essentially unnecessary, but here's the thing.

[51:24.120 --> 51:27.160] It's massively popular within psychotherapy.

[51:27.160 --> 51:29.080] It's so popular.

[51:29.080 --> 51:30.960] It's so frustrating.

[51:30.960 --> 51:40.640] It's so frustrating, because the APA has basically said it's an evidence-based treatment, because

[51:40.640 --> 51:44.420] there is some evidence to support it, but it's poor evidence.

[51:44.420 --> 51:47.800] So that is an indictment of EBM, right?

[51:47.800 --> 51:51.120] In my opinion, that's why it's not a science-based medicine treatment.

[51:51.120 --> 51:56.280] It may be an EBM treatment, but that's only because, and then even then I think it fails

[51:56.280 --> 52:03.000] the EBM standard, but essentially people exploit the weaknesses in evidence-based medicine

[52:03.000 --> 52:08.320] to say things like EMDR is quote-unquote evidence-based, because there is clinical evidence to show

[52:08.320 --> 52:14.520] that it quote-unquote works, but only when not comparing it to an adequate control, not

[52:14.520 --> 52:16.920] as an isolated variable.

[52:16.920 --> 52:18.760] It's exactly like acupuncture.

[52:18.760 --> 52:23.120] It only works when you're not isolating what acupuncture is, you know, the variable that

[52:23.120 --> 52:24.120] is acupuncture.

[52:24.120 --> 52:27.720] And do you know what the saddest part of all of this is, and it's a part we don't often

[52:27.720 --> 52:33.800] talk about on the show, is that the variable that works the most of almost any therapeutic

[52:33.800 --> 52:36.400] variable is relationship.

[52:36.400 --> 52:41.000] Yeah, it's the therapeutic relationship, yeah.

[52:41.000 --> 52:42.480] Here's the other way to look at this.

[52:42.480 --> 52:46.960] So EMDR, the evidence is really crappy.

[52:46.960 --> 52:51.360] I was really reading a recent systematic review and they said they looked at like all of the

[52:51.360 --> 52:58.400] randomized controlled trials over whatever, you know, since forever, and the total number

[52:58.400 --> 53:02.960] of subjects in all of the studies that they were able to add into a meta-analysis was

[53:02.960 --> 53:03.960] like 420.

[53:03.960 --> 53:04.960] That's it?

[53:04.960 --> 53:05.960] That's it.

[53:05.960 --> 53:06.960] Oh, yeah.

[53:06.960 --> 53:07.960] That's terrible.

[53:07.960 --> 53:08.960] It's terrible.

[53:08.960 --> 53:16.840] Like 35 years down with, there should be thousands of patients in the meta-analysis of EMDR.

[53:16.840 --> 53:18.960] And then most of the studies are crap too.

[53:18.960 --> 53:22.740] Most of the studies are not well, are they're pragmatic studies or they're not well controlled

[53:22.740 --> 53:26.360] or they're just comparing it to no intervention or to inadequate intervention.

[53:26.360 --> 53:30.080] So the thing is they're doing studies that are like pragmatic studies that are not designed

[53:30.080 --> 53:33.760] to test whether or not it works, right?

[53:33.760 --> 53:37.280] They're not really doing efficacy trials and when they do, it doesn't work.

[53:37.280 --> 53:39.500] And they just sort of gloss over that.

[53:39.500 --> 53:43.480] And so a lot of people, when I talk about this kind of thing, like EMDR specifically

[53:43.480 --> 53:47.800] or similar things, they say, well, what's the problem, what's the harm?

[53:47.800 --> 53:53.420] It's gimmicky, it's superficial, but people feel that it works, it feels better, you know.

[53:53.420 --> 53:55.160] It waters down the entire system.

[53:55.160 --> 54:00.400] It makes me a less trustworthy practitioner because my field says this works.

[54:00.400 --> 54:01.400] Exactly.

[54:01.400 --> 54:06.880] So there is a lot of harm when you have a profession endorsing essentially pseudoscience

[54:06.880 --> 54:12.400] or at least these, you know, pop, you know, popular, what would you call it, like pop

[54:12.400 --> 54:18.960] psi or pop whatever sort of bizarre or superficial notions of how the brain works.

[54:18.960 --> 54:19.960] Marketing scam.

[54:19.960 --> 54:20.960] Yeah.

[54:20.960 --> 54:27.200] So this basically completely feeds into snake oil industry, completely feeds into that.

[54:27.200 --> 54:30.560] Also it is a massive distraction.

[54:30.560 --> 54:35.080] There is a feedback loop between clinical studies and basic science research, right?

[54:35.080 --> 54:40.160] So if people are making up these weird, you know, notions about what's happening in the

[54:40.160 --> 54:46.680] brain, and then they say, and that's how EMDR works, then they then they falsely conclude

[54:46.680 --> 54:50.680] that EMDR works because of studies that don't show that it works, but they're misinterpreting

[54:50.680 --> 54:51.680] it.

[54:51.680 --> 54:56.520] Then they say that supports these bizarre neuroscience ideas that I have about how it's

[54:56.520 --> 54:57.680] working, right?

[54:57.680 --> 55:00.800] That's like saying, oh, we know qi exists because acupuncture works.

[55:00.800 --> 55:04.440] It's like, well, no, acupuncture doesn't work and there's no reason to think that qi

[55:04.440 --> 55:05.440] exists.

[55:05.440 --> 55:06.440] It's the same kind of thing.

[55:06.440 --> 55:08.700] It poisons the whole research paradigm.

[55:08.700 --> 55:14.960] And so, you know, mental health practice has a hard enough time grinding forward with really

[55:14.960 --> 55:17.980] rigorous science based modalities.

[55:17.980 --> 55:20.600] This kind of thing just makes it harder.

[55:20.600 --> 55:27.300] It's like throwing dirt in the gears of trying to move the whole field forward.

[55:27.300 --> 55:31.480] They really do need to be able to, plus they also need to recognize the research does not

[55:31.480 --> 55:33.840] show that this is a real phenomenon.

[55:33.840 --> 55:37.300] And if they think that it does, they don't know how to do or interpret research.

[55:37.300 --> 55:39.660] And that's the real problem.

[55:39.660 --> 55:46.560] That is the real problem here is that it is exposing a real problem in the in the understanding

[55:46.560 --> 55:48.600] of how clinical science works.

[55:48.600 --> 55:52.880] And in the discipline that I would argue that needs to understand it the best because it

[55:52.880 --> 55:54.140] is the hardest.

[55:54.140 --> 55:59.480] It is really hard to do good research when your outcomes are so subjective and so complicated

[55:59.480 --> 56:00.480] and interdependent.

[56:00.480 --> 56:06.240] It's almost like good psychology research, really good psychology research is some of

[56:06.240 --> 56:07.240] the best research.

[56:07.240 --> 56:08.240] Yes, totally.

[56:08.240 --> 56:12.900] Like we have such a good handle on sophisticated statistics because we're looking for tiny

[56:12.900 --> 56:16.960] outcome measures and we're looking for controlling variables.

[56:16.960 --> 56:21.220] It's not that hard to take a bunch of cloned animals and drop something in the water of

[56:21.220 --> 56:22.220] half of them.

[56:22.220 --> 56:23.220] Yeah.

[56:23.220 --> 56:26.000] Or send an electron through a detector a trillion times or whatever.

[56:26.000 --> 56:27.000] Exactly.

[56:27.000 --> 56:28.000] And I'm not saying that laboratory work isn't hard.

[56:28.000 --> 56:29.000] It's really freaking hard.

[56:29.000 --> 56:30.000] I did it for years.

[56:30.000 --> 56:33.560] But what I'm saying is you have to get real creative when you're working with human subjects

[56:33.560 --> 56:38.040] and when you're looking, like you said, with these more subjective outcomes that much more

[56:38.040 --> 56:39.040] rigorous.

[56:39.040 --> 56:40.040] Yeah, exactly.

[56:40.040 --> 56:42.680] There's a lot of unique challenges to doing this kind of research.

[56:42.680 --> 56:44.640] It has to be all the more rigorous.

[56:44.640 --> 56:46.680] And that's the EMDR.

[56:46.680 --> 56:51.880] The fact that it's able to thrive within this community, this profession is a problem.

[56:51.880 --> 56:56.000] And it's a big flashing light that we need to a lot of celebrities have taken on to this

[56:56.000 --> 56:57.000] as well.

[56:57.000 --> 56:58.560] And that helps perpetuate the problem.

[56:58.560 --> 56:59.560] Yeah.

[56:59.560 --> 57:00.560] Yeah.

[57:00.560 --> 57:06.320] And sometimes these treatments do proliferate within sort of the fringe professionals.

[57:06.320 --> 57:09.960] But EMDR is really thriving within the mainstream of the profession.

[57:09.960 --> 57:10.960] I know.

[57:10.960 --> 57:11.960] It's really scary.

[57:11.960 --> 57:12.960] That's a real problem.

[57:12.960 --> 57:13.960] All right.

[57:13.960 --> 57:14.960] Let's move on.

how meteorites may have created the continents (57:14)

  • [link_URL TITLE][5]

[57:14.960 --> 57:18.840] Bob, tell us about how meteorites may have created the continents.

[57:18.840 --> 57:19.840] What's going on there?

[57:19.840 --> 57:20.840] Oh, yeah, right?

[57:20.840 --> 57:25.840] Scientists have revealed the best evidence yet that the Earth's continents formed from

[57:25.840 --> 57:29.260] meteorite impacts billions of years ago.

[57:29.260 --> 57:34.280] This is from the Nature paper called Giant Impacts and the Origin and Evolution of Continents

[57:34.280 --> 57:40.440] by Dr. Tim Johnson from Curtin School of Earth and Planetary Sciences in Australia.

[57:40.440 --> 57:41.440] Okay.

[57:41.440 --> 57:47.840] So now this idea or speculation that the force of meteorite impacts essentially planted the

[57:47.840 --> 57:52.560] seeds of Earth's continents, if you will, it's been bandied about for decades and nobody

[57:52.560 --> 57:53.560] ever told me.

[57:53.560 --> 57:54.620] I never heard of this.

[57:54.620 --> 57:57.040] But there was no good proof for it.

[57:57.040 --> 58:01.280] You know, just nothing really solid to back it up until now, apparently.

[58:01.280 --> 58:07.240] So this evidence comes from Pilbara Craton in Western Australia.

[58:07.240 --> 58:12.200] Now this area of Western Australia is one of the only two places on the Earth that has

[58:12.200 --> 58:18.220] pristine Earth crust from the Archean Epoch 3.6 to 2.7 billion years ago.

[58:18.220 --> 58:22.520] So this is the oldest crust on the planet that we have identified.

[58:22.520 --> 58:26.680] And it was during this ancient time that rocks themselves first formed at the beginning of

[58:26.680 --> 58:27.720] the Archean.

[58:27.720 --> 58:33.600] And when that period ended, almost three quarters of Earth's crust had been formed during that

[58:33.600 --> 58:34.600] time.

[58:34.600 --> 58:39.600] Now this is a very iron rich rock and it started forming before there was even any atmospheric

[58:39.600 --> 58:43.000] oxygen, before there was even life on Earth.

[58:43.000 --> 58:47.440] And later rocks from that time, though, have evidence of some of the earliest life ever

[58:47.440 --> 58:52.880] found 3.45 billion year old fossil colonies of microbial cyanobacteria.

[58:52.880 --> 58:59.040] Yeah, so a lot of important foundational things happening during the Archean Epoch.

[58:59.040 --> 59:04.700] Now the smoking gun in the rocks was the tiny crystals of the mineral zircon in the ancient

[59:04.700 --> 59:08.880] crust that show the scientists evidence of the ancient and huge meteorite impacts.

[59:08.880 --> 59:14.800] Dr. Johnson said, studying the composition of oxygen isotopes in these zircon crystals

[59:14.800 --> 59:19.280] reveal the top down process, starting with the melting of rocks near the surface and

[59:19.280 --> 59:24.640] progressing deeper, consistent with the geological effect of giant meteorite impacts.

[59:24.640 --> 59:28.520] Our research provides the first solid evidence that the process that ultimately formed the

[59:28.520 --> 59:32.040] continents began with giant meteorite impacts.

[59:32.040 --> 59:33.720] So that this is amazing.

[59:33.720 --> 59:39.100] I always thought that continents themselves just was formed purely from internal processes

[59:39.100 --> 59:43.000] of the Earth, you know, the formation and cooling of the Earth.

[59:43.000 --> 59:49.200] It's really it's interesting to think that, you know, that at least Earth perhaps needed

[59:49.200 --> 59:52.920] these meteorites to hit to really get this process going.

[59:52.920 --> 59:57.160] So does that lessen the possibility that life like Earth life can exist?

[59:57.160 --> 59:58.160] I don't know.

[59:58.160 --> 59:59.440] It kind of seems like it.

[59:59.440 --> 01:00:03.160] But then again, how common is that kind of those kind of meteorite impacts?

[01:00:03.160 --> 01:00:07.600] So speaking of the impacts, they happened during an interesting period of Earth's early

[01:00:07.600 --> 01:00:10.880] history called the Late Heavy Bombardment.

[01:00:10.880 --> 01:00:14.760] You guys heard of that from 4.1 to 3.8 billion years ago.

[01:00:14.760 --> 01:00:19.480] Now the Late Heavy Bombardment is distinguished from the very early light bombardment and

[01:00:19.480 --> 01:00:23.780] especially the on time mediocre bombardment.

[01:00:23.780 --> 01:00:27.280] So please ignore that last sentence entirely.

[01:00:27.280 --> 01:00:33.000] So it was during it was during the millions years long late heavy bombardment that an

[01:00:33.000 --> 01:00:38.560] unusually large number of asteroids and meteoroids terrorize the inner solar system.

[01:00:38.560 --> 01:00:40.960] Now, of course, there's no evidence on the Earth of that, right?

[01:00:40.960 --> 01:00:43.920] Because of weathering and plate tectonics.

[01:00:43.920 --> 01:00:49.120] But the moon and Mercury still bear signs from billions of years ago of that bombardment

[01:00:49.120 --> 01:00:51.680] that are discernible even today.

[01:00:51.680 --> 01:00:52.680] So does this matter?

[01:00:52.680 --> 01:00:54.080] Well, it's science.

[01:00:54.080 --> 01:00:56.000] Of course it matters, right, Steve?

[01:00:56.000 --> 01:01:03.120] More specifically, continents are pretty important, even from a purely selfish, human centric

[01:01:03.120 --> 01:01:04.120] point of view.

[01:01:04.120 --> 01:01:09.760] Look, not only do we live on them, but 98 percent of the world's biomass is terrestrial.

[01:01:09.760 --> 01:01:15.920] So yeah, there's a lot going on and critically important, but not just for biomass.

[01:01:15.920 --> 01:01:21.680] On this point, Dr. Johnson chimes in and he says, not not least, the continents host critical

[01:01:21.680 --> 01:01:27.840] metals such as lithium, tin and nickel, commodities that are essential to the emerging green technologies

[01:01:27.840 --> 01:01:31.480] needed to fulfill our obligation to mitigate climate change.

[01:01:31.480 --> 01:01:32.480] OK.

[01:01:32.480 --> 01:01:36.640] So as Dr. Johnson finishes his thought with this, he says these mineral deposits are the

[01:01:36.640 --> 01:01:42.720] end result of a process known as crustal differentiation, which began with the formation of the earliest

[01:01:42.720 --> 01:01:46.800] land masses of which Pilbara Craton is just one of many.

[01:01:46.800 --> 01:01:48.960] Now in the future, what's going to happen?

[01:01:48.960 --> 01:01:53.480] The future of this research is probably, as you might imagine, find stronger evidence,

[01:01:53.480 --> 01:01:54.480] right?

[01:01:54.480 --> 01:01:55.480] More evidence.

[01:01:55.480 --> 01:01:58.840] So what they're going to probably do is they're going to look at other ancient crusts, probably

[01:01:58.840 --> 01:02:04.640] the other oldest piece of crust from the Archean epoch that's in Africa, but I think other

[01:02:04.640 --> 01:02:08.720] crusts that are maybe not quite as old, but still quite old.

[01:02:08.720 --> 01:02:12.840] And they're going to they're going to apply their models to those crusts and see if if

[01:02:12.840 --> 01:02:15.280] their models fit there as well.

[01:02:15.280 --> 01:02:21.240] And and then and then the time may come in the not too distant future where it's common

[01:02:21.240 --> 01:02:27.040] knowledge and the consensus that that the Earth's continents were created pretty much

[01:02:27.040 --> 01:02:31.360] directly because of meteorite impacts billions of years ago.

[01:02:31.360 --> 01:02:34.360] And who knows what Earth would be like if that didn't happen?

[01:02:34.360 --> 01:02:35.360] But really interesting.

[01:02:35.360 --> 01:02:36.360] Didn't know this.

[01:02:36.360 --> 01:02:40.680] Yeah, it's one of those interesting things like we're not really sure how why the continents

[01:02:40.680 --> 01:02:41.680] exist.

[01:02:41.680 --> 01:02:42.680] Yeah.

[01:02:42.680 --> 01:02:43.680] I mean, mysterious.

[01:02:43.680 --> 01:02:44.680] Yeah.

[01:02:44.680 --> 01:02:45.680] To a certain extent.

[01:02:45.680 --> 01:02:46.680] Yeah.

[01:02:46.680 --> 01:02:47.680] I didn't realize it.

[01:02:47.680 --> 01:02:48.680] All right.

[01:02:48.680 --> 01:02:49.680] Thanks, Bob.

[01:02:49.680 --> 01:02:50.680] Sure, man.

Special Segment: Death by Pseudoscience (1:02:47)

[01:02:50.680 --> 01:02:51.680] Evan, you're going to give us another installment of Death by Pseudoscience.

[01:02:51.680 --> 01:02:52.680] Indeed.

[01:02:52.680 --> 01:02:53.680] Yep.

[01:02:53.680 --> 01:02:57.440] So supplements and herbal remedies back in the headlines.

[01:02:57.440 --> 01:03:02.400] It was reported recently that the wife of a U.S. congressman died with the cause of

[01:03:02.400 --> 01:03:08.560] death being and this is a quote from the autopsy report, dehydration due to gastroenteritis

[01:03:08.560 --> 01:03:16.480] due to adverse effects of white mulberry leaf ingestion and gastroenteritis, am I pronouncing

[01:03:16.480 --> 01:03:17.480] that right?

[01:03:17.480 --> 01:03:18.480] Gastroenteritis.

[01:03:18.480 --> 01:03:19.480] Yeah, that's right.

[01:03:19.480 --> 01:03:22.780] Is an inflammation of the stomach and intestines.

[01:03:22.780 --> 01:03:28.120] Her name was Lori McClintock, 61 year old wife of U.S. Congressman Tom McClintock from

[01:03:28.120 --> 01:03:29.660] California.

[01:03:29.660 --> 01:03:32.400] This happened back in December of twenty twenty one.

[01:03:32.400 --> 01:03:36.840] Lori was found unresponsive in her locked residence by her husband, Tom.

[01:03:36.840 --> 01:03:41.480] The day prior, she had complaints of an upset stomach and the results of the autopsy were

[01:03:41.480 --> 01:03:44.400] finally just reported last week.

[01:03:44.400 --> 01:03:50.040] And this was an autopsy with toxicology testing that confirmed the cause of death in her stomach.

[01:03:50.040 --> 01:03:56.260] They found a partially intact white mulberry leaf and people who take white mulberry leaves

[01:03:56.260 --> 01:04:02.080] sometimes or more often brew them into a tea, you know, they'll drink the tea and perhaps

[01:04:02.080 --> 01:04:07.020] in the process they'll consume the liquid, of course, but some fragments or dregs go

[01:04:07.020 --> 01:04:08.460] in there as well.

[01:04:08.460 --> 01:04:12.640] Now it wasn't clear from the autopsy report whether she drank tea with white mulberry

[01:04:12.640 --> 01:04:18.420] leaves or ate fresh or dried leaves or took it as a dietary supplement containing the

[01:04:18.420 --> 01:04:20.160] leaf.

[01:04:20.160 --> 01:04:23.440] Didn't get into that level of detail in that particular report.

[01:04:23.440 --> 01:04:27.980] But independent lab tests ordered by the coroner's office showed that the body had elevated levels

[01:04:27.980 --> 01:04:33.800] of nitrogen, sodium and creatine, all signs of dehydration, according to three pathologists

[01:04:33.800 --> 01:04:36.440] who reviewed the coroner's documents.

[01:04:36.440 --> 01:04:43.780] Dr. DeMichelle Depree, who is a retired forensic pathologist and a former medical examiner

[01:04:43.780 --> 01:04:46.860] in South Carolina, also reviewed the documents.

[01:04:46.860 --> 01:04:51.160] White mulberry and said white mulberry leaves do tend to cause dehydration and part of the

[01:04:51.160 --> 01:04:55.940] uses for that can be to help some people lose weight, mostly through fluid loss, which in

[01:04:55.940 --> 01:04:57.760] this case was excessive.

[01:04:57.760 --> 01:05:01.920] Allison Colwell, who's a curator with the UC Davis Center for Plant Diversity at the

[01:05:01.920 --> 01:05:08.120] University of California, Davis wrote in December of 2021 afterwards, about a week after she

[01:05:08.120 --> 01:05:12.880] had died, she wrote a letter to the Sacramento County coroner's office and said, in comparing

[01:05:12.880 --> 01:05:17.520] this leaf fragment to fresh leaves and to our extensive library of pressed specimens,

[01:05:17.520 --> 01:05:21.960] we determined that this leaf fragment is a match to Morris Alba, the white mulberry.

[01:05:21.960 --> 01:05:26.020] So it was definitely confirmed that that is what they found in her stomach.

[01:05:26.020 --> 01:05:32.400] Now, right on the heels of this report, the arguments against the findings, the proponents

[01:05:32.400 --> 01:05:36.800] of herbs and supplements all questioned these findings.

[01:05:36.800 --> 01:05:43.800] For example, Daniel Fabrikant, CEO and president of the Natural Products Association, who represents

[01:05:43.800 --> 01:05:48.680] the dietary supplements industry and oversaw dietary supplements at the FDA during the

[01:05:48.680 --> 01:05:53.800] Obama administration, questioned whether her death was related to this supplement.

[01:05:53.800 --> 01:05:55.440] He said, it's completely speculative.

[01:05:55.440 --> 01:05:57.380] He said, there's a science to this.

[01:05:57.380 --> 01:06:00.280] It's not just what the coroner feels.

[01:06:00.280 --> 01:06:03.440] People unfortunately pass from dehydration every day and there's lots of different reasons

[01:06:03.440 --> 01:06:05.340] and lots of different causes.

[01:06:05.340 --> 01:06:10.960] Another person, Peter Cohen, a doctor, associate professor at Harvard Medical School in Boston

[01:06:10.960 --> 01:06:15.080] who leads the dietary supplement research program at Cambridge.

[01:06:15.080 --> 01:06:18.400] After reading the actual report, which clearly states that it's the cause, I still don't

[01:06:18.400 --> 01:06:20.960] believe that's an accurate description of what happened.

[01:06:20.960 --> 01:06:24.840] If you eat any leaves, they'll upset your stomach, but they will not kill you.

[01:06:24.840 --> 01:06:28.160] And here's another one, Bill Gurley, who's the principal scientist with the National

[01:06:28.160 --> 01:06:33.720] Center for Natural Products Research at the University of Mississippi School of Pharmacy

[01:06:33.720 --> 01:06:36.280] called the coroner's report bizarre.

[01:06:36.280 --> 01:06:41.360] He said he's looked at a lot of botanical toxicology reports over the years and in the

[01:06:41.360 --> 01:06:45.640] pantheon of those reports, I would say Mulberry ranks right up there as among the top three

[01:06:45.640 --> 01:06:49.600] or four safest botanicals that you will ever run across.

[01:06:49.600 --> 01:06:55.100] So you definitely had the people coming out sort of in defense of the industry because

[01:06:55.100 --> 01:07:00.720] as you can imagine, there's also a lot of people who were after this got reported, were

[01:07:00.720 --> 01:07:06.720] bringing up the fact that, and bringing attention to quite correctly, that the supplement industry

[01:07:06.720 --> 01:07:10.440] in the United States is woefully under regulated.

[01:07:10.440 --> 01:07:15.840] And we're talking about, they said it's grown to a $54 billion industry in the United annual

[01:07:15.840 --> 01:07:19.120] in the United States alone.

[01:07:19.120 --> 01:07:27.540] And there's, and for that going on and the lack of oversight and the lack of agencies

[01:07:27.540 --> 01:07:33.460] ensuring that these products are safe is really a serious problem.

[01:07:33.460 --> 01:07:35.760] There are some laws that are being proposed.

[01:07:35.760 --> 01:07:42.240] In fact, there is one in particular that was raised in recent months, Senator Richard Durbin

[01:07:42.240 --> 01:07:46.120] from Illinois is co-sponsoring bill with another Republican senator.

[01:07:46.120 --> 01:07:52.320] I'm sorry, I don't have his name handy, but he said he's introducing legislation to strengthen

[01:07:52.320 --> 01:07:55.160] the oversight of the dietary supplement industry.

[01:07:55.160 --> 01:07:58.560] It would require manufacturers to register with the FDA and provide a public list of

[01:07:58.560 --> 01:08:03.400] ingredients in their products, which are provisions that are backed by the Council for Responsible

[01:08:03.400 --> 01:08:04.400] Nutrition.

[01:08:04.400 --> 01:08:08.320] And he goes into a description about all of that.

[01:08:08.320 --> 01:08:14.240] So it sort of brought all this to the forefront, this particular incident and a lot of news

[01:08:14.240 --> 01:08:18.080] reporting and commentary from both sides of the argument here.

[01:08:18.080 --> 01:08:23.860] Yeah, I mean, I think the takeaway for the average person is that supplements are drugs.

[01:08:23.860 --> 01:08:27.680] They're not like this magically safe because they're quote unquote natural things.

[01:08:27.680 --> 01:08:31.960] And the mulberry leaves are known to cause GI symptoms.

[01:08:31.960 --> 01:08:34.680] They can cause constipation, bloating, et cetera.

[01:08:34.680 --> 01:08:40.240] I think this level of reaction may be unusual, but not implausible.

[01:08:40.240 --> 01:08:42.320] Maybe she had an allergic reaction to it.

[01:08:42.320 --> 01:08:48.080] It's hard to say, but they shouldn't be considered to be magically safe because they're natural.

[01:08:48.080 --> 01:08:50.320] They are drugs.

[01:08:50.320 --> 01:08:55.800] Mulberry in particular has a very nasty taste to it, which is an evolutionary signal that

[01:08:55.800 --> 01:08:57.600] it's poison.

[01:08:57.600 --> 01:09:00.680] So asparagus is poison?

[01:09:00.680 --> 01:09:01.680] Everything is poison.

[01:09:01.680 --> 01:09:07.960] Pretty much every plant that we eat, plants make poisons to protect themselves and we

[01:09:07.960 --> 01:09:14.480] bred plants, cultivated plants to not protect themselves so that we can eat them.

[01:09:14.480 --> 01:09:18.020] And then that's why we have to carefully take care of them because otherwise, you ever wonder

[01:09:18.020 --> 01:09:21.440] why weeds grow so easily in your garden and it's hard to keep your plants alive?

[01:09:21.440 --> 01:09:24.440] That's because weeds are poisonous and they protect themselves and the plants that we

[01:09:24.440 --> 01:09:27.600] eat are vulnerable because we bred them to be vulnerable.

[01:09:27.600 --> 01:09:32.920] But in any case, the other thing is not only are they drugs, they're just massively understudied

[01:09:32.920 --> 01:09:34.440] and poorly regulated.

[01:09:34.440 --> 01:09:37.600] So these kinds of things absolutely can happen.

[01:09:37.600 --> 01:09:40.960] Not to mention the interactions with existing medications that otherwise a doctor might

[01:09:40.960 --> 01:09:41.960] be prescribing.

[01:09:41.960 --> 01:09:42.960] Exactly.

[01:09:42.960 --> 01:09:45.780] They can have drug interactions, absolutely.

[01:09:45.780 --> 01:09:48.000] And then a lot of them do, and we know that they do.

[01:09:48.000 --> 01:09:52.760] So you're just taking a poorly regulated, dirty drug about which we don't have a lot

[01:09:52.760 --> 01:09:54.340] of information.

[01:09:54.340 --> 01:09:58.340] The probability that's going to help you is vanishingly small compared to the probability

[01:09:58.340 --> 01:10:03.780] that it's going to hurt you, although the most likely outcome is probably nothing.

[01:10:03.780 --> 01:10:08.200] The biggest thing about them is that they have very low bioavailability.

[01:10:08.200 --> 01:10:11.280] You're probably going to get like a tummy ache is probably the most common thing that's

[01:10:11.280 --> 01:10:12.940] going to happen to you.

[01:10:12.940 --> 01:10:17.640] But it's a roll of the dice, and again, chances are pretty low that it's going to actually

[01:10:17.640 --> 01:10:19.680] be helpful.

[01:10:19.680 --> 01:10:26.220] And using ancient anecdotal evidence is not very reassuring.

[01:10:26.220 --> 01:10:31.060] The things that have obvious effects we already are exploiting and have turned into regulated

[01:10:31.060 --> 01:10:32.920] drugs.

[01:10:32.920 --> 01:10:37.840] And everything else is basically just plants.

[01:10:37.840 --> 01:10:42.480] Anyway, it's a good reminder of those basic things.

Who's That Noisy? (1:10:42)

J: ... similar to English's "Buffalo buffalo Buffalo buffalo buffalo [+ 3 'buffalos']"

...

C: (sing-song) Homonymy![note 1]

New Noisy (1:14:49)

[musical boings and dings]

J: ... If you think you know the answer or you have a cool Noisy you heard this week, you can email me at WTN@theskepticsguide.org.

[01:10:42.480 --> 01:10:44.840] All right, Jay, it's who's that noisy time.

[01:10:44.840 --> 01:10:59.360] All right, guys, last week I played this noisy.

[01:10:59.360 --> 01:11:03.760] And it goes on and on and on and on.

[01:11:03.760 --> 01:11:08.920] And that's a case where if the person speaks more, actually, they'll be more liked.

[01:11:08.920 --> 01:11:09.920] Exactly.

[01:11:09.920 --> 01:11:13.200] So yeah, it's good that it goes on like that.

[01:11:13.200 --> 01:11:14.200] Do you guys have any idea?

[01:11:14.200 --> 01:11:16.200] I really have no idea what this is.

[01:11:16.200 --> 01:11:17.200] No, I don't.

[01:11:17.200 --> 01:11:19.960] It's weird, whatever it is.

[01:11:19.960 --> 01:11:20.960] It's weird.

[01:11:20.960 --> 01:11:21.960] It is definitely weird.

[01:11:21.960 --> 01:11:26.120] So a listener named Tom Patterson wrote in and said, hello, fellow skeptics.

[01:11:26.120 --> 01:11:31.960] This noisy is the effect of putting a nonsense phrase made of actual words into Google Translate.

[01:11:31.960 --> 01:11:35.800] The result sounds silly because the words all have similar sound when translated.

[01:11:35.800 --> 01:11:37.880] So that's not correct.

[01:11:37.880 --> 01:11:43.880] And I would think that it isn't Google Translate because how would you be playing it?

[01:11:43.880 --> 01:11:46.680] Like, what is the mechanism to play what you've translated?

[01:11:46.680 --> 01:11:47.680] Do you know what I mean?

[01:11:47.680 --> 01:11:51.520] Like, you can put something in and get a nonsense response out of Google Translate, but how

[01:11:51.520 --> 01:11:52.760] would you vocalize that?

[01:11:52.760 --> 01:11:55.640] I wonder if you could vocalize Google Translate.

[01:11:55.640 --> 01:11:59.720] Another listener named Brian Jackson wrote in and said, I'm fairly certain that what

[01:11:59.720 --> 01:12:05.880] is being said in the noisy is a specific Mandarin Chinese tongue twister similar to English's

[01:12:05.880 --> 01:12:08.560] buffalo, buffalo, buffalo, buffalo, buffalo.

[01:12:08.560 --> 01:12:12.480] However, I have absolutely no idea what is producing the sound.

[01:12:12.480 --> 01:12:16.040] So that is a definite step in the right direction.

[01:12:16.040 --> 01:12:20.040] I have another guest here from a listener named Matt Nichols that said, love you guys,

[01:12:20.040 --> 01:12:21.040] love the show.

[01:12:21.040 --> 01:12:24.640] This week's noisy is definitely a villager from Minecraft.

[01:12:24.640 --> 01:12:28.820] The sound is a little off, so I'll assume it's poison or being hit by a zombie.

[01:12:28.820 --> 01:12:33.840] So without a doubt, that is indeed what the villagers of Minecraft sound like, but that's

[01:12:33.840 --> 01:12:34.960] not what the noisy is.

[01:12:34.960 --> 01:12:38.120] If you know Minecraft, you know that that is what they sound like, which is kind of

[01:12:38.120 --> 01:12:39.120] funny.

[01:12:39.120 --> 01:12:45.200] I have a couple of people get, well, first off, about four or five dozen people answered

[01:12:45.200 --> 01:12:46.520] correctly today.

[01:12:46.520 --> 01:12:47.720] I can't believe it.

[01:12:47.720 --> 01:12:49.320] I never heard this thing before.

[01:12:49.320 --> 01:12:50.720] I had no idea what it was.

[01:12:50.720 --> 01:12:55.040] I had literally zero information in my head about it, but tons of people recognize this

[01:12:55.040 --> 01:12:57.360] right out and got it perfectly correct.

[01:12:57.360 --> 01:13:02.900] So the first person that got it right named Chubby from Romania said, hello Jay, this

[01:13:02.900 --> 01:13:08.000] week's noisy is the Chinese tongue-twisting poem called The Lion Eating Poet in the Stone

[01:13:08.000 --> 01:13:09.000] Den.

[01:13:09.000 --> 01:13:15.280] And he says it is playful tongue twister based on Chinese homonymy.

[01:13:15.280 --> 01:13:16.280] I like it.

[01:13:16.280 --> 01:13:17.280] I like it.

[01:13:17.280 --> 01:13:18.280] That's good.

[01:13:18.280 --> 01:13:19.280] Homonymy.

[01:13:19.280 --> 01:13:23.000] It is a playful tongue twister based on Chinese homonymy and tonality.

[01:13:23.000 --> 01:13:24.080] So that is correct.

[01:13:24.080 --> 01:13:26.240] There's a lot more details that I can give you guys.

[01:13:26.240 --> 01:13:32.600] Another listener wrote in named Colin Flahive, and he's an American who lives in Southwest

[01:13:32.600 --> 01:13:33.600] China.

[01:13:33.600 --> 01:13:36.480] He says he listens to the show every Sunday morning.

[01:13:36.480 --> 01:13:38.520] And he also guessed correctly this week.

[01:13:38.520 --> 01:13:41.280] So let me give you some details about what this thing is.

[01:13:41.280 --> 01:13:45.180] The person who sent it in, Joshua Twilley, wrote, Dear Jay, what you are hearing in this

[01:13:45.180 --> 01:13:49.820] audio clip is a poem called Lion Eating Poet in the Stone Den written by Chinese linguist

[01:13:49.820 --> 01:13:51.440] Yan Renchao.

[01:13:51.440 --> 01:13:55.680] What makes this poem unique is that every single one of the Mandarin characters used

[01:13:55.680 --> 01:14:00.120] in its composition is a homophone pronounced she.

[01:14:00.120 --> 01:14:04.720] The only variation in the sound of the characters used there are tones.

[01:14:04.720 --> 01:14:09.160] While tones are very important to any tonal language in identifying the meaning of different

[01:14:09.160 --> 01:14:13.960] words, deciphering the meaning of this poem by sound alone is nearly impossible.

[01:14:13.960 --> 01:14:18.480] If you played this recording to a native speaker, they would find it incomprehensible.

[01:14:18.480 --> 01:14:21.040] Yet written down, the poem is coherent.

[01:14:21.040 --> 01:14:22.040] Oh, cool.

[01:14:22.040 --> 01:14:23.040] Yeah, very cool.

[01:14:23.040 --> 01:14:27.140] Like, you know, he also included the poem itself.

[01:14:27.140 --> 01:14:28.800] But let me play it for you again.

[01:14:28.800 --> 01:14:38.480] This is what the poem sounds like when read.

[01:14:38.480 --> 01:14:41.720] So anyway, that's crazy stuff, man.

[01:14:41.720 --> 01:14:45.560] A poem with one word all pronounced differently.

[01:14:45.560 --> 01:14:46.560] I love it.

[01:14:46.560 --> 01:14:49.080] That is a who's that noisy if I ever heard one.

[01:14:49.080 --> 01:14:51.460] I have a new noisy for you guys this week.

[01:14:51.460 --> 01:14:56.600] This noisy was sent in by a listener named Anthony from Edmonton.

[01:14:56.600 --> 01:15:12.460] And here it is.

[01:15:12.460 --> 01:15:14.440] So I would like you to be very specific.

[01:15:14.440 --> 01:15:16.440] Yeah, it's very cool sound.

[01:15:16.440 --> 01:15:17.680] Be very specific.

[01:15:17.680 --> 01:15:21.560] I will only accept perfectly correct answers on this one.

[01:15:21.560 --> 01:15:24.720] If you think you know the answer or you have a cool noisy you heard this week, you can

[01:15:24.720 --> 01:15:28.040] email me at WTN at the skeptics guide.org.

Announcements (1:15:29)

[01:15:28.040 --> 01:15:29.040] Steve.

[01:15:29.040 --> 01:15:30.040] Yes, brother.

[01:15:30.040 --> 01:15:38.600] We have six hours, six hours of SGU content coming up for free on September 24th, 12 p.m.

[01:15:38.600 --> 01:15:41.980] Eastern to 6 p.m. Eastern time.

[01:15:41.980 --> 01:15:43.520] That is a Saturday.

[01:15:43.520 --> 01:15:44.520] We will be talking.

[01:15:44.520 --> 01:15:45.520] We will.

[01:15:45.520 --> 01:15:46.520] A few things are happening.

[01:15:46.520 --> 01:15:49.520] One, we are going to introduce everybody to our new book.

[01:15:49.520 --> 01:15:53.000] The name of the book is The Skeptics' Guide to the future.

[01:15:53.000 --> 01:15:56.360] You know, you must have heard us talk about this before, but if you haven't, the book

[01:15:56.360 --> 01:16:00.520] is about the technologies that exist today that are important to us today.

[01:16:00.520 --> 01:16:01.720] But where did they come from?

[01:16:01.720 --> 01:16:02.720] How did they start?

[01:16:02.720 --> 01:16:06.640] What's the history of those technologies and where do we think that they'll be going in

[01:16:06.640 --> 01:16:07.640] the future?

[01:16:07.640 --> 01:16:09.560] It was a ton of fun to write.

[01:16:09.560 --> 01:16:12.300] And if you want to learn more about the book, please join us.

[01:16:12.300 --> 01:16:13.860] Something cool will be happening on that day.

[01:16:13.860 --> 01:16:20.180] We will be giving away a signed copy of the book and, you know, quite a bit of swag all

[01:16:20.180 --> 01:16:22.100] going to one person.

[01:16:22.100 --> 01:16:24.780] You have a chance to win if you listen to the live stream.

[01:16:24.780 --> 01:16:28.800] We will be giving out a secret password that we will say during the live stream that you

[01:16:28.800 --> 01:16:33.600] can enter in to enter yourself into the the free signed book giveaway.

[01:16:33.600 --> 01:16:35.140] We don't do this that often.

[01:16:35.140 --> 01:16:36.140] It's a big deal.

[01:16:36.140 --> 01:16:38.240] I hope you join us for it.

[01:16:38.240 --> 01:16:41.060] We're really going to enjoy doing that six hours, guys.

[01:16:41.060 --> 01:16:42.060] We could do it.

[01:16:42.060 --> 01:16:43.060] We've done so much more.

[01:16:43.060 --> 01:16:46.440] We'll also be doing two SGU episodes.

[01:16:46.440 --> 01:16:49.520] It'll provide the content for two episodes and we're going to organize it that way.

[01:16:49.520 --> 01:16:53.420] So there'll be like two intros to science or fiction, et cetera, et cetera.

[01:16:53.420 --> 01:16:54.920] A couple more things.

[01:16:54.920 --> 01:16:59.040] We have shows happening in Arizona in December.

[01:16:59.040 --> 01:17:05.960] So December 15th to December 17th, we will be doing four different shows, you know, two

[01:17:05.960 --> 01:17:06.960] private shows.

[01:17:06.960 --> 01:17:10.640] This is SGU private recording where you can listen to us do the show live and then we

[01:17:10.640 --> 01:17:15.140] spend an extra hour with the audience doing some fun stuff.

[01:17:15.140 --> 01:17:21.600] So that that'll be one of those will be happening in Phoenix on Thursday, the 15th of December.

[01:17:21.600 --> 01:17:27.080] And we will be doing another one in Tucson at noon on Saturday, the 17th.

[01:17:27.080 --> 01:17:30.020] And then we also will be doing two extravaganza's.

[01:17:30.020 --> 01:17:31.440] These are our stage shows.

[01:17:31.440 --> 01:17:33.440] This is very different than the private recording.

[01:17:33.440 --> 01:17:39.200] This is a completely different thing where we basically do a lot of improv stand up and

[01:17:39.200 --> 01:17:42.280] we discuss how your brain can fool you.

[01:17:42.280 --> 01:17:46.300] So it's mixed in throughout this whole show, we teach you about how your brain fools you

[01:17:46.300 --> 01:17:49.880] and how you can't trust what you see in here.

[01:17:49.880 --> 01:17:51.740] That is called the extravaganza.

[01:17:51.740 --> 01:17:56.680] And you can go to theskepticsguy.org forward slash events for all the details on these

[01:17:56.680 --> 01:18:00.200] four shows, two in Phoenix, two in Tucson.

[01:18:00.200 --> 01:18:01.200] Join us.

[01:18:01.200 --> 01:18:02.200] One more thing.

[01:18:02.200 --> 01:18:06.460] We are adding some benefits to our higher Patreon levels.

[01:18:06.460 --> 01:18:12.920] The two highest levels, the Medikog and the Luxotic for existing and new members, we will

[01:18:12.920 --> 01:18:19.760] be sending you a signed and personalized hard copy of our new book, The Skeptics Guide to

[01:18:19.760 --> 01:18:20.960] the Future.

[01:18:20.960 --> 01:18:23.600] So keep an eye out for that as well.

Science or Fiction (1:18:27)

Theme: Social Psychology

Item #1: A recent study finds that positive fortune-telling results in increased financial risk-taking for men but not for women.[6]
Item #2: A study of 5-years-olds finds that they perceive overweight people to be happier than thin people.[7]
Item #3: A study of college students finds that mask-wearing does not impair social interactions.[8]

Answer Item
Fiction Overweight happier than thin
Science Risk-taking men vs. women
Science
Mask-wearing impairs not
Host Result
Steve win
Rogue Guess
Bob
Mask-wearing impairs not
Jay
Mask-wearing impairs not
Evan
Overweight happier than thin
Cara
Overweight happier than thin

Voice-over: It's time for Science or Fiction.

Bob's Response

Jay's Response

Evan's Response

Cara's Response

Steve Explains Item #1

Steve Explains Item #2

Steve Explains Item #3

[01:18:23.600 --> 01:18:29.800] All right, everyone, let's go on with science or fiction.

[01:18:29.800 --> 01:18:39.680] It's time for science or fiction.

[01:18:39.680 --> 01:18:43.880] Each week I come up with three science news items, two real and one fake, and then I challenge

[01:18:43.880 --> 01:18:46.760] my panel of skeptics to tell me which one is the fake.

[01:18:46.760 --> 01:18:48.920] You have a theme this week.

[01:18:48.920 --> 01:18:52.480] It's three news items, but they all happen to cluster within a theme.

[01:18:52.480 --> 01:18:54.640] The theme is social psychology.

[01:18:54.640 --> 01:18:56.640] Oh, no.

[01:18:56.640 --> 01:18:57.640] Okay.

[01:18:57.640 --> 01:19:00.240] Are you guys ready?

[01:19:00.240 --> 01:19:01.240] Yeah.

[01:19:01.240 --> 01:19:02.240] All right.

[01:19:02.240 --> 01:19:07.800] Item number one, a recent study finds that positive fortune telling results in increased

[01:19:07.800 --> 01:19:12.440] financial risk taking for men, but not for women.

[01:19:12.440 --> 01:19:17.000] Item number two, a study of five year olds finds that they perceive overweight people

[01:19:17.000 --> 01:19:20.580] to be happier than thin people.

[01:19:20.580 --> 01:19:25.920] And item number three, a study of college students finds that mask wearing does not

[01:19:25.920 --> 01:19:28.680] impair social interactions.

[01:19:28.680 --> 01:19:29.680] Bob go first.

[01:19:29.680 --> 01:19:30.680] No.

[01:19:30.680 --> 01:19:31.680] Okay.

[01:19:31.680 --> 01:19:37.480] So a recent study finds that positive fortune telling results in increased financial risk

[01:19:37.480 --> 01:19:39.680] taking for men, but not for women.

[01:19:39.680 --> 01:19:41.160] Yeah, that just sounds right.

[01:19:41.160 --> 01:19:43.760] Yeah, you know, men are stupid.

[01:19:43.760 --> 01:19:49.640] A study of five year olds find that they perceive overweight people to be happier than thin

[01:19:49.640 --> 01:19:50.640] people.

[01:19:50.640 --> 01:19:51.640] Yeah, that sounds right too.

[01:19:51.640 --> 01:19:52.640] All right.

[01:19:52.640 --> 01:19:55.560] A study of college students find that mask wearing does not impair social interactions.

[01:19:55.560 --> 01:19:57.480] Yeah, that sounds right too.

[01:19:57.480 --> 01:19:59.360] So I'll say that three is fiction.

[01:19:59.360 --> 01:20:00.360] Thank you very much.

[01:20:00.360 --> 01:20:01.360] The mask wearing.

[01:20:01.360 --> 01:20:02.360] That's it.

[01:20:02.360 --> 01:20:03.360] Yes.

[01:20:03.360 --> 01:20:04.360] Yeah.

[01:20:04.360 --> 01:20:05.360] Wow.

[01:20:05.360 --> 01:20:06.360] That's how you did it.

[01:20:06.360 --> 01:20:07.360] Yeah.

[01:20:07.360 --> 01:20:08.360] Bingo, bingo.

[01:20:08.360 --> 01:20:09.360] This one's fiction.

[01:20:09.360 --> 01:20:10.360] All right, Jay.

[01:20:10.360 --> 01:20:11.360] All right.

[01:20:11.360 --> 01:20:13.420] The first one, a recent study finds positive fortune telling results in increased financial

[01:20:13.420 --> 01:20:15.160] risk taken for men, but not for women.

[01:20:15.160 --> 01:20:17.160] I don't even understand this one.

[01:20:17.160 --> 01:20:18.960] I'll tell you.

[01:20:18.960 --> 01:20:25.080] So if you go to a fortune teller and they give you a reading that's positive, like good

[01:20:25.080 --> 01:20:28.560] things are going to happen in your life as opposed to a neutral or a negative reading,

[01:20:28.560 --> 01:20:29.560] right?

[01:20:29.560 --> 01:20:34.880] Then after that, you're more likely to take increased financial risk, like you will bet

[01:20:34.880 --> 01:20:37.080] more money or invest more money or whatever.

[01:20:37.080 --> 01:20:38.080] Roger that.

[01:20:38.080 --> 01:20:41.360] So the men are duped more, I guess, than the women is what it seems like.

[01:20:41.360 --> 01:20:43.160] Or maybe just women are more risk averse.

[01:20:43.160 --> 01:20:44.160] I don't know.

[01:20:44.160 --> 01:20:45.160] All right.

[01:20:45.160 --> 01:20:47.480] Next one, a study of five year olds finds that they perceive overweight people to be

[01:20:47.480 --> 01:20:49.040] happier than thin people.

[01:20:49.040 --> 01:20:52.360] I can't possibly think that that is true.

[01:20:52.360 --> 01:20:53.800] That is so odd.

[01:20:53.800 --> 01:20:55.360] How overweight, Steve?

[01:20:55.360 --> 01:20:58.200] Well overweight is a very specific medical definition.

[01:20:58.200 --> 01:21:01.680] Like there is a BMI range that is considered, quote unquote, overweight.

[01:21:01.680 --> 01:21:03.560] Can you give me an idea?

[01:21:03.560 --> 01:21:04.560] So overweight but not obese.

[01:21:04.560 --> 01:21:05.560] Well, it depends.

[01:21:05.560 --> 01:21:06.560] Yeah, then there's obese.

[01:21:06.560 --> 01:21:09.560] So there's like thin, normal, overweight obese, right?

[01:21:09.560 --> 01:21:12.680] I forget what the exact numbers are and it's different for men and women, but you know,

[01:21:12.680 --> 01:21:16.240] it's like if you're 20, 30 pounds, you're probably in the overweight category.

[01:21:16.240 --> 01:21:19.360] Yeah, it depends on your height and all that.

[01:21:19.360 --> 01:21:23.360] The last one here is college students find that mask wearing does not impair social

[01:21:23.360 --> 01:21:24.360] interactions.

[01:21:24.360 --> 01:21:28.240] I mean, I would think that's the opposite.

[01:21:28.240 --> 01:21:29.480] Which one did you pick, Bob?

[01:21:29.480 --> 01:21:31.740] The right one.

[01:21:31.740 --> 01:21:32.740] That one?

[01:21:32.740 --> 01:21:33.740] Yeah, I did.

[01:21:33.740 --> 01:21:34.740] Yeah, I'm leaning to that one.

[01:21:34.740 --> 01:21:36.600] I'll take that one as the fake.

[01:21:36.600 --> 01:21:37.600] Okay, Evan.

[01:21:37.600 --> 01:21:42.160] All right, positive fortune telling results in increased financial risk taking for men,

[01:21:42.160 --> 01:21:43.160] but not for women.

[01:21:43.160 --> 01:21:51.240] You know, I think the thing here is that we, oh, stereotype women, unfortunately, as the

[01:21:51.240 --> 01:21:58.480] ones who, you know, would be probably more swayed by the results of a fortune teller

[01:21:58.480 --> 01:21:59.480] rather than a man.

[01:21:59.480 --> 01:22:01.760] But this is basically, you know, kind of saying the opposite.

[01:22:01.760 --> 01:22:07.560] I'm not trying to be sexist here or anything, but in general, it's my perception of it.

[01:22:07.560 --> 01:22:10.000] So maybe that's why this one could be the fiction.

[01:22:10.000 --> 01:22:11.700] It could be a flip flop here.

[01:22:11.700 --> 01:22:12.960] Maybe not.

[01:22:12.960 --> 01:22:18.560] The five year olds perceiving overweight people to be happier than thin people like, you know,

[01:22:18.560 --> 01:22:24.080] like a Santa Claus kind of effect, maybe the jolly old, you know, elf kind of thing.

[01:22:24.080 --> 01:22:27.600] That's all that's coming to my mind here with this one.

[01:22:27.600 --> 01:22:31.520] I have no, why would you, it's interesting that someone would even think to study this.

[01:22:31.520 --> 01:22:34.400] How do they come up with these ideas?

[01:22:34.400 --> 01:22:40.480] And then the last one, college students specifically find that mask wearing does not impair social

[01:22:40.480 --> 01:22:42.000] interactions.

[01:22:42.000 --> 01:22:47.080] I think all three are fiction, but the most fictiony of them, I'll say the five year old

[01:22:47.080 --> 01:22:49.920] one, I think.

[01:22:49.920 --> 01:22:55.640] It's the one I have the least understanding of even why this would be studied.

[01:22:55.640 --> 01:22:58.600] So just got no vibe for this one.

[01:22:58.600 --> 01:22:59.600] I'll say that's the fiction.

[01:22:59.600 --> 01:23:00.600] All right.

[01:23:00.600 --> 01:23:01.600] And Kara?

[01:23:01.600 --> 01:23:02.600] It's got no vibe.

[01:23:02.600 --> 01:23:07.440] I really think it all go either way.

[01:23:07.440 --> 01:23:11.160] Pretty much like all social psychology studies.

[01:23:11.160 --> 01:23:12.160] Exactly.

[01:23:12.160 --> 01:23:13.160] Yeah.

[01:23:13.160 --> 01:23:20.000] It says, okay, maybe women might be more credulous, but then on the flip side of that, as Bob

[01:23:20.000 --> 01:23:25.480] pointed out, maybe men are already taking higher risks.

[01:23:25.480 --> 01:23:28.640] That's not what you thought I was going to say.

[01:23:28.640 --> 01:23:32.400] Maybe men are, you know, women are already more risk averse and that's just exacerbated

[01:23:32.400 --> 01:23:33.400] by this.

[01:23:33.400 --> 01:23:34.400] You know what I mean?

[01:23:34.400 --> 01:23:35.920] So it could go either way.

[01:23:35.920 --> 01:23:37.720] The five year old, same thing.

[01:23:37.720 --> 01:23:44.160] It could be the like, you know, jolly St. Nick effect, but it could also be, I think,

[01:23:44.160 --> 01:23:47.880] some internalized sizes.

[01:23:47.880 --> 01:23:51.120] And then the last one, it's a study of college students, Evan, because they're all a study

[01:23:51.120 --> 01:23:53.520] of college students because they're easy to study.

[01:23:53.520 --> 01:23:54.520] Psychologists are studying.

[01:23:54.520 --> 01:23:55.520] They're available, I suppose.

[01:23:55.520 --> 01:23:56.520] Yeah, they're in their class.

[01:23:56.520 --> 01:23:57.520] College student bias.

[01:23:57.520 --> 01:23:58.520] Right.

[01:23:58.520 --> 01:24:02.360] Mask wearing does not impair social interactions.

[01:24:02.360 --> 01:24:06.880] This one's tough because it's like on what measure and is it, you know, is it a significant

[01:24:06.880 --> 01:24:08.800] difference and how, you know, what's the outcome?

[01:24:08.800 --> 01:24:18.560] I hope this one is science because as a therapist, we wear masks and I don't want my own therapy

[01:24:18.560 --> 01:24:23.520] sessions to be dramatically affected by the fact that we are wearing masks.

[01:24:23.520 --> 01:24:29.920] So and I would hope that it would be the psychologists themselves who studied that and said, okay,

[01:24:29.920 --> 01:24:34.440] we should probably, you know, when we're measuring the different types of risk, the risk of COVID

[01:24:34.440 --> 01:24:37.240] is higher than the risk of like negative outcomes from not wearing masks.

[01:24:37.240 --> 01:24:38.800] So I'm going to say that one's science.

[01:24:38.800 --> 01:24:44.280] So it's really between the first two, all the, everybody said that the masks was fiction

[01:24:44.280 --> 01:24:45.520] except for Evan, right?

[01:24:45.520 --> 01:24:47.320] You said the five year olds was fiction.

[01:24:47.320 --> 01:24:48.320] Okay.

[01:24:48.320 --> 01:24:49.320] Yeah, the five year olds.

[01:24:49.320 --> 01:24:53.320] So nobody said the risk taking, I could go out on that limb and spread it out, but I

[01:24:53.320 --> 01:24:55.380] think I'm leaning towards the five year olds too.

[01:24:55.380 --> 01:24:59.000] Something about this one tells me that that may have been the case like a hundred years

[01:24:59.000 --> 01:25:04.800] ago, but that that trend has reversed very similar to the doll studies, like super young

[01:25:04.800 --> 01:25:10.080] children will say that like the black doll is less likable than the white doll.

[01:25:10.080 --> 01:25:14.660] Even black children, there are these horrid stereotypes that become internalized when

[01:25:14.660 --> 01:25:16.200] we're very, very young.

[01:25:16.200 --> 01:25:21.600] And I wouldn't be surprised if this is like a, like a thin bias and it was the opposite.

[01:25:21.600 --> 01:25:25.840] So I'm going to go with Evan and say that it's the five year old study that's a fiction.

[01:25:25.840 --> 01:25:26.840] All right.

[01:25:26.840 --> 01:25:27.840] So you all agree with number one.

[01:25:27.840 --> 01:25:28.840] So we'll start there.

[01:25:28.840 --> 01:25:33.120] The science and study finds that positive fortune telling results in increased financial

[01:25:33.120 --> 01:25:35.120] risk taking for men, but not women.

[01:25:35.120 --> 01:25:39.780] You all think this one is science and this one is science.

[01:25:39.780 --> 01:25:41.780] You're all safe so far.

[01:25:41.780 --> 01:25:45.120] Part of the reason why I chose this one is because they did, the study was fairly rigorous.

[01:25:45.120 --> 01:25:48.600] They actually did three separate independent studies.

[01:25:48.600 --> 01:25:54.800] They used online, real online gambling games and, but in a laboratory setting.

[01:25:54.800 --> 01:26:02.160] And they, they basically had subjects, men and women get a fortune telling that was either

[01:26:02.160 --> 01:26:03.920] positive, neutral and negative.

[01:26:03.920 --> 01:26:08.160] And then they measured their behavior in the online gambling.

[01:26:08.160 --> 01:26:13.160] In all three studies, there was a pretty robust effect where the men had increased risk taking.

[01:26:13.160 --> 01:26:16.240] They made larger bets, you know, riskier bets.

[01:26:16.240 --> 01:26:20.200] For women it was mixed, but when they did a meta analysis across the three studies,

[01:26:20.200 --> 01:26:21.320] there was zero effect.

[01:26:21.320 --> 01:26:23.840] So basically there was no effect for the women.

[01:26:23.840 --> 01:26:27.920] So again, they said there might be a small effect because it was positive or whatever.

[01:26:27.920 --> 01:26:30.920] But basically it was negative across all three studies.

[01:26:30.920 --> 01:26:34.660] So yeah, interesting, you know, and again, is that because the, so, you know, you think

[01:26:34.660 --> 01:26:37.240] people feel good that good things are going to happen.

[01:26:37.240 --> 01:26:41.640] They feel positive because even if they don't believe in it, just the idea that somebody

[01:26:41.640 --> 01:26:45.680] told them good things are going to happen, it made them take more risks because they,

[01:26:45.680 --> 01:26:49.040] they guess they felt that their probability of winning was going to go up.

[01:26:49.040 --> 01:26:54.960] Now the question is, did we not see that effect in women because women did, didn't have the

[01:26:54.960 --> 01:26:59.560] effect or it was canceled out by, by just being higher, more risk averse of baseline?

[01:26:59.560 --> 01:27:00.560] Exactly, yeah.

[01:27:00.560 --> 01:27:04.000] Was it, was it, then this study couldn't answer that question, but it is interesting to think

[01:27:04.000 --> 01:27:05.000] about.

[01:27:05.000 --> 01:27:06.000] All right, let's go on to number two.

[01:27:06.000 --> 01:27:10.160] A study of five-year-olds finds that they perceive overweight people to be happier than

[01:27:10.160 --> 01:27:11.160] thin people.

[01:27:11.160 --> 01:27:15.640] Evan and Keri, you think this one is the fiction, Bob and Jay, you think this one is science.

[01:27:15.640 --> 01:27:20.880] So I guess the question is, Keri, you, you, you, I think zeroed in on the actual question.

[01:27:20.880 --> 01:27:22.600] Is it the jolly effect, right?

[01:27:22.600 --> 01:27:29.000] Do they think that overweight people are jolly or is there an attractiveness bias here even

[01:27:29.000 --> 01:27:33.440] in, even in, even in five-year-olds?

[01:27:33.440 --> 01:27:38.520] And this one is the fiction, you guys are right.

[01:27:38.520 --> 01:27:39.520] Doesn't surprise.

[01:27:39.520 --> 01:27:41.400] It's so sad, but it was the opposite.

[01:27:41.400 --> 01:27:48.040] They thought that the thinner people were, were happier and more likable, you know, than

[01:27:48.040 --> 01:27:49.880] the overweight people.

[01:27:49.880 --> 01:27:54.720] And then they do, they didn't interpret this as an overall attractiveness bias, that we

[01:27:54.720 --> 01:27:59.080] think that attractive people are just, they're better people, they're more interesting, they're

[01:27:59.080 --> 01:28:01.480] more likable, they're happier, et cetera.

[01:28:01.480 --> 01:28:07.760] And that even at five-year-old level, that bias has already culturally seeped in into

[01:28:07.760 --> 01:28:08.760] that.

[01:28:08.760 --> 01:28:09.760] Yeah.

[01:28:09.760 --> 01:28:12.480] And all of the stuff reinforcing that.

[01:28:12.480 --> 01:28:13.480] Yeah.

[01:28:13.480 --> 01:28:17.240] This means that a study of college students finds that mask wearing does not impair social

[01:28:17.240 --> 01:28:19.560] interactions is science.

[01:28:19.560 --> 01:28:27.040] This was a convenience sample of college students and in this paradigm, they basically instructed

[01:28:27.040 --> 01:28:31.080] students to like go into the library, find somebody at random and then talk to them.

[01:28:31.080 --> 01:28:35.740] And then for whatever, a half an hour or so, and then they rate the interaction.

[01:28:35.740 --> 01:28:39.720] And then for half of them, they had them wear like a hat, glasses and a mask.

[01:28:39.720 --> 01:28:42.980] And the other half, they didn't do anything.

[01:28:42.980 --> 01:28:48.360] And there was no effect on any of the measures for the social interaction.

[01:28:48.360 --> 01:28:55.320] They measured the ease of the interaction, authenticity, friendliness, mood, discomfort,

[01:28:55.320 --> 01:28:56.920] and interestingness of the interaction.

[01:28:56.920 --> 01:29:00.000] There was no measurable effect between the two groups.

[01:29:00.000 --> 01:29:06.000] So they found basically there was no impairment of social interaction while wearing the mask

[01:29:06.000 --> 01:29:11.200] within obviously the paradigm of this study and of course within college students.

[01:29:11.200 --> 01:29:12.200] So interesting.

[01:29:12.200 --> 01:29:16.560] Obviously, you know, with social psychology study like this, no one study is definitive

[01:29:16.560 --> 01:29:19.560] or is the final word on any phenomenon like this.

[01:29:19.560 --> 01:29:23.280] But I thought that was an interesting way to look at that question.

[01:29:23.280 --> 01:29:24.520] All right.

[01:29:24.520 --> 01:29:26.080] So Kara and Evan, good job.

[01:29:26.080 --> 01:29:27.080] Hey.

[01:29:27.080 --> 01:29:28.080] Thanks.

[01:29:28.080 --> 01:29:29.080] Yay, Evan.

Skeptical Quote of the Week (1:29:29)

A good ghost story may hold entertainment and even cultural value, but the popular portrayal of pseudoscientific practices as science may be detracting from efforts to cultivate a scientifically literate public.
Micheal Knees, engineering psychologist

Signoff

S: —and until next week, this is your Skeptics' Guide to the Universe.

S: Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information, visit us at theskepticsguide.org. Send your questions to info@theskepticsguide.org. And, if you would like to support the show and all the work that we do, go to patreon.com/SkepticsGuide and consider becoming a patron and becoming part of the SGU community. Our listeners and supporters are what make SGU possible.

[top]                        

Today I Learned

  • Fact/Description, possibly with an article reference[9]
  • Fact/Description
  • Fact/Description

Notes

  1. The emailer uses the wrong word, homonymy here. The preceding wikilink goes to the disambiguation entry for "Homophony"; the Wikitionary entry shows that "homophony" is the word the emailer should have used.

References

  1. [url_from_news_item_show_notes PUBLICATION: TITLE]
  2. [url_from_news_item_show_notes PUBLICATION: TITLE]
  3. [url_from_news_item_show_notes PUBLICATION: TITLE]
  4. [url_from_news_item_show_notes PUBLICATION: TITLE]
  5. [url_from_news_item_show_notes PUBLICATION: TITLE]
  6. [url_from_SoF_show_notes PUBLICATION: TITLE]
  7. [url_from_SoF_show_notes PUBLICATION: TITLE]
  8. [url_from_SoF_show_notes PUBLICATION: TITLE]
  9. [url_for_TIL publication: title]

Vocabulary


Navi-previous.png Back to top of page Navi-next.png