SGU Episode 896

From SGUTranscripts
Jump to navigation Jump to search
  Emblem-pen-green.png This transcript is not finished. Please help us finish it!
Add a Transcribing template to the top of this transcript before you start so that we don't duplicate your efforts.
  Emblem-pen-orange.png This episode needs: transcription, formatting, links, 'Today I Learned' list, categories, segment redirects.
Please help out by contributing!
How to Contribute

This is an outline for a typical episode's transcription. Not all of these segments feature in each episode.
There may also be additional/special segments not listed in this outline.

You can use this outline to help structure the transcription. Click "Edit" above to begin.


SGU Episode 896
September 10th 2022
896 chicxulub meteor.jpeg

depiction of Chicxulub meteor

SGU 895                      SGU 897

Skeptical Rogues
S: Steven Novella

B: Bob Novella

C: Cara Santa Maria

J: Jay Novella

E: Evan Bernstein

Quote of the Week

A good ghost story may hold entertainment and even cultural value, but the popular portrayal of pseudoscientific practices as science may be detracting from efforts to cultivate a scientifically literate public.

Micheal Knees, American engineering psychologist

Links
Download Podcast
Show Notes
Forum Discussion

Introduction, another Artemis launch scrubbed

Voice-over: You're listening to the Skeptics' Guide to the Universe, your escape to reality.

[00:09.840 --> 00:13.440] Hello and welcome to the Skeptics Guide to the Universe. Today is Wednesday,

[00:13.440 --> 00:17.360] September 7th, 2022, and this is your host, Stephen Novella.

[00:17.360 --> 00:19.840] Joining me this week are Bob Novella. Hey, everybody.

[00:19.840 --> 00:21.760] Kara Santamaria. Howdy.

[00:21.760 --> 00:23.280] Jay Novella. Hey, guys.

[00:23.280 --> 00:26.400] And Evan Bernstein. Good evening, everyone.

[00:26.400 --> 00:32.400] So we had this scrubbing of the second launch date for the Artemis 1.

[00:32.400 --> 00:35.120] Why do they keep doing that to us? Frustrating.

[00:35.120 --> 00:39.280] Yeah, so, I mean, the first, you know, this was supposed to fly in 2017.

[00:39.280 --> 00:44.720] This is now a five-year rolling delay in terms of getting this thing off the ground.

[00:44.720 --> 00:49.200] But yeah, so on last Monday or Tuesday, I think it was Monday,

[00:49.200 --> 00:51.120] they were going to try to do a launch.

[00:51.120 --> 00:55.040] They had a temperature problem in the engines,

[00:55.040 --> 00:58.720] and then it turned out they couldn't fix it within the launch window,

[00:58.720 --> 01:01.440] so they had to scrub. Turned out it was a faulty sensor.

[01:01.440 --> 01:04.880] Everything was fine, but whatever, one faulty sensor scrubs a launch.

[01:05.760 --> 01:08.640] So they rescheduled it for Saturday, and then on Saturday,

[01:08.640 --> 01:10.400] they had actually a more serious problem.

[01:10.400 --> 01:12.640] I'm not sure why they didn't have the same problem on Monday.

[01:12.640 --> 01:16.800] They had a hydrogen leak from the liquid hydrogen gassing.

[01:16.800 --> 01:19.120] Now, this is a serious problem because...

[01:19.120 --> 01:21.600] Yeah. You don't f*** around with hydrogen, man.

[01:21.600 --> 01:26.560] If it gets too, if the percentage of hydrogen outside the tank gets too high,

[01:26.560 --> 01:29.520] there's a chance that it could explode, you know, when the ship takes off,

[01:29.520 --> 01:30.880] which would be bad, right?

[01:30.880 --> 01:33.840] You don't want the explosion to be happening outside of the tank.

[01:33.840 --> 01:35.920] Oh, what do they call that? There's a name for that.

[01:35.920 --> 01:37.120] Catastrophic failure?

[01:37.120 --> 01:37.760] No, no, no.

[01:37.760 --> 01:39.280] No, you're right, Bob. There is a name for that.

[01:40.240 --> 01:43.600] And it's hilarious. Explosive disassembly or something.

[01:43.600 --> 01:44.320] Yeah, okay.

[01:44.320 --> 01:46.560] That's so scary.

[01:46.560 --> 01:48.960] Let's disassemble it with explosives, yay.

[01:48.960 --> 01:52.640] So this is interesting. So they had to scrub that because they couldn't fix that in time.

[01:52.640 --> 01:54.160] They tried a couple of things to, like,

[01:54.160 --> 01:57.040] I'll change the temperature to get the seals to work, but it didn't work.

[01:57.040 --> 02:01.200] They could potentially fix this problem on the launch pad,

[02:01.200 --> 02:03.760] but by the time they could do that,

[02:03.760 --> 02:10.640] the batteries that are needed for the abort system to work would have to be recycled.

[02:10.640 --> 02:17.920] So they have to bring the ship back to the building just to swap out the abort batteries.

[02:17.920 --> 02:20.240] But of course, while it's there, they'll fix everything.

[02:20.240 --> 02:21.840] And they got to reset everything.

[02:21.840 --> 02:23.520] It's like outside the window.

[02:23.520 --> 02:26.880] So now it's like you're keeping all these plates spinning, you know?

[02:26.880 --> 02:29.360] And if you don't get it to fly within a certain amount of time,

[02:29.360 --> 02:32.960] you got to bring it back and reset everything, you know, and then try again.

[02:32.960 --> 02:36.800] So now the earliest, they haven't set a new launch date yet as this recording,

[02:36.800 --> 02:38.560] but the earliest would be mid-October.

[02:38.560 --> 02:39.600] It would be like six weeks.

[02:39.600 --> 02:41.360] October 2023.

[02:41.360 --> 02:42.480] Yeah, 2022.

[02:42.480 --> 02:43.680] Oh, okay.

[02:43.680 --> 02:48.160] We did talk about it briefly during the live show, Jane.

[02:48.160 --> 02:51.440] You brought up the fact that you've heard some criticism.

[02:51.440 --> 02:54.160] So I did a deeper dive on it because I've heard some criticism too,

[02:54.160 --> 02:55.520] and I wanted to know where that was.

[02:55.520 --> 02:59.440] The bottom line is that it's just really expensive, you know?

[02:59.440 --> 03:04.160] They're spending, you know, $150 billion to get this thing up.

[03:04.160 --> 03:11.680] It's going to cost a billion dollars or $2 billion a launch just for the launch fees itself.

[03:11.680 --> 03:14.000] If you amortize the development cost,

[03:14.000 --> 03:17.440] it's going to be between four and five billion dollars per launch,

[03:18.400 --> 03:22.720] and they only have the infrastructure to launch one a year.

[03:22.720 --> 03:24.080] That's all we're going to get out of it.

[03:24.080 --> 03:26.080] One launch a year, and at the end of the day,

[03:26.080 --> 03:29.840] it's probably going to be at like four to five billion dollars per launch.

[03:29.840 --> 03:32.160] So that's mainly where the criticism is coming from.

[03:32.160 --> 03:32.880] It's expensive.

[03:33.520 --> 03:37.120] It's not really going to be able to do that many launches.

[03:37.120 --> 03:40.480] But you got to keep in mind that you go back to 2011

[03:40.480 --> 03:42.960] when they canceled the Constellation program,

[03:42.960 --> 03:47.360] which is the predecessor to the Space Launch System, the SLS,

[03:47.360 --> 03:51.280] and also that was the end of the life of the space shuttle.

[03:51.280 --> 03:54.640] So we had no, basically, no rockets to go up.

[03:54.640 --> 04:00.160] So at that time, the Obama administration basically made a bargain with NASA.

[04:00.160 --> 04:06.480] They said, okay, we will fund the SLS program for deep space,

[04:06.480 --> 04:10.800] but you are going to contract out low Earth orbit to private industry.

[04:10.800 --> 04:11.680] So that's what they did.

[04:12.480 --> 04:15.920] And that's where SpaceX comes from and like Blue Origin, all these companies.

[04:15.920 --> 04:17.040] So that worked out really well.

[04:17.040 --> 04:21.120] The low Earth orbit, you know, and SpaceX worked out tremendously well,

[04:21.120 --> 04:25.440] but they're kind of hobbled with this really over budget, delayed,

[04:25.440 --> 04:29.680] really expensive SLS, you know, heavy launch system.

[04:30.400 --> 04:34.400] And, you know, now looking back 11 years later,

[04:34.400 --> 04:37.360] it's like, you know, there's nothing innovative about it.

[04:37.360 --> 04:45.040] It's not reusable, you know, and the SpaceX is basically completely leapfrogged over it.

[04:45.040 --> 04:47.600] So I think that's where a lot of the criticism comes from.

[04:47.600 --> 04:50.560] But still, here we are, you know, it's going to get us to the moon.

[04:50.560 --> 04:53.760] You also have to keep in mind that at the other end of the spectrum,

[04:54.400 --> 04:56.800] the Artemis program, not the SLS,

[04:56.800 --> 05:00.640] but the Artemis program was originally planned for 2028.

[05:00.640 --> 05:04.720] Well, to the moon, right, to be back on the moon in 2028.

[05:04.720 --> 05:08.160] That's Artemis mission, not the SLS system, right?

[05:08.160 --> 05:09.280] So not the rocket.

[05:09.280 --> 05:15.520] But the Artemis mission was moved up from 2028 to 2024 by the Trump administration.

[05:15.520 --> 05:18.400] And then it's now pushed back to 2025.

[05:18.400 --> 05:20.880] That's still three years ahead of schedule.

[05:20.880 --> 05:22.080] Of original schedule, yes.

[05:22.080 --> 05:23.120] Original schedule.

[05:23.120 --> 05:26.320] And nobody ever thought that the 2024 thing was realistic.

[05:26.320 --> 05:28.640] NASA was like, this is just not going to be like, OK, sure, right.

[05:28.640 --> 05:32.560] But they knew politically it sounded good, but never going to happen.

[05:32.560 --> 05:35.120] So, all right, we're still on track to get back to the moon

[05:35.120 --> 05:36.720] by the middle of this decade.

[05:36.720 --> 05:39.360] And hopefully, you know, the SLS will work out.

[05:39.360 --> 05:40.720] Artemis will launch.

[05:40.720 --> 05:43.760] It's obviously I'd rather have them scrub for six weeks

[05:43.760 --> 05:45.200] and have the thing blow up on the pad.

[05:45.200 --> 05:46.800] That would be a disaster.

[05:46.800 --> 05:47.840] My gosh.

[05:47.840 --> 05:51.760] What I do think is that NASA should already be planning

[05:51.760 --> 05:53.760] the successor of the SLS, though.

[05:54.960 --> 05:55.200] Right.

[05:55.200 --> 05:55.920] I mean, they shouldn't.

[05:55.920 --> 05:59.040] Well, the SLS is expensive to fly.

[05:59.040 --> 06:01.600] And it's like, you know, it's not reusable.

[06:01.600 --> 06:03.440] It's not efficient or whatever.

[06:04.000 --> 06:06.000] They should probably just contract out, you know,

[06:06.000 --> 06:09.360] to the private space industry now to develop the next thing

[06:09.360 --> 06:12.320] that's going to be able to get to the moon and to Mars

[06:12.960 --> 06:14.720] and not try to do it themselves.

[06:14.720 --> 06:15.440] You know what I mean?

[06:16.000 --> 06:16.560] Yeah.

[06:16.560 --> 06:19.760] Yeah, I mean, that's a really hard thing to predict, Steve.

[06:19.760 --> 06:22.800] You know, first of all, we don't know how well the SLS is going to work.

[06:22.800 --> 06:26.640] It seems like private industry is going to work out better

[06:26.640 --> 06:28.880] than NASA owning their own rockets at this point.

[06:28.880 --> 06:29.920] Don't you agree?

[06:29.920 --> 06:32.160] I mean, for low Earth orbit, it's worked out really well.

[06:32.800 --> 06:34.720] You know, that was sort of the division of labor.

[06:34.720 --> 06:37.120] They would let private industry handle low Earth orbit

[06:37.120 --> 06:39.040] and then NASA will do deep space, right?

[06:39.040 --> 06:40.720] Go back to the moon and then eventually Mars.

[06:41.280 --> 06:45.360] Orion, which is NASA's capsule, that is the only spaceship

[06:45.360 --> 06:47.760] that can get to the, you know, to the moon now, right?

[06:47.760 --> 06:49.120] That can do deep space missions.

[06:49.120 --> 06:51.680] It's rated for 21 days.

[06:51.680 --> 06:54.240] It's long enough to get to the moon and back, you know what I mean?

[06:54.240 --> 06:56.480] So the Dragon module can't do it?

[06:56.480 --> 06:59.360] Well, according to NASA, it's the only one that's rated for,

[06:59.360 --> 07:00.720] like, moon missions at this point.

[07:00.720 --> 07:05.200] So they would, not that you, you know, I'm sure you could get the Dragon capsule

[07:05.200 --> 07:09.280] or a version of it to the point where it would be rated for deep space,

[07:09.280 --> 07:10.480] but it isn't right now.

[07:11.200 --> 07:15.040] But again, they gave the contract to SpaceX, remember, for the lunar lander

[07:15.040 --> 07:20.240] and Musk wants to convert the Starship into a lunar lander.

[07:20.240 --> 07:22.240] Yeah, that's still on.

[07:22.240 --> 07:23.760] Which is, like, weird in a way.

[07:24.640 --> 07:28.080] Would that ship, Steve, leave from Earth or would it stay?

[07:28.080 --> 07:29.120] Well, it'd have to, right?

[07:29.120 --> 07:31.680] We're not going to build it on Earth, send it to the moon,

[07:31.680 --> 07:34.560] and then it's going to land on, that's the ship that's going to land on the moon.

[07:34.560 --> 07:36.560] But, you know, I think we talked about it at the time,

[07:36.560 --> 07:39.360] it's like, yeah, but it's going all the way to the moon.

[07:39.360 --> 07:41.760] Why don't you just make that your moon ship, you know what I mean?

[07:41.760 --> 07:46.800] Like, why are you going to take the SLS to the moon, then hop on over into the Starship

[07:46.800 --> 07:49.120] to go down, to land down on the moon?

[07:49.120 --> 07:49.680] I don't know.

[07:49.680 --> 07:51.680] I don't know exactly how that's going to work.

[07:51.680 --> 07:56.320] So, okay, so it is that way, that ship is going to basically ferry people

[07:56.320 --> 08:00.400] from low moon orbit to the surface.

[08:00.400 --> 08:01.440] Yes, that's right.

[08:01.440 --> 08:04.480] And it stays out there and they just refuel it and keep reusing it.

[08:04.480 --> 08:05.520] I guess so.

[08:05.520 --> 08:08.560] Steve, I'm hoping that the next thing that will be developed

[08:08.560 --> 08:14.720] will be a deep space nuclear rocket, because they're developing nuclear rockets for cislunar.

[08:14.720 --> 08:17.920] Now, they won't be really rated for beyond cislunar, right?

[08:17.920 --> 08:20.960] They really won't be designed to go beyond the moon.

[08:20.960 --> 08:25.680] But, and this is why NASA is working with them on this, once they have it,

[08:25.680 --> 08:29.040] then the homework, you know, the foundational homework will be done,

[08:29.040 --> 08:32.880] and then NASA could take that and then extend it and then make it,

[08:32.880 --> 08:34.640] you know, for a much deeper space.

[08:34.640 --> 08:36.320] So that's my hope.

[08:36.320 --> 08:40.400] That's my hope. The question is, is it going to be the next gen deep space,

[08:40.400 --> 08:41.920] or is it going to be the one after that?

[08:42.560 --> 08:48.560] Well, maybe just like let private companies handle just the heavy lift rockets

[08:48.560 --> 08:49.600] that get you to the moon.

[08:50.240 --> 08:54.560] And NASA just completely focuses on developing nuclear rockets.

[08:54.560 --> 08:57.120] Yeah, shit man, I'd be, I'm all for that.

[08:57.120 --> 08:58.400] Because that's the next thing we need.

[08:58.400 --> 09:01.680] And chemical rockets are just so inefficient, you know,

[09:01.680 --> 09:04.560] like it's just not the way to get to Mars and back.

[09:04.560 --> 09:11.760] No, anything beyond the moon, and chemical rockets are just going to be marginalized.

[09:11.760 --> 09:14.080] I mean, of course, now I'm thinking much deeper into the future,

[09:14.080 --> 09:16.880] but as we, as the decades and centuries accrue,

[09:17.760 --> 09:21.360] chemical is really going to be just like maybe for Earth launch.

[09:21.360 --> 09:22.160] And that's it.

[09:22.160 --> 09:25.120] Getting out of Earth's gravity well, that's pretty much going to be it.

[09:25.120 --> 09:28.640] Right. But that's, you know, who knows how long that's going to take,

[09:29.600 --> 09:32.880] you know, when chemical no longer has any role in deep space,

[09:32.880 --> 09:35.600] because, you know, long distance rocket equation says,

[09:35.600 --> 09:37.360] screw you chemical rockets.

[09:37.360 --> 09:38.080] Yeah.

[09:38.080 --> 09:38.640] Yeah.

[09:38.640 --> 09:40.720] And then, and then eventually fusion.

[09:40.720 --> 09:44.480] Once we get to fusion, then we're, that's the, that's the game.

[09:44.480 --> 09:45.440] Started man, that's good.

[09:45.440 --> 09:45.920] Yeah.

[09:45.920 --> 09:46.800] And what's interesting is-

[09:46.800 --> 09:49.200] Especially the hydrogen proton proton fusion engine.

[09:49.200 --> 09:54.640] Once we develop fusion engines, that's going to be our engines forever.

[09:54.640 --> 10:00.080] Like there's the probability that anything will replace it is so remote.

[10:00.080 --> 10:05.200] Like we don't know if it will ever happen and if it does, it will be in the distant far future.

[10:05.200 --> 10:05.600] Right.

[10:05.600 --> 10:08.560] So that's the brass ring right there.

[10:08.560 --> 10:11.360] Well, for reaction rockets, yes.

[10:11.360 --> 10:16.080] I think that's going to be it for quite, for potentially centuries.

[10:16.080 --> 10:18.560] And you could do an amazing amount of things-

[10:18.560 --> 10:19.520] I think thousands of years.

[10:19.520 --> 10:21.920] With, with, that's silly.

[10:21.920 --> 10:22.960] Technically centuries too.

[10:22.960 --> 10:23.840] But that's, yeah.

[10:23.840 --> 10:24.960] But that's, yeah.

[10:24.960 --> 10:28.320] I mean, even the best we can do with that type of reaction rocket,

[10:28.320 --> 10:31.120] say a fusion hydrogen proton proton, which is really efficient,

[10:31.120 --> 10:34.960] like say 11%, 11% speed of light exhaust velocity.

[10:34.960 --> 10:40.400] That is, you could still do, you know, 20% the speed of light with that type of rocket.

[10:40.400 --> 10:45.040] And if you don't care about cargo at all, you can get that rocket up to 50% the speed of light.

[10:45.680 --> 10:49.520] But then cargo of course becomes literally a millionth of the payload,

[10:49.520 --> 10:52.720] but still 10%, 20% the speed of light with a super advanced-

[10:52.720 --> 10:58.560] Give it a bob, you add, add a little bit of light sails and then that'll get you.

[10:58.560 --> 10:58.880] Yes.

[10:58.880 --> 10:59.680] That'll get you there.

[10:59.680 --> 11:01.840] So that's going to be light sails and fusion.

[11:01.840 --> 11:02.720] That's going to be space travel.

[11:02.720 --> 11:06.880] That seems to be, I think that's pretty much where we're going for centuries.

[11:06.880 --> 11:10.960] Unless an ASI, artificial super intelligence, rises and then all bets are off.

[11:10.960 --> 11:16.880] But even then, he or she would be constrained to, to the physics, to physics as we know it.

[11:16.880 --> 11:19.280] And even, even, you know, the ASI might say,

[11:19.280 --> 11:22.640] damn man, this is the best I could do, but it's still going to be cool.

[11:22.640 --> 11:24.560] Yeah. It's almost as if we wrote a whole book about it.

[11:24.560 --> 11:24.720] Yeah.

[11:26.560 --> 11:30.560] It's almost as if I just did a deep dive research on it because I talked about it at Dragon Con.

[11:31.200 --> 11:31.680] Dragon Con.

[11:31.680 --> 11:32.480] How was Dragon Con?

[11:33.040 --> 11:33.920] It was great.

[11:33.920 --> 11:39.360] Liz and I went first time in three years and I know you guys were just so wicked jealous.

[11:39.360 --> 11:40.000] It was great.

[11:40.000 --> 11:40.560] Totally.

[11:40.560 --> 11:42.160] It was pretty much as we remember it.

[11:42.160 --> 11:45.840] Amazing costumes, amazing fun, lots of people.

[11:45.840 --> 11:50.160] And pretty much, I was double masked for like four days in a row

[11:50.160 --> 11:55.680] and I took a, took a test today and totally clean, no, totally negative.

[11:55.680 --> 11:59.360] So I think I totally, you know, got away with it totally.

[12:00.400 --> 12:01.040] I did a talk.

[12:01.040 --> 12:04.480] I called the science, I called, I called the science panel guys and I'm like,

[12:04.480 --> 12:07.840] I want to do the future of rockets.

[12:07.840 --> 12:10.880] And they'd made a panel with like five guys and I was one of them.

[12:10.880 --> 12:12.320] And I just went off.

[12:12.320 --> 12:14.480] I did a deep dive for weeks.

[12:14.480 --> 12:17.840] For weeks I did a deep dive just to refresh my memory and all the research that I had

[12:17.840 --> 12:20.880] done for the chapter of the book about future rockets.

[12:20.880 --> 12:21.920] And I got it down, man.

[12:21.920 --> 12:26.880] I made an awesome bullet list of all the top, the top things that I needed to keep straight

[12:26.880 --> 12:27.440] in my head.

[12:27.440 --> 12:29.680] And it was so much fun to research.

[12:29.680 --> 12:34.000] And there was a great panel, great panel, great fellow panelists with me.

[12:34.000 --> 12:36.960] They were all very knowledgeable and it was great.

[12:36.960 --> 12:38.880] But also I did some skeptical stuff.

[12:38.880 --> 12:40.240] I talked about the two books.

[12:40.240 --> 12:45.760] I did a, I did a one man show on stage on the skeptical track and I was like, oh boy,

[12:45.760 --> 12:46.960] this is scary.

[12:46.960 --> 12:47.840] But it was fine.

[12:47.840 --> 12:48.640] It was fine.

[12:48.640 --> 12:52.160] I just, I just went off on the books and then I started talking about rockets again.

[12:52.160 --> 12:52.960] And then that was it.

[12:52.960 --> 12:54.400] I was in my happy place.

[12:54.960 --> 12:56.560] And, uh, totally great.

[12:56.560 --> 13:00.160] Bob, totally utterly, absolutely.

[13:00.160 --> 13:04.560] Indubitably your solo talk was basically like a pared down news item for Bob.

[13:04.560 --> 13:06.080] Yeah, that's basically what it was.

[13:06.880 --> 13:07.520] It was great.

[13:07.520 --> 13:13.760] And, uh, so many, as usual, so many great costumes, the talent on display at Dragon

[13:13.760 --> 13:19.840] Con blows me away every time I go and I'm determined next year to have an awesome homemade

[13:19.840 --> 13:21.520] costume, which I didn't have this year.

[13:22.320 --> 13:22.480] Yeah.

[13:22.480 --> 13:24.240] We haven't been, I've been what, in four years now.

[13:24.240 --> 13:26.480] It'll be, we're definitely going to make a plan to go next year.

[13:27.120 --> 13:27.440] Yeah.

[13:27.440 --> 13:28.800] I mean, we were fine.

[13:28.800 --> 13:30.480] Pandemic willing, but I hopefully will.

[13:30.480 --> 13:30.720] Yeah.

[13:30.720 --> 13:31.200] It's time.

[13:31.200 --> 13:33.360] I mean, as long as things are good, we gotta go.

[13:33.360 --> 13:36.960] We were surrounded at times by thousands of people.

[13:36.960 --> 13:39.840] And at a couple of times I was like, this is uncomfortable.

[13:40.560 --> 13:44.080] But I had my double masks, you know, I held my breath a lot.

[13:44.640 --> 13:46.000] And it, and I'm fine.

[13:46.000 --> 13:48.720] Both Liz and I are both, you know, totally testing negative.

[13:48.720 --> 13:50.400] And it's been many, it's been days.

[13:50.400 --> 13:51.200] So it's doable.

[13:51.200 --> 13:53.760] Just, you know, you just, you know, you could take it easy.

[13:53.760 --> 13:57.840] You don't have to go into the big shoulder to shoulder crowds, um, you know?

[13:57.840 --> 13:58.320] And, uh, it's totally doable.

[13:58.320 --> 14:00.400] How about the, uh, the merch room?

[14:00.400 --> 14:00.800] Oh yeah.

[14:00.800 --> 14:02.640] That was, that was, you know, it was Christmas.

[14:02.640 --> 14:04.080] I'm, I'm walking towards it.

[14:04.080 --> 14:04.240] Yeah.

[14:04.240 --> 14:04.720] But how was it?

[14:04.720 --> 14:06.320] Was there a shoulder to shoulder in there?

[14:06.320 --> 14:07.200] No, no.

[14:07.200 --> 14:10.400] The first day, the first day it opened where I was like waiting for it.

[14:10.400 --> 14:13.360] And it was, it was, there's four floors, as you know.

[14:13.360 --> 14:17.680] And, uh, it was not shoulder to shoulder craziness at that, that the first few hours that I was

[14:17.680 --> 14:18.400] there.

[14:18.400 --> 14:20.080] And, uh, so that's, so that was fine too.

[14:20.080 --> 14:21.680] I was worried about that as well.

[14:21.680 --> 14:26.160] By the way, one last detail I've got to mention about the Orion capsule is that it's not just

[14:26.160 --> 14:27.920] that it's rated for 21 days.

[14:27.920 --> 14:32.800] When you come back from the moon, you reenter the atmosphere much faster than when you,

[14:32.800 --> 14:35.520] than when you're just coming down from low earth orbit.

[14:35.520 --> 14:39.360] And so the capsule has to be rated for high speed reentry.

[14:39.360 --> 14:39.680] Yeah.

[14:39.680 --> 14:43.360] And I think the, the Orion capsule is the only one that could do that.

[14:43.360 --> 14:48.400] So like the dragon capsule would really need to be redesigned or refitted to be a high

[14:48.400 --> 14:49.360] speed reentry.

[14:49.360 --> 14:51.520] That's yeah, that's, yeah, that's major.

[14:51.520 --> 14:53.200] You're not going to just slap on duct tape.

[14:53.200 --> 14:56.160] That's like a major event, major redesign.

[14:56.160 --> 14:56.960] I'm sure they could.

[14:56.960 --> 14:57.360] Yeah.

[14:57.360 --> 14:57.520] Yeah.

[14:57.520 --> 14:58.880] But I'm sure they could do it if they wanted to.

[14:58.880 --> 15:02.480] All right, Bob, um, you're going to start us off with a quickie.

[15:02.480 --> 15:05.200] You're going to tell us about Frank Drake.

[15:05.200 --> 15:06.320] Thank you, Steve.

Quickie with Bob: Frank Drake (15:00)

  • Frank Drake passes away [link_URL TITLE][1]

[15:06.320 --> 15:08.800] Hello and welcome to your Quickie with Bob.

[15:08.800 --> 15:12.800] No need to gird your loins for this one, Kara, but you may need a hanky.

[15:12.800 --> 15:17.600] We lost astrophysicist Frank Drake this past September 2nd, 2022.

[15:17.600 --> 15:18.880] He was 92.

[15:18.880 --> 15:19.680] Good run.

[15:19.680 --> 15:20.400] Nice run.

[15:20.400 --> 15:20.880] Good run.

[15:20.880 --> 15:24.960] We always say that when you're like, you know, in the eighties or nineties, but 92 is great.

[15:24.960 --> 15:30.240] I would, I would pay a good chunk of money right now if you can guarantee me, um, 92.

[15:30.240 --> 15:35.600] So he is most famous, of course, for his 1961 Drake equation.

[15:35.600 --> 15:39.920] I would love to have an equation named after me like that, which attempts to determine

[15:39.920 --> 15:44.560] the number of technological civilizations in our galaxy whose signals we could detect.

[15:45.200 --> 15:46.800] We talked about it on the show many times.

[15:46.800 --> 15:49.120] I won't go into any more detail on the equation itself.

[15:49.120 --> 15:51.120] We all know it by heart.

[15:51.120 --> 15:52.080] He did this now.

[15:52.080 --> 15:58.160] He did this after doing the very first modern SETI experiment in 1960 called Project

[15:58.160 --> 16:04.960] OSMA using a radio telescope to examine the stars, Tau Ceti and Epsilon Eridani, two names

[16:04.960 --> 16:06.560] of stars that I absolutely love.

[16:06.560 --> 16:09.040] I think it's what the Star Trek vibe or whatever.

[16:09.040 --> 16:10.240] I just love those names.

[16:10.240 --> 16:12.320] To me, there's just so science fictiony.

[16:12.320 --> 16:18.080] Now he used a part of the spectrum called the water hole, which is awesome on many levels

[16:18.080 --> 16:20.720] because it's near the hydrogen spectral lines.

[16:20.720 --> 16:21.360] Get it?

[16:21.360 --> 16:26.400] And it's also that, that part of the spectrum, the electromagnetic spectrum that's especially

[16:26.400 --> 16:31.600] quiet, and he reasoned that other intelligences would realize that as well and that it would

[16:31.600 --> 16:35.360] be a really good, efficient frequency to communicate over.

[16:36.000 --> 16:42.480] Now that experiment took two months and $2,000 in new equipment, and he essentially created

[16:42.480 --> 16:47.280] a new field by doing that SETI, the search for extraterrestrial intelligence.

[16:47.280 --> 16:51.760] From what I could tell, he did come up with that Drake equation to necessarily determine

[16:51.760 --> 16:56.320] the number of aliens that are out there, but as a way to stimulate discussions at the first

[16:56.320 --> 17:02.560] SETI meeting, because he was asked, hey, dude, because he became famous the year after he

[17:02.560 --> 17:02.960] did this.

[17:02.960 --> 17:09.120] He became well known the world over, and he was asked, hey, have this SETI conference,

[17:09.120 --> 17:10.560] the very first SETI conference.

[17:11.360 --> 17:17.280] He then came up with the Drake equation for that to stimulate discussions and thinking.

[17:17.280 --> 17:21.680] Drake's passion for astronomy and the possibility of life out there began when he was eight

[17:21.680 --> 17:26.000] years old, imagining alien Earths scattered across the night sky.

[17:26.000 --> 17:31.680] After his dad told him there were many worlds out there in space, and that was in 1938,

[17:31.680 --> 17:32.160] by the way.

[17:32.160 --> 17:32.960] Good on you, dad.

[17:33.760 --> 17:37.520] Seth Shostak said of him, Drake was never an impatient listener.

[17:37.520 --> 17:40.720] He was, to my mind, one of the last nice guys around.

[17:40.720 --> 17:44.800] He was never moody, never angry, and he didn't show the slightest annoyance if you walked

[17:44.800 --> 17:48.320] into his office and took his attention away from whatever he was doing.

[17:48.320 --> 17:53.920] And I read that over and over, people who had known him, that he was such a great, great

[17:53.920 --> 17:54.480] guy.

[17:54.480 --> 18:00.080] And I'll end with a quote from Nadia Drake, Frank's daughter, a titan in life, dad leaves

[18:00.080 --> 18:01.600] a titanic absence.

[18:02.720 --> 18:04.800] This was your sad quickie with Bob.

[18:04.800 --> 18:05.120] Thank you.

[18:06.000 --> 18:07.920] Yeah, it's always like a bittersweet, right?

[18:07.920 --> 18:14.000] It is sad to lose a giant like Frank Drake, but you're happy that he lived a long life.

[18:14.000 --> 18:16.560] He was relevant to the end.

[18:17.200 --> 18:23.440] Yeah, I mean, for centuries, as we're looking, and I think we'll never stop searching for

[18:23.440 --> 18:29.840] a life out there, his name will, and he will be in the thoughts of all the other big explorers

[18:29.840 --> 18:32.880] that haven't even been born yet that will be looking to the stars.

[18:32.880 --> 18:35.120] Bob, you and I are going to have to come up with our own equation.

[18:35.120 --> 18:36.400] It'll be the novella equation.

[18:36.400 --> 18:37.040] Yes, yes.

[18:37.040 --> 18:40.480] How about the probability that AI will wipe out human civilization?

[18:40.480 --> 18:41.920] Yes, all right.

[18:41.920 --> 18:42.640] Done.

[18:42.640 --> 18:43.440] We will do this.

[18:43.440 --> 18:44.800] That's too good not to happen.

[18:44.800 --> 18:45.040] All right.

[18:45.040 --> 18:45.840] All right.

[18:45.840 --> 18:46.640] Well, we'll work on it.

[18:46.640 --> 18:47.520] We'll come up with some other ideas.

[18:47.520 --> 18:48.880] That's horrible.

[18:48.880 --> 18:49.760] That's horrible.

News Items

S:

B:

C:

J:

E:

(laughs) (laughter) (applause) [inaudible]

follow up on the MOXIE instrument on Mars (18:51)

  • [link_URL TITLE][2]

[18:50.720 --> 18:51.440] All right, Jay.

[18:51.440 --> 18:55.520] You actually were kind of an astronomy space theme.

[18:56.480 --> 19:02.320] You're going to tell us, give us a follow up on the MOXIE instrument on Mars.

[19:02.320 --> 19:04.240] Yeah, do you guys remember MOXIE?

[19:04.240 --> 19:05.040] The oxygen thing?

[19:05.040 --> 19:06.640] MOXIE is creating oxygen on Mars right now.

[19:06.640 --> 19:07.040] Yeah.

[19:07.040 --> 19:08.000] Oh, okay.

[19:08.000 --> 19:08.800] Now I remember.

[19:08.800 --> 19:12.160] It's one of my favorite things about Perseverance.

[19:12.160 --> 19:17.760] So just to go through the basics so you guys understand, it's totally worth talking about

[19:17.760 --> 19:19.840] again because it's this fascinating technology.

[19:19.840 --> 19:25.920] It's an instrument about the size of a lunchbox that is connected to the Perseverance rover.

[19:25.920 --> 19:30.240] It happens to live in the front right side of Perseverance.

[19:30.240 --> 19:36.240] And its job is to take in the Martian atmosphere, which is 96% death, right?

[19:36.240 --> 19:38.640] It's 96% carbon dioxide.

[19:38.640 --> 19:41.840] And what it does is it strips the carbon atom away from the oxygen atoms.

[19:41.840 --> 19:43.200] I'll get into more detail about that.

[19:44.240 --> 19:48.640] And they're testing it and it's gone amazingly well.

[19:48.640 --> 19:50.480] So let me get into some details.

[19:50.480 --> 19:56.240] So first of all, MOXIE stands for Mars Oxygen In-Situ Resource Utilization Experiment.

[19:56.880 --> 20:00.320] And it couldn't be, the name is perfect for what this little bugger does.

[20:00.320 --> 20:02.880] So details about how it works.

[20:02.880 --> 20:08.560] So it takes in the Martian air and it filters it to remove any contaminants that happen

[20:08.560 --> 20:11.440] to be in there and dust particles, dirt and all that crap.

[20:11.440 --> 20:15.760] The air is then pressurized and then it's fed into something called the Solid Oxide

[20:15.760 --> 20:16.640] Electrolyzer.

[20:16.640 --> 20:19.280] That totally sounds like, what was that?

[20:19.280 --> 20:20.480] The encabulator, Steve?

[20:21.040 --> 20:22.720] Yeah, the turbo encabulator.

[20:22.720 --> 20:23.220] Yeah.

[20:23.840 --> 20:29.120] So the Solid Oxide Electrolyzer, this is an instrument that electrochemically splits the

[20:29.120 --> 20:33.120] carbon dioxide molecule into oxygen ions and carbon monoxide.

[20:33.680 --> 20:40.880] The oxygen ions are separated from the carbon monoxide and then they combine them to make

[20:40.880 --> 20:43.920] molecular oxygen, which is essentially just oxygen.

[20:43.920 --> 20:47.440] MOXIE then measures how much oxygen it creates, right?

[20:47.440 --> 20:54.000] So as it has created this batch, it measures how much it has just created and it also checks

[20:54.000 --> 20:56.720] its purity before it releases it back into the atmosphere.

[20:56.720 --> 21:00.320] And that's what MOXIE is doing right now, is just spitting this stuff right back out

[21:00.320 --> 21:01.200] into the atmosphere.

[21:01.920 --> 21:04.640] The single unit weighs, how much do you guys think this thing weighs?

[21:04.640 --> 21:05.140] Four pounds?

[21:06.160 --> 21:07.280] 21 kilos.

[21:07.280 --> 21:10.080] Not bad, 37 pounds or 17 kilos.

[21:10.080 --> 21:14.720] And now it does the work of a small tree, if you can believe that.

[21:14.720 --> 21:15.280] The unit was-

[21:15.280 --> 21:15.780] How small?

[21:16.640 --> 21:17.520] I'll get into detail.

[21:17.520 --> 21:21.760] A small tree, meaning I would say anything that's probably below 10 feet, like it's

[21:21.760 --> 21:23.520] a non-mature tree.

[21:24.160 --> 21:29.040] The unit was first turned on in February 2021 and every test they ran worked perfectly.

[21:29.040 --> 21:33.920] They ran seven experimental runs at different conditions, which now that I think about it,

[21:33.920 --> 21:39.360] of course they had to test it under different conditions because Mars can be so variable.

[21:39.360 --> 21:44.160] They first have to warm up MOXIE for a few hours because it's cold.

[21:44.160 --> 21:47.600] And then they run it for about an hour and then they shut it down.

[21:47.600 --> 21:49.680] And that's them just running a cycle test on it.

[21:50.800 --> 21:54.880] And what they do is they run it during the day, then they tested it at night, then they

[21:54.880 --> 22:00.640] tested it in different seasons because the temperature and the air pressure, the density

[22:00.640 --> 22:05.360] and the overall air temperature can vary a lot, like 100 degree shifts in temperature

[22:05.360 --> 22:08.080] depending on the season and time of day and all that.

[22:08.080 --> 22:13.600] So they haven't tested it during dawn and dusk because there are significant temperature

[22:13.600 --> 22:17.200] changes that happen during those times and they just want to like do preliminary testing

[22:17.200 --> 22:19.280] and then they're going to get into the more advanced testing.

[22:19.280 --> 22:25.120] But so far, every scenario that they put it through, it worked fantastically well.

[22:25.120 --> 22:28.880] It produces six grams of oxygen per hour.

[22:29.440 --> 22:32.720] So this is equal to, as I said, a small tree on Earth.

[22:32.720 --> 22:38.320] MOXIE is the first, it's the first thing that we've put on another planet that does what

[22:38.320 --> 22:38.800] guys?

[22:38.800 --> 22:41.280] Creates oxygen.

[22:41.280 --> 22:46.640] Well, more importantly, it's the first thing that ever used local resources and manufactured

[22:46.640 --> 22:48.400] them into something that's usable.

[22:48.400 --> 22:48.880] That's cool.

[22:50.000 --> 22:50.720] Very cool.

[22:50.720 --> 22:53.040] That is like, that is a milestone here.

[22:53.040 --> 22:57.680] It's incredibly useful because it could save, I'm going to start off by saying millions,

[22:57.680 --> 23:01.840] but after hearing Steve talk about how expensive these missions are, it could save billions

[23:01.840 --> 23:07.600] of dollars or more quintillions in cost to ship oxygen to Mars, right?

[23:07.600 --> 23:11.680] Think about it because we'd have to ship frequently ship a lot of oxygen to Mars.

[23:11.680 --> 23:13.600] That stuff is heavy, by the way.

[23:13.600 --> 23:14.880] Sorry, the launch is late, guys.

[23:14.880 --> 23:16.080] Hold your breath for a week.

[23:16.080 --> 23:20.400] The current version of MOXIE was made small deliberately so it could fit on perseverance

[23:20.400 --> 23:26.080] and it wasn't built to run continuously, but in its current form, it has proven to be very

[23:26.080 --> 23:29.840] efficient, which is very important because it won't use a lot of energy and it's reliable.

[23:29.840 --> 23:35.600] The next big test for MOXIE is to run it when the atmosphere is at its densest and they

[23:35.600 --> 23:37.920] plan to run it for as long as possible now.

[23:37.920 --> 23:40.960] They're just going to let that little bugger keep chugging along and just see what happens

[23:40.960 --> 23:43.520] to it because that'll teach us more about what to do.

[23:43.520 --> 23:48.160] Since MOXIE has to be turned on and then it has to be heated up and then they turn it

[23:48.160 --> 23:50.720] off, it goes through something called thermal stress, right?

[23:50.720 --> 23:54.240] Temperature goes up and the metal and parts expand and do what they're going to do and

[23:54.240 --> 23:56.320] then when it cools off, it shrinks back down.

[23:56.320 --> 24:01.600] Now since MOXIE is able to handle thermal stress, the researchers say that a new larger

[24:01.600 --> 24:06.720] system, MOXIE on steroids, would be able to last a very long time since it won't be

[24:06.720 --> 24:12.240] experiencing anywhere near the number of thermal stresses that MOXIE has already proven to

[24:12.240 --> 24:12.960] go through.

[24:12.960 --> 24:18.880] I know they've only tested it seven times, but that's a lot and they could turn on the

[24:18.880 --> 24:23.680] larger version of it and it may never turn off until its end of life cycle.

[24:23.680 --> 24:24.480] It just does what it does.

[24:24.480 --> 24:25.520] Just let it run, yeah.

[24:25.520 --> 24:30.560] Yeah, the larger version of MOXIE could be placed on Mars way before we put humans there.

[24:30.560 --> 24:32.800] It could be producing oxygen for a long time.

[24:32.800 --> 24:35.600] There could be a whole cache of oxygen ready to go.

[24:36.560 --> 24:42.720] The new unit, of course, they want it to run continuously and it could make approximately

[24:43.360 --> 24:46.400] several hundred trees worth of oxygen per day.

[24:46.400 --> 24:47.120] Per day?

[24:47.120 --> 24:47.600] Yes.

[24:47.600 --> 24:48.160] Not bad.

[24:48.160 --> 24:48.640] How many?

[24:48.640 --> 24:49.440] How much is that?

[24:49.440 --> 24:53.920] As a point of reference, Kara, I'm going to tell you a single person needs about seven

[24:53.920 --> 24:56.240] to eight trees worth of oxygen a day.

[24:56.240 --> 24:57.600] Oh, damn.

[24:57.600 --> 24:58.800] That's a lot.

[24:58.800 --> 24:59.040] Yep.

[24:59.600 --> 25:05.120] But, you know, if you do the math, you know, several hundred trees divided by eight turns

[25:05.120 --> 25:09.200] into quite a good crew of people there that the machine could keep alive.

[25:09.200 --> 25:12.240] And who says that they don't put two or three MOXIE machines?

[25:12.240 --> 25:13.840] Yeah, they want some redundancy.

[25:14.720 --> 25:18.160] The great thing about oxygen is what, first, it keeps us alive.

[25:18.720 --> 25:23.040] And the second great thing is that it, of course, can be used as fuel because we need

[25:23.040 --> 25:24.800] fuel to get off the surface of Mars.

[25:24.800 --> 25:29.600] And oxygen is a primary component in fuel, you know, in chemical fuel.

[25:29.600 --> 25:32.560] So thank you, MOXIE, for working.

[25:32.560 --> 25:35.680] So the carbon monoxide is useful, too.

[25:35.680 --> 25:36.000] Oh, yeah.

[25:36.000 --> 25:38.640] Don't discount the carbon monoxide is a high energy molecule.

[25:38.640 --> 25:42.880] And that's feedstock for things like hydrocarbons.

[25:42.880 --> 25:44.640] So all that you need is hydrogen.

[25:44.640 --> 25:49.360] If we could get a source of hydrogen on Mars, then you can combine the hydrogen with the

[25:49.360 --> 25:51.120] carbon monoxide to make methane.

[25:51.120 --> 25:55.120] The hydrogen, obviously, itself could be burned with the oxygen as rocket fuel.

[25:55.920 --> 25:58.000] And there are sources of hydrogen on Mars.

[25:58.000 --> 26:03.360] There are there's a lot of water on Mars and so not all of it is in drinkable form.

[26:03.360 --> 26:09.360] There are like what they call perchlorate brines, which is a lot of hydroxyl groups,

[26:09.360 --> 26:13.920] a lot of a lot of water type, you know, molecular groups in there.

[26:13.920 --> 26:14.880] Water-ish.

[26:14.880 --> 26:18.880] Yeah, well, hydrogen and oxygen, but it's not necessarily drinkable water.

[26:18.880 --> 26:23.840] But you get that you split the hydrogen off, you have pure hydrogen, you have more oxygen,

[26:23.840 --> 26:26.800] you could make fuel, you have oxygen to burn with the fuel.

[26:26.800 --> 26:30.560] We definitely are going to need to be able to make all of our fuel for the return trip

[26:30.560 --> 26:31.600] locally on Mars.

[26:31.600 --> 26:33.280] You can't carry all that crap with you.

[26:33.280 --> 26:36.000] Yeah, rocket equation will kill you if you try to do that.

[26:36.000 --> 26:42.400] So and then if and then if we could find a source of nitrogen on Mars, then we also have

[26:42.400 --> 26:45.600] our fertilizer to grow our own food there.

[26:45.600 --> 26:50.160] And there is nitrogen on Mars already fixed in the form of nitrates.

[26:50.160 --> 26:56.160] So, yeah, the bottom line is pretty much we have everything we need on Mars, you know,

[26:56.160 --> 26:57.840] for food, oxygen and water.

[26:57.840 --> 26:59.600] Except for the hamburger molecules.

[26:59.600 --> 27:05.280] Well, yeah, but you just grow the food and then you raise the animals and slaughter them.

[27:05.280 --> 27:08.080] And then you have your hamburger.

[27:08.080 --> 27:10.080] Thank you, Dr. Strangelove.

[27:10.080 --> 27:12.880] Just make some lab-grown meat, that'd be fine.

[27:12.880 --> 27:14.400] Yeah, there you go, lab-grown meat.

do people like you more if you talk more or talk less (27:16)

  • [link_URL TITLE][3]

[27:14.400 --> 27:21.120] All right, Kara, tell us, do people like you more if you talk more or talk less?

[27:21.120 --> 27:22.480] Let's get into it.

[27:22.480 --> 27:23.480] Let's talk about it.

[27:23.480 --> 27:27.360] When you say you, do you mean someone specifically or people in general?

[27:27.360 --> 27:28.360] People in general.

[27:28.360 --> 27:29.360] Yeah.

[27:29.360 --> 27:33.440] So I guess that is the important question, and that is the question that some researchers

[27:33.440 --> 27:41.120] from a little place called Harvard and the University of Virginia wanted to know.

[27:41.120 --> 27:45.880] So they have a new empirical paper that was published in Personality and Social Psychology

[27:45.880 --> 27:49.600] Bulletin called Speak Up!

[27:49.600 --> 27:53.000] Mistaken beliefs about how much to talk in conversations.

[27:53.000 --> 27:57.080] And as the title implies, I probably shouldn't have said that out loud, very often people

[27:57.080 --> 28:02.160] make judgments that are not reflective of reality about how much they should speak based

[28:02.160 --> 28:04.800] on what their outcome goals are.

[28:04.800 --> 28:12.080] So they wanted to know if somebody wants to be liked, how much do they think they should

[28:12.080 --> 28:13.480] talk in a conversation?

[28:13.480 --> 28:19.200] If somebody wants to be or to seem important, how much do they think they should talk in

[28:19.200 --> 28:20.360] a conversation?

[28:20.360 --> 28:25.200] And finally, if somebody just wants to enjoy the conversation, how much do they think they

[28:25.200 --> 28:26.200] should talk?

[28:26.200 --> 28:30.000] And they used a couple of different paradigms to look at this, like most psychology experiments

[28:30.000 --> 28:33.280] they kind of ran it a few different ways to ask different questions.

[28:33.280 --> 28:37.800] So first, I guess I'm curious from all of you, what do you think across those three

[28:37.800 --> 28:38.800] different parameters?

[28:38.800 --> 28:44.080] If somebody wants to be liked, what percentage of time do you think that they will think

[28:44.080 --> 28:45.080] that they should talk?

[28:45.080 --> 28:49.760] Do they think or should they do this in a one on one conversation?

[28:49.760 --> 28:50.760] What's the environment?

[28:50.760 --> 28:51.760] One on one.

[28:51.760 --> 28:52.760] So these are diets.

[28:52.760 --> 28:53.760] Yeah.

[28:53.760 --> 28:57.720] So with a partner, how much and again, not how much should they talk to be liked?

[28:57.720 --> 29:00.720] How much do they think they should talk to be liked?

[29:00.720 --> 29:01.720] 30%.

[29:01.720 --> 29:08.160] Well, no, are minutes, are we measuring, are we, 50% of time, 31% you think 50% you think

[29:08.160 --> 29:16.600] 30% definitely lower than 50% probably 30% for Evan says 30%.

[29:16.600 --> 29:17.600] Okay.

[29:17.600 --> 29:18.600] So it's interesting.

[29:18.600 --> 29:23.480] Evan, you said before you said 30%, you said definitely lower than 50%.

[29:23.480 --> 29:24.800] That's what the researchers thought as well.

[29:24.800 --> 29:30.920] They were like, okay, people whose goal is to be liked very often, and this bears out

[29:30.920 --> 29:34.720] in the literature, very often think they should talk less than half the time.

[29:34.720 --> 29:39.800] Cause it's a, it's a gesture of politeness and courtesy in a, in a cultural sense.

[29:39.800 --> 29:42.720] People love to talk about themselves.

[29:42.720 --> 29:44.360] And they actually have a name for this.

[29:44.360 --> 29:46.160] They call it the reticence bias.

[29:46.160 --> 29:47.160] Reticence.

[29:47.160 --> 29:48.160] Very good.

[29:48.160 --> 29:49.160] Yeah.

[29:49.160 --> 29:53.920] As they say, the reticence bias we suggest is rooted in the fact that people lack confidence

[29:53.920 --> 29:56.000] in their conversational abilities.

[29:56.000 --> 30:00.520] They go on to talk about social anxiety, but they also, um, and that the people are hard

[30:00.520 --> 30:05.160] on themselves and that they don't get a lot of valid external feedback.

[30:05.160 --> 30:10.000] Um, but generally speaking, people think that if they listen more than they talk, they're

[30:10.000 --> 30:11.880] going to be liked more.

[30:11.880 --> 30:13.060] Here's the rub.

[30:13.060 --> 30:15.980] That's not the case.

[30:15.980 --> 30:22.480] So second paradigm do if people's goal is not to be liked, but it's to seem interesting.

[30:22.480 --> 30:23.480] What do you think?

[30:23.480 --> 30:25.560] Well, then I would think talk more, right?

[30:25.560 --> 30:26.560] You would think that, right?

[30:26.560 --> 30:28.280] Then they would want to talk more.

[30:28.280 --> 30:32.000] And that's exactly how it bared out with their, with their study.

[30:32.000 --> 30:37.760] People when told that the goal was to seem interesting, did talk more in the dyad because

[30:37.760 --> 30:40.240] they thought, or they, they selected that option.

[30:40.240 --> 30:44.120] I should talk more in the dyad because that will make me seem more interesting.

[30:44.120 --> 30:50.080] And then finally, if they wanted to just enjoy themselves in the conversation, what do you

[30:50.080 --> 30:51.080] think?

[30:51.080 --> 30:52.080] Don't talk at all.

[30:52.080 --> 30:53.080] Even.

[30:53.080 --> 30:54.080] Even.

[30:54.080 --> 30:55.080] Yeah.

[30:55.080 --> 30:56.080] 50 50.

[30:56.080 --> 30:57.080] So.

[30:57.080 --> 30:58.520] That's the most important question.

[30:58.520 --> 31:06.200] But what if your goal is to get laid, that probably is the most important.

[31:06.200 --> 31:07.560] That is their follow up study.

[31:07.560 --> 31:09.840] Well, you got to do all of them though.

[31:09.840 --> 31:10.840] You got to be liked.

[31:10.840 --> 31:15.480] You got to be interesting thing at all at the same time.

[31:15.480 --> 31:16.480] And enjoy yourself.

[31:16.480 --> 31:17.480] Yeah.

[31:17.480 --> 31:19.400] So actually, interestingly, something will bear out from that.

[31:19.400 --> 31:24.120] So basically they, um, they did a couple of studies where they were forecasting.

[31:24.120 --> 31:27.320] So they, you know, this is very classic psychology study.

[31:27.320 --> 31:32.120] So they used a mechanical Turk and Qualtrics and they asked some questions.

[31:32.120 --> 31:34.880] Let me go to study one.

[31:34.880 --> 31:41.520] And they asked some questions where they imagined having a conversation with different conversational

[31:41.520 --> 31:43.320] prompts.

[31:43.320 --> 31:47.000] And then they were asked, you know, how much should they, how much did they think they

[31:47.000 --> 31:48.000] should talk?

[31:48.000 --> 31:50.240] Because remember, this is not about whether or not they did.

[31:50.240 --> 31:54.360] It's about their self kind of assessment of should they talk more, talk less.

[31:54.360 --> 31:55.840] And those were the outcomes, right?

[31:55.840 --> 32:03.520] I think it was, uh, let's see if the goal was to be liked, people said on average that

[32:03.520 --> 32:12.520] they should speak 43% of the time, whereas their partner should speak 56% of the time.

[32:12.520 --> 32:16.440] If the goal was to be interesting, they said they themselves should speak about 57 and

[32:16.440 --> 32:20.320] a half percent of the time, whereas their partner should speak 42% of the time.

[32:20.320 --> 32:24.740] And if the goal was to enjoy themselves, it was right around 50, 50.

[32:24.740 --> 32:30.960] And then they repeated the study and found very similar outcomes.

[32:30.960 --> 32:35.520] And then they actually kind of forced the hand and made people talk a certain amount

[32:35.520 --> 32:36.520] of time.

[32:36.520 --> 32:37.520] Right?

[32:37.520 --> 32:38.680] Like, of course that would be the followup study.

[32:38.680 --> 32:42.400] It was a little bit arbitrary and a little bit fake because they used like a computer

[32:42.400 --> 32:45.560] would cue them and say, talk now, now you talk, now you talk.

[32:45.560 --> 32:51.120] But it actually measured it out at these certain points, like 30%, 40%, 50%, 60% or 70% of

[32:51.120 --> 32:52.120] the time.

[32:52.120 --> 32:53.800] What do you think happened to that?

[32:53.800 --> 32:54.840] It worked.

[32:54.840 --> 32:55.840] What worked?

[32:55.840 --> 33:02.360] They were, they achieved the goal of like, you know, enjoying themselves more or being

[33:02.360 --> 33:03.360] liked more.

[33:03.360 --> 33:04.360] Interesting.

[33:04.360 --> 33:05.360] It worked.

[33:05.360 --> 33:06.360] Yeah.

[33:06.360 --> 33:10.680] So, so basically that the outcomes would have followed the intentions.

[33:10.680 --> 33:14.200] Like if they thought they wanted to be liked and they spoke less than they were liked more.

[33:14.200 --> 33:16.160] No, the opposite.

[33:16.160 --> 33:17.160] They were wrong.

[33:17.160 --> 33:18.160] The opposite happened.

[33:18.160 --> 33:19.160] Yeah.

[33:19.160 --> 33:20.160] Their initial assessment was wrong.

[33:20.160 --> 33:21.160] It's kind of neither.

[33:21.160 --> 33:27.240] It sort of doesn't follow the most clear pattern, but it looks like it's a little bit bimodal.

[33:27.240 --> 33:32.760] The more important outcome that the researchers point to is something that they call, they

[33:32.760 --> 33:34.320] call it halo ignorance.

[33:34.320 --> 33:37.520] And to understand halo ignorance, you have to first understand the halo effect.

HALO Effect (33:27)

[33:37.520 --> 33:40.720] Do you guys, have you ever heard of the halo effect in social psychology?

[33:40.720 --> 33:41.720] Okay.

[33:41.720 --> 33:42.720] It's pretty interesting.

[33:42.720 --> 33:49.120] I'm going to describe it because I had experience demonstrating it when I worked on brain games.

[33:49.120 --> 33:54.480] So this was a really fun episode that we did where I got to go to this like fake art gallery

[33:54.480 --> 33:59.760] and I played the gallerist and opposite me was Colin Hanks.

[33:59.760 --> 34:01.200] You guys know Colin Hanks, right?

[34:01.200 --> 34:02.200] Tom Hanks' son.

[34:02.200 --> 34:03.200] Colin.

[34:03.200 --> 34:04.200] Yes.

[34:04.200 --> 34:05.200] Oh, with all the tattoos.

[34:05.200 --> 34:06.200] He was in the offer.

[34:06.200 --> 34:07.200] He was in the offer.

[34:07.200 --> 34:08.200] No, he doesn't have a lot of tattoos.

[34:08.200 --> 34:09.200] Oh, maybe the other son.

[34:09.200 --> 34:12.200] No, he's like a very famous actor who looks a lot like Tom Hanks.

[34:12.200 --> 34:13.200] I see.

[34:13.200 --> 34:14.200] Yeah.

[34:14.200 --> 34:15.200] Yeah.

[34:15.200 --> 34:16.200] And has been in, yeah, he was in band of brothers.

[34:16.200 --> 34:17.200] He was in a ton of stuff.

[34:17.200 --> 34:23.760] So he played himself in one paradigm where he had all, we set the art gallery up with

[34:23.760 --> 34:27.360] a bunch of art and in one paradigm he was like, Hey, I'm Colin Hanks.

[34:27.360 --> 34:29.160] I've decided my hand at art.

[34:29.160 --> 34:32.320] I really interested in you guys as you know, feedback, blah, blah, blah.

[34:32.320 --> 34:36.980] And then we got their, their feedback, not to his face where they talked about whether

[34:36.980 --> 34:39.060] or not they enjoyed his art.

[34:39.060 --> 34:44.100] And then in another paradigm, we dressed him up as this alter ego named Giannis Patch.

[34:44.100 --> 34:48.280] And he had like a soul patch and was dressed really weird and was like super aggressive

[34:48.280 --> 34:51.320] and like not nice to anybody.

[34:51.320 --> 34:54.600] Like he was very holier than thou and just played this awful character that nobody got

[34:54.600 --> 34:55.600] along with.

[34:55.600 --> 34:56.600] And what do you think happened?

[34:56.600 --> 35:01.360] They liked Colin Hanks' art and they hated Giannis Patch's art, even though it was the

[35:01.360 --> 35:02.360] same art.

[35:02.360 --> 35:03.360] Wow.

[35:03.360 --> 35:04.360] Right.

[35:04.360 --> 35:10.840] So the halo effect really is this idea that there are global features that come together.

[35:10.840 --> 35:14.040] So if you like the person, you like the person's art.

[35:14.040 --> 35:16.000] If you dislike the person, you dislike the person's art.

[35:16.000 --> 35:17.960] Those things aren't independent of each other.

[35:17.960 --> 35:21.120] And so what the researchers here are saying is that there's, there's probably something

[35:21.120 --> 35:22.400] called halo ignorance.

[35:22.400 --> 35:24.760] Most people aren't aware of the halo effect.

[35:24.760 --> 35:29.920] And so they don't realize that it's not so simple that I want to be liked, but I want

[35:29.920 --> 35:33.360] to seem, I want to be liked here and I want to seem important here.

[35:33.360 --> 35:35.780] They're not mutually exclusive.

[35:35.780 --> 35:40.600] Generally speaking, if somebody in a conversation has a positive effect on somebody, you're

[35:40.600 --> 35:42.640] going to like them and find them interesting.

[35:42.640 --> 35:47.400] If they have a negative effect, you're going to dislike them and find them disinteresting.

[35:47.400 --> 35:48.400] But most people-

[35:48.400 --> 35:50.400] And your breath smells too.

[35:50.400 --> 35:51.400] Exactly.

[35:51.400 --> 35:52.400] Okay.

[35:52.400 --> 35:58.140] So most people are ignorant of the halo effect, which is probably why they estimate that they

[35:58.140 --> 36:01.860] have to speak differently for different outcomes.

[36:01.860 --> 36:07.740] But the truth of the matter is that basically the big takeaway, because the, the final study

[36:07.740 --> 36:09.860] outcomes are a little all over the place.

[36:09.860 --> 36:10.860] They're not super clean.

[36:10.860 --> 36:14.080] Like the less you talk, the less you're like, the more you talk, the more you're like, it's

[36:14.080 --> 36:15.420] not really that clean.

[36:15.420 --> 36:23.360] It's basically that generally speaking, if you're in the 30% or 40% group, you're kind

[36:23.360 --> 36:26.080] of not liked or interesting.

[36:26.080 --> 36:31.160] Like if you don't talk that much, you don't get this good feedback.

[36:31.160 --> 36:36.880] So most people think I need to be quiet and be a good listener in a dyad and then I'm

[36:36.880 --> 36:38.600] going to be more well liked.

[36:38.600 --> 36:41.660] But the truth is that's, that doesn't bear out.

[36:41.660 --> 36:47.820] If you talk way too much, we start to see diminishing returns, so it's sort of somewhere

[36:47.820 --> 36:48.820] in the middle.

[36:48.820 --> 36:54.560] Kara, the halo effect reminds me of the Oscars because I, right?

[36:54.560 --> 37:00.820] I always had the feeling that like the, the movie that wins like best art direction.

[37:00.820 --> 37:04.800] Was that really the movie that had the best art direction or was that just the favorite

[37:04.800 --> 37:07.360] movie that was nominated for the, for best art?

[37:07.360 --> 37:08.360] Yeah.

[37:08.360 --> 37:09.360] What was it?

[37:09.360 --> 37:13.200] Was it one of the eight movies that got nominated for everything because there's like only certain

[37:13.200 --> 37:15.200] movies that were quote, Oscar worthy.

[37:15.200 --> 37:19.660] Did that movie really have the best editing and costuming and all the other little technical

[37:19.660 --> 37:22.760] things just because it was a popular movie?

[37:22.760 --> 37:23.760] You don't, yeah.

[37:23.760 --> 37:24.760] It just boggles the imagination.

[37:24.760 --> 37:25.760] Completely agree.

[37:25.760 --> 37:26.880] Completely agree.

[37:26.880 --> 37:35.400] I had a friend once who used to call the Oscars rich people prom and nary was their true statement.

[37:35.400 --> 37:42.840] The difference between a person perceived wanting to be liked versus that same person

[37:42.840 --> 37:46.800] perceiving to be either polite or courteous.

[37:46.800 --> 37:50.240] We talking, we're splitting hairs, are they two very different things?

[37:50.240 --> 37:51.240] It wasn't the paradigm.

[37:51.240 --> 37:52.240] So I can't tell you that.

[37:52.240 --> 37:53.240] You know what I mean?

[37:53.240 --> 37:54.840] They did not ask that question.

[37:54.840 --> 37:57.240] They asked three very simple questions.

[37:57.240 --> 37:58.240] Okay.

[37:58.240 --> 38:01.840] Do you want to be like, you know, if you want to be liked, how often will you talk?

[38:01.840 --> 38:03.880] If you want to seem interesting, how often will you talk?

[38:03.880 --> 38:05.720] If you want to enjoy the conversation, how often will you talk?

[38:05.720 --> 38:12.720] But I wonder if a person can confuse the desire to be liked with the desire to be accommodating

[38:12.720 --> 38:17.800] or you know, courteous to the other person by speaking less.

[38:17.800 --> 38:23.520] Is it, you know, can that be, can those two things be confused and therefore that's why

[38:23.520 --> 38:28.080] people would think that, yeah, I should probably talk less, not so much to be liked, but just

[38:28.080 --> 38:29.680] to give for consideration.

[38:29.680 --> 38:30.680] Right.

[38:30.680 --> 38:33.840] But, but you're basically talking about constructs, right?

[38:33.840 --> 38:36.560] Like these aren't, these aren't actual things that exist.

[38:36.560 --> 38:39.880] Being liked is not specifically different from being courteous.

[38:39.880 --> 38:43.920] It's all in how you construct that reality, how you interpret it.

[38:43.920 --> 38:48.240] So if your definition of being liked is that you're very courteous, that's how you're

[38:48.240 --> 38:50.240] going to view that when you do this study.

[38:50.240 --> 38:51.240] Okay.

[38:51.240 --> 38:52.240] All right.

[38:52.240 --> 38:54.600] So you can totally interpret that differently.

[38:54.600 --> 38:56.680] And that's the interesting thing about psychology.

[38:56.680 --> 39:01.480] It's why we have to always operationally define everything because nothing, there's no like

[39:01.480 --> 39:04.960] fundamental truth to the idea of being liked.

[39:04.960 --> 39:05.960] It's how we define it.

[39:05.960 --> 39:06.960] Right.

[39:06.960 --> 39:07.960] Right.

[39:07.960 --> 39:08.960] Yes.

[39:08.960 --> 39:09.960] It does seem nebulous.

[39:09.960 --> 39:10.960] Yeah.

[39:10.960 --> 39:12.440] And there's probably, sorry, there's probably a ton of crossover there.

[39:12.440 --> 39:17.640] If I want to seem likable, I might have different factors that if I were to ask, if I were to

[39:17.640 --> 39:22.400] sit down in a separate study and the researchers were to say, what are the five features of

[39:22.400 --> 39:24.920] likability that are most important?

[39:24.920 --> 39:27.960] My list might be different than your list, but I wouldn't be surprised if there was a

[39:27.960 --> 39:30.680] lot of crossover among people.

[39:30.680 --> 39:35.680] And so courteousness is probably one of those things.

[39:35.680 --> 39:37.200] So I wouldn't say it's a conflation.

[39:37.200 --> 39:40.120] I would say it's part of it.

[39:40.120 --> 39:42.880] And the United States, is this a US study?

[39:42.880 --> 39:43.880] Yeah, absolutely.

[39:43.880 --> 39:44.880] This is a US study.

[39:44.880 --> 39:48.640] So we've got to remember that there's always a massive culture bias in these kinds of studies.

[39:48.640 --> 39:53.160] And this really only applies to the group of college students that they were looking

[39:53.160 --> 39:54.160] at.

[39:54.160 --> 39:59.680] But they do, like any good paper, cite the available research literature, show areas

[39:59.680 --> 40:05.120] where this reinforces things that have already been studied and basically move the needle

[40:05.120 --> 40:08.840] that much more because there is a body of literature around this.

[40:08.840 --> 40:14.000] But so the cool thing is basically the big two outcomes here are the reticence bias seems

[40:14.000 --> 40:15.640] to exist.

[40:15.640 --> 40:20.400] People generally think they are reticent to speak because they think that they will be

[40:20.400 --> 40:23.200] liked less when they speak more.

[40:23.200 --> 40:24.600] And that does not bear out.

[40:24.600 --> 40:29.240] The truth is you're actually liked more if you speak more up to a point.

[40:29.240 --> 40:30.320] And yeah, up to a point.

[40:30.320 --> 40:35.920] And then the other one, yeah, they did show that with but it's actually it's not a huge

[40:35.920 --> 40:40.160] effect like you it does start to have diminishing returns at the 70%.

[40:40.160 --> 40:42.000] And they didn't look at anything over 70%.

[40:42.000 --> 40:46.240] They didn't look at the extremes like 90% of the time.

[40:46.240 --> 40:54.800] But interestingly, being interesting plateaus more so basically that are liked and are interesting

[40:54.800 --> 40:59.120] and enjoy themselves less when they don't speak that much.

[40:59.120 --> 41:03.920] It all kind of peaks around the 50% so about when you're even and then you see a diminishing

[41:03.920 --> 41:08.920] return that's the heaviest on enjoyment.

[41:08.920 --> 41:15.920] So basically, you're seeing that there is a sort of bimodal distribution, but the researchers

[41:15.920 --> 41:19.960] make a kind of blanket statement, which I think is kind of a good statement, which is

[41:19.960 --> 41:22.840] that all things being equal, you should talk.

[41:22.840 --> 41:27.560] Because all things being equal, you're actually going to have a better outcome from talking

[41:27.560 --> 41:29.700] more than talking less.

[41:29.700 --> 41:34.440] And that flies in the face of most people's preconceived notions, which is often the case,

[41:34.440 --> 41:37.940] you know, people usually misjudge.

[41:37.940 --> 41:44.120] It's interesting how bad we are at intuitively like thinking about how our behavior affects

[41:44.120 --> 41:48.120] our relationships, you know, people do things that have the opposite effect of what they

[41:48.120 --> 41:49.120] want.

[41:49.120 --> 41:50.120] Absolutely.

[41:50.120 --> 41:51.120] Yeah.

[41:51.120 --> 41:52.120] People shoot themselves in the foot all the time.

[41:52.120 --> 41:57.040] It's also I mean, that's a fundamental part of exactly what Jay mentioned in the ad this

[41:57.040 --> 42:00.020] week of cognitive behavioral therapy.

[42:00.020 --> 42:06.360] It's recognizing these biases and recognizing all of the times that we we act in a way that

[42:06.360 --> 42:11.480] we think is going to achieve a goal when actually it gives us the opposite outcome or an outcome

[42:11.480 --> 42:13.160] that we weren't looking for.

[42:13.160 --> 42:14.160] Right.

[42:14.160 --> 42:17.320] That's why studies like this are important.

[42:17.320 --> 42:18.320] All right.

[42:18.320 --> 42:19.320] Thanks, Kara.

[42:19.320 --> 42:22.400] Well, everyone, we're going to take a quick break from our show to talk about our sponsor

[42:22.400 --> 42:23.880] this week, Wondrium.

[42:23.880 --> 42:26.600] Guys, we talk about therapy a lot on the show.

[42:26.600 --> 42:31.740] And I found a course on Wondrium that is called cognitive behavioral therapy techniques for

[42:31.740 --> 42:33.240] retraining your brain.

[42:33.240 --> 42:37.600] So it goes over the foundations of cognitive behavioral therapy, which, by the way, is

[42:37.600 --> 42:42.820] basically one of if not the best technique that you can learn in therapy to help yourself

[42:42.820 --> 42:46.080] get out of your anxiety and your depression.

[42:46.080 --> 42:48.780] They talk about setting your own goals.

[42:48.780 --> 42:50.800] They talk about dealing with stress.

[42:50.800 --> 42:54.760] They talk about dealing with anxiety and fear, how to treat your depression.

[42:54.760 --> 42:57.800] They have 24 chapters in this course.

[42:57.800 --> 42:59.760] And I really do recommend this course.

[42:59.760 --> 43:03.360] And get this, Wondrium will help you learn about pretty much anything you're curious

[43:03.360 --> 43:08.240] about from history to science to language to travel, even learning how to cook.

[43:08.240 --> 43:13.440] You get unlimited access to thousands of hours of trustworthy audio and video courses, documentaries,

[43:13.440 --> 43:15.520] tutorials and so much more.

[43:15.520 --> 43:19.840] And you can learn wherever and whenever that works for you because of the flexibility of

[43:19.840 --> 43:20.840] the platform.

[43:20.840 --> 43:25.640] We highly recommend signing up for Wondrium and Wondrium is offering our listeners a free

[43:25.640 --> 43:27.820] month of unlimited access.

[43:27.820 --> 43:31.120] Sign up today through our special URL to get this offer.

[43:31.120 --> 43:34.200] Go to Wondrium.com slash skeptics.

[43:34.200 --> 43:42.520] Again, that's W-O-N-D-R-I-U-M.com slash skeptics.

[43:42.520 --> 43:44.720] All right, guys, let's get back to the show.

Electromagnetic Dynamic Resonance (43:45)

  • [link_URL TITLE][4]

[43:44.720 --> 43:47.160] Guys, what do you know about EMDR?

[43:47.160 --> 43:49.520] Oh, EMDR.

[43:49.520 --> 43:50.520] That's a new boy band.

[43:50.520 --> 43:51.520] I know a lot about that.

[43:51.520 --> 43:52.520] South Korea.

[43:52.520 --> 43:55.520] Electromagnetic Dynamic Resonance?

[43:55.520 --> 43:56.520] What?

[43:56.520 --> 44:01.560] Eye movement and desensitization and reprocessing.

[44:01.560 --> 44:04.080] Eye movement, desensitization and reprocessing.

[44:04.080 --> 44:05.080] Oh, yeah, that too.

[44:05.080 --> 44:06.080] Yeah.

[44:06.080 --> 44:07.080] All right.

[44:07.080 --> 44:10.600] So, Kara, do you know when it was developed and how?

[44:10.600 --> 44:12.280] Maybe by the VA.

[44:12.280 --> 44:13.280] Probably not.

[44:13.280 --> 44:14.280] But they use it.

[44:14.280 --> 44:21.720] In 1987, a PhD psychologist, Francine Shapiro, was walking through the park when she realized

[44:21.720 --> 44:29.920] that her eye movements, looking around at the trees, reduced her anxiety and depression.

[44:29.920 --> 44:31.480] And that was it.

[44:31.480 --> 44:38.000] EMDR was born, a single subjective personal observation.

[44:38.000 --> 44:44.800] She said, maybe the eye movements itself is making me feel less anxious and depressed.

[44:44.800 --> 44:45.800] And that was it.

[44:45.800 --> 44:46.880] That's like the entire base of it.

[44:46.880 --> 44:53.120] It did not come out of any basic science or any neuroscience or any thinking about how

[44:53.120 --> 44:55.080] the brain works or how depression works or anything.

[44:55.080 --> 44:58.080] It was just that naked observation.

[44:58.080 --> 45:05.360] Now that is reminiscent of a lot of medical pseudosciences where a single quirky observation

[45:05.360 --> 45:11.480] is the entire basis of the whole thing, like iridology and chiropractic, et cetera.

[45:11.480 --> 45:12.800] Yeah, so that was the origin.

[45:12.800 --> 45:14.200] Doesn't mean it doesn't work.

[45:14.200 --> 45:18.000] Is it possible that she made an observation that actually was based on reality?

[45:18.000 --> 45:20.520] Well, that's how science often is started.

[45:20.520 --> 45:21.520] Yeah, it's okay.

[45:21.520 --> 45:25.200] You make an observation, then you test it.

[45:25.200 --> 45:30.760] It's fine as a method of hypothesis generation, but not hypothesis testing, right?

[45:30.760 --> 45:35.040] You can't conclude that it's real because you made an anecdotal observation.

[45:35.040 --> 45:36.400] All right.

[45:36.400 --> 45:39.600] So that was, what, 35 years ago.

[45:39.600 --> 45:45.560] So there's been 35 years of research into EMDR, mainly for PTSD, post-traumatic stress

[45:45.560 --> 45:46.560] disorder.

[45:46.560 --> 45:51.360] That's the most common thing that it is used for and studied for, but also pretty much

[45:51.360 --> 45:52.360] everything.

[45:52.360 --> 45:56.920] It's been also studied for anxiety and depression and other things as well.

[45:56.920 --> 45:59.440] So what's the idea behind it?

[45:59.440 --> 46:05.080] Again, there really isn't anything very compelling in terms of what's the putative mechanism.

[46:05.080 --> 46:10.920] There have been probably dozens, hundreds of proposed possible mechanisms, but it all

[46:10.920 --> 46:15.560] is some version of, oh, you're sort of forcing the right half of the brain to communicate

[46:15.560 --> 46:20.280] with the left half, and there's something going on there, and you're rewiring the brain.

[46:20.280 --> 46:26.360] That's the reprocessing, the connection between the memory and the emotional feeling.

[46:26.360 --> 46:34.440] But it's all this made-up, hand-waving, very, I think, neurologically naive kind of statements,

[46:34.440 --> 46:38.400] and there's really no science behind it.

[46:38.400 --> 46:44.160] But again, irrelevant to the core question, or at least not irrelevant, but yeah, does

[46:44.160 --> 46:45.160] it work?

[46:45.160 --> 46:50.840] It sort of, it does impact how we address that question, but you could have sufficient

[46:50.840 --> 46:54.400] evidence that it works even if you don't know what the mechanism is and even if it was based

[46:54.400 --> 46:56.700] on a quirky observation.

[46:56.700 --> 47:02.320] So what's the research been into EMDR over the last 35 years?

[47:02.320 --> 47:08.360] So from where I'm sitting from like a psychologist in training who reads the APA literature,

[47:08.360 --> 47:13.260] blah, blah, blah, and I'll give you my, I'm going to approach this really quickly with

[47:13.260 --> 47:18.000] my skeptical hat, but also with my psychologist hat, and also I am in the process of co-editing

[47:18.000 --> 47:21.120] a volume about pseudoscience and therapy, so it's informed by that, too.

[47:21.120 --> 47:22.900] I've read some good chapters on this.

[47:22.900 --> 47:30.120] Number one, the evidence base shows that people who get EMDR have better outcomes than people

[47:30.120 --> 47:32.280] who don't get therapy.

[47:32.280 --> 47:36.040] It also shows that they have sometimes better outcomes than people who get certain types

[47:36.040 --> 47:42.680] of therapy, but from where I'm sitting and a lot of people who, I think it was Rosen

[47:42.680 --> 47:44.860] who dubbed this a purple hat therapy.

[47:44.860 --> 47:46.400] Did you come across that statement?

[47:46.400 --> 47:47.400] Yeah.

[47:47.400 --> 47:48.400] Yeah.

[47:48.400 --> 47:49.400] I love this, right?

[47:49.400 --> 47:55.800] The idea is, is it the EMDR or is it the fact that they're learning anti-anxiety reduction

[47:55.800 --> 47:56.800] skills?

[47:56.800 --> 48:01.420] Like if you taught somebody how to reduce their anxiety while driving and then you gave

[48:01.420 --> 48:05.600] them a purple hat and said, wear this while driving, then they finish driving and they

[48:05.600 --> 48:07.320] go, oh my God, I wasn't anxious.

[48:07.320 --> 48:09.200] Was it because of the purple hat?

[48:09.200 --> 48:10.200] Yeah.

[48:10.200 --> 48:12.600] So that's exactly what's going on here.

[48:12.600 --> 48:18.520] So Scott Lillian Field, who is a skeptical psychologist, yeah, passed away a couple

[48:18.520 --> 48:19.520] years ago.

[48:19.520 --> 48:20.520] Yeah.

[48:20.520 --> 48:21.520] Yeah, unfortunately.

[48:21.520 --> 48:22.520] Died young, really tragic.

[48:22.520 --> 48:23.520] But great guy.

[48:23.520 --> 48:29.400] He did a very good review of the literature on EMDR a few years ago, and I liked the way

[48:29.400 --> 48:30.400] he summarized it.

[48:30.400 --> 48:35.080] All right, so first we could ask, does EMDR work better, this is like for PTSD, does it

[48:35.080 --> 48:38.840] work better than nothing, right, than no therapy?

[48:38.840 --> 48:40.520] And the answer is like, yeah, it clearly does.

[48:40.520 --> 48:47.760] But remember, EMDR is you're moving your eyes while you're doing essentially exposure therapy,

[48:47.760 --> 48:49.600] imaginal exposure therapy.

[48:49.600 --> 48:57.000] So it's not, in fact, Shapiro, you know, tried doing the eye movements by itself and it didn't

[48:57.000 --> 48:58.640] work at all.

[48:58.640 --> 49:03.320] So she had to combine it with, basically with cognitive behavioral therapy, and so now it

[49:03.320 --> 49:04.320] works.

[49:04.320 --> 49:08.360] When you combine it with this already established effective psychological treatment, it quote

[49:08.360 --> 49:09.360] unquote works.

[49:09.360 --> 49:10.360] All right, but he said-

[49:10.360 --> 49:13.560] How can you possibly have an effect size that's large enough to differentiate?

[49:13.560 --> 49:21.160] Yeah, so he said, in the literature, you compare EMDR to no therapy at all, and of course it

[49:21.160 --> 49:23.160] works, compared to nothing.

[49:23.160 --> 49:29.300] You can also compare it to an intervention, but the intervention itself is not effective.

[49:29.300 --> 49:35.000] So for example, it's often compared to passive listening, where you just have a therapist

[49:35.000 --> 49:40.640] going, mm-hmm, yeah, tell me more, mm-hmm, you know, not doing any therapy, just listening,

[49:40.640 --> 49:42.240] which isn't really an effective treatment.

[49:42.240 --> 49:43.720] And yeah, is it better than that?

[49:43.720 --> 49:44.720] Sure.

[49:44.720 --> 49:50.320] Is it better than doing the exact same thing, but without the eye movement?

[49:50.320 --> 49:52.560] No, it's not better than doing it.

[49:52.560 --> 49:58.040] And if you adequately control it, where you're doing, say, a fixed gaze therapy, as opposed

[49:58.040 --> 50:02.280] to eye movement therapy, but otherwise, cognitively, you're doing the exact same thing.

[50:02.280 --> 50:08.920] You're imagining the stressor, and also imagining yourself in a safe space at the same time,

[50:08.920 --> 50:09.920] whatever.

[50:09.920 --> 50:14.260] If you're going through the cognitive therapy, it's exactly the same.

[50:14.260 --> 50:21.600] So the EMDR, again, my purple hat phrase that I like to use is the part of this nutritious

[50:21.600 --> 50:22.600] breakfast, right?

[50:22.600 --> 50:25.080] So you remember those commercials?

[50:25.080 --> 50:26.640] When served with a nutritious breakfast.

[50:26.640 --> 50:29.560] Yeah, so it was like selling Pop-Tarts or something.

[50:29.560 --> 50:34.480] It's like, this Danish is part of this, and they show a nutritious breakfast with orange

[50:34.480 --> 50:41.200] juice and whatever, and this tumor is part of a healthy body.

[50:41.200 --> 50:42.200] It's irrelevant.

[50:42.200 --> 50:45.000] It's an irrelevant part of this nutritious breakfast, but that's the thing.

[50:45.000 --> 50:51.500] It's a completely irrelevant superficial component of a treatment that is already established

[50:51.500 --> 50:55.400] as being effective, and it doesn't appear to add anything.

[50:55.400 --> 51:00.280] And so, and again, another critic who, again, I like this statement that I've applied this

[51:00.280 --> 51:06.600] to many other things, like, what is unique about EMDR doesn't work, and what works about

[51:06.600 --> 51:09.040] EMDR is not unique, right?

[51:09.040 --> 51:10.040] So it's like-

[51:10.040 --> 51:11.040] You said that a lot about chiropractic.

[51:11.040 --> 51:12.040] Chiropractic, yeah.

[51:12.040 --> 51:16.600] What chiropractors do that works isn't unique to chiropractors, and what is unique to chiropractors

[51:16.600 --> 51:17.600] doesn't work.

[51:17.600 --> 51:20.280] But in any case, so that's the case of EMDR.

[51:20.280 --> 51:24.120] It's essentially unnecessary, but here's the thing.

[51:24.120 --> 51:27.160] It's massively popular within psychotherapy.

[51:27.160 --> 51:29.080] It's so popular.

[51:29.080 --> 51:30.960] It's so frustrating.

[51:30.960 --> 51:40.640] It's so frustrating, because the APA has basically said it's an evidence-based treatment, because

[51:40.640 --> 51:44.420] there is some evidence to support it, but it's poor evidence.

[51:44.420 --> 51:47.800] So that is an indictment of EBM, right?

[51:47.800 --> 51:51.120] In my opinion, that's why it's not a science-based medicine treatment.

[51:51.120 --> 51:56.280] It may be an EBM treatment, but that's only because, and then even then I think it fails

[51:56.280 --> 52:03.000] the EBM standard, but essentially people exploit the weaknesses in evidence-based medicine

[52:03.000 --> 52:08.320] to say things like EMDR is quote-unquote evidence-based, because there is clinical evidence to show

[52:08.320 --> 52:14.520] that it quote-unquote works, but only when not comparing it to an adequate control, not

[52:14.520 --> 52:16.920] as an isolated variable.

[52:16.920 --> 52:18.760] It's exactly like acupuncture.

[52:18.760 --> 52:23.120] It only works when you're not isolating what acupuncture is, you know, the variable that

[52:23.120 --> 52:24.120] is acupuncture.

[52:24.120 --> 52:27.720] And do you know what the saddest part of all of this is, and it's a part we don't often

[52:27.720 --> 52:33.800] talk about on the show, is that the variable that works the most of almost any therapeutic

[52:33.800 --> 52:36.400] variable is relationship.

[52:36.400 --> 52:41.000] Yeah, it's the therapeutic relationship, yeah.

[52:41.000 --> 52:42.480] Here's the other way to look at this.

[52:42.480 --> 52:46.960] So EMDR, the evidence is really crappy.

[52:46.960 --> 52:51.360] I was really reading a recent systematic review and they said they looked at like all of the

[52:51.360 --> 52:58.400] randomized controlled trials over whatever, you know, since forever, and the total number

[52:58.400 --> 53:02.960] of subjects in all of the studies that they were able to add into a meta-analysis was

[53:02.960 --> 53:03.960] like 420.

[53:03.960 --> 53:04.960] That's it?

[53:04.960 --> 53:05.960] That's it.

[53:05.960 --> 53:06.960] Oh, yeah.

[53:06.960 --> 53:07.960] That's terrible.

[53:07.960 --> 53:08.960] It's terrible.

[53:08.960 --> 53:16.840] Like 35 years down with, there should be thousands of patients in the meta-analysis of EMDR.

[53:16.840 --> 53:18.960] And then most of the studies are crap too.

[53:18.960 --> 53:22.740] Most of the studies are not well, are they're pragmatic studies or they're not well controlled

[53:22.740 --> 53:26.360] or they're just comparing it to no intervention or to inadequate intervention.

[53:26.360 --> 53:30.080] So the thing is they're doing studies that are like pragmatic studies that are not designed

[53:30.080 --> 53:33.760] to test whether or not it works, right?

[53:33.760 --> 53:37.280] They're not really doing efficacy trials and when they do, it doesn't work.

[53:37.280 --> 53:39.500] And they just sort of gloss over that.

[53:39.500 --> 53:43.480] And so a lot of people, when I talk about this kind of thing, like EMDR specifically

[53:43.480 --> 53:47.800] or similar things, they say, well, what's the problem, what's the harm?

[53:47.800 --> 53:53.420] It's gimmicky, it's superficial, but people feel that it works, it feels better, you know.

[53:53.420 --> 53:55.160] It waters down the entire system.

[53:55.160 --> 54:00.400] It makes me a less trustworthy practitioner because my field says this works.

[54:00.400 --> 54:01.400] Exactly.

[54:01.400 --> 54:06.880] So there is a lot of harm when you have a profession endorsing essentially pseudoscience

[54:06.880 --> 54:12.400] or at least these, you know, pop, you know, popular, what would you call it, like pop

[54:12.400 --> 54:18.960] psi or pop whatever sort of bizarre or superficial notions of how the brain works.

[54:18.960 --> 54:19.960] Marketing scam.

[54:19.960 --> 54:20.960] Yeah.

[54:20.960 --> 54:27.200] So this basically completely feeds into snake oil industry, completely feeds into that.

[54:27.200 --> 54:30.560] Also it is a massive distraction.

[54:30.560 --> 54:35.080] There is a feedback loop between clinical studies and basic science research, right?

[54:35.080 --> 54:40.160] So if people are making up these weird, you know, notions about what's happening in the

[54:40.160 --> 54:46.680] brain, and then they say, and that's how EMDR works, then they then they falsely conclude

[54:46.680 --> 54:50.680] that EMDR works because of studies that don't show that it works, but they're misinterpreting

[54:50.680 --> 54:51.680] it.

[54:51.680 --> 54:56.520] Then they say that supports these bizarre neuroscience ideas that I have about how it's

[54:56.520 --> 54:57.680] working, right?

[54:57.680 --> 55:00.800] That's like saying, oh, we know qi exists because acupuncture works.

[55:00.800 --> 55:04.440] It's like, well, no, acupuncture doesn't work and there's no reason to think that qi

[55:04.440 --> 55:05.440] exists.

[55:05.440 --> 55:06.440] It's the same kind of thing.

[55:06.440 --> 55:08.700] It poisons the whole research paradigm.

[55:08.700 --> 55:14.960] And so, you know, mental health practice has a hard enough time grinding forward with really

[55:14.960 --> 55:17.980] rigorous science based modalities.

[55:17.980 --> 55:20.600] This kind of thing just makes it harder.

[55:20.600 --> 55:27.300] It's like throwing dirt in the gears of trying to move the whole field forward.

[55:27.300 --> 55:31.480] They really do need to be able to, plus they also need to recognize the research does not

[55:31.480 --> 55:33.840] show that this is a real phenomenon.

[55:33.840 --> 55:37.300] And if they think that it does, they don't know how to do or interpret research.

[55:37.300 --> 55:39.660] And that's the real problem.

[55:39.660 --> 55:46.560] That is the real problem here is that it is exposing a real problem in the in the understanding

[55:46.560 --> 55:48.600] of how clinical science works.

[55:48.600 --> 55:52.880] And in the discipline that I would argue that needs to understand it the best because it

[55:52.880 --> 55:54.140] is the hardest.

[55:54.140 --> 55:59.480] It is really hard to do good research when your outcomes are so subjective and so complicated

[55:59.480 --> 56:00.480] and interdependent.

[56:00.480 --> 56:06.240] It's almost like good psychology research, really good psychology research is some of

[56:06.240 --> 56:07.240] the best research.

[56:07.240 --> 56:08.240] Yes, totally.

[56:08.240 --> 56:12.900] Like we have such a good handle on sophisticated statistics because we're looking for tiny

[56:12.900 --> 56:16.960] outcome measures and we're looking for controlling variables.

[56:16.960 --> 56:21.220] It's not that hard to take a bunch of cloned animals and drop something in the water of

[56:21.220 --> 56:22.220] half of them.

[56:22.220 --> 56:23.220] Yeah.

[56:23.220 --> 56:26.000] Or send an electron through a detector a trillion times or whatever.

[56:26.000 --> 56:27.000] Exactly.

[56:27.000 --> 56:28.000] And I'm not saying that laboratory work isn't hard.

[56:28.000 --> 56:29.000] It's really freaking hard.

[56:29.000 --> 56:30.000] I did it for years.

[56:30.000 --> 56:33.560] But what I'm saying is you have to get real creative when you're working with human subjects

[56:33.560 --> 56:38.040] and when you're looking, like you said, with these more subjective outcomes that much more

[56:38.040 --> 56:39.040] rigorous.

[56:39.040 --> 56:40.040] Yeah, exactly.

[56:40.040 --> 56:42.680] There's a lot of unique challenges to doing this kind of research.

[56:42.680 --> 56:44.640] It has to be all the more rigorous.

[56:44.640 --> 56:46.680] And that's the EMDR.

[56:46.680 --> 56:51.880] The fact that it's able to thrive within this community, this profession is a problem.

[56:51.880 --> 56:56.000] And it's a big flashing light that we need to a lot of celebrities have taken on to this

[56:56.000 --> 56:57.000] as well.

[56:57.000 --> 56:58.560] And that helps perpetuate the problem.

[56:58.560 --> 56:59.560] Yeah.

[56:59.560 --> 57:00.560] Yeah.

[57:00.560 --> 57:06.320] And sometimes these treatments do proliferate within sort of the fringe professionals.

[57:06.320 --> 57:09.960] But EMDR is really thriving within the mainstream of the profession.

[57:09.960 --> 57:10.960] I know.

[57:10.960 --> 57:11.960] It's really scary.

[57:11.960 --> 57:12.960] That's a real problem.

[57:12.960 --> 57:13.960] All right.

[57:13.960 --> 57:14.960] Let's move on.

News_Item_4 (57:14)

  • [link_URL TITLE][5]

Special Segment: Death by Pseudoscience (1:02:47)

Who's That Noisy? (1:10:42)

J: ... similar to English's "Buffalo buffalo Buffalo buffalo buffalo [+ 3 'buffalos']"

...

C: (sing-song) Homonymy![note 1]

New Noisy (1:14:49)

[musical boings and dings]

J: ... If you think you know the answer or you have a cool Noisy you heard this week, you can email me at WTN@theskepticsguide.org.

Announcements (1:15:29)

Science or Fiction (1:18:27)

Theme: Social Psychology

Item #1: A recent study finds that positive fortune-telling results in increased financial risk-taking for men but not for women.[6]
Item #2: A study of 5-years-olds finds that they perceive overweight people to be happier than thin people.[7]
Item #3: A study of college students finds that mask-wearing does not impair social interactions.[8]

Answer Item
Fiction Overweight happier than thin
Science Risk-taking men vs. women
Science
Mask-wearing impairs not
Host Result
Steve win
Rogue Guess
Bob
Mask-wearing impairs not
Jay
Mask-wearing impairs not
Evan
Overweight happier than thin
Cara
Overweight happier than thin

Voice-over: It's time for Science or Fiction.

Bob's Response

Jay's Response

Evan's Response

Cara's Response

Steve Explains Item #1

Steve Explains Item #2

Steve Explains Item #3

Skeptical Quote of the Week (1:29:29)

A good ghost story may hold entertainment and even cultural value, but the popular portrayal of pseudoscientific practices as science may be detracting from efforts to cultivate a scientifically literate public.
Micheal Knees, engineering psychologist

Signoff

S: —and until next week, this is your Skeptics' Guide to the Universe.

S: Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information, visit us at theskepticsguide.org. Send your questions to info@theskepticsguide.org. And, if you would like to support the show and all the work that we do, go to patreon.com/SkepticsGuide and consider becoming a patron and becoming part of the SGU community. Our listeners and supporters are what make SGU possible.

[top]                        

Today I Learned

  • Fact/Description, possibly with an article reference[9]
  • Fact/Description
  • Fact/Description

Notes

  1. The emailer uses the wrong word, homonymy here. The preceding wikilink goes to the disambiguation entry for "Homophony"; the Wikitionary entry shows that "homophony" is the word the emailer should have used.

References

  1. [url_from_news_item_show_notes PUBLICATION: TITLE]
  2. [url_from_news_item_show_notes PUBLICATION: TITLE]
  3. [url_from_news_item_show_notes PUBLICATION: TITLE]
  4. [url_from_news_item_show_notes PUBLICATION: TITLE]
  5. [url_from_news_item_show_notes PUBLICATION: TITLE]
  6. [url_from_SoF_show_notes PUBLICATION: TITLE]
  7. [url_from_SoF_show_notes PUBLICATION: TITLE]
  8. [url_from_SoF_show_notes PUBLICATION: TITLE]
  9. [url_for_TIL publication: title]

Vocabulary


Navi-previous.png Back to top of page Navi-next.png