SGU Episode 898

From SGUTranscripts
Revision as of 22:11, 28 October 2022 by Ralsettem (talk | contribs)
Jump to navigation Jump to search
  GoogleSpeechAPI.png This episode was transcribed by the Google Web Speech API Demonstration (or another automatic method) and therefore will require careful proof-reading.
  Emblem-pen-green.png This transcript is not finished. Please help us finish it!
Add a Transcribing template to the top of this transcript before you start so that we don't duplicate your efforts.

Template:Editing required (w/links) You can use this outline to help structure the transcription. Click "Edit" above to begin.

SGU Episode 898
September 24th 2022
898 polar ice cap.png

2012 Arctic sea ice minimum. Outline shows average minimum 1979-2010.[1]

SGU 897                      SGU 899

Skeptical Rogues
S: Steven Novella

B: Bob Novella

C: Cara Santa Maria

J: Jay Novella

E: Evan Bernstein

Guest

DA: David Almeda (sp?), SGU Patron

Quote of the Week

In the field of thinking, the whole history of science – from geocentrism to the Copernican revolution, from the false absolutes of Aristotle's physics to the relativity of Galileo's principle of inertia and to Einstein's theory of relativity – shows that it has taken centuries to liberate us from the systematic errors, from the illusions caused by the immediate point of view as opposed to "decentered" systematic thinking.

Jean Piaget, Swiss psychologist

Links
Download Podcast
Show Notes
Forum Discussion

Introduction, Guest Rogue

Voice-over: You're listening to the Skeptics' Guide to the Universe, your escape to reality.

[00:12.880 --> 00:17.560] Today is Tuesday, September 20th, 2022, and this is your host, Stephen Novella.

[00:17.560 --> 00:18.960] Joining me this week are Bob Novella.

[00:18.960 --> 00:19.960] Hey, everybody.

[00:19.960 --> 00:20.960] Kara Santamaria.

[00:20.960 --> 00:21.960] Howdy.

[00:21.960 --> 00:22.960] Jay Novella.

[00:22.960 --> 00:23.960] Hey, guys.

[00:23.960 --> 00:24.960] Evan Bernstein.

[00:24.960 --> 00:26.760] Good evening, everyone.

[00:26.760 --> 00:30.400] And we have a guest rogue this week, David Almeida.

[00:30.400 --> 00:32.120] David, welcome to the Skeptics' Guide.

[00:32.120 --> 00:33.120] Hi, guys.

[00:33.120 --> 00:34.120] Thank you for having me.

[00:34.120 --> 00:40.440] So, David, you are a patron of the SGU, and you've been a loyal supporter for a while,

[00:40.440 --> 00:45.200] so we invited you on the show to join us and have some fun.

[00:45.200 --> 00:46.200] Tell us what you do.

[00:46.200 --> 00:47.840] Give us a little bit about your background.

[00:47.840 --> 00:48.840] I'm an electrician.

[00:48.840 --> 00:52.640] That doesn't sound very exciting relative to what you guys are all doing.

[00:52.640 --> 00:55.080] No, electricians are cool, man.

[00:55.080 --> 00:56.080] Yeah, they're shocking.

[00:56.080 --> 00:58.120] Oh, it starts.

[00:58.120 --> 01:05.640] I actually heard about your show kind of because of work, because when I was starting out as

[01:05.640 --> 01:09.920] an apprentice, pretty much all the work I did was really boring and repetitive, and

[01:09.920 --> 01:13.520] so I was kind of losing my mind a little bit.

[01:13.520 --> 01:17.080] And a friend of mine was at my house helping me work on the house, and he was playing your

[01:17.080 --> 01:20.400] guys' podcast, and I had no idea what a podcast even was.

[01:20.400 --> 01:22.200] This was like 2012, I think.

[01:22.200 --> 01:24.400] I thought he was like listening to NPR or something.

[01:24.400 --> 01:26.320] You were late to the game.

[01:26.320 --> 01:27.320] Yeah, I know.

[01:27.320 --> 01:28.320] I know.

[01:28.320 --> 01:32.320] And so anyways, he was listening to your show, and I asked him what it was, and he told me,

[01:32.320 --> 01:35.400] and then I ended up going back and listening to your whole back catalog while I was doing

[01:35.400 --> 01:41.320] horrible, very repetitive work, and it got me through that for the first couple years

[01:41.320 --> 01:42.320] of my apprenticeship.

[01:42.320 --> 01:43.320] Yeah, we hear that a lot.

[01:43.320 --> 01:49.520] It's good for when you're exercising, doing mind-numbing repetitive tasks, riding a bike

[01:49.520 --> 01:50.520] or whatever.

[01:50.520 --> 01:51.520] It's good.

[01:51.520 --> 01:53.840] You're not just going to sit there staring off into space listening to the SGU.

[01:53.840 --> 01:58.520] I guess some people might do that, but it's always good for when you're doing something

[01:58.520 --> 01:59.520] else.

[01:59.520 --> 02:00.520] Well, great.

[02:00.520 --> 02:01.520] Thanks for joining us on the show.

[02:01.520 --> 02:02.520] It should be a lot of fun.

News Items

S:

B:

C:

J:

E:

(laughs) (laughter) (applause) [inaudible]

2022 Ig Nobels (2:08)

[02:02.520 --> 02:05.640] You're going to have a news item to talk about a little bit later, but first we're just going

[02:05.640 --> 02:07.960] to dive right into some news items.

[02:07.960 --> 02:12.760] Jay, you're going to start us off by talking about this year's Ig Nobel Prizes.

[02:12.760 --> 02:14.160] Yeah, this was interesting.

[02:14.160 --> 02:22.840] This year, I didn't find anything about people getting razzed so much as I found stuff that

[02:22.840 --> 02:26.520] was basically legitimate, just weird, if you know what I mean.

[02:26.520 --> 02:28.200] Yeah, but that's kind of the Ig Nobles.

[02:28.200 --> 02:29.200] They're legit, but weird.

[02:29.200 --> 02:32.880] They're not fake science or bad science.

[02:32.880 --> 02:37.880] The Ig Nobel Prize is an honor about achievements that first make people laugh and then make

[02:37.880 --> 02:39.720] them think.

[02:39.720 --> 02:44.720] That's kind of their tagline, and it was started in 1991.

[02:44.720 --> 02:47.600] Here's a list of the 2022 winners.

[02:47.600 --> 02:48.600] Check this one out.

[02:48.600 --> 02:53.520] The first one here, the Art History Prize, they call it a multidisciplinary approach

[02:53.520 --> 02:57.600] to ritual enema scenes on ancient Maya pottery.

[02:57.600 --> 03:01.960] Whoa, I want to see those.

[03:01.960 --> 03:04.560] Talk about an insane premise.

[03:04.560 --> 03:11.680] Back in the 6-900 CE timeframe, the Mayans depicted people getting enemas on their pottery,

[03:11.680 --> 03:13.640] which that's crazy.

[03:13.640 --> 03:18.520] This is because they administered enemas back then for medicinal purposes.

[03:18.520 --> 03:20.200] So it was part of their culture.

[03:20.200 --> 03:24.200] And the researchers think it's likely that the Mayans also gave enemas that had drugs

[03:24.200 --> 03:28.000] in them to make people get high during rituals, which is-

[03:28.000 --> 03:29.000] That works, by the way.

[03:29.000 --> 03:30.000] Yeah.

[03:30.000 --> 03:33.000] So one of the lead researchers tried it.

[03:33.000 --> 03:39.960] He gave himself an alcohol enema situation, and they were giving him breathalyzer, and

[03:39.960 --> 03:43.340] lo and behold, he absorbed alcohol through his rectum.

[03:43.340 --> 03:46.120] He didn't need to do that to know that that happens.

[03:46.120 --> 03:47.120] This is science.

[03:47.120 --> 03:48.120] We already know that that happens.

[03:48.120 --> 03:49.120] What's the problem, Kara?

[03:49.120 --> 03:50.120] Science?

[03:50.120 --> 03:51.120] Hello?

[03:51.120 --> 03:52.120] Lit review.

[03:52.120 --> 03:53.120] Lit review.

[03:53.120 --> 04:01.160] The guy also tested DMT, but apparently the dose was probably not high enough for him

[04:01.160 --> 04:03.280] to feel anything because he didn't feel anything.

[04:03.280 --> 04:04.840] So I think that's pretty interesting.

[04:04.840 --> 04:09.760] I mean, that's very, very provocative, just to think that the Mayans, I didn't know they

[04:09.760 --> 04:11.160] did that, and then it was like a thing.

[04:11.160 --> 04:13.480] I was like, whatever.

[04:13.480 --> 04:16.080] Next one, Applied Cardiology Prize.

[04:16.080 --> 04:18.940] This one I find to be really cool.

[04:18.940 --> 04:24.700] So the researchers, they were seeking and finding evidence that when new romantic partners

[04:24.700 --> 04:29.880] meet for the first time and feel attracted to each other, their heart rates synchronize.

[04:29.880 --> 04:31.840] That was the premise of their research.

[04:31.840 --> 04:35.640] So the researchers wanted to find out if there is something physiological behind the gut

[04:35.640 --> 04:42.240] feeling that people can and do feel when they have met the quote unquote right person.

[04:42.240 --> 04:44.560] I don't know about you guys, but I've felt this.

[04:44.560 --> 04:49.440] I felt an inexplicable physiological thing.

[04:49.440 --> 04:53.440] I just didn't realize that something profound was happening.

[04:53.440 --> 04:55.960] So let me give you a quick idea of what they did.

[04:55.960 --> 04:58.080] They had 140 test subjects.

[04:58.080 --> 05:03.060] They monitored test subjects as they met other test subjects one on one.

[05:03.060 --> 05:09.280] If the pair got a gut feeling about the other person, the researchers predicted that they

[05:09.280 --> 05:15.660] would have motor movements, you know, act certain types of activity with their eyes,

[05:15.660 --> 05:18.440] heart rate, skin conductance.

[05:18.440 --> 05:22.020] These types of things would synchronize or they would pair each other or mirror each

[05:22.020 --> 05:23.020] other.

[05:23.020 --> 05:27.820] So 17 percent of the test subjects had what they considered to be successful pairing with

[05:27.820 --> 05:28.820] another subject.

[05:28.820 --> 05:33.760] And they found that couples heart rates and skin conductance correlated to a mutual attraction

[05:33.760 --> 05:35.060] with each other.

[05:35.060 --> 05:41.680] So there is some type of thing happening physiologically when two people, you know, get that feeling

[05:41.680 --> 05:44.520] when they're, you know, there's an initial attraction.

[05:44.520 --> 05:50.040] And it doesn't surprise me because, you know, as mammals, attraction is, you know, it's

[05:50.040 --> 05:51.480] a huge thing.

[05:51.480 --> 05:55.800] It's really it's not only important, but it, you know, your body is reacting to it.

[05:55.800 --> 06:00.180] There are things that happen when you feel that your body is is changing in a way.

[06:00.180 --> 06:01.180] Very cool.

[06:01.180 --> 06:04.960] The next one, I think a lot of people will get a kick out of the it's the literature

[06:04.960 --> 06:11.400] prize and they are analyzing what makes legal documents unnecessarily difficult to understand.

[06:11.400 --> 06:14.280] The basic idea here was there's two camps.

[06:14.280 --> 06:21.040] There's a camp that thinks that legal documents need to be as complicated as they are because

[06:21.040 --> 06:26.180] there's technical concepts and they need to be precise and they use they use this type

[06:26.180 --> 06:28.720] of language to help get that precision.

[06:28.720 --> 06:33.880] And there are other experts that think that laws are actually built upon, you know, mundane

[06:33.880 --> 06:38.360] concepts like cause consent and having best interests.

[06:38.360 --> 06:42.100] And what the researchers wanted to do was they wanted to test the two positions against

[06:42.100 --> 06:43.100] each other.

[06:43.100 --> 06:47.600] And essentially what they found, which is which is not going to surprise anyone, that

[06:47.600 --> 06:53.840] legal documents are at their core, essentially difficult to understand, which is what they

[06:53.840 --> 06:55.320] basically started with that premise.

[06:55.320 --> 07:00.640] But they proved it and they found out exactly what parts of the of the actual legal documents

[07:00.640 --> 07:02.080] were difficult to understand.

[07:02.080 --> 07:07.380] And they called it something called center embedding, which is when lawyers use legal

[07:07.380 --> 07:11.920] jargon within what they call convoluted syntax.

[07:11.920 --> 07:16.420] So in essence, I think what they're saying here is that that legal documents are difficult

[07:16.420 --> 07:21.800] essentially by design, that it's deliberate, which I find interesting and frustrating.

[07:21.800 --> 07:26.680] If you've ever had to read any type of legal documentation, it's kind of annoying how difficult

[07:26.680 --> 07:27.680] it is.

[07:27.680 --> 07:30.860] You have to reread it over and over and over again and look things up and really like sink

[07:30.860 --> 07:32.000] into it to understand.

[07:32.000 --> 07:34.320] So I understand why they did the research.

[07:34.320 --> 07:37.500] I just don't see what the benefit is to the result of their research.

[07:37.500 --> 07:38.720] Maybe they have to iterate it.

[07:38.720 --> 07:41.920] The center embedding thing is actually pretty interesting.

[07:41.920 --> 07:48.120] I saw an example of it and it's like not like it's not what I thought it was going to be.

[07:48.120 --> 07:52.760] It gives an example of like a sentence like a man loves a woman and then a man that a

[07:52.760 --> 07:57.880] woman that a child knows loves a man that a woman that a child that a bird saw is giving

[07:57.880 --> 08:03.840] these examples of like how they do the legal jargon with all those like with all those

[08:03.840 --> 08:08.520] little like phrases within phrases.

[08:08.520 --> 08:12.120] And it's like grammatically correct, but it's like impossible to follow once you have like

[08:12.120 --> 08:13.440] more than one of those.

[08:13.440 --> 08:16.480] The longest one they had here was a man that a woman that a child that a bird that I heard

[08:16.480 --> 08:18.680] saw knows loves.

[08:18.680 --> 08:23.720] I don't know how that actually makes grammatical sense, but apparently it does.

[08:23.720 --> 08:26.960] It makes no sense to me even though I read it like five times in my head.

[08:26.960 --> 08:29.000] I don't know what that sentence means.

[08:29.000 --> 08:33.500] Have you guys seen I just started watching that show Maid.

[08:33.500 --> 08:35.000] Have you guys seen that show on Netflix?

[08:35.000 --> 08:36.000] It's so good.

[08:36.000 --> 08:37.840] Oh, highly recommend.

[08:37.840 --> 08:41.660] But they do this funny thing where like she has to go to a court hearing and they kind

[08:41.660 --> 08:45.600] of show what she hears instead of what is being said.

[08:45.600 --> 08:51.680] And so she's at this custody hearing and the judge is talking to one of the prosecutors

[08:51.680 --> 08:56.620] and like mid-sentence they start going, and then legal, legal, legal, legal, so that you

[08:56.620 --> 08:57.720] can legal, legal.

[08:57.720 --> 08:59.520] I'll legal, legal after you legal.

[08:59.520 --> 09:02.520] And she like looked so confused because it's all she hears.

[09:02.520 --> 09:06.680] And they do it on the forms too when she looks at the forms like the words move and they

[09:06.680 --> 09:08.960] start to just say like legal, legal, legal, legal.

[09:08.960 --> 09:13.360] It's a great representation of exactly what you're talking about.

[09:13.360 --> 09:14.360] Totally.

[09:14.360 --> 09:17.360] The Charlie Brown teacher.

[09:17.360 --> 09:18.360] Charlie Brown.

[09:18.360 --> 09:19.360] Yeah.

[09:19.360 --> 09:20.360] Totally.

[09:20.360 --> 09:22.080] Let me get through a few more really quick.

[09:22.080 --> 09:26.720] There was a biology prize where they studied whether and how constipation affects the mating

[09:26.720 --> 09:28.360] prospects of scorpions.

[09:28.360 --> 09:36.540] There was a medical prize for showing that when patients undergo some form of toxic chemotherapy,

[09:36.540 --> 09:41.720] they suffer fewer harmful side effects when ice cream replaces ice chips.

[09:41.720 --> 09:42.720] Okay.

[09:42.720 --> 09:43.720] Reasonable to me.

[09:43.720 --> 09:44.720] Reasonable.

[09:44.720 --> 09:45.720] Unactionable.

[09:45.720 --> 09:46.720] Yep, it is.

[09:46.720 --> 09:47.720] And it's legit.

[09:47.720 --> 09:48.720] It is actually legit.

[09:48.720 --> 09:53.080] The ice cream, giving chemo patients ice cream helped them a lot more deal with side effects

[09:53.080 --> 09:55.280] than ice chips.

[09:55.280 --> 09:59.300] Their engineering prize, they're trying to discover the most efficient way for people

[09:59.300 --> 10:04.400] to use their fingers when turning a knob, doorknob.

[10:04.400 --> 10:09.920] That's you know, so they study people turning doorknobs and figured it out.

[10:09.920 --> 10:14.040] Works for trying to understand how ducklings manage to swim in formation.

[10:14.040 --> 10:20.520] Now this is cool because we know that fish and birds have very few rules of interaction

[10:20.520 --> 10:26.640] in order to do profound feats of being able to stay in these giant groups.

[10:26.640 --> 10:27.640] They could swim near each other.

[10:27.640 --> 10:31.600] They can fly near each other and they don't really need to have a complicated algorithm

[10:31.600 --> 10:32.600] happening.

[10:32.600 --> 10:34.560] They just have to follow a few simple rules and it works.

[10:34.560 --> 10:36.240] And apparently ducks can do it too.

[10:36.240 --> 10:40.720] There was a peace prize and this one is for developing an algorithm to help gossipers

[10:40.720 --> 10:43.800] decide when to tell the truth and when to lie.

[10:43.800 --> 10:44.800] Very important.

[10:44.800 --> 10:45.800] Right?

[10:45.800 --> 10:47.200] That's just wacky as hell.

[10:47.200 --> 10:52.740] The economics prize for explaining mathematically why success most often goes not to the most

[10:52.740 --> 10:55.460] talented people but instead to the luckiest.

[10:55.460 --> 10:56.460] That one was interesting.

[10:56.460 --> 10:57.840] I recommend you read that.

[10:57.840 --> 11:04.400] And then this last one here is safety engineering prize for developing a moose crash test dummy.

[11:04.400 --> 11:05.400] That's smart actually.

[11:05.400 --> 11:06.400] Yeah.

[11:06.400 --> 11:09.480] So you know lots of people hit these animals with their cars.

[11:09.480 --> 11:12.800] I heard you hesitate because you didn't know if you're supposed to say moose or mooses.

[11:12.800 --> 11:13.800] Meeses.

[11:13.800 --> 11:14.800] I don't know.

[11:14.800 --> 11:15.800] You were like hit these animals.

[11:15.800 --> 11:16.800] Animals.

[11:16.800 --> 11:17.800] Yeah.

[11:17.800 --> 11:18.800] I'm not going to.

[11:18.800 --> 11:19.800] What the heck?

[11:19.800 --> 11:20.800] What's the plural of moose?

[11:20.800 --> 11:21.800] Isn't it moose?

[11:21.800 --> 11:22.800] Yeah.

[11:22.800 --> 11:25.720] I think the plural of moose is moose.

[11:25.720 --> 11:27.640] I just add a K in when I don't know what to do.

[11:27.640 --> 11:29.520] Have I told you the story about the mongoose?

[11:29.520 --> 11:30.520] No.

[11:30.520 --> 11:31.520] What?

[11:31.520 --> 11:32.520] I heard.

[11:32.520 --> 11:33.520] I learned this.

[11:33.520 --> 11:43.760] I heard this in my film class where a director needed two mongoose for a scene and he couldn't

[11:43.760 --> 11:46.640] figure out what the plural of mongoose was.

[11:46.640 --> 11:49.780] You know the mongooses, mongies, whatever, he couldn't figure it out.

[11:49.780 --> 11:56.120] So he wrote in his message to the person who had to do this, I need you to get me one mongoose

[11:56.120 --> 11:59.600] and while you're at it get me another one.

[11:59.600 --> 12:04.160] Yeah, technically solves that problem.

[12:04.160 --> 12:05.160] All right.

[12:05.160 --> 12:06.160] Thanks Jay.

It's OK to Ask for Help (12:06)

[12:06.160 --> 12:08.520] Kara, is it okay to ask for help when you need it?

[12:08.520 --> 12:09.920] Is it okay?

[12:09.920 --> 12:13.400] Not only is it okay, it's great.

[12:13.400 --> 12:19.480] Let's dive into a cool study that was recently published in Psychological Science.

[12:19.480 --> 12:23.680] This study has a lot of moving parts so I'm not going to get into all of them but I have

[12:23.680 --> 12:29.520] to say just kind of at the top that I'm loving the thoroughness and I'm loving the clarity

[12:29.520 --> 12:32.320] of the writing of this research article.

[12:32.320 --> 12:37.040] I feel like it's a great example of good psychological science.

[12:37.040 --> 12:39.440] It's based on a really deep literature review.

[12:39.440 --> 12:44.120] A lot of people that we know and love like Dunning are cited who have co-written with

[12:44.120 --> 12:45.480] some of the authors.

[12:45.480 --> 12:56.720] This study basically is asking the question, why do people struggle to ask for help?

[12:56.720 --> 13:01.960] When people do ask for help, what is the outcome usually?

[13:01.960 --> 13:08.800] They did six, I think it was six or was it eight, I think it was six different individual

[13:08.800 --> 13:12.000] experiments within this larger study.

[13:12.000 --> 13:18.000] Their total and the number of people overall that were involved, that participated in the

[13:18.000 --> 13:21.780] study was like over 2,000.

[13:21.780 --> 13:25.180] They kind of looked at it from multiple perspectives.

[13:25.180 --> 13:30.340] They said, first we're going to ask people to imagine a scenario and tell us what they

[13:30.340 --> 13:34.640] think they would do or what they think the other person would feel or think.

[13:34.640 --> 13:39.200] Then we're going to ask them about experiences that they've actually had, like think back

[13:39.200 --> 13:43.220] to a time when you asked for help or when somebody asked you for help and then answer

[13:43.220 --> 13:44.400] all these questions.

[13:44.400 --> 13:50.280] Then they actually did a more real world kind of ecological study where they said, okay,

[13:50.280 --> 13:52.920] we're going to put a scenario in place.

[13:52.920 --> 13:59.120] Basically this scenario was in a public park, they asked people to basically go up to somebody

[13:59.120 --> 14:02.700] else and be like, hey, do you mind taking a picture for me?

[14:02.700 --> 14:10.200] They did a bunch of different really clean study designs where they took a portion of

[14:10.200 --> 14:14.280] the people and had them be the askers and a portion of the people and have them be the

[14:14.280 --> 14:17.920] non-askers and then a portion of the people and have them ask with a prompt, without a

[14:17.920 --> 14:18.920] prompt.

[14:18.920 --> 14:22.920] The study designs are pretty clean but they're kind of complex.

[14:22.920 --> 14:26.760] What do you guys think, I mean obviously you can't answer based on every single study,

[14:26.760 --> 14:31.360] but the main sort of takeaway of this was?

[14:31.360 --> 14:34.040] Asking for help is good and people are willing to give the help.

[14:34.040 --> 14:35.040] People like getting help.

[14:35.040 --> 14:36.040] Right.

[14:36.040 --> 14:37.040] Yeah.

[14:37.040 --> 14:40.440] So not only do people more often than not, and not even more often than not, like almost

[14:40.440 --> 14:46.200] all the time, especially in these low hanging fruit scenarios, do the thing that's asked

[14:46.200 --> 14:50.200] of them, but they actually feel good about it after.

[14:50.200 --> 14:54.240] They feel better having given help.

[14:54.240 --> 14:58.980] And so what they wanted to look at were some of these kind of cognitive biases basically.

[14:58.980 --> 15:04.560] They asked themselves, why are people so hesitant to ask for help?

[15:04.560 --> 15:13.120] And they believe it's because people miscalibrate their expectations about other people's prosociality,

[15:13.120 --> 15:17.640] that there's sort of a Western ideal that says people are only looking out for their

[15:17.640 --> 15:21.600] own interest and they'd rather not help anybody and only help themselves.

[15:21.600 --> 15:28.200] They also talk about something called compliance motivation.

[15:28.200 --> 15:34.160] So they think that people are, when they actually do help you out, it's less because they want

[15:34.160 --> 15:36.400] to because they're prosocial.

[15:36.400 --> 15:41.100] And it's more literally because they feel like they have to, like they feel a pull to

[15:41.100 --> 15:43.560] comply with a request.

[15:43.560 --> 15:49.000] But it turns out that in these different studies where either they're looking at a real world

[15:49.000 --> 15:54.920] example, they're asking people for imagined examples, the helpers more often than not

[15:54.920 --> 15:57.360] want to help and say that they feel good about helping.

[15:57.360 --> 16:01.760] But the people who need the help more often than not judge the helpers to not want to

[16:01.760 --> 16:06.120] help them and worry that the helpers won't want to help them.

[16:06.120 --> 16:11.200] So this is another example of kind of, do you guys remember last week, I think it was,

[16:11.200 --> 16:15.400] when I talked about a study where people were trying to calibrate how much they should talk

[16:15.400 --> 16:16.400] to be likable?

[16:16.400 --> 16:17.400] Yeah.

[16:17.400 --> 16:18.400] Yeah, so yeah.

[16:18.400 --> 16:20.120] And they were, again, miscalibrating.

[16:20.120 --> 16:22.000] They were saying, I shouldn't talk that much.

[16:22.000 --> 16:23.240] They'll like me more if I talk less.

[16:23.240 --> 16:26.360] But it turns out if you talk more, people actually like you more.

[16:26.360 --> 16:31.480] And so it's another one of those examples of a cognitive bias getting in the way of

[16:31.480 --> 16:37.080] us engaging in social behavior and actually kind of shooting ourselves in the foot because

[16:37.080 --> 16:42.580] we fear an outcome that is basically the opposite of the outcome that we'll get.

[16:42.580 --> 16:46.800] If we ask for help, we'll more than likely get it, and more than likely, the person who

[16:46.800 --> 16:50.280] helped us will feel good about having helped us, and it's really a win-win.

[16:50.280 --> 16:52.040] Of course, they caveated at the end.

[16:52.040 --> 16:53.480] We're talking about low-hanging fruit.

[16:53.480 --> 16:58.120] We're not talking about massive power differentials, you know, where one person, where there's

[16:58.120 --> 17:00.000] coercion and things like that.

[17:00.000 --> 17:05.040] But given some of those caveats to the side, basically an outcome of every single design

[17:05.040 --> 17:07.600] that they did in the study was people want to help.

[17:07.600 --> 17:09.660] And they want to help because it makes them feel good.

[17:09.660 --> 17:13.240] So maybe the next time you need help, if you ask, you shall receive.

[17:13.240 --> 17:15.400] Yeah, also, it works both ways, too.

[17:15.400 --> 17:20.520] People often don't offer help because they are afraid that they don't understand the

[17:20.520 --> 17:26.560] situation and they're basically afraid of committing a social faux pas.

[17:26.560 --> 17:32.560] And so they end up not offering help, even in a situation when they probably should,

[17:32.560 --> 17:33.560] you know, because the fear...

[17:33.560 --> 17:34.560] Like a good Samaritan?

[17:34.560 --> 17:38.840] Well, because people want to help and they want to offer to help, but they're more afraid

[17:38.840 --> 17:41.400] of doing something socially awkward, and so they don't.

[17:41.400 --> 17:47.040] So if you just don't worry about that and just offer to help, have a much lower threshold

[17:47.040 --> 17:49.120] for offering, it's like it's no big deal.

[17:49.120 --> 17:53.400] If it's like, oh, I'm fine, okay, just checking, you know, but people will not do it.

[17:53.400 --> 17:58.080] I was once walking down the street and there was a guy on the sidewalk who couldn't get

[17:58.080 --> 17:59.080] up.

[17:59.080 --> 18:00.640] He clearly could not get up on his own, right?

[18:00.640 --> 18:04.600] And there are people walking by and other people sort of like checking him out, but

[18:04.600 --> 18:06.600] nobody was saying anything or offering to help.

[18:06.600 --> 18:08.560] So I just, hey, you need a hand?

[18:08.560 --> 18:09.560] And he did.

[18:09.560 --> 18:13.080] And then like three or four people right next to him were like, oh, let me help you, you

[18:13.080 --> 18:14.160] know what I mean?

[18:14.160 --> 18:18.320] But it was just, again, it's not that they weren't bad people, they just were paralyzed

[18:18.320 --> 18:21.200] by fear of social faux pas.

[18:21.200 --> 18:24.720] So it's kind of, it's the reverse of, I guess, of what you're saying, you know, where people

[18:24.720 --> 18:27.760] might not ask for help because they're afraid that it's not socially not the right thing

[18:27.760 --> 18:28.760] to do.

[18:28.760 --> 18:29.760] But it is.

[18:29.760 --> 18:30.760] Give help, ask for help.

[18:30.760 --> 18:31.760] It's all good.

[18:31.760 --> 18:32.760] Everybody likes it.

[18:32.760 --> 18:34.720] Just don't let your social fears get in the way.

[18:34.720 --> 18:35.880] Don't let your social fears get in the way.

[18:35.880 --> 18:41.360] And also there are ways to buffer if you are scared that you're like putting somebody out,

[18:41.360 --> 18:44.160] is there are different like strategies that you can use.

[18:44.160 --> 18:45.840] You can give people outs.

[18:45.840 --> 18:50.200] You know, if you really do need help, but you also really are worried that you're going

[18:50.200 --> 18:52.880] to be putting somebody out by asking for help.

[18:52.880 --> 18:58.620] You can say things like, I'd really appreciate your help in this situation, but I also understand

[18:58.620 --> 19:02.120] that it may be too much for you and don't worry, I'll still get it taken care of.

[19:02.120 --> 19:08.360] Like there are ways to buffer and to negotiate the sociality of that so that you don't feel

[19:08.360 --> 19:10.200] like you're being coercive.

[19:10.200 --> 19:14.560] And so, yeah, it's like you see it all the time.

[19:14.560 --> 19:16.920] People who get stuff in life ask for it.

[19:16.920 --> 19:22.400] Yeah, or you could, you could do, or you could do with my, what my Italian mother does and

[19:22.400 --> 19:24.280] say, don't worry about me.

[19:24.280 --> 19:25.280] I'll be fine.

[19:25.280 --> 19:28.080] I don't need anything.

[19:28.080 --> 19:29.080] The passive guilt.

[19:29.080 --> 19:30.080] They are.

[19:30.080 --> 19:31.080] They're wonderful at that.

[19:31.080 --> 19:34.800] Don't forget guys, don't forget.

[19:34.800 --> 19:41.480] If you help somebody, they owe you someday.

[19:41.480 --> 19:44.080] You might do a favor for me, you know?

[19:44.080 --> 19:45.080] That's right.

[19:45.080 --> 19:49.600] One of those, one of those things that my dad like drilled into my head that I like

[19:49.600 --> 19:53.000] always hear in my head all the time was he would always say, you don't ask, you don't

[19:53.000 --> 19:54.000] get.

[19:54.000 --> 19:57.280] So like I literally hear that in my head all the time when I'm in scenarios where I want

[19:57.280 --> 20:03.240] to ask for something and I totally do get that way sometimes where I'm like, yeah.

Bitcoin and Fedimints (20:03)

[20:03.240 --> 20:05.240] Let me ask you something, David.

[20:05.240 --> 20:12.800] I have a feeling that fediments are not as tasty as they sound.

[20:12.800 --> 20:15.840] Tell us about Bitcoin and fediments.

[20:15.840 --> 20:16.840] Fediments.

[20:16.840 --> 20:21.520] So a fediments, it's a portmanteau, I just realized I don't think I've ever said that

[20:21.520 --> 20:22.520] word out loud.

[20:22.520 --> 20:23.520] You said it right though.

[20:23.520 --> 20:24.520] Welcome to the show.

[20:24.520 --> 20:28.680] That's one of those words I've probably read like a million times.

[20:28.680 --> 20:31.820] I don't think I've ever said it in conversation or ever.

[20:31.820 --> 20:40.480] So a fediments, a portmanteau of a federated Chowmian mint, which basically it's a way

[20:40.480 --> 20:43.960] to scale Bitcoin.

[20:43.960 --> 20:54.780] There's two big problems with Bitcoin scaling to achieve its goal of being a worldwide payment

[20:54.780 --> 20:59.120] settlement system that's decentralized.

[20:59.120 --> 21:03.240] So one of those is that it doesn't provide very good privacy.

[21:03.240 --> 21:11.280] And the other one is that custody is kind of a sticky issue.

[21:11.280 --> 21:17.800] You've got people on two extremes when it comes to custody of Bitcoin.

[21:17.800 --> 21:24.760] You've got people who are kind of the old school Bitcoiners who will say, not your keys,

[21:24.760 --> 21:30.640] not your Bitcoin, which basically they're 100% self custody.

[21:30.640 --> 21:34.480] If you're not self custodying your Bitcoin, then you're not doing it right.

[21:34.480 --> 21:39.000] But there are problems with self custody, which is basically if you theoretically had

[21:39.000 --> 21:44.480] your life savings in Bitcoin and you lost the key to your Bitcoin, then you lose all

[21:44.480 --> 21:45.840] your life savings.

[21:45.840 --> 21:47.640] It's completely irretrievable.

[21:47.640 --> 21:49.920] There's no way to get back.

[21:49.920 --> 21:54.480] So the other option is third party custody, which the most common form is like people

[21:54.480 --> 21:59.860] will have it on Coinbase or Binance and they hold the keys for you.

[21:59.860 --> 22:02.360] All you need to have is your username and password.

[22:02.360 --> 22:07.400] And even if you lose that, you could prove your identity to them and they could restore

[22:07.400 --> 22:08.400] that for you.

[22:08.400 --> 22:10.540] They could restore your account for you.

[22:10.540 --> 22:15.840] The main problem with that is that you're hackable.

[22:15.840 --> 22:18.600] It's hackable.

[22:18.600 --> 22:25.320] It could be part of, you got a lot of rug pools, especially in the past, you had a lot

[22:25.320 --> 22:30.680] of situations where somebody created some third party custody things simply to get people

[22:30.680 --> 22:37.480] to put their Bitcoin on there and then they just like walked away with it.

[22:37.480 --> 22:44.660] Fedimint is an idea that they call second party custody.

[22:44.660 --> 22:51.640] The basic idea of a fediment is where a small group of people could come together, they

[22:51.640 --> 23:00.320] could create this fediment and it would have some of the advantages of self-custody where

[23:00.320 --> 23:04.240] there would be no one point of failure where one person could get up and walk away with

[23:04.240 --> 23:11.000] everything and so that also gets rid of the risk of hacking as well because you would

[23:11.000 --> 23:14.920] need, well, it doesn't get rid of the risk, but it minimizes the risk of a hack because

[23:14.920 --> 23:21.200] you would need consensus of a group of people that say you had five people who were, they

[23:21.200 --> 23:26.280] call them the guardians of the fediment, you would need three out of those five people

[23:26.280 --> 23:28.680] to do anything for anything to happen to the Bitcoin.

[23:28.680 --> 23:33.840] So three out of those five people would have to all get hacked by the same person in order

[23:33.840 --> 23:38.120] for somebody to be able to walk away with the Bitcoin.

[23:38.120 --> 23:43.600] It's resistant to the third party custody problems of being hacked or somebody just

[23:43.600 --> 23:52.280] pulling the rug under you and the other problem it can help alleviate is the privacy problem

[23:52.280 --> 24:01.240] which is not really achievable on the main blockchain of Bitcoin, on the base level.

[24:01.240 --> 24:05.440] If you're like a really technically savvy person you can do a lot of extra steps to

[24:05.440 --> 24:13.640] make your transactions somewhat private, but it's not anything that even a moderately technically

[24:13.640 --> 24:18.280] savvy person could do and there's a lot of steps that you could mess up where it would

[24:18.280 --> 24:19.280] fail.

[24:19.280 --> 24:25.720] So anyway, so Fediment by Design is much more private, it works using kind of an old technology

[24:25.720 --> 24:31.640] called eCash and using blind signatures, which I don't know if we want to get to that, that's

[24:31.640 --> 24:38.600] kind of old stuff, that blind signature makes it so that once you have your Bitcoin in the

[24:38.600 --> 24:45.920] Fediment any transactions you do with that are essentially anonymous and it would only

[24:45.920 --> 24:50.680] be dealing with anything on the blockchain if you were to withdraw from the Fediment.

[24:50.680 --> 24:53.800] Let me see if I have my head wrapped around this the correct way.

[24:53.800 --> 24:59.920] So it's a blockchain where three different people have to be hacked in order for a person

[24:59.920 --> 25:03.840] to actually get their hands on the Bitcoin or on the crypto.

[25:03.840 --> 25:09.960] And isn't the other hard part about that that would they know who the three people are inherently

[25:09.960 --> 25:13.280] by the blockchain or would that be part of the problem?

[25:13.280 --> 25:20.440] So the people who are the guardians, there could be, I think they said up to 15 and the

[25:20.440 --> 25:25.600] guardians could be anonymous or they could be public people.

[25:25.600 --> 25:31.720] The idea of the whole Fediment is it's supposed to be, whereas the base layer of Bitcoin is

[25:31.720 --> 25:33.720] meant to be completely trustless.

[25:33.720 --> 25:41.320] There's no like third party involved in really any of the process of Bitcoin, it's all completely

[25:41.320 --> 25:45.920] automated and that's how it's designed to be.

[25:45.920 --> 25:48.120] And so this is supposed to bring a little bit of trust into it.

[25:48.120 --> 25:53.960] So realistically, most of the guardians probably you would want to be a publicly known person

[25:53.960 --> 26:01.880] so that people would trust you and your small group to custody their Bitcoin for them.

[26:01.880 --> 26:07.960] But that person who was the guardian wouldn't know any information about the people who

[26:07.960 --> 26:14.400] were participating in the Fediment and they wouldn't have any information about what Bitcoin

[26:14.400 --> 26:17.160] is moving where or anything like that.

[26:17.160 --> 26:23.200] The other thing that is hitting me is that you also want those people would need to like

[26:23.200 --> 26:28.720] be there, you know, they would have to have a presence that isn't going to just go away

[26:28.720 --> 26:29.720] suddenly.

[26:29.720 --> 26:30.720] Right.

[26:30.720 --> 26:35.440] You know, if or would those three people be picked at random when a trade is happening?

[26:35.440 --> 26:37.340] Is it like permanent or random?

[26:37.340 --> 26:38.340] It's permanent.

[26:38.340 --> 26:43.720] So somebody would basically create or a small group of people would basically create a Fediment.

[26:43.720 --> 26:51.180] The idea is that it would be implemented on different scales, but their ideal scale that

[26:51.180 --> 26:55.640] they're trying to what they're trying to do is create this for community level.

[26:55.640 --> 26:58.080] It's almost comparable to like a community bank.

[26:58.080 --> 26:59.080] Oh, OK.

[26:59.080 --> 27:00.080] That makes sense.

[27:00.080 --> 27:01.080] Yeah.

[27:01.080 --> 27:08.240] So and the idea of having Bitcoin banks goes all the way back to 2010, basically this idea

[27:08.240 --> 27:15.340] that you would have a bank that was very similar to to a regular bank, but it would be based

[27:15.340 --> 27:16.340] on Bitcoin.

[27:16.340 --> 27:20.880] This is one of the first attempts to actually implement that, though.

[27:20.880 --> 27:25.800] As like a protocol, as opposed to being like, you know, most other solutions are just like

[27:25.800 --> 27:32.100] a company, an app, whereas this is more of a protocol layer solution where anybody could

[27:32.100 --> 27:33.100] use it.

[27:33.100 --> 27:37.120] It's not like, you know, it's not like proprietary app or something.

[27:37.120 --> 27:38.120] That sounds interesting.

[27:38.120 --> 27:44.360] I mean, it does sound like it solves the problem of basically having your Bitcoin stolen by

[27:44.360 --> 27:48.160] someone who creates a totally like temporary exchange.

[27:48.160 --> 27:53.400] I mean, it does it does seem to have more, you know, more of anonymous security built

[27:53.400 --> 27:56.880] into it in a sense, you know, like it does seem like it could do it.

[27:56.880 --> 28:01.320] But I mean, is it is it something that is is happening right now or is this is this

[28:01.320 --> 28:02.360] in the works?

[28:02.360 --> 28:07.600] They're hoping to have like to be able to like launch probably like a prototype or,

[28:07.600 --> 28:11.700] you know, maybe like a beta version or something around around then.

[28:11.700 --> 28:15.560] So it's not available right now, but should be available soon, hopefully.

[28:15.560 --> 28:17.100] Well, thanks, David.

Multivitamins for Memory (28:17)

  • [link_URL Effects of cocoa extract and a multivitamin on cognitive function: A randomized clinical trial][5]

[28:17.100 --> 28:23.640] On the live stream this last Friday, somebody asked us about the recent study on multivitamins

[28:23.640 --> 28:27.880] and memory, which I hadn't done a deep dive on yet, and I said I would do it for the show

[28:27.880 --> 28:28.880] this week.

[28:28.880 --> 28:29.880] So here I am.

[28:29.880 --> 28:35.120] So the study is effects of cocoa extract and a multivitamin on cognitive function, a randomized

[28:35.120 --> 28:36.780] clinical trial.

[28:36.780 --> 28:44.520] This essentially was studying both a multivitamin and cocoa extract individually and together

[28:44.520 --> 28:45.680] versus placebo.

[28:45.680 --> 28:51.600] So there were four, you know, four groups in this trial was randomized and placebo controlled

[28:51.600 --> 28:52.780] and blinded.

[28:52.780 --> 28:58.080] And then they followed older subjects over three years.

[28:58.080 --> 29:02.540] It was a telephone evaluation, but you could do that because it was essentially just a

[29:02.540 --> 29:04.560] verbal mental status exam.

[29:04.560 --> 29:09.640] You know, they asked them to do verbal tasks, naming tasks, trail making, whatever.

[29:09.640 --> 29:13.960] Then they were these were standardized cognitive evaluations that you could be that can be

[29:13.960 --> 29:14.960] scored.

[29:14.960 --> 29:17.920] And then you can attach a number to it.

[29:17.920 --> 29:23.960] And the bottom line is that what they found is over those three years that taking a multivitamin

[29:23.960 --> 29:31.280] every day was associated with a greater improvement in performance on these cognitive tests than

[29:31.280 --> 29:34.780] was placebo or the cocoa extract.

[29:34.780 --> 29:39.800] The cocoa extract had zero effect, so there was no apparent benefit to that.

[29:39.800 --> 29:40.800] Now I say a greater.

[29:40.800 --> 29:44.520] Yeah, I say a greater increase because everyone improved, right?

[29:44.520 --> 29:47.040] I mean, every group improved.

[29:47.040 --> 29:49.600] And that's a well-known phenomenon.

[29:49.600 --> 29:53.480] Whenever you do a study like this, it's a practice effect, right?

[29:53.480 --> 29:56.640] The second time you do the set of standardized tests, you're going to be done.

[29:56.640 --> 29:57.640] You're going to do better.

[29:57.640 --> 29:59.600] The third time, you're going to do better.

[29:59.600 --> 30:01.040] And then it kind of plateaus.

[30:01.040 --> 30:04.600] So you can do this one of two ways.

[30:04.600 --> 30:11.840] You can give the test to your subjects until they plateau and then that's their baseline,

[30:11.840 --> 30:13.000] right?

[30:13.000 --> 30:18.520] Or you just have to compare it against the placebo and then you see who improves more,

[30:18.520 --> 30:19.520] right?

[30:19.520 --> 30:21.000] Is there a difference or not?

[30:21.000 --> 30:22.000] So that's what they chose to do.

[30:22.000 --> 30:29.920] They did not establish, they didn't do a practice series to get them to their plateau first.

[30:29.920 --> 30:33.480] So a few things to put this into perspective.

[30:33.480 --> 30:36.720] First of all, this is not the first study to look at the effects, the correlation between

[30:36.720 --> 30:42.560] taking a multivitamin and cognitive function in older patients or in patients in general

[30:42.560 --> 30:43.560] subjects.

[30:43.560 --> 30:48.320] There's been, you know, decades of research into this with pretty mixed results.

[30:48.320 --> 30:52.200] Like there's no consistent effect here, no huge effect here.

[30:52.200 --> 30:55.920] And the general interpretation of all the research has been, yeah, there's just no,

[30:55.920 --> 31:00.120] nothing that you can point to that's clearly demonstrated.

[31:00.120 --> 31:05.760] The reason the results are mixed is probably because there isn't a huge effect here.

[31:05.760 --> 31:10.440] But you know, the researchers took all the previous research into consideration.

[31:10.440 --> 31:14.920] They wanted to do a larger study with, that's more, that's with a longer follow-up.

[31:14.920 --> 31:18.920] Most studies were only like six months or a year, so they did three years and et cetera.

[31:18.920 --> 31:23.360] Basically just do a bigger, better, longer study to see if they could squeeze out a statistically

[31:23.360 --> 31:25.560] significant effect that way.

[31:25.560 --> 31:26.560] And they did.

[31:26.560 --> 31:27.560] So what does that mean?

[31:27.560 --> 31:31.760] So we need to, again, statistical significance, as I've said many, many times on this show,

[31:31.760 --> 31:37.440] is not the only thing to look at when evaluating the clinical significance of a study of a

[31:37.440 --> 31:38.980] medical trial.

[31:38.980 --> 31:44.180] So you have to also look at the clinical significance of the difference, right?

[31:44.180 --> 31:47.560] So what, how much of an effect size is there here?

[31:47.560 --> 31:52.940] And the bottom line is that the overall effect size was fairly small, right?

[31:52.940 --> 31:59.120] It was less than the amount that everybody improved just from the study effect, right?

[31:59.120 --> 32:01.200] Just from the practice effect.

[32:01.200 --> 32:04.700] So it was pretty small, but it was statistically significant.

[32:04.700 --> 32:07.520] And then the other question is, well, what could be going on here?

[32:07.520 --> 32:12.200] So first of all, this could be a spurious effect and as the researchers acknowledge,

[32:12.200 --> 32:18.480] we need to do this study with more individuals and with a more diverse population and we

[32:18.480 --> 32:21.600] need to gather more data to see like, is this real?

[32:21.600 --> 32:25.500] And if so, what are the, what's the probable mechanism?

[32:25.500 --> 32:33.520] For me, the big glaring omission in this test was that they did not test vitamin levels

[32:33.520 --> 32:41.660] before doing the study, because let's say older patients, older people do tend to have

[32:41.660 --> 32:45.380] lower vitamin B12 levels, for example, cause that's a hard vitamin to absorb.

[32:45.380 --> 32:48.040] You need a special molecule to bind to it.

[32:48.040 --> 32:53.400] It's called intrinsic factor and then sort of usher it over, you know, the gastric membrane.

[32:53.400 --> 32:58.000] So it doesn't just get passively absorbed and you know, that, that can decrease as we

[32:58.000 --> 33:02.480] age and people could become B12 deficient as we get older.

[33:02.480 --> 33:05.600] Pretty much I check it in every single patient that I have, cause it's just that one of,

[33:05.600 --> 33:10.120] it's a basic neurology lab that we do because it affects neurological functioning.

[33:10.120 --> 33:16.960] And you know, it's, it's a pretty high incidence of B12, either insufficiency or deficiency

[33:16.960 --> 33:23.620] in the population being studied here and you know, multivitamins typically contain B12.

[33:23.620 --> 33:32.200] So how do they know they're not just treating undiagnosed B12 deficiency in this population,

[33:32.200 --> 33:35.920] which has a measure, which is, has a known benefit to cognition, right?

[33:35.920 --> 33:40.760] As a known benefit in terms of dementia, you know, B12 deficiency contributes to dementia

[33:40.760 --> 33:44.320] and can cause it by itself if it's bad enough for long enough.

[33:44.320 --> 33:47.920] But so anyway, that seemed like a pretty big omission to me.

[33:47.920 --> 33:51.760] And I think that definitely a follow-up study would do that because that might, you know,

[33:51.760 --> 33:57.240] treating otherwise undiagnosed deficiency could be the entire explanation here, could

[33:57.240 --> 33:58.240] be the entire effect.

[33:58.240 --> 34:02.760] For the vitamin industry, isn't that also kind of their argument?

[34:02.760 --> 34:07.280] The there's an important difference though, between targeted supplementation and routine

[34:07.280 --> 34:10.040] multivitamin supplementation.

[34:10.040 --> 34:16.180] And there is known harm, at least, you know, correlations between routine multivitamin

[34:16.180 --> 34:20.600] use and, and other, you know, health, negative health outcomes like heart disease.

[34:20.600 --> 34:21.600] Right.

[34:21.600 --> 34:22.880] Cause you're taking too much of something.

[34:22.880 --> 34:23.880] Yeah.

[34:23.880 --> 34:27.800] And I'll tell you just from again, having tested vitamin levels on hundreds and hundreds

[34:27.800 --> 34:33.360] of patients of all ages, but certainly many of them older, um, you know, we diagnose levels

[34:33.360 --> 34:38.100] that are, that are too high as frequently as we diagnose levels that are too low.

[34:38.100 --> 34:42.200] And so often I'm telling my patients, stop this, stop that, and add this, you know what

[34:42.200 --> 34:43.200] I mean?

[34:43.200 --> 34:44.840] Like I'm, I have to direct their supplementation.

[34:44.840 --> 34:48.560] They're taking too much of certain things and not enough of other things.

[34:48.560 --> 34:49.560] That's very, very common.

[34:49.560 --> 34:56.840] So just, you know, a blanket multivitamin without any pre-testing of vitamin levels,

[34:56.840 --> 34:58.160] I don't think is the right approach.

[34:58.160 --> 35:02.280] I don't think the evidence supports that, but the, you know, the, the, the, all the

[35:02.280 --> 35:05.780] other evidence, if you look at it in its totality, it does support, there's a lot of instances

[35:05.780 --> 35:11.560] where targeted supplementation is proven to be effective and is, is good, you know, medical

[35:11.560 --> 35:16.520] management, but routine multivitamin supplementation really isn't.

[35:16.520 --> 35:21.000] And this study does not really answer that question because they didn't test vitamin

[35:21.000 --> 35:22.000] levels.

[35:22.000 --> 35:26.720] So, um, they didn't make the key distinction in my part, in my, in my opinion.

[35:26.720 --> 35:31.480] And of course a good scientific study should, but I guess for the devil's advocate thing

[35:31.480 --> 35:36.640] that I'm asking, just trying to kind of channel the people who take multivitamins regularly

[35:36.640 --> 35:43.280] is, is there a, is there a benefit to a certain percent of the population who you know are

[35:43.280 --> 35:48.120] not going to be going in for this routine testing, who you know are very likely not

[35:48.120 --> 35:54.880] on top of their levels, a just in case multivitamin, it just, does the good outweigh the harm?

[35:54.880 --> 35:58.520] So again, if you look at, you can't answer that question from looking at this study,

[35:58.520 --> 36:02.280] but because it's only looking at a certain number of things, but if you look at the totality

[36:02.280 --> 36:06.800] of the research into multivitamins, there seems to be a net negative, if anything, you

[36:06.800 --> 36:10.680] know, correlation with, with just taking a routine multivitamin.

[36:10.680 --> 36:13.580] Here's here's one reason why that could be a bad thing.

[36:13.580 --> 36:18.400] If you're in your sixties, you know, let's say like for seventies, the target population

[36:18.400 --> 36:22.220] in this study, you should be seeing your primary care doctor at least once a year.

[36:22.220 --> 36:26.000] So if you take a multivitamin and go, I don't have to see my primary care doctor, I'm taking

[36:26.000 --> 36:29.720] a multivitamin, that's, I'm covered, you know, that actually can have an unintended

[36:29.720 --> 36:31.520] negative consequence.

[36:31.520 --> 36:35.160] But it's hard to know how that all shakes out, you know, because there's, there's multiple

[36:35.160 --> 36:38.880] possible unintended consequences here.

[36:38.880 --> 36:42.600] But the thing is you should be seeing your primary care doctor at least annually.

[36:42.600 --> 36:47.540] And if you do, I guarantee you that they're checking your B12 level and other, and other

[36:47.540 --> 36:48.680] vitamin levels as well.

[36:48.680 --> 36:51.600] And if you certainly, if you have any neurological symptoms, you're going to get pretty much

[36:51.600 --> 36:56.080] a full metabolic screen, you know, nutritional screen, you should anyway.

[36:56.080 --> 37:01.560] And then you'll be able to take personalized, you know, targeted supplementation.

[37:01.560 --> 37:08.000] And so, you know, there, there may be negative unintended consequences here that are not

[37:08.000 --> 37:11.680] being picked up by this study, which is mainly concerned with cognitive function.

[37:11.680 --> 37:12.680] Yeah.

[37:12.680 --> 37:15.720] And by the way, there's another consideration here and I'll use a personal anecdote.

[37:15.720 --> 37:21.400] I, you know, did have B12 testing when I was struggling, have been struggling with some,

[37:21.400 --> 37:28.960] some specific symptoms and my physician recommended, and same thing actually with iron.

[37:28.960 --> 37:34.440] So I was iron deficient and B12 deficient and sure, I could have like done targeted

[37:34.440 --> 37:38.480] supplementation by going down the vitamin aisle, but these are like not FDA approved

[37:38.480 --> 37:40.120] and I don't really know what's in them.

[37:40.120 --> 37:43.260] And you know, this is, they're not regulated very well.

[37:43.260 --> 37:46.820] And ultimately I had some issues with absorption, like with like gut problems from trying to

[37:46.820 --> 37:48.240] take a oral iron.

[37:48.240 --> 37:54.640] So whatever I was able to do B12 injections and I ended up having to get iron infusion.

[37:54.640 --> 37:59.400] So now I'm getting prescription medication that I know is FDA approved.

[37:59.400 --> 38:02.880] I know it's made in a lab, it's clean, it's been tested.

[38:02.880 --> 38:05.640] And the same is true of many oral vitamins as well.

[38:05.640 --> 38:12.920] And, and I write prescriptions for vitamins to my patients so they know exactly what they're

[38:12.920 --> 38:15.120] getting and what dose and everything.

[38:15.120 --> 38:16.280] And that's the other thing.

[38:16.280 --> 38:21.840] If I'm prescribing B12 supplements to a patient who has B12 deficiency, who's whatever in

[38:21.840 --> 38:26.400] their sixties or seventies, I then have to check followup levels because they may not

[38:26.400 --> 38:27.760] be absorbing the B12.

[38:27.760 --> 38:28.760] It may not work.

[38:28.760 --> 38:29.760] You know, oral B12 may not.

[38:29.760 --> 38:30.760] That's what happened to me.

[38:30.760 --> 38:31.760] Yeah.

[38:31.760 --> 38:32.760] And then you have, then you have to get the injections.

[38:32.760 --> 38:35.200] You have to bypass the gut and do the injections.

[38:35.200 --> 38:42.160] So again, just taking a multivitamin may not be addressing the actual, the actual problem.

[38:42.160 --> 38:46.720] And if people are doing that, instead of getting their levels checked, that could have a net

[38:46.720 --> 38:47.720] negative level.

[38:47.720 --> 38:51.520] So you have to compare it to, you know, like best practices also.

[38:51.520 --> 38:52.520] Yeah.

[38:52.520 --> 38:56.920] It still could be true that, you know, there's an intention to treat analysis here, although

[38:56.920 --> 39:00.360] I will say that there was about a 10% dropout, which they didn't count.

[39:00.360 --> 39:03.680] And you know, that was probably enough people to affect the outcome of the study because

[39:03.680 --> 39:06.520] the effect sizes were not that huge.

[39:06.520 --> 39:10.040] And you have to wonder why that, you know, there was that dropout and what did, what,

[39:10.040 --> 39:11.440] what was going on with those individuals.

[39:11.440 --> 39:12.440] Right.

[39:12.440 --> 39:13.440] Did that bias the outcome?

[39:13.440 --> 39:14.440] Yeah.

[39:14.440 --> 39:15.440] Right.

[39:15.440 --> 39:16.440] Exactly.

[39:16.440 --> 39:17.520] You know, that's, that's why you have to disclose the dropout rate.

[39:17.520 --> 39:19.800] But in any case, yeah, I mean, the thing is, yeah, sure.

[39:19.800 --> 39:24.040] Some people in the study were probably helped by taking, by taking a multivitamin.

[39:24.040 --> 39:25.040] That's true.

[39:25.040 --> 39:30.280] But you know, we definitely, my concern is that because there's already a huge cultural

[39:30.280 --> 39:35.760] impetus to just take a multivitamin just in case, that if people get the bottom line message

[39:35.760 --> 39:39.560] from this, that taking the multivitamin is good and that's all they have to worry about,

[39:39.560 --> 39:41.800] that could have a net negative effect.

[39:41.800 --> 39:45.640] And that we really do want to get the message out that, you know, vitamins are a medical

[39:45.640 --> 39:46.640] intervention.

[39:46.640 --> 39:50.200] You know, you do need to have your levels checked, especially when you get older, especially

[39:50.200 --> 39:55.800] if you have certain medical concerns or symptoms and things need to be done in an evidence

[39:55.800 --> 39:58.120] based way, not just shooting from the hip.

[39:58.120 --> 39:59.120] Take a multivitamin.

[39:59.120 --> 40:00.120] Don't worry about it.

[40:00.120 --> 40:01.120] Right.

[40:01.120 --> 40:04.080] Like you wouldn't take an, well, maybe aspirin is different, but I wouldn't take like a

[40:04.080 --> 40:08.280] full dose of ibuprofen every day just in case I'm going to get a headache.

[40:08.280 --> 40:12.040] That's dangerous.

[40:12.040 --> 40:16.480] The dose ranges are pretty broad for vitamins, but you know, but we do see vitamin toxicity

[40:16.480 --> 40:17.480] does happen.

[40:17.480 --> 40:18.480] Yeah.

[40:18.480 --> 40:20.320] Because you don't know what else, you don't know what else they're, they're having their

[40:20.320 --> 40:23.000] regular diet, what else they're consuming.

[40:23.000 --> 40:26.080] And sometimes when people are really doing it, like for these health, they're taking

[40:26.080 --> 40:30.640] like thousands, the plural percent of like the recommended daily intake.

[40:30.640 --> 40:32.840] Well, the other thing is, yeah, they might be taking too much.

[40:32.840 --> 40:33.840] They don't know what dose to take.

[40:33.840 --> 40:35.240] The other thing, or they might be taking too little.

[40:35.240 --> 40:36.760] That's the other thing.

[40:36.760 --> 40:42.480] There are, like I've had patients who've had like, you know, vitamin B6 levels that are

[40:42.480 --> 40:47.800] 10 times the upper limit of normal, like really super high levels.

[40:47.800 --> 40:51.720] And they tell me that they don't know that they're not supplementing.

[40:51.720 --> 40:57.120] The thing is there's so much embedded supplementation in, you know, food.

[40:57.120 --> 40:58.600] Oh, like fortified foods.

[40:58.600 --> 41:03.480] Fortified, like are you drinking vitamin water or whatever, the cereal, whatever, that they're,

[41:03.480 --> 41:08.360] they're might be getting too much of certain vitamins without specifically taking a multivitamin

[41:08.360 --> 41:10.560] or specifically supplementing.

[41:10.560 --> 41:14.040] So again, it's why it's good to sort of have a conversation with them about like what they're

[41:14.040 --> 41:17.840] eating and what they're not eating and then, and what their levels are and, and again,

[41:17.840 --> 41:23.080] try to give them some specific personalized advice about their diet as well as, you know,

[41:23.080 --> 41:26.000] what may or may not need to be supplemented.

[41:26.000 --> 41:28.320] That's the direction I definitely would like to see things go.

[41:28.320 --> 41:31.640] And that's, you know, I think neurology is there for the things that we are concerned

[41:31.640 --> 41:36.440] about because, you know, actual deficiencies in the, in these vitamins can cause neurological

[41:36.440 --> 41:40.680] symptoms and dementia being a big one, you know, so, you know, we're kind of already

[41:40.680 --> 41:41.840] all over that.

[41:41.840 --> 41:49.400] But anyway, it's and again, this is one study embedded in a very, you know, decades of research

[41:49.400 --> 41:51.960] showing results kind of all over the place.

[41:51.960 --> 41:56.840] So we can't, you know, look at this as if this is conclusive, you know, we definitely,

[41:56.840 --> 42:01.480] I would definitely like to see, you know, better controlled, more thorough, larger studies

[42:01.480 --> 42:05.280] with more diverse population and, you know, see, and see if there's a consistent effect

[42:05.280 --> 42:06.280] here.

Refreezing the Poles (42:07)

[42:06.280 --> 42:07.280] All right, Bob.

[42:07.280 --> 42:08.280] Yes.

[42:08.280 --> 42:10.300] I mean, is this serious?

[42:10.300 --> 42:15.000] They're talking about refreezing the poles if they melt.

[42:15.000 --> 42:16.000] Yeah.

[42:16.000 --> 42:17.080] Check this, check this out.

[42:17.080 --> 42:20.360] So this is geoengineering in the news.

[42:20.360 --> 42:24.840] This time it's a new study that's determined that deploying aerosols once a year, just

[42:24.840 --> 42:29.960] at the poles is not only feasible with current technology levels, but could also potentially

[42:29.960 --> 42:34.440] reverse the alarming ice melt that we are seeing there.

[42:34.440 --> 42:39.120] This was published recently in Environmental Research Communications, lead author, Wake

[42:39.120 --> 42:44.240] Smith, Steve's buddy at Yale, he's a lecturer at Yale, and senior fellow at the Harvard

[42:44.240 --> 42:45.560] Kennedy School.

[42:45.560 --> 42:50.160] So all right, so we're all aware of how climate change is basically like in our face now,

[42:50.160 --> 42:51.160] right?

[42:51.160 --> 42:55.440] It's like, no one's really saying anymore, not much anyway, that is that because of climate

[42:55.440 --> 42:56.440] change?

[42:56.440 --> 43:00.960] The answer is pretty yes, for a lot of this stuff that we're seeing.

[43:00.960 --> 43:03.480] And it's only going to get worse.

[43:03.480 --> 43:09.520] And so it's worth stressing that the poles of the Earth have been especially and worryingly

[43:09.520 --> 43:13.480] impacted above and beyond the global average.

[43:13.480 --> 43:16.600] So I mean, look at the heat waves this past year at both poles.

[43:16.600 --> 43:18.720] They're breaking all previous records.

[43:18.720 --> 43:24.560] But the Arctic has been hit even harder, the hardest of the harder of the two, warming

[43:24.560 --> 43:27.200] at twice the global average.

[43:27.200 --> 43:29.840] And this is due to what's called the Arctic amplification.

[43:29.840 --> 43:32.560] And this has multiple causes.

[43:32.560 --> 43:37.920] One is that the reduction in snow and sea ice albedo, that's albedo albedo, albedo.

[43:37.920 --> 43:44.480] So it's a reduction in snow and sea ice coefficient of reflectivity.

[43:44.480 --> 43:48.760] Increased heating due to increasing Arctic cloud cover and water vapor content, that's

[43:48.760 --> 43:50.800] also adding to the amplification.

[43:50.800 --> 43:55.680] And there's also more energy flowing from the lower latitudes to the Arctic for various

[43:55.680 --> 43:56.680] reasons.

[43:56.680 --> 44:00.440] And finally, there's more soot and black carbon aerosols in the atmosphere.

[44:00.440 --> 44:03.100] And that's causing even more heat to be absorbed.

[44:03.100 --> 44:08.360] So all of this is kind of like getting together into like a trifecta of amplification in the

[44:08.360 --> 44:11.320] Arctic that's making it as bad as we're seeing it there.

[44:11.320 --> 44:12.320] So here's an example.

[44:12.320 --> 44:18.200] Did you guys know that the Arctic annual mean surface temperature from 71 to 2019 has already

[44:18.200 --> 44:20.920] increased over three degrees Celsius?

[44:20.920 --> 44:25.400] Already from 1971 to 2019, wow.

[44:25.400 --> 44:26.400] So now what?

[44:26.400 --> 44:30.000] Now the Antarctic is not as dramatically bad.

[44:30.000 --> 44:36.620] But I mean, there's also the melting Antarctic ice sheet, which I think is very scary.

[44:36.620 --> 44:38.840] That could very well be a climate change tipping point.

[44:38.840 --> 44:40.360] A lot of people are talking about it.

[44:40.360 --> 44:44.200] A lot of the news coming from there is a little scary.

[44:44.200 --> 44:47.680] So with all that as the foundation, put that in your head.

[44:47.680 --> 44:52.640] So let's look at the type of geoengineering that was evaluated in this recent study that

[44:52.640 --> 44:53.640] I'm going to talk about.

[44:53.640 --> 45:00.200] So it's called SAI for Stratospheric Aerosol Injection, which is designed to increase Earth's

[45:00.200 --> 45:07.240] albedo albedo, or how much light it reflects, which would reduce the amount of global warming,

[45:07.240 --> 45:08.240] right?

[45:08.240 --> 45:09.920] Very, very simple at that level.

[45:09.920 --> 45:14.400] Now the researchers stress appropriately that this type of geoengineering is a bandaid.

[45:14.400 --> 45:19.080] It's part of an overall strategy that should include other climate strategies like mitigation,

[45:19.080 --> 45:21.480] adaptation, and carbon dioxide removal.

[45:21.480 --> 45:23.320] Those are the big boys.

[45:23.320 --> 45:24.320] That's what you want.

[45:24.320 --> 45:25.320] So this is just a bandaid.

[45:25.320 --> 45:30.280] This is something that could help delay things basically until we get our shit together.

[45:30.280 --> 45:35.840] Now if you've been following stratospheric aerosol injection, as I know we all have,

[45:35.840 --> 45:37.680] you know how controversial it is, right?

[45:37.680 --> 45:39.680] It's very controversial.

[45:39.680 --> 45:40.680] Think about it.

[45:40.680 --> 45:44.960] Spread deployment of chemicals into the upper atmosphere all over the globe.

[45:44.960 --> 45:49.840] Oh gosh, the chemtrail nutters are coming out of the woodwork for this one.

[45:49.840 --> 45:52.640] But it would seriously help cool down the Earth.

[45:52.640 --> 45:54.360] I mean, there's pretty much no doubt about that.

[45:54.360 --> 45:55.360] I mean, it's simple.

[45:55.360 --> 45:59.000] Wait, wait, Bob, you're jumping right to this is going to work?

[45:59.000 --> 46:00.000] Like, no, no, no.

[46:00.000 --> 46:01.000] I'm saying-

[46:01.000 --> 46:02.000] No, but like what are the unintended consequences?

[46:02.000 --> 46:03.000] Oh my God.

[46:03.000 --> 46:04.000] Yes.

[46:04.000 --> 46:07.680] I'm talking that in general, stratospheric aerosol injection will cool down the Earth.

[46:07.680 --> 46:09.120] That's not in question.

[46:09.120 --> 46:13.120] Isn't that the premise of Snowpiercer, that movie and TV show?

[46:13.120 --> 46:17.320] They inject the aerosols and they cool the Earth like minus 160 degrees.

[46:17.320 --> 46:18.320] Oh boy.

[46:18.320 --> 46:19.320] That was deliberate?

[46:19.320 --> 46:20.320] Oh my gosh.

[46:20.320 --> 46:21.320] Yeah.

[46:21.320 --> 46:27.040] Is that aerosol just like vaporized water or is it some other-

[46:27.040 --> 46:31.760] It's this various chemicals that could be used like sulfur dioxide.

[46:31.760 --> 46:32.760] So-

[46:32.760 --> 46:33.760] Yeah, it's not just water.

[46:33.760 --> 46:34.760] Yeah.

[46:34.760 --> 46:40.200] So, I was saying there's very little controversy that it would help cool the Earth, but the

[46:40.200 --> 46:43.080] unwanted side effects and the expense are the huge things.

[46:43.080 --> 46:48.240] I mean, we can get side effects that could be worse than the climate change itself.

[46:48.240 --> 46:50.040] And I mean, it could be horrific.

[46:50.040 --> 46:54.240] So, I mean, this is really not what this study was about.

[46:54.240 --> 46:58.960] I mean, we're not even close to understanding the impact to the people, to the environment.

[46:58.960 --> 47:00.480] So yeah.

[47:00.480 --> 47:05.520] And then don't forget, there's also the money it would cost for such a global geoengineering

[47:05.520 --> 47:07.320] is off the hook.

[47:07.320 --> 47:10.800] And even the technology to do it, we're not even there.

[47:10.800 --> 47:14.440] So as you can probably tell, there's a big but right here.

[47:14.440 --> 47:18.480] Now Jay, say it with me on four, one, two, three, but-

[47:18.480 --> 47:19.480] Sexual innuendo.

[47:19.480 --> 47:20.480] What?

[47:20.480 --> 47:21.480] All right.

[47:21.480 --> 47:22.480] It's close.

[47:22.480 --> 47:23.480] It's close.

[47:23.480 --> 47:31.520] But most of the research and studies and simulations and speculation of stratospheric aerosol injection

[47:31.520 --> 47:37.200] has all been about global solar geoengineering, deploying aerosols globally in order to lower

[47:37.200 --> 47:38.540] temperatures worldwide.

[47:38.540 --> 47:42.160] That's what, if people are talking about it and you're reading about it, chances are that's

[47:42.160 --> 47:45.840] what they're talking about, global insertion.

[47:45.840 --> 47:50.120] And that's not what this paper is about.

[47:50.120 --> 47:54.760] There's only actually been a few studies dealing with a more limited version of this called

[47:54.760 --> 47:56.480] subpolar geoengineering.

[47:56.480 --> 47:59.000] And this is what the study is about.

[47:59.000 --> 48:06.080] Now this flavor of SAI has geographically limited deployments, in this case, 60 degrees

[48:06.080 --> 48:09.680] north and south, subpolar region of the poles.

[48:09.680 --> 48:14.520] Now to right off the bat, if you know about this, this strategy has some interesting potential

[48:14.520 --> 48:15.520] benefits.

[48:15.520 --> 48:19.080] First off, it's technically easier to pull off because of the poles, the troposphere

[48:19.080 --> 48:25.040] is lower, which means that you could insert the aerosols at a lower altitude than anywhere

[48:25.040 --> 48:26.680] else on the planet.

[48:26.680 --> 48:27.680] So that's huge.

[48:27.680 --> 48:29.040] That is absolutely huge.

[48:29.040 --> 48:31.080] You don't have to go nearly as high.

[48:31.080 --> 48:37.280] Getting a plane filled with these aerosols at such incredible altitude is really hard.

[48:37.280 --> 48:41.680] And so this is a big bonus for these subpolar insertions.

[48:41.680 --> 48:46.240] And the other benefit is that there have been some of these polar studies and they have

[48:46.240 --> 48:52.000] showed that an Arctic deployment would be better at preserving sea ice than the global

[48:52.000 --> 48:53.380] equatorial injection.

[48:53.380 --> 48:55.240] So that's big as well.

[48:55.240 --> 48:56.240] Okay.

[48:56.240 --> 49:01.400] So the lead author, Wade Wake-Smith says, there's widespread and sensible trepidation

[49:01.400 --> 49:04.240] about deploying aerosols to cool the planet.

[49:04.240 --> 49:09.440] But if the risk benefit equation were to pay off anywhere, it would be at the poles.

[49:09.440 --> 49:12.800] So it was like, if this technique is going to work at all, this is where it's going to

[49:12.800 --> 49:13.800] work.

[49:13.800 --> 49:15.140] Now I recommend reading the study.

[49:15.140 --> 49:17.920] It's actually a very, very accessible.

[49:17.920 --> 49:22.640] It's called a subpolar focused stratospheric aerosol injection deployment scenario.

[49:22.640 --> 49:28.760] So now it takes a deep dive, as you might imagine, into this whole idea of subpolar

[49:28.760 --> 49:33.800] geoengineering and what it would actually be like, what would it involve.

[49:33.800 --> 49:35.180] So I'll cut to the chase.

[49:35.180 --> 49:40.320] The paper's conclusion says substantially cooling the world's polar and subpolar regions

[49:40.320 --> 49:42.400] would be logically feasible.

[49:42.400 --> 49:47.360] So in this study, they've determined that it is absolutely logically feasible to pull

[49:47.360 --> 49:48.360] this off.

[49:48.360 --> 49:55.060] I'll continue quoting, this could arrest and likely reverse the melting of sea ice, land

[49:55.060 --> 50:00.040] ice and permafrost in the most vulnerable regions of the earth's cryosphere.

[50:00.040 --> 50:02.380] The cryosphere is an awesome word.

[50:02.380 --> 50:06.800] It's just the parts of the earth that have frozen, that have ice, frozen water.

[50:06.800 --> 50:08.060] And then it ends here.

[50:08.060 --> 50:12.400] This in turn would substantially slow sea level rise globally.

[50:12.400 --> 50:16.120] So that to me, that's fantastic.

[50:16.120 --> 50:17.120] That's a wonderful result.

[50:17.120 --> 50:20.720] I couldn't really hope for too much more in the conclusion.

[50:20.720 --> 50:25.640] Their plan could reduce the average surface temperatures north of 60 degrees by a year

[50:25.640 --> 50:29.180] round average of two degrees Celsius.

[50:29.180 --> 50:35.060] That's basically could get it back to close to pre-industrial average temperatures.

[50:35.060 --> 50:36.100] Amazing reduction.

[50:36.100 --> 50:41.120] So some of the details are interesting, like to do that, they said that we would need 125

[50:41.120 --> 50:46.480] high altitude tankers that would have to be built because we can't really use hand me

[50:46.480 --> 50:49.920] down planes or planes that exist right now.

[50:49.920 --> 50:55.080] They're not good enough to get enough of the aerosol at high enough altitude in order to

[50:55.080 --> 50:56.120] pull this off.

[50:56.120 --> 51:01.200] So we would need to design and build these high altitude tankers, but it's totally doable.

[51:01.200 --> 51:04.920] This is not FTL, faster than light drive we're creating here.

[51:04.920 --> 51:09.840] There's just high altitude tankers, not that difficult in the grand scheme of things.

[51:09.840 --> 51:15.120] So they would release trillions of grams of aerosols like sulfur dioxide once per year

[51:15.120 --> 51:16.940] during each pole spring.

[51:16.940 --> 51:21.560] So the idea is that they would go to say the North Pole and then in the spring release

[51:21.560 --> 51:24.800] these trillions of grams of the aerosols.

[51:24.800 --> 51:28.700] And then they, and then when they were done at the North Pole, they would travel to the

[51:28.700 --> 51:33.040] South Pole and get there by the spring of the South Pole and do it there and just kind

[51:33.040 --> 51:34.680] of go back and forth.

[51:34.680 --> 51:39.520] They would inject it at 43,000 feet, which is above airline cruising altitudes.

[51:39.520 --> 51:44.080] And as it slowly drifted down, it would basically just shade the ground.

[51:44.080 --> 51:45.080] That's it.

[51:45.080 --> 51:46.080] Just shading the ground.

[51:46.080 --> 51:51.600] It's preventing some of the light to get to getting to the planet reflected away, and

[51:51.600 --> 51:52.940] then it would cool the planet.

[51:52.940 --> 51:57.040] So they estimate that this would cost 11 billion USD.

[51:57.040 --> 52:01.240] So in the grand scheme of things, not a lot, not a lot at all, right?

[52:01.240 --> 52:04.880] So here's a quote from Smith and I like this.

[52:04.880 --> 52:09.500] He says, game changing though this could be in a rapidly warming world, stratospheric

[52:09.500 --> 52:14.560] aerosol injections merely treat a symptom of climate change, but not the underlying

[52:14.560 --> 52:15.560] disease.

[52:15.560 --> 52:17.440] It's aspirin, not penicillin.

[52:17.440 --> 52:19.640] It's not a substitute for decarbonization.

[52:19.640 --> 52:21.440] And that's critical.

[52:21.440 --> 52:25.600] This is just as he says, it's a band-aid, it's aspirin, it's not penicillin.

[52:25.600 --> 52:26.600] I love that.

[52:26.600 --> 52:29.760] And of course there's some potential downsides, right?

[52:29.760 --> 52:35.120] There's lots of unknowns, but the obvious artifacts of this type of this insertion would

[52:35.120 --> 52:39.280] include things like increased sulfur deposition, duh, right?

[52:39.280 --> 52:41.960] Of course, delay, right?

[52:41.960 --> 52:43.840] Delayed ozone layer recovery.

[52:43.840 --> 52:44.840] Okay.

[52:44.840 --> 52:45.840] There's that.

[52:45.840 --> 52:46.840] That doesn't.

[52:46.840 --> 52:49.600] Well, that doesn't sound great either.

[52:49.600 --> 52:51.440] Increased stratospheric heating.

[52:51.440 --> 52:56.280] And there's also the deployment effort itself would marginally work against this cooling

[52:56.280 --> 52:57.280] effort, right?

[52:57.280 --> 53:01.840] All this extra fuel that's being combusted by the planes and the supply chain emissions

[53:01.840 --> 53:05.280] and even increased contrails, all of that would work against you a little bit.

[53:05.280 --> 53:06.280] It's marginal.

[53:06.280 --> 53:12.920] It's not a deal breaker by their estimation in any way, but it's also a downside.

[53:12.920 --> 53:18.320] And all of these negative impacts though would be obviously far less than a global deployment.

[53:18.320 --> 53:25.320] A global deployment year round would have far more of these types of downsides.

[53:25.320 --> 53:29.000] And let me stress again, these are the downsides that are the low hanging fruit obvious.

[53:29.000 --> 53:30.720] Yeah, of course they are.

[53:30.720 --> 53:34.440] And so we don't know for sure what some of these, what are the other ones could be.

[53:34.440 --> 53:40.000] And then another potentially big obstacle is the fact that a dozen nations are in these

[53:40.000 --> 53:42.640] sub polar areas and they would all have to agree.

[53:42.640 --> 53:46.380] They'd all have to agree and get along, you know, to make this happen.

[53:46.380 --> 53:48.920] So that could be a deal breaker right there.

[53:48.920 --> 53:52.900] So I'll finish with another quote from the paper's conclusion by Smith.

[53:52.900 --> 53:59.120] He said, an SI program with global benefits that would entail deployment directly overhead

[53:59.120 --> 54:04.520] of far less than 1% of the world's population and nearly none of its agriculture may prove

[54:04.520 --> 54:09.680] an easier sell to a skeptical world than a full on global deployment.

[54:09.680 --> 54:14.840] Given its apparent feasibility and low cost, the scenario deserves further attention.

[54:14.840 --> 54:15.840] And that's all he's saying.

[54:15.840 --> 54:17.360] He's not saying do this.

[54:17.360 --> 54:18.880] He's like, look at these results.

[54:18.880 --> 54:20.880] Look at these preliminary results.

[54:20.880 --> 54:25.800] Let's look into this more because this could be something that becomes very, very important.

[54:25.800 --> 54:27.520] And I mean, it's going to take time.

[54:27.520 --> 54:31.160] We can't do this, you know, if we decide to do it, it's going to take 10 years to build

[54:31.160 --> 54:34.120] the planes or, you know, to get the infrastructure in place.

[54:34.120 --> 54:38.220] It's going to take like a decade before we can really get going.

[54:38.220 --> 54:44.000] So I think we should spend more money to look into this more deeply and see how feasible

[54:44.000 --> 54:45.000] this really is.

[54:45.000 --> 54:48.800] I mean, you almost can make an argument for like starting to build the planes now while

[54:48.800 --> 54:53.560] we study it so that if we decide in five years that, yeah, this is a good idea, you know,

[54:53.560 --> 54:58.160] we're halfway there, you can probably repurpose them for something else if it turns out we

[54:58.160 --> 54:59.160] don't need them.

[54:59.160 --> 55:00.160] Yeah.

[55:00.160 --> 55:01.160] I mean, that's called hedging your bets.

[55:01.160 --> 55:02.800] And we need to do that with global warming.

[55:02.800 --> 55:03.800] Yeah.

[55:03.800 --> 55:04.800] Like we don't have the luxury to be like.

[55:04.800 --> 55:05.800] Yeah.

[55:05.800 --> 55:06.800] We need to be investing in every potential option.

[55:06.800 --> 55:07.800] Right.

[55:07.800 --> 55:11.800] And this is not a, and this doesn't cripple the economy or, you know, send us into spiraling

[55:11.800 --> 55:12.800] debt or anything.

[55:12.800 --> 55:13.800] Oh, my God, Al.

[55:13.800 --> 55:14.800] Eleven billion.

[55:14.800 --> 55:15.800] Not even close.

[55:15.800 --> 55:20.080] I mean, it's a billion before we could do further research to really know if we should

[55:20.080 --> 55:21.080] go forward.

[55:21.080 --> 55:23.880] So, again, normally I wouldn't be in favor of that kind of thing.

[55:23.880 --> 55:29.320] But because we're up against the clock, it's probably the lesser of two evils to just to

[55:29.320 --> 55:30.320] do that.

[55:30.320 --> 55:31.320] All right.

[55:31.320 --> 55:32.320] I mean, imagine if we could delay C-Rise.

[55:32.320 --> 55:40.720] I mean, Bob, your news item is over meters and meters and meters and meters.

[55:40.720 --> 55:41.720] Get a clue.

[55:41.720 --> 55:47.880] Who says, all right, that means shut the fuck up, time to move on.

[55:47.880 --> 55:48.880] But he's only half done.

Neuro Emotional Technique (55:50)

[55:48.880 --> 55:53.920] All right, Evan, tell, what is the neuro emotional technique?

[55:53.920 --> 55:56.400] I don't have a good feeling about this.

[55:56.400 --> 55:59.920] Wait, but Steve, it has the word neuro in it.

[55:59.920 --> 56:04.960] I mean, we're talking about legitimate science here, right?

[56:04.960 --> 56:10.120] So this article popped up this week, courtesy of McGill University, about something called

[56:10.120 --> 56:11.960] neuro emotional technique.

[56:11.960 --> 56:12.960] What is it?

[56:12.960 --> 56:13.960] It's NET.

[56:13.960 --> 56:14.960] I'm going to use NET for short.

[56:14.960 --> 56:16.800] OK, so what is NET?

[56:16.800 --> 56:18.840] Well, that depends on who you ask.

[56:18.840 --> 56:24.440] For example, there is an organization called One Research Foundation or ORF.

[56:24.440 --> 56:26.880] I'm going to call them ORF.

[56:26.880 --> 56:29.600] Here is their description.

[56:29.600 --> 56:34.920] Neuro emotional technique, NET, is based on the concept that unresolved emotional trauma

[56:34.920 --> 56:37.680] is stored in the body.

[56:37.680 --> 56:44.300] Emotions such as excessive unresolved anger, grief and fear may long affect us even after

[56:44.300 --> 56:47.160] we have forgotten the event that caused them.

[56:47.160 --> 56:53.160] Since our emotional reality dramatically affects our health, using NET to identify these negatively

[56:53.160 --> 56:59.880] charged emotions, also known as neuro emotional complexes, and releasing them can normalize

[56:59.880 --> 57:02.960] abnormal physical and behavioral patterns.

[57:02.960 --> 57:05.040] NET is safe and effective.

[57:05.040 --> 57:09.520] It's a natural way to resolve longstanding health problems by resolving the emotional

[57:09.520 --> 57:12.320] components that accompany the physical symptoms.

[57:12.320 --> 57:17.160] NET practitioners can find unresolved negatively charged emotional responses that are stored

[57:17.160 --> 57:19.760] in your body and help you release them.

[57:19.760 --> 57:24.080] Now NET has been used to successfully treat headaches, body aches, chronic pain, digestive

[57:24.080 --> 57:30.140] issues, phobias, general anxiety, organ dysfunction, and self-sabotaging behaviors.

[57:30.140 --> 57:35.480] That sounds like Scientology but without the aliens.

[57:35.480 --> 57:40.160] The thing that's so frustrating is they're taking something that has, again, a kernel

[57:40.160 --> 57:49.160] of truth and they're blowing it up into some sort of specialized voodoo treatment.

[57:49.160 --> 57:53.960] If you go to any pain clinic in any legitimate hospital for people who are struggling with

[57:53.960 --> 57:57.460] chronic pain, they know there's a psychological component.

[57:57.460 --> 57:59.960] You're going to be working on the psychological stuff too.

[57:59.960 --> 58:04.460] There are psychiatrists and psychologists working in any legitimate pain clinic to help

[58:04.460 --> 58:10.740] understand why this chronic pain is consistent even after the source of the pain is gone.

[58:10.740 --> 58:16.040] We know that this is a function but it's not like acupuncture.

[58:16.040 --> 58:19.760] It's not like, oh, your liver doesn't work because you were abused as a child.

[58:19.760 --> 58:21.360] It's not that simple.

[58:21.360 --> 58:27.360] That you bring up voodoo and acupuncture both, Kara, is very, very astute of you because

[58:27.360 --> 58:31.800] the author of this article, his name is Jonathan Jerry from McGill University, he puts it this

[58:31.800 --> 58:32.800] way.

[58:32.800 --> 58:33.800] This is how he defines it.

[58:33.800 --> 58:40.880] NET is a pseudoscientific Voltron made up of chiropractic manipulations, applied kinesiology,

[58:40.880 --> 58:46.640] and traditional Chinese medicine and its goal is to expand chiropractors' scope of practice.

[58:46.640 --> 58:47.640] Of course.

[58:47.640 --> 58:52.000] I'm sorry, but chiropractors are not psychiatrists and they're not psychologists.

[58:52.000 --> 58:54.160] They don't have any training in mental health.

[58:54.160 --> 58:59.360] It's really scary to think about chiropractors trying to treat somebody's mental health problems.

[58:59.360 --> 59:04.360] Yeah, they're trying to turn it into a chiropractic problem by saying it's physically in your

[59:04.360 --> 59:09.120] body and we could make it go away magically with these magic methods that we have.

[59:09.120 --> 59:10.120] It's pure nonsense.

[59:10.120 --> 59:11.120] It's totally correct.

[59:11.120 --> 59:12.120] Exactly.

[59:12.120 --> 59:18.920] Because while it may be true, while it may be true that psychological experiences and

[59:18.920 --> 59:26.880] somatic experiences are deeply intertwined, it's not the case that there's a switch

[59:26.880 --> 59:28.440] that's flipped.

[59:28.440 --> 59:33.320] You don't have a psychological experience and then an organ has an effect from it.

[59:33.320 --> 59:34.320] It's not that simple.

[59:34.320 --> 59:38.800] Well, according to the proponents of NET, it is that simple, Kara.

[59:38.800 --> 59:43.320] And then they just want to manipulate the organ and then somehow your trauma is gone.

[59:43.320 --> 59:44.320] Oh yeah.

[59:44.320 --> 59:49.300] Would you like to hear one of the protocols which Jerry uses to describe it best like

[59:49.300 --> 59:50.300] this?

[59:50.300 --> 59:52.960] He said, here's a perfect example, NET.

[59:52.960 --> 59:58.000] This protocol describes the steps for using NET for people with trigger points or spots

[59:58.000 --> 01:00:00.780] on the body that tend to be painful to the touch.

[01:00:00.780 --> 01:00:06.680] The first step and then you go for your consultation and the consultation specialist of NET will

[01:00:06.680 --> 01:00:12.400] compare the strength of healthy muscle with and without pressing down on the trigger point.

[01:00:12.400 --> 01:00:15.960] Now the muscle should weaken when the trigger point is squeezed.

[01:00:15.960 --> 01:00:21.280] Next the process is repeated with the client's palm on their forehead, which according to

[01:00:21.280 --> 01:00:27.800] NET's founder, Scott Walker, a chiropractor who came up with this in the 1980s, he says,

[01:00:27.800 --> 01:00:29.320] this is the emotional point.

[01:00:29.320 --> 01:00:33.400] So putting your palm of your hand to your forehead, there's your emotional point.

[01:00:33.400 --> 01:00:37.480] If the muscle is now felt to be strong, it somehow means that there's an underlying

[01:00:37.480 --> 01:00:43.040] emotion from the past that has rippled over the years and is causing the current illness

[01:00:43.040 --> 01:00:45.780] and therefore the NET is warranted.

[01:00:45.780 --> 01:00:48.560] So I guess that's how you figure out whether you really need this or not.

[01:00:48.560 --> 01:00:52.540] And surprisingly everyone needs it.

[01:00:52.540 --> 01:00:56.280] But then the next step is the applied kinesiology.

[01:00:56.280 --> 01:01:03.280] So this is an attempt of diagnosing illnesses by asking the muscles to reveal what the issue

[01:01:03.280 --> 01:01:04.800] is.

[01:01:04.800 --> 01:01:08.000] You ask the person, so while they're doing this, you ask them a question and then you

[01:01:08.000 --> 01:01:11.920] press down on their extended arm to see how much resistance the muscle gives you.

[01:01:11.920 --> 01:01:14.480] Where have we seen this a billion times before?

[01:01:14.480 --> 01:01:17.720] On stage at an SGU show.

[01:01:17.720 --> 01:01:18.760] Every year at TAM?

[01:01:18.760 --> 01:01:24.600] The muscle is said to answer yes or no if the power is felt as strong or weak.

[01:01:24.600 --> 01:01:27.440] Now in the context of this, the practitioner is talking about things.

[01:01:27.440 --> 01:01:29.920] Let's say in this example, money.

[01:01:29.920 --> 01:01:33.540] Talking about money, presses down on the client's arm, the weakness is felt and when that word

[01:01:33.540 --> 01:01:37.560] money is spoken, that represents where the problem lies.

[01:01:37.560 --> 01:01:39.680] The practitioner starts to dig deeper.

[01:01:39.680 --> 01:01:43.320] They question the muscle with more precise hypotheticals until the core issue has been

[01:01:43.320 --> 01:01:44.320] identified.

[01:01:44.320 --> 01:01:49.400] Money leads to asking about, say, losing money or perhaps gambling, for example.

[01:01:49.400 --> 01:01:53.160] The client is told to actively think about their core issue, which is presumably the

[01:01:53.160 --> 01:01:54.900] reason why they're ill.

[01:01:54.900 --> 01:02:00.300] And then the practitioner presses on different acupuncture points in the body that are said

[01:02:00.300 --> 01:02:06.920] to be tied to the internal organs and to be associated with a distinct emotion.

[01:02:06.920 --> 01:02:10.240] At each point, the muscle is tested again for strength and weakness.

[01:02:10.240 --> 01:02:15.120] This time if the weakness goes away as a particular acupuncture point is pressed, NET claims that

[01:02:15.120 --> 01:02:20.760] the emotion that corresponds to that point is involved in that client's trauma.

[01:02:20.760 --> 01:02:25.280] So essentially, right, if the muscle regains strength where, say, the gallbladder point

[01:02:25.280 --> 01:02:31.120] is pressed, then that means anger or resentment is at play.

[01:02:31.120 --> 01:02:33.400] So they've identified the problem now.

[01:02:33.400 --> 01:02:36.340] Next comes the treatment.

[01:02:36.340 --> 01:02:37.840] And here's how the author puts it.

[01:02:37.840 --> 01:02:41.800] In what could be compared to a game of Twister, the client touches both their forehead and

[01:02:41.800 --> 01:02:47.860] the acupuncture point and contemplates their traumatic emotion while the practitioner performs

[01:02:47.860 --> 01:02:50.520] chiropractic adjustments on their spine.

[01:02:50.520 --> 01:02:54.560] The muscle questioning resumes until the muscle is found to be strong even when the client

[01:02:54.560 --> 01:02:58.880] is thinking about their trauma.

[01:02:58.880 --> 01:02:59.880] That's the treatment.

[01:02:59.880 --> 01:03:05.160] You might as well have them write what makes them sad on a whiteboard and then hit them

[01:03:05.160 --> 01:03:06.160] upside the head.

[01:03:06.160 --> 01:03:09.080] It's like, now you don't feel so bad.

[01:03:09.080 --> 01:03:12.160] What makes you feel worse?

[01:03:12.160 --> 01:03:18.880] The author then goes into the, again, this one research foundation, ORF, they are one

[01:03:18.880 --> 01:03:24.440] of the largest funding arms and researchers and education of NET.

[01:03:24.440 --> 01:03:28.920] They list their published research on their website.

[01:03:28.920 --> 01:03:33.080] And Jerry goes through it all and he basically says it's all garbage.

[01:03:33.080 --> 01:03:39.240] All the studies are pure garbage, all of them flawed in various ways.

[01:03:39.240 --> 01:03:45.320] Of course you have celebrities that talk about it and it gets the usual kind of same, runs

[01:03:45.320 --> 01:03:50.000] in the same circles as chiropractic, of course, because it is an extension basically of chiropractic

[01:03:50.000 --> 01:03:55.360] with several other forms of pseudoscience and snake oil thrown in.

[01:03:55.360 --> 01:03:59.000] If you start out believing in magic, you're disconnected from reality.

[01:03:59.000 --> 01:04:00.640] This is what it looks like, right?

[01:04:00.640 --> 01:04:06.040] This is just pure witchcraft, trifecta pseudoscience.

[01:04:06.040 --> 01:04:11.020] With that beautiful pseudoscientific throw in a couple of words that you've heard from

[01:04:11.020 --> 01:04:13.760] legitimate practitioners before.

[01:04:13.760 --> 01:04:19.660] Sprinkle in a little science and it makes it shine, makes it sparkle.

[01:04:19.660 --> 01:04:26.500] When you zoom out on things like this, it turns into people sitting there literally

[01:04:26.500 --> 01:04:28.560] making this shit up by whole cloth.

[01:04:28.560 --> 01:04:29.560] Yeah.

[01:04:29.560 --> 01:04:34.000] I mean, it kind of is, put your hand on your forehead, touch your point here, think about

[01:04:34.000 --> 01:04:35.000] something.

[01:04:35.000 --> 01:04:37.240] You know, it's like, it's a sophisticated hokey pokey.

[01:04:37.240 --> 01:04:38.920] I think it's a lot of narcissism.

[01:04:38.920 --> 01:04:45.560] The people see themselves as this big thinker and they come up with this theory that works

[01:04:45.560 --> 01:04:50.040] in a vacuum, but it's not connected to reality in any way.

[01:04:50.040 --> 01:04:52.760] But they don't really care about that because they have their genius idea.

[01:04:52.760 --> 01:04:53.760] You're so right.

[01:04:53.760 --> 01:04:54.760] It's so much of a hubris.

[01:04:54.760 --> 01:04:55.760] Oh yeah.

[01:04:55.760 --> 01:04:57.840] Oh, it's cultish, no doubt about it.

[01:04:57.840 --> 01:05:01.840] I don't disagree with what you just said, but it is also giving them the benefit of

[01:05:01.840 --> 01:05:02.840] the doubt.

[01:05:02.840 --> 01:05:08.640] Meaning that, you know, it's like, yeah, they're so full of their own BS that they're making

[01:05:08.640 --> 01:05:11.520] it up and they don't even realize that they're making it up.

[01:05:11.520 --> 01:05:16.480] But I also think there are people out there that deliberately scam other people.

[01:05:16.480 --> 01:05:17.480] Sure.

[01:05:17.480 --> 01:05:18.480] It's all happening.

[01:05:18.480 --> 01:05:21.040] And you can't know individually, you know, where along the spectrum they are and they

[01:05:21.040 --> 01:05:22.040] could be doing well.

[01:05:22.040 --> 01:05:23.040] It's probably somewhere in the middle.

[01:05:23.040 --> 01:05:24.040] Yeah.

[01:05:24.040 --> 01:05:29.400] Because I think so often people, they go after it for, you know, whatever the reason is,

[01:05:29.400 --> 01:05:32.100] and then they convince themselves that it's true.

[01:05:32.100 --> 01:05:35.060] That definitely happens.

[01:05:35.060 --> 01:05:41.780] Even an unadulterated, like just a total charlatan who's like textbook, I know better and I'm

[01:05:41.780 --> 01:05:46.280] doing this anyway, will eventually convince themselves again because of their narcissism

[01:05:46.280 --> 01:05:47.640] that they're helping people.

[01:05:47.640 --> 01:05:49.200] Something real is happening, right?

[01:05:49.200 --> 01:05:50.200] Yeah.

[01:05:50.200 --> 01:05:51.200] Yeah.

[01:05:51.200 --> 01:05:52.200] Yeah.

[01:05:52.200 --> 01:05:53.200] They fall into it.

[01:05:53.200 --> 01:05:54.880] They're giving people hope and comfort and whatever.

[01:05:54.880 --> 01:05:59.520] It's scary how much this kind of stuff has like pervaded the whole healthcare industry

[01:05:59.520 --> 01:06:04.200] because I just, I recently was trying to make an appointment with a therapist that I'm part

[01:06:04.200 --> 01:06:09.960] of a big like HMO, I think it's called a Kaiser over here.

[01:06:09.960 --> 01:06:14.440] And so it was this big long process and they don't actually have a therapist like in-house.

[01:06:14.440 --> 01:06:16.640] They give me like a referral to some local person.

[01:06:16.640 --> 01:06:17.640] There's all these options.

[01:06:17.640 --> 01:06:24.280] So I just pick a random person and I ended up having an appointment over the phone and

[01:06:24.280 --> 01:06:30.000] I'm just like spilling my guts for like 50 minutes.

[01:06:30.000 --> 01:06:37.960] And then at the end she goes, okay, so just so you know, I do EMDR and then I was like,

[01:06:37.960 --> 01:06:43.000] oh my God, I just wasted a whole hour talking to this person.

[01:06:43.000 --> 01:06:46.920] I feel like we should do like the skeptic's guide to your first therapy session and we

[01:06:46.920 --> 01:06:51.400] should offer like a checklist of all the things you should ask the new therapist that you've

[01:06:51.400 --> 01:06:57.520] been matched with so you can determine if it's worth staying with them or if you're

[01:06:57.520 --> 01:07:01.160] like I need to find somebody new because you're not a good match for me.

[01:07:01.160 --> 01:07:04.520] I wasn't familiar at all with EMDR until that happened so I immediately went and looked

[01:07:04.520 --> 01:07:05.520] it up after the appointment.

[01:07:05.520 --> 01:07:09.760] And I don't know if you know this, Steve, but you're quoted on the Wikipedia article

[01:07:09.760 --> 01:07:10.760] for EMDR.

[01:07:10.760 --> 01:07:11.760] Yeah.

[01:07:11.760 --> 01:07:12.760] That's great.

[01:07:12.760 --> 01:07:13.760] Yeah.

Who's That Noisy? (1:07:17)

J: ... I did. This Noisy has appeared on the show before.[link needed]


New Noisy (1:12:05)

[gibberish song with trumpet and percussion beat]

J: So if you think you know what this week's Noisy is ...

[01:07:13.760 --> 01:07:19.520] All right, Jay, it's who's that noisy time?

[01:07:19.520 --> 01:07:20.520] All right, guys.

[01:07:20.520 --> 01:07:34.280] Last week I played this noisy.

[01:07:34.280 --> 01:07:35.280] So what do you guys think it is?

[01:07:35.280 --> 01:07:42.000] Well, you gave a clue last week and the clue was about C and I wonder if you meant S-E-A-C.

[01:07:42.000 --> 01:07:48.240] So something about like a organ in the ocean kind of thing, an instrument of some sort.

[01:07:48.240 --> 01:07:49.640] There's one in San Francisco over here.

[01:07:49.640 --> 01:07:50.640] There's a sea organ.

[01:07:50.640 --> 01:07:51.640] I don't know if that's the one.

[01:07:51.640 --> 01:07:52.640] That's a thing?

[01:07:52.640 --> 01:07:53.640] A sea organ?

[01:07:53.640 --> 01:07:54.640] Yeah.

[01:07:54.640 --> 01:07:55.640] Didn't we have it on the show already though, Jay?

[01:07:55.640 --> 01:07:56.640] Evan was making that up.

[01:07:56.640 --> 01:07:57.640] I think, Jay, you said...

[01:07:57.640 --> 01:07:58.640] No.

[01:07:58.640 --> 01:07:59.640] I did.

[01:07:59.640 --> 01:08:00.640] I've had...

[01:08:00.640 --> 01:08:02.120] This noisy has appeared on the show before.

[01:08:02.120 --> 01:08:03.120] It's one of my favorites.

[01:08:03.120 --> 01:08:04.120] Oh, an encore.

[01:08:04.120 --> 01:08:06.320] Oh, I don't like it.

[01:08:06.320 --> 01:08:07.800] You like it?

[01:08:07.800 --> 01:08:10.200] It's very uncomfortable for me, that noise.

[01:08:10.200 --> 01:08:11.200] All right.

[01:08:11.200 --> 01:08:12.200] So that was a good guess.

[01:08:12.200 --> 01:08:13.200] Let's go through.

[01:08:13.200 --> 01:08:14.200] Let's see what some listeners thought.

[01:08:14.200 --> 01:08:18.320] So a listener named Michael Blaney wrote in and said, hi, Jay, this sounds like a giant

[01:08:18.320 --> 01:08:22.560] set of pan pipes being played by some kind of mythical giant creature.

[01:08:22.560 --> 01:08:23.560] Cool.

[01:08:23.560 --> 01:08:24.560] Giant Zamphere.

[01:08:24.560 --> 01:08:25.560] Sure.

[01:08:25.560 --> 01:08:27.320] It could be from a movie or whatever.

[01:08:27.320 --> 01:08:32.720] I don't disagree with you that it has that vibe to it, but that is not correct, but I

[01:08:32.720 --> 01:08:35.360] really like the visual that you put into my head.

[01:08:35.360 --> 01:08:40.660] I have another listener that guest named Dustin Borland and Dustin wrote in, it's those damn

[01:08:40.660 --> 01:08:45.000] rocks that make music by the waves.

[01:08:45.000 --> 01:08:50.560] I will just say that a lot of people, again, emailed me and got this right.

[01:08:50.560 --> 01:08:52.040] Most people knew exactly what it was.

[01:08:52.040 --> 01:08:55.560] Some people guessed very close, but I have a couple of notable people here.

[01:08:55.560 --> 01:09:00.020] Well, first, the very first person to guess correctly is Adam Hepburn.

[01:09:00.020 --> 01:09:06.160] And Adam said, my man, Jason, I got your answer for who's that noisy.

[01:09:06.160 --> 01:09:08.320] And he said, I also got your reference on your hint.

[01:09:08.320 --> 01:09:10.560] Let's see what you come up with.

[01:09:10.560 --> 01:09:15.360] So this is the sea organ of Zadar, Croatia.

[01:09:15.360 --> 01:09:16.360] Very cool.

[01:09:16.360 --> 01:09:21.680] So he guessed correctly, but then I got an email a couple of days after this, this episode

[01:09:21.680 --> 01:09:27.160] posted by a listener named Viktor Morović and Viktor gave a little bit more interesting

[01:09:27.160 --> 01:09:28.160] information.

[01:09:28.160 --> 01:09:33.240] So he writes, hi guys, I'm Viktor from Croatia and currently I'm studying medicinal chemistry.

[01:09:33.240 --> 01:09:36.480] I've been listening to your podcast for some six, seven years now.

[01:09:36.480 --> 01:09:39.400] Big fan of your show, not a patron yet, but I'm still a student.

[01:09:39.400 --> 01:09:41.560] So I think I get a pass for now.

[01:09:41.560 --> 01:09:42.560] First time guessing.

[01:09:42.560 --> 01:09:48.560] Usually I skip this bit because sometimes I know it, uh, it triggers is, uh, tinnitus.

[01:09:48.560 --> 01:09:50.880] I still cannot pronounce the word tinnitus.

[01:09:50.880 --> 01:09:51.880] Thank you.

[01:09:51.880 --> 01:09:54.040] Here's the thing.

[01:09:54.040 --> 01:09:59.400] It's not tinnitus cause it's not inflammation of anything, right?

[01:09:59.400 --> 01:10:02.000] It's not like appendicitis or laryngitis.

[01:10:02.000 --> 01:10:03.000] It's tinnitus.

[01:10:03.000 --> 01:10:04.000] Tinnitus.

[01:10:04.000 --> 01:10:05.200] Everyone gets it wrong.

[01:10:05.200 --> 01:10:06.200] That's okay.

[01:10:06.200 --> 01:10:07.200] But it's tinnitus.

[01:10:07.200 --> 01:10:08.200] Yeah, it's okay.

[01:10:08.200 --> 01:10:09.200] Not tinnitus.

[01:10:09.200 --> 01:10:19.440] He said, he said, so he, um, he typically skips who's that noisy because he said he

[01:10:19.440 --> 01:10:22.880] typically skips who's that noisy because it can trigger his tinnitus.

[01:10:22.880 --> 01:10:26.520] So just by chance he heard this week's noisy and he couldn't help himself, but write the

[01:10:26.520 --> 01:10:29.560] answer, which was great because you got it right.

[01:10:29.560 --> 01:10:34.280] And of course this is, he says, this is an experimental musical instrument called C organ,

[01:10:34.280 --> 01:10:45.940] also known as Morsk, or glue, or glue, O R G U L J E it was like a word that was created

[01:10:45.940 --> 01:10:50.800] to make it so I can't pronounce it.

[01:10:50.800 --> 01:10:53.880] Morsk the word, uh, please try to say it out loud.

[01:10:53.880 --> 01:10:56.520] It will be really funny to hear it takes a lot.

[01:10:56.520 --> 01:11:01.800] He actually says, say it out loud because he knows I will be able to say or whatever

[01:11:01.800 --> 01:11:02.800] in Zadar.

[01:11:02.800 --> 01:11:03.800] Okay.

[01:11:03.800 --> 01:11:08.240] The sound is generated by tides and waves passing through pipes of various diameter.

[01:11:08.240 --> 01:11:11.640] As the waves push the air in the pipes, they generate the sound.

[01:11:11.640 --> 01:11:16.960] They're situated near meteor, near a meteor impact that destroyed our civilization in

[01:11:16.960 --> 01:11:17.960] the movie.

[01:11:17.960 --> 01:11:18.960] Don't look up.

[01:11:18.960 --> 01:11:22.960] Fun fact, according to Alfred Hitchcock, Zadar has the best sunset on earth.

[01:11:22.960 --> 01:11:27.240] So anyway, uh, thank you both guys for getting the answer correct and everybody else who

[01:11:27.240 --> 01:11:30.920] emailed me and told me the correct answer again, you know, this was a repeat, so if

[01:11:30.920 --> 01:11:34.880] anyone has been listening to the show, they might remember me playing it in the past.

[01:11:34.880 --> 01:11:36.240] I just really think it's cool.

[01:11:36.240 --> 01:11:42.600] But Kara, I, um, there is something uncomfortable about the sound because it's, it's not necessarily

[01:11:42.600 --> 01:11:45.720] playing like what we would consider it to be a melody.

[01:11:45.720 --> 01:11:47.800] It could play almost any note at any time.

[01:11:47.800 --> 01:11:51.360] Oh, it's super creepy haunted house.

[01:11:51.360 --> 01:11:56.420] There's a, you know, demon playing an organ in a place you don't want to visit.

[01:11:56.420 --> 01:11:58.200] Like it has that vibe for sure.

[01:11:58.200 --> 01:11:59.200] Unless you're Bob.

[01:11:59.200 --> 01:12:00.200] It's very Halloween.

[01:12:00.200 --> 01:12:01.200] Yeah, Bob.

[01:12:01.200 --> 01:12:02.200] You should love it.

[01:12:02.200 --> 01:12:03.200] Bob loves it.

[01:12:03.200 --> 01:12:04.200] Sounds awesome.

[01:12:04.200 --> 01:12:05.200] Anyway, thanks everyone for emailing me.

[01:12:05.200 --> 01:12:06.920] I have a new Noisy this week.

[01:12:06.920 --> 01:12:12.000] This Noisy was sent in by a listener named C Ross and I hope you like it.

[01:12:12.000 --> 01:12:32.920] All right.

[01:12:32.920 --> 01:12:33.920] That's this week's Noisy.

[01:12:33.920 --> 01:12:36.120] I know exactly what that is.

[01:12:36.120 --> 01:12:38.520] Oh my God.

[01:12:38.520 --> 01:12:41.560] I came across that and I was, I got addicted to it.

[01:12:41.560 --> 01:12:45.680] I watched it over and over like what the hell is happening?

[01:12:45.680 --> 01:12:46.680] All right.

[01:12:46.680 --> 01:12:50.120] So if you think, you know what this week's Noisy is and be specific, please.

[01:12:50.120 --> 01:12:53.060] You can't just, you know, don't, don't give me something that where I'm like, well, you're

[01:12:53.060 --> 01:12:54.060] kind of right.

[01:12:54.060 --> 01:12:57.480] You know, if you want me to say your name on the show, if you want to actually win,

[01:12:57.480 --> 01:13:00.680] who's that Noisy, you gotta, you gotta give me the specifics.

[01:13:00.680 --> 01:13:05.480] Or if you heard something cool this week, email me at WTN at the skeptics guide.org.

Announcements (1:13:06)

[01:13:05.480 --> 01:13:08.100] Steve, got a few announcements for you.

[01:13:08.100 --> 01:13:09.520] So do it.

[01:13:09.520 --> 01:13:17.120] As this episode comes out, our book will be released during the release of this episode.

[01:13:17.120 --> 01:13:23.920] We will be doing the six hour live stream, which is two live SGU podcasts back to back

[01:13:23.920 --> 01:13:30.300] with a couple of interviews and discussion also about the book that Steve Bob and I co

[01:13:30.300 --> 01:13:33.400] wrote together called the skeptics guide to the future.

[01:13:33.400 --> 01:13:39.480] I wanted to, uh, to remind the audience that if you become a patron at, at one of our two

[01:13:39.480 --> 01:13:45.460] highest patron levels, then you will receive a free signed copy of the book.

[01:13:45.460 --> 01:13:51.240] So please go to a patreon.com forward slash skeptics guide to check out the details on

[01:13:51.240 --> 01:13:52.240] that.

[01:13:52.240 --> 01:13:56.480] We have shows coming up guys and the, they're starting to take shape.

[01:13:56.480 --> 01:14:04.640] I have ordered and received all of the updated graphics that we need for the SGU holiday,

[01:14:04.640 --> 01:14:10.160] you know, has a basically four shows, two, two skeptical extravaganza and two private

[01:14:10.160 --> 01:14:11.160] SGU recordings.

[01:14:11.160 --> 01:14:16.440] We'll be doing these in Arizona and they are going to be holiday themed, especially the

[01:14:16.440 --> 01:14:17.520] extravaganza.

[01:14:17.520 --> 01:14:19.120] So we'd like you to join us.

[01:14:19.120 --> 01:14:24.880] You can go to the skeptics guide.org forward slash events for all the details on how to,

[01:14:24.880 --> 01:14:26.680] to buy tickets for those shows.

[01:14:26.680 --> 01:14:30.960] And while you're at it, if you enjoy the show, if you enjoy the work that we do, then please

[01:14:30.960 --> 01:14:33.120] consider becoming a patron of ours.

[01:14:33.120 --> 01:14:34.220] It really helps.

[01:14:34.220 --> 01:14:40.320] It helps us accomplish more things, do things like, you know, nexus, uh, continuously recording

[01:14:40.320 --> 01:14:44.200] this show every week and all the other things like the free live streams and all the things

[01:14:44.200 --> 01:14:45.200] that we do.

[01:14:45.200 --> 01:14:46.200] It all helps.

[01:14:46.200 --> 01:14:50.280] So we'd really appreciate it if you would like to support us and the work that we do

[01:14:50.280 --> 01:14:54.600] go to patreon.com forward slash skeptics guide.

[01:14:54.600 --> 01:14:56.040] Uh, Jay, we had a question recently.

[01:14:56.040 --> 01:15:00.240] Someone asked us if the two private shows, the one in, the one in Tucson and one in Phoenix,

[01:15:00.240 --> 01:15:04.320] if they're going to be the same show and so they're, they're not, they're going to be

[01:15:04.320 --> 01:15:05.320] completely different.

[01:15:05.320 --> 01:15:07.880] So they're, they're two different SGU episodes.

[01:15:07.880 --> 01:15:10.240] There'll be no overlapping content.

[01:15:10.240 --> 01:15:13.840] Um, the extravaganza are partial overlap, right?

[01:15:13.840 --> 01:15:18.160] There are some bits that are the same, but all of, all of the, a lot of the stuff that

[01:15:18.160 --> 01:15:20.600] we do are like quizzes and games and things like that.

[01:15:20.600 --> 01:15:21.600] And that's all fresh content.

[01:15:21.600 --> 01:15:22.600] Yeah.

[01:15:22.600 --> 01:15:24.440] It's all different every time.

[01:15:24.440 --> 01:15:26.800] But yeah, you know, it's basically the same show.

[01:15:26.800 --> 01:15:31.160] I mean, what I said was you really got to love us if you want to see both extravaganza,

[01:15:31.160 --> 01:15:36.600] which is not telling us not to do it, but don't expect zero overlapping content between

[01:15:36.600 --> 01:15:37.600] those two shows.

[01:15:37.600 --> 01:15:40.560] But the, the private shows will be completely different.

[01:15:40.560 --> 01:15:41.560] Yeah.

[01:15:41.560 --> 01:15:45.320] Steve, don't forget those private shows aren't just a recording of our episode.

[01:15:45.320 --> 01:15:49.560] It's an extended thing where we're going to be spending at least an hour after the show

[01:15:49.560 --> 01:15:52.280] is over doing a lot of other things.

[01:15:52.280 --> 01:15:55.840] And you know, the, the key point here is that we'll have the time to hang out and talk and

[01:15:55.840 --> 01:15:56.840] meet everybody.

[01:15:56.840 --> 01:15:58.840] And we'll be doing special bits.

[01:15:58.840 --> 01:15:59.840] Yeah.

[01:15:59.840 --> 01:16:00.840] Yes.

[01:16:00.840 --> 01:16:01.840] Just for the, just for the live audience.

[01:16:01.840 --> 01:16:02.840] All right.

[01:16:02.840 --> 01:16:04.240] Thanks Jay.

[01:16:04.240 --> 01:16:05.360] Also one quick announcement.

[01:16:05.360 --> 01:16:11.640] The Canberra Skeptics is hosting the Australian Skeptics National Convention 2022, also known

[01:16:11.640 --> 01:16:19.080] as Skepticon from December 3rd through 4th at the National Library of Australia in Canberra.

[01:16:19.080 --> 01:16:24.640] And this is a hybrid event that'll be live and there'll be some people attending virtually.

[01:16:24.640 --> 01:16:29.640] I will be giving a virtual talk at the conference.

[01:16:29.640 --> 01:16:38.360] So you can get your tickets at Skepticon, that's S K E P T I C O N dot org dot A U.

Questions/Emails/Corrections/Follow-ups (1:16:39)

Followup #1: Chess Cheating

[01:16:38.360 --> 01:16:39.360] All right.

[01:16:39.360 --> 01:16:46.040] So we have, we had a few emails about our discussion last week about the the chess cheating.

[01:16:46.040 --> 01:16:49.000] So part of it was, I'm not going to read any specific email.

[01:16:49.000 --> 01:16:52.040] We got a bunch of emails just basically saying that there were some details that we got wrong

[01:16:52.040 --> 01:16:54.120] and there's been some updates since we talked about it.

[01:16:54.120 --> 01:16:57.480] You know, we were kind of, you know, just having it as a more of a casual discussion

[01:16:57.480 --> 01:16:59.640] as the news was breaking.

[01:16:59.640 --> 01:17:00.640] Couple of just factual things.

[01:17:00.640 --> 01:17:05.920] You know, we said that it was a, you know, that the number one ranked player beat the

[01:17:05.920 --> 01:17:08.400] lowest ranked player in the first round.

[01:17:08.400 --> 01:17:11.980] So actually it was the third round of play and it was a round robin.

[01:17:11.980 --> 01:17:16.360] So it wasn't, it was not one of those tournaments where the first round is where the first and

[01:17:16.360 --> 01:17:19.080] last person would, would play each other.

[01:17:19.080 --> 01:17:20.080] Right.

[01:17:20.080 --> 01:17:21.560] The Swiss system is what that's called.

[01:17:21.560 --> 01:17:22.560] Yeah.

[01:17:22.560 --> 01:17:23.560] It was round robin instead.

[01:17:23.560 --> 01:17:24.560] Yeah.

[01:17:24.560 --> 01:17:25.560] Didn't know that.

[01:17:25.560 --> 01:17:28.800] Doesn't matter to the actual discussion, but just to get those details correct.

[01:17:28.800 --> 01:17:33.240] You know, again, we, I, I think that we were pretty clear about this, but just to, to really

[01:17:33.240 --> 01:17:40.260] emphasize this, the accusation about there being like anal beads or some kind of anal

[01:17:40.260 --> 01:17:47.800] vibrating situation was completely made up speculation and maybe speculation is even

[01:17:47.800 --> 01:17:48.800] too strong.

[01:17:48.800 --> 01:17:53.000] It was really just a made up joke that sort of went viral and there's not, there was never

[01:17:53.000 --> 01:17:54.440] anything to that.

[01:17:54.440 --> 01:17:59.320] So on YouTube, there's a person who has a chess channel, his name's Eric Hanson, and

[01:17:59.320 --> 01:18:06.560] we believe it started with him in which he was asked about how someone could possibly

[01:18:06.560 --> 01:18:13.480] cheat what's called over the board, OTB, which means live as opposed to, as opposed to online.

[01:18:13.480 --> 01:18:17.840] And the, you know, in, in joking, you know, he said, well, you know, we could have something,

[01:18:17.840 --> 01:18:21.060] you know, inserted, you know, but it was meant to be a joke.

[01:18:21.060 --> 01:18:22.680] But the video was watched a lot.

[01:18:22.680 --> 01:18:28.640] It got picked up actually, Elon Musk commented on it, you know, sort of perpetuating the

[01:18:28.640 --> 01:18:29.640] joke a little bit.

[01:18:29.640 --> 01:18:33.840] And then that's when the news media ran with it and it got splashed on the headlines.

[01:18:33.840 --> 01:18:39.560] So it became this kind of out of control runaway joke that, that, you know, it, you know, I

[01:18:39.560 --> 01:18:42.000] guess enhanced the story in a way.

[01:18:42.000 --> 01:18:44.520] But you know, I could, you know, whatever gets you clicks, I guess.

[01:18:44.520 --> 01:18:49.840] So Eric Hanson, he seems to be the, uh, uh, the source of that original joke.

[01:18:49.840 --> 01:18:51.360] Yeah, yeah, yeah.

[01:18:51.360 --> 01:18:54.720] And then, you know, the question of whether or not he was cheating, again, we were making

[01:18:54.720 --> 01:18:59.120] a very superficial point because none of us are chess experts that people are trying to

[01:18:59.120 --> 01:19:06.480] infer whether or not it was more or less likely that he was cheating based, based upon circumstantial

[01:19:06.480 --> 01:19:11.400] evidence, things like analyzing his moves and seeing if they were consistent with his

[01:19:11.400 --> 01:19:12.680] level of play.

[01:19:12.680 --> 01:19:16.560] Was he making really high end decisions too quickly?

[01:19:16.560 --> 01:19:21.180] And another thing that somebody brought up that we had didn't mention was, you know,

[01:19:21.180 --> 01:19:26.760] in discussing, like analyzing his own moves after the game, like when you're talking,

[01:19:26.760 --> 01:19:33.240] you know, to reporters, whatever, and talking about your own moves, could he give a cogent

[01:19:33.240 --> 01:19:36.200] explanation of his strategy or not?

[01:19:36.200 --> 01:19:39.920] You know, like normally a player would be able to say exactly why they made, what they

[01:19:39.920 --> 01:19:42.400] were thinking and why they made, they made the moves they did.

[01:19:42.400 --> 01:19:46.000] And if they, if a computer told them to make it, they won't really know what to say, you

[01:19:46.000 --> 01:19:47.000] know?

[01:19:47.000 --> 01:19:48.040] And neither would the computer.

[01:19:48.040 --> 01:19:51.680] So there was some, you know, implications, this is all subjective, which is why it's

[01:19:51.680 --> 01:19:55.680] hard to know, you know, unless you're an expert yourself, there was some suggestion that maybe

[01:19:55.680 --> 01:20:00.960] his, he was giving pretty light explanations for his own moves.

[01:20:00.960 --> 01:20:04.960] But then somebody else brought up a fact that he was a last minute substitution, like one

[01:20:04.960 --> 01:20:08.680] minute, one week before the tournament started.

[01:20:08.680 --> 01:20:12.640] And that's like not enough time to set something like this up, you know, like an elaborate

[01:20:12.640 --> 01:20:16.800] cheating scheme that worked because he wasn't caught red handed, you know, that worked in

[01:20:16.800 --> 01:20:18.080] that sense.

[01:20:18.080 --> 01:20:23.260] So whatever, you know, I don't think we may not ever know a hundred percent at this point.

[01:20:23.260 --> 01:20:25.480] It's not even really an official accusation.

[01:20:25.480 --> 01:20:29.680] It's just sort of this implication that was thrown out there and there's no proof that

[01:20:29.680 --> 01:20:30.680] he cheated.

[01:20:30.680 --> 01:20:36.000] I haven't read any definitive analysis saying he must have cheated or he couldn't possibly

[01:20:36.000 --> 01:20:37.160] have cheated.

[01:20:37.160 --> 01:20:40.800] It's just like, which type of evidence do you want to listen to?

[01:20:40.800 --> 01:20:43.280] And there's a lot of subjectivity to it.

[01:20:43.280 --> 01:20:46.540] And we don't have any strong opinions because none of us know enough to have any strong

[01:20:46.540 --> 01:20:47.540] opinions.

[01:20:47.540 --> 01:20:50.840] We're just trying to reflect what's being said out there in the chess community.

[01:20:50.840 --> 01:20:55.180] I'm sure there's a lot of chess aficionados in our audience who do have strong opinions

[01:20:55.180 --> 01:20:56.380] about it.

[01:20:56.380 --> 01:20:57.380] And that's fine.

[01:20:57.380 --> 01:20:59.920] We're just trying to say what is being reported.

[01:20:59.920 --> 01:21:04.960] And just again, the phenomenon we were interested in was in the lack of definitive physical

[01:21:04.960 --> 01:21:12.080] direct evidence, people are happy to latch on to subjective inferential evidence and

[01:21:12.080 --> 01:21:16.180] circumstantial evidence and try to build a case that way.

[01:21:16.180 --> 01:21:21.520] And at the end of the day, unless you have like a statistically solid analysis, I think

[01:21:21.520 --> 01:21:25.960] we just need to be humble and just say, well, we don't know.

[01:21:25.960 --> 01:21:30.440] And maybe we should, you know, you can make an argument, I guess, for innocence until

[01:21:30.440 --> 01:21:32.840] proving guilty or the presumption of innocence.

[01:21:32.840 --> 01:21:38.120] Of course, he has a history of cheating a couple of times when he was younger.

[01:21:38.120 --> 01:21:39.120] And how does that play in?

[01:21:39.120 --> 01:21:40.120] Not that much younger.

[01:21:40.120 --> 01:21:42.600] Yeah, but that's a sort of, again, what does that mean?

[01:21:42.600 --> 01:21:44.500] Is it poisoning the well?

[01:21:44.500 --> 01:21:45.560] Is it unfair?

[01:21:45.560 --> 01:21:49.720] Should he still be given the benefit of the doubt or is he a cheater and therefore has

[01:21:49.720 --> 01:21:51.640] forfeited the benefit of the doubt?

[01:21:51.640 --> 01:21:52.640] That's I don't know.

[01:21:52.640 --> 01:21:53.640] It's a little ambiguous.

[01:21:53.640 --> 01:21:55.320] I think we're in the gray area there.

[01:21:55.320 --> 01:21:58.440] I will bring up someone I was directed to.

[01:21:58.440 --> 01:21:59.840] I did not know about this gentleman.

[01:21:59.840 --> 01:22:02.440] His name is Dr. Kenneth Regan, R-E-G-A-N.

[01:22:02.440 --> 01:22:05.320] He works at the computer scientist from the University of Buffalo.

[01:22:05.320 --> 01:22:10.240] He's considered the foremost authority on chess cheating.

[01:22:10.240 --> 01:22:17.760] He himself is a master or a grandmaster of chess and you know, but also is massively

[01:22:17.760 --> 01:22:22.040] brilliant at statistical analysis and this is part of what he does.

[01:22:22.040 --> 01:22:27.440] In fact, he's the Kojak, if it were, of chess cheating, right?

[01:22:27.440 --> 01:22:33.760] When there's an accusation brought in, they bring in this fellow, Dr. Regan, to offer

[01:22:33.760 --> 01:22:34.940] his opinion.

[01:22:34.940 --> 01:22:42.720] He has analyzed, so the chess player in question, Hans Niemann is his name, and Dr. Regan analyzed

[01:22:42.720 --> 01:22:50.720] 106 of his matches, both live over the table and computer, which is all the data available

[01:22:50.720 --> 01:22:57.680] on his matches that Hans Niemann has participated in, and bottom line is, Ken Regan could find

[01:22:57.680 --> 01:23:02.120] no statistical evidence of cheating in any of his games.

[01:23:02.120 --> 01:23:05.640] Everything falls within the margins expected.

[01:23:05.640 --> 01:23:07.440] So he's a really good cheater.

[01:23:07.440 --> 01:23:11.360] He's either the best cheater ever.

[01:23:11.360 --> 01:23:13.240] Anyone who moves is VC.

[01:23:13.240 --> 01:23:16.720] Anyone who doesn't move is a well-disciplined VC.

[01:23:16.720 --> 01:23:21.720] So Dr. Regan is, exactly, Dr. Regan is considered the authority.

[01:23:21.720 --> 01:23:23.520] Those venture capitalists.

[01:23:23.520 --> 01:23:28.440] Yeah, he gets hired, you know, he gets hired by various chess organizations and other professional

[01:23:28.440 --> 01:23:30.940] organizations having to deal with chess online ones as well.

[01:23:30.940 --> 01:23:35.600] So he's very well established and you know, I suppose if you're going to believe anyone,

[01:23:35.600 --> 01:23:39.160] you can count on his analysis, I would assume.

[01:23:39.160 --> 01:23:40.160] Sounds reasonable.

Science or Fiction (1:23:43)

Theme: Global Warming

Item #1: A survey of 48 coastal cities finds that they are sinking at an average rate of 16.2 mm per year, with the fastest at 43 mm per year. (For reference, average global sea level rise is 3.7 mm per year.)[8]
Item #2: A recent study estimates the total social cost of releasing carbon into the atmosphere at $185 per tonne, which is triple the current US government estimate. (For reference, the world emits >34 billion tonnes of CO2 each year.)[9]
Item #3: The latest climate models indicate that even with rapid decarbonization it is too late to prevent eventual warming >1.5 C.[10]

Answer Item
Fiction Too late to prevent >1.5 °C
Science Carbon release estimate
Science
Cities are sinking
Host Result
Steve win
Rogue Guess
David
Cities are sinking
Jay
Too late to prevent >1.5 °C
Bob
Too late to prevent >1.5 °C
Evan
Too late to prevent >1.5 °C
Cara
Too late to prevent >1.5 °C

Voice-over: It's time for Science or Fiction.

David's Response

Jay's Response

Bob's Response

Evan's Response

Cara's Response

Steve Explains Item #2

Steve Explains Item #1

Steve Explains Item #3

[01:23:40.160 --> 01:23:45.960] All right, guys, let's move on with Science or Fiction.

[01:23:45.960 --> 01:23:55.480] It's time for Science or Fiction.

[01:23:55.480 --> 01:24:00.080] Each week I come up with three science news items or facts, two real and one fictitious,

[01:24:00.080 --> 01:24:04.400] and then I challenge my panel of skeptics to tell me which one is the fake.

[01:24:04.400 --> 01:24:07.160] There's a theme this week.

[01:24:07.160 --> 01:24:15.080] The theme is global warming, so three items that in some way have to do about global warming.

[01:24:15.080 --> 01:24:16.540] Here we go.

[01:24:16.540 --> 01:24:22.920] Item number one, a survey of 48 coastal cities finds that they are sinking at an average

[01:24:22.920 --> 01:24:30.160] rate of 16.2 millimeters per year, with the fastest at 43 millimeters per year.

[01:24:30.160 --> 01:24:35.240] For reference, average global sea level rise is 3.7 millimeters per year.

[01:24:35.240 --> 01:24:41.640] All right, number two, a recent study estimates the total social cost of releasing carbon

[01:24:41.640 --> 01:24:49.880] into the atmosphere at $185 per ton, which is triple the current U.S. government estimate.

[01:24:49.880 --> 01:24:56.240] And here for reference, the world emits 34 billion tons of CO2 every year, greater than

[01:24:56.240 --> 01:24:58.440] 34 billion tons.

[01:24:58.440 --> 01:25:05.200] And item number three, the latest climate models indicate that even with rapid decarbonization,

[01:25:05.200 --> 01:25:12.560] it is too late to prevent eventual warming greater than 1.5 C, 1.5 degrees Celsius above

[01:25:12.560 --> 01:25:14.560] pre-industrial levels.

[01:25:14.560 --> 01:25:21.720] All right, well, David, as our guest, you have the privilege of going first.

[01:25:21.720 --> 01:25:24.880] Oh, lucky me.

[01:25:24.880 --> 01:25:28.000] Okay, so let's see.

[01:25:28.000 --> 01:25:33.480] A survey of 48 coastal cities finds they're sinking at an average rate of 16.2 millimeters

[01:25:33.480 --> 01:25:36.960] per year, fastest at 43.

[01:25:36.960 --> 01:25:42.320] Okay, that seems like a lot per year.

[01:25:42.320 --> 01:25:46.960] A recent study estimates the total social cost of releasing carbon into the atmosphere

[01:25:46.960 --> 01:25:52.200] at $185 per ton, it's triple the current estimate.

[01:25:52.200 --> 01:25:53.200] Okay.

[01:25:53.200 --> 01:25:56.120] Yeah, that's the cost to society, right?

[01:25:56.120 --> 01:25:57.120] Like the external costs.

[01:25:57.120 --> 01:25:58.120] Everything.

[01:25:58.120 --> 01:26:01.920] It's all the externalized costs to society, yeah.

[01:26:01.920 --> 01:26:02.920] That's definitely a lot of money.

[01:26:02.920 --> 01:26:04.480] Okay.

[01:26:04.480 --> 01:26:08.120] The latest climate models indicate that even with rapid decarbonization, it's too late

[01:26:08.120 --> 01:26:09.120] to prevent...

[01:26:09.120 --> 01:26:10.960] That one seems like obvious to me.

[01:26:10.960 --> 01:26:13.840] I'm definitely a doom and gloomer when it comes to this stuff.

[01:26:13.840 --> 01:26:20.880] So that seems like an easy science to me.

[01:26:20.880 --> 01:26:22.840] It's definitely between one and two.

[01:26:22.840 --> 01:26:25.360] Okay, one just seems really fast.

[01:26:25.360 --> 01:26:32.520] Even an average rate of 16.2 millimeters per year that our coastal cities are sinking,

[01:26:32.520 --> 01:26:33.520] that seems like a lot.

[01:26:33.520 --> 01:26:34.520] I'm going to say number one is fiction.

[01:26:34.520 --> 01:26:35.520] All right.

[01:26:35.520 --> 01:26:36.520] Jay, you're next.

[01:26:36.520 --> 01:26:39.040] Which one did you say was fiction?

[01:26:39.040 --> 01:26:40.760] The coastal cities sinking.

[01:26:40.760 --> 01:26:41.760] Sinking.

[01:26:41.760 --> 01:26:42.760] What are you sinking?

[01:26:42.760 --> 01:26:43.760] What are you sinking?

[01:26:43.760 --> 01:26:44.760] All right.

[01:26:44.760 --> 01:26:45.760] To cut to it...

[01:26:45.760 --> 01:26:46.760] I wonder how long it's going to take for that.

[01:26:46.760 --> 01:26:47.760] Yeah.

[01:26:47.760 --> 01:26:48.760] There's a lot of information here.

[01:26:48.760 --> 01:26:55.800] There is quite a bit of information here, but I'm just going to go right down to the

[01:26:55.800 --> 01:27:00.560] third one, the one about the climate models indicate that even with rapid decarbonization,

[01:27:00.560 --> 01:27:03.400] it's too late to prevent eventual warming to 1.5.

[01:27:03.400 --> 01:27:04.920] I don't think that's correct.

[01:27:04.920 --> 01:27:07.160] I think we can still prevent 1.5.

[01:27:07.160 --> 01:27:09.120] Okay, Bob.

[01:27:09.120 --> 01:27:15.040] I mean, the city sinking, it doesn't sound right at all to me.

[01:27:15.040 --> 01:27:20.040] But that's well above the 3.7 millimeter of the global sea level rise.

[01:27:20.040 --> 01:27:21.040] So it's sinking?

[01:27:21.040 --> 01:27:28.200] I mean, this is like the plates, the crustal plates or the tectonic plates are sinking

[01:27:28.200 --> 01:27:29.800] or something else.

[01:27:29.800 --> 01:27:35.400] And also what about, at least in the United States, there's big chunks of the country

[01:27:35.400 --> 01:27:41.120] that are actually rebounding from the heavyweight of the glaciers that were relatively recently

[01:27:41.120 --> 01:27:44.280] on the country.

[01:27:44.280 --> 01:27:46.300] I wonder what that rate of rise is.

[01:27:46.300 --> 01:27:48.680] So that one, I'm really leaning towards that one.

[01:27:48.680 --> 01:27:56.600] The $185 per ton, 34 billion tons, that's $6.29 trillion a year.

[01:27:56.600 --> 01:27:58.280] That sounds about right.

[01:27:58.280 --> 01:27:59.560] So I'm totally buying that one.

[01:27:59.560 --> 01:28:04.280] I mean, I think it's a little higher than I would have thought per year, but it's within

[01:28:04.280 --> 01:28:05.280] the range.

[01:28:05.280 --> 01:28:11.640] I mean, I could absolutely agree that fully externalized cost being 6.2, 6.3 trillion

[01:28:11.640 --> 01:28:12.640] a year.

[01:28:12.640 --> 01:28:19.640] But Jay, I mean, yeah, this third one though, I thought for sure that we had already pretty

[01:28:19.640 --> 01:28:29.320] much agreed that there's no way we can miss 1.5 degrees and we got to shoot for two now.

[01:28:29.320 --> 01:28:30.320] Just try to avoid the two.

[01:28:30.320 --> 01:28:33.360] I thought 1.5 was done.

[01:28:33.360 --> 01:28:40.760] But I could absolutely see that if like, no, if we decarbonized at the theoretical maximum

[01:28:40.760 --> 01:28:50.900] rate that's feasible, maybe then we could prevent 1.5, anything greater than 1.5.

[01:28:50.900 --> 01:28:51.900] So yeah.

[01:28:51.900 --> 01:28:52.900] All right.

[01:28:52.900 --> 01:28:58.200] So I think Steve was hoping that that's what we would have thought.

[01:28:58.200 --> 01:29:01.120] I can't think of a really good reason why the cities would be sinking.

[01:29:01.120 --> 01:29:07.280] But yeah, I'm going to go with Jay and say decarbonization, 1.5, fiction, meh.

[01:29:07.280 --> 01:29:08.280] All right.

[01:29:08.280 --> 01:29:09.280] Evan?

[01:29:09.280 --> 01:29:11.120] I think I'm on the same boat.

[01:29:11.120 --> 01:29:12.120] Yeah.

[01:29:12.120 --> 01:29:15.960] Not too late to prevent 1.5C.

[01:29:15.960 --> 01:29:19.040] It will take a massive effort.

[01:29:19.040 --> 01:29:24.200] But I think on the technical sense, it's not there yet.

[01:29:24.200 --> 01:29:28.000] We have not crossed that threshold.

[01:29:28.000 --> 01:29:35.960] Whereas the other two, yeah, the $185 per ton, that's alarming and probably factual.

[01:29:35.960 --> 01:29:45.220] And then the other one, the one about the sinking versus the rising sea level, intuitively

[01:29:45.220 --> 01:29:49.140] that doesn't seem right in a way, but I just have a feeling that one is going to wind up

[01:29:49.140 --> 01:29:50.140] being correct.

[01:29:50.140 --> 01:29:51.840] So I'm with the guys there.

[01:29:51.840 --> 01:29:52.840] All right.

[01:29:52.840 --> 01:29:53.840] And Kara?

[01:29:53.840 --> 01:30:02.280] Yeah, I mean, the rising sea levels being, what does it say, 3.7 millimeters per year,

[01:30:02.280 --> 01:30:09.020] but a survey of 48 coastal cities being significantly high, like sinking significantly more than

[01:30:09.020 --> 01:30:12.140] sea level is rising, makes sense, right?

[01:30:12.140 --> 01:30:14.840] Like they're low-lying cities.

[01:30:14.840 --> 01:30:18.280] Like it's not going to track directly with sea level rise.

[01:30:18.280 --> 01:30:20.000] So why are they sinking?

[01:30:20.000 --> 01:30:24.160] Because they're actually sinking and the sea level is rising.

[01:30:24.160 --> 01:30:30.080] But we're not talking about like across a flat landmass.

[01:30:30.080 --> 01:30:33.840] They took 48 random coastal cities.

[01:30:33.840 --> 01:30:35.280] And so I'm not surprised by this.

[01:30:35.280 --> 01:30:40.440] If they said on average all cities, I would be very surprised by this.

[01:30:40.440 --> 01:30:43.960] But the fact that they took 48 coastal cities, some of them are going to be sinking in like

[01:30:43.960 --> 01:30:45.220] incredibly fast.

[01:30:45.220 --> 01:30:49.260] Like there are whole islands that are just disappearing and they're definitely not disappearing

[01:30:49.260 --> 01:30:50.800] at the rate of sea level rise.

[01:30:50.800 --> 01:30:51.800] All right.

[01:30:51.800 --> 01:30:53.240] So what's happening?

[01:30:53.240 --> 01:30:54.460] It's faster.

[01:30:54.460 --> 01:30:57.840] Other geological things that I'm not a geologist, I don't know.

[01:30:57.840 --> 01:30:59.600] But there's definitely a lot more going on than that.

[01:30:59.600 --> 01:31:02.280] So that one doesn't surprise me at all.

[01:31:02.280 --> 01:31:05.920] It's not just sea level rise, but it's also erosion and things like that.

[01:31:05.920 --> 01:31:08.280] Like they're literally falling into the water.

[01:31:08.280 --> 01:31:09.560] Not subduction, but okay.

[01:31:09.560 --> 01:31:12.860] Yeah, I think it's just like literal erosion.

[01:31:12.860 --> 01:31:15.040] And so yeah, I'm not surprised by that.

[01:31:15.040 --> 01:31:19.920] The carbon one really, well, I don't know, I'm bad with really, really big number.

[01:31:19.920 --> 01:31:25.480] It's like it was 34 billion versus, I don't know, 185, what is it, $185 a ton?

[01:31:25.480 --> 01:31:26.480] I don't know.

[01:31:26.480 --> 01:31:27.480] Yeah.

[01:31:27.480 --> 01:31:32.080] This one, it's just like that seems like a lot, but I'm not surprised that it's a lot.

[01:31:32.080 --> 01:31:36.320] So I'm going with the guys on this because I'm pretty sure if we were too late for 1.5

[01:31:36.320 --> 01:31:40.280] C, we would be like screaming even louder than we are right now.

[01:31:40.280 --> 01:31:44.360] I think we've passed a degree.

[01:31:44.360 --> 01:31:50.560] I think I remember that benchmark of passing one degree, but I think the bell has been

[01:31:50.560 --> 01:31:53.240] we cannot let ourselves get to 1.5.

[01:31:53.240 --> 01:31:54.380] We cannot.

[01:31:54.380 --> 01:31:55.520] If we do, it's going to be bad.

[01:31:55.520 --> 01:31:58.440] And if we get to two, like forget about it, game over.

[01:31:58.440 --> 01:32:02.260] I think that's sort of those are the benchmarks in my mind.

[01:32:02.260 --> 01:32:03.800] So I'm going to go with the guys.

[01:32:03.800 --> 01:32:07.280] So sorry, our new friend, but we'll see.

[01:32:07.280 --> 01:32:08.280] Maybe you're the lone wolf.

[01:32:08.280 --> 01:32:10.280] Yeah, you could sweep us there.

[01:32:10.280 --> 01:32:15.640] Yeah, David is a lone wolf with number one about the sinking and the rest of you, the

[01:32:15.640 --> 01:32:22.440] rogues are think that the there's still time for 1.5 C. So you all agree that a recent

[01:32:22.440 --> 01:32:26.460] study estimates the total social cost of releasing carbon into the atmosphere at one hundred

[01:32:26.460 --> 01:32:31.200] and eighty five dollars per ton, which is triple the current US government estimate.

[01:32:31.200 --> 01:32:35.720] And for reference, the world emits greater than 34 billion tons of CO2 each year.

[01:32:35.720 --> 01:32:40.840] You all think this one is science and this one is science.

[01:32:40.840 --> 01:32:42.920] Yeah, that's about what you would expect.

[01:32:42.920 --> 01:32:45.000] That's the numbers right in there.

[01:32:45.000 --> 01:32:48.160] Well, it's not what the government expected, but OK, it's more.

[01:32:48.160 --> 01:32:50.160] It's more than what was being previously estimated.

[01:32:50.160 --> 01:32:51.160] Yes, triple.

[01:32:51.160 --> 01:32:58.320] It's three times the previous sort of estimated total cost because it's getting worse faster.

[01:32:58.320 --> 01:33:03.760] And because the consequences are hitting earlier and they're going to be worse than we thought.

[01:33:03.760 --> 01:33:04.760] Right.

[01:33:04.760 --> 01:33:10.600] I mean, everything is going to be more expensive and it's going to be more drought and fires

[01:33:10.600 --> 01:33:14.980] and displaced people and climate refugees, et cetera.

[01:33:14.980 --> 01:33:19.000] So yeah, the cost is massive, massive cost.

[01:33:19.000 --> 01:33:20.000] All right.

[01:33:20.000 --> 01:33:21.000] Let's go back to number one.

[01:33:21.000 --> 01:33:25.880] A survey of 48 coastal cities finds that they are sinking at an average rate of 16.2 millimeters

[01:33:25.880 --> 01:33:29.340] per year, with the fastest at 43 millimeters per year.

[01:33:29.340 --> 01:33:33.240] And for reference, average global sea level rise is three point seven millimeters per

[01:33:33.240 --> 01:33:34.240] year.

[01:33:34.240 --> 01:33:36.320] David, you think this one is a fiction.

[01:33:36.320 --> 01:33:38.440] Everyone else thinks this one is science.

[01:33:38.440 --> 01:33:41.720] And this one is science.

[01:33:41.720 --> 01:33:42.720] This is science.

[01:33:42.720 --> 01:33:43.720] This is happening.

[01:33:43.720 --> 01:33:44.720] So, yeah.

[01:33:44.720 --> 01:33:47.520] So why is that happening?

[01:33:47.520 --> 01:33:50.040] There are a few reasons.

[01:33:50.040 --> 01:33:53.200] One is none of them are the ones that Kara listed.

[01:33:53.200 --> 01:33:56.160] None of them have anything to do with anything Kara said.

[01:33:56.160 --> 01:34:00.760] Subduction, erosion, constipation, none of this.

[01:34:00.760 --> 01:34:04.960] So these are these are I don't think they were totally picked at random.

[01:34:04.960 --> 01:34:06.680] These are these are big coastal cities, right?

[01:34:06.680 --> 01:34:10.440] They had to be within like 30 miles of the coast and they have to be population greater

[01:34:10.440 --> 01:34:14.200] than two million, I think, with the criteria that they used.

[01:34:14.200 --> 01:34:17.560] So one of the reasons is that just cities are heavy.

[01:34:17.560 --> 01:34:22.400] I mean, literally, like when you're building so many, the buildings and the cement and

[01:34:22.400 --> 01:34:27.400] everything just weighs down the ground underneath it and it's just pushing, pushing the ground

[01:34:27.400 --> 01:34:29.920] sinking because they're sinking.

[01:34:29.920 --> 01:34:34.680] Have you guys have you guys heard of the leaning the leaning tower of Soma in San Francisco?

[01:34:34.680 --> 01:34:35.680] Oh, absolutely.

[01:34:35.680 --> 01:34:36.680] Yes.

[01:34:36.680 --> 01:34:40.680] Oh, that's a big luxury condominium building built like six years ago, right?

[01:34:40.680 --> 01:34:41.680] Seven years ago.

[01:34:41.680 --> 01:34:42.680] Yeah.

[01:34:42.680 --> 01:34:48.400] They were all like multi-million dollar condos and I guess they did not put the foundation

[01:34:48.400 --> 01:34:49.400] deep enough.

[01:34:49.400 --> 01:34:51.760] And it has been sinking every year since it was built.

[01:34:51.760 --> 01:34:54.600] I think it's something like 14 inches.

[01:34:54.600 --> 01:34:55.600] Yeah.

[01:34:55.600 --> 01:34:56.600] Yes.

[01:34:56.600 --> 01:34:57.600] So there's other reasons.

[01:34:57.600 --> 01:35:02.800] One of them are that we're pulling a lot of stuff out of the ground, you know, gas, oil,

[01:35:02.800 --> 01:35:03.800] water.

[01:35:03.800 --> 01:35:04.800] Yeah.

[01:35:04.800 --> 01:35:05.800] And it occurred to me.

[01:35:05.800 --> 01:35:06.800] Yeah.

[01:35:06.800 --> 01:35:09.640] So when you pull that stuff out of the ground, it sort of settles down to fill in those gaps,

[01:35:09.640 --> 01:35:10.640] right?

[01:35:10.640 --> 01:35:12.200] Especially if there's a lot of weight on top.

[01:35:12.200 --> 01:35:16.540] But yeah, and the point of this study was to see like how much of these coastal cities

[01:35:16.540 --> 01:35:20.040] at risk for rising oceans.

[01:35:20.040 --> 01:35:25.520] And of course, if you know, on average, the cities are sinking at five times, four to

[01:35:25.520 --> 01:35:31.880] five times the rate as sea level rise, obviously that makes them much more vulnerable to the

[01:35:31.880 --> 01:35:32.880] sea level rise.

[01:35:32.880 --> 01:35:33.880] Yeah.

[01:35:33.880 --> 01:35:39.000] So that just compounds the whole problem, which means that the latest climate models

[01:35:39.000 --> 01:35:43.320] indicate that even with rapid decarbonization, it is too late to prevent eventual warming

[01:35:43.320 --> 01:35:47.960] greater than 1.5 degrees Celsius above pre-industrial levels.

[01:35:47.960 --> 01:35:49.280] That one is the fiction.

[01:35:49.280 --> 01:35:53.000] I was hoping to get your doom and gloomism, Bob.

[01:35:53.000 --> 01:35:56.480] I was really hoping to get you.

[01:35:56.480 --> 01:36:02.240] Like the movie Shazam, I could swear I read something say 1.5 was gone.

[01:36:02.240 --> 01:36:03.240] I mean, I really could.

[01:36:03.240 --> 01:36:04.640] I definitely have read stuff like that.

[01:36:04.640 --> 01:36:08.880] But I think the caveat is usually it's technically feasible, but nobody thinks we're going to

[01:36:08.880 --> 01:36:09.880] avoid it practically.

[01:36:09.880 --> 01:36:12.880] Shazam was a documentary.

[01:36:12.880 --> 01:36:19.060] It's technically feasible, but it's just unlikely.

[01:36:19.060 --> 01:36:32.220] But if we manage to get to net zero by 2040, we could keep below 1.5 C. So that's 18 years

[01:36:32.220 --> 01:36:33.220] from now.

[01:36:33.220 --> 01:36:36.840] It's technically feasible, but we're not headed there.

[01:36:36.840 --> 01:36:38.000] That's the thing you probably heard.

[01:36:38.000 --> 01:36:39.440] We're not headed there.

[01:36:39.440 --> 01:36:45.240] We're not trending to keep it below 1.5 C. That would require dramatically increasing

[01:36:45.240 --> 01:36:50.620] the rate at which we're converting our industries over and the investment that we're making

[01:36:50.620 --> 01:36:54.600] and the regulations that we're tweaking, et cetera.

[01:36:54.600 --> 01:37:01.660] But it's not out of the realm of possibility quite yet.

[01:37:01.660 --> 01:37:07.960] But I do think being realistic, we're probably going to blow past 1.5 C.

[01:37:07.960 --> 01:37:10.240] Is that what you're calling it now, realistic?

[01:37:10.240 --> 01:37:15.840] But honestly, here's the thing.

[01:37:15.840 --> 01:37:22.840] If you keep to the Paris Accords, we will keep below 1.5 C, but we're not.

[01:37:22.840 --> 01:37:34.080] But if they were, and the next big milestone is 2.0 C. But again, that's kind of an arbitrary

[01:37:34.080 --> 01:37:35.080] whole number.

[01:37:35.080 --> 01:37:39.720] The bottom line is you want to keep it as low as possible, 1.6, 1.7, whatever.

[01:37:39.720 --> 01:37:41.240] We got to keep it as low as possible.

[01:37:41.240 --> 01:37:45.880] 1.5 would be nice because that would be more of a guarantee that we would avoid some of

[01:37:45.880 --> 01:37:46.960] the worst tipping points.

[01:37:46.960 --> 01:37:52.200] But again, if I were a betting man, I would not bet on 1.5 C. But I think keeping it below

[01:37:52.200 --> 01:37:54.600] 2.0 C is probably a good bet.

[01:37:54.600 --> 01:37:57.400] I think that's probably going to land somewhere in there.

[01:37:57.400 --> 01:38:00.640] But that's catastrophic, right?

[01:38:00.640 --> 01:38:01.640] It's still bad.

[01:38:01.640 --> 01:38:04.760] 2 C is really bad.

[01:38:04.760 --> 01:38:07.520] Like there's a big difference between 1.5 and 2.

[01:38:07.520 --> 01:38:08.520] Yes.

[01:38:08.520 --> 01:38:12.000] It's kind of the steep part of the curve, although again, there's a lot of uncertainty

[01:38:12.000 --> 01:38:13.000] there.

[01:38:13.000 --> 01:38:15.640] But there's uncertainty, but all around bad, right?

[01:38:15.640 --> 01:38:16.640] Right.

[01:38:16.640 --> 01:38:17.640] How bad?

[01:38:17.640 --> 01:38:18.640] That's all.

[01:38:18.640 --> 01:38:19.640] Right.

[01:38:19.640 --> 01:38:23.080] Some of the regulations in regard to this stuff is so screwy.

[01:38:23.080 --> 01:38:28.520] I just read this article the other day about how when car companies come out with more

[01:38:28.520 --> 01:38:37.240] EVs that they can actually make their other gas guzzlers more gas guzzly because the regulations,

[01:38:37.240 --> 01:38:42.720] they don't apply to like if you make like a Suburban, the regulation doesn't say like

[01:38:42.720 --> 01:38:44.780] it has to have this miles per gallon.

[01:38:44.780 --> 01:38:49.880] It's their whole fleet of all the cars that they're making has to have an average.

[01:38:49.880 --> 01:38:53.880] So actually coming out with EVs means they can be coming out with bigger gas guzzlers

[01:38:53.880 --> 01:38:57.600] and still meet the standards, which is so screwy.

[01:38:57.600 --> 01:38:58.600] Right.

[01:38:58.600 --> 01:38:59.600] Awful.

[01:38:59.600 --> 01:39:00.600] Yeah.

[01:39:00.600 --> 01:39:03.360] But we were talking about this last week.

[01:39:03.360 --> 01:39:05.400] There are some positive things.

[01:39:05.400 --> 01:39:06.720] There was actually just a really good article.

[01:39:06.720 --> 01:39:13.880] I think just today in the Washington Post, maybe it was yesterday by an expert who was

[01:39:13.880 --> 01:39:14.880] reviewing the situation.

[01:39:14.880 --> 01:39:20.160] They were like, if you look at the industry, the industry has fundamentally changed in

[01:39:20.160 --> 01:39:27.080] the last year, especially with with the Inflation Reduction Act, the other recent regulation.

[01:39:27.080 --> 01:39:33.400] What is the all the industry basically not just automobile, but the but other industries

[01:39:33.400 --> 01:39:39.400] that that produce carbon and, you know, the the energy and, you know, sector.

[01:39:39.400 --> 01:39:41.680] It's all basically it.

[01:39:41.680 --> 01:39:48.520] The flip has been from thinking of climate change mitigation as an expense and a money

[01:39:48.520 --> 01:39:54.480] loser to think that's thinking about it as a job creator and an opportunity.

[01:39:54.480 --> 01:40:00.600] That's been really a fundamental shift that's happened with these regulations and and companies

[01:40:00.600 --> 01:40:08.640] are now throwing billions of dollars, you know, towards transitioning to to lower carbon,

[01:40:08.640 --> 01:40:13.920] you know, industry processes, whether it's, you know, cars or energy production or whatever,

[01:40:13.920 --> 01:40:16.160] because they see that now as an opportunity.

[01:40:16.160 --> 01:40:19.160] So that's that was the that was the flip we needed to happen.

[01:40:19.160 --> 01:40:20.160] Right.

[01:40:20.160 --> 01:40:26.080] The where the economic incentives are in favor of climate mitigation, not not thought of

[01:40:26.080 --> 01:40:29.120] as a sacrifice we have to make.

[01:40:29.120 --> 01:40:32.040] It's like, no, this is a good idea that's going to actually make us money.

[01:40:32.040 --> 01:40:33.040] Right.

[01:40:33.040 --> 01:40:34.040] It's such a win win win.

[01:40:34.040 --> 01:40:35.560] It's an investment in the future.

[01:40:35.560 --> 01:40:40.180] It's going to position us really well to be outcompete our our competitors, et cetera.

[01:40:40.180 --> 01:40:42.920] So that's I that gives me some hope again.

[01:40:42.920 --> 01:40:47.040] I still, you know, I'd be pleasantly surprised if things really accelerate to the point where

[01:40:47.040 --> 01:40:50.960] we can keep a low one point five and I'm not going to give up on it, you know, just as

[01:40:50.960 --> 01:40:52.200] a just a political thing.

[01:40:52.200 --> 01:40:58.160] But yeah, it's just we've made it really, really hard to keep to that level.

[01:40:58.160 --> 01:40:59.600] All right.

[01:40:59.600 --> 01:41:00.600] Good job, guys.

[01:41:00.600 --> 01:41:04.440] David, it was your first time, you know, did you did a good job of reasoning your way through

[01:41:04.440 --> 01:41:05.440] it.

[01:41:05.440 --> 01:41:06.440] This was a tricky one.

[01:41:06.440 --> 01:41:07.440] Mm hmm.

[01:41:07.440 --> 01:41:08.440] You got me on that pessimism.

Skeptical Quote of the Week (1:41:09)

In the field of thinking, the whole history of science – from geocentrism to the Copernican revolution, from the false absolutes of Aristotle's physics to the relativity of Galileo's principle of inertia and to Einstein's theory of relativity – shows that it has taken centuries to liberate us from the systematic errors, from the illusions caused by the immediate point of view as opposed to "decentered" systematic thinking.

Jean Piaget (1896-1980), Swiss psychologist

[01:41:08.440 --> 01:41:09.440] All right.

[01:41:09.440 --> 01:41:15.360] Evan, give us a quote in the field of thinking the whole history of science from geocentrism

[01:41:15.360 --> 01:41:20.680] to the comparison revolution, from the false absolutes of Aristotle's physics to the

[01:41:20.680 --> 01:41:27.480] relativity of Galileo's principle of inertia and to Einstein's theory of relativity shows

[01:41:27.480 --> 01:41:33.180] that it has taken centuries to liberate us from the systematic errors, from the illusions

[01:41:33.180 --> 01:41:38.960] caused by the immediate point of view, as opposed to the decentered, since thematic

[01:41:38.960 --> 01:41:39.960] thinking.

[01:41:39.960 --> 01:41:40.960] Jean Piaget.

[01:41:40.960 --> 01:41:48.320] I know that was a little bit just say, well, you know who Jean Piaget is, right?

[01:41:48.320 --> 01:41:49.320] Of course.

[01:41:49.320 --> 01:41:50.320] I'm psychology.

[01:41:50.320 --> 01:41:51.320] Yeah.

[01:41:51.320 --> 01:41:52.320] I didn't.

[01:41:52.320 --> 01:41:53.320] That would be bad.

[01:41:53.320 --> 01:41:55.500] You'd be like, well, Kara, what have you been doing in your degree?

[01:41:55.500 --> 01:41:59.880] The takeaway is that it's taken centuries to liberate us from the systematic errors

[01:41:59.880 --> 01:42:02.940] and the illusions caused by immediate points of view.

[01:42:02.940 --> 01:42:07.760] So these basically, yeah, these things take time.

[01:42:07.760 --> 01:42:13.580] It takes time for us to make all the the corrections that need to be made so that we do have a

[01:42:13.580 --> 01:42:15.920] better understanding of the universe around us.

[01:42:15.920 --> 01:42:20.200] Yeah, it's very meta, which I like in that he's stepping back and saying, you know,

[01:42:20.200 --> 01:42:26.420] one of the trends, big picture, broad brushstroke trends in intellectual thinking over the course

[01:42:26.420 --> 01:42:31.440] of human history is that we've progressively gone from looking at what's right in front

[01:42:31.440 --> 01:42:36.880] of us, both literally and metaphorically, like logically, like just the immediate most

[01:42:36.880 --> 01:42:44.280] apparent thing to thinking in much more abstract, systematic, institutional ways, you know,

[01:42:44.280 --> 01:42:47.680] and even the counterintuitive, less obvious things.

[01:42:47.680 --> 01:42:50.480] That's been a general trend and it's a good thing.

[01:42:50.480 --> 01:42:56.960] It's the only because the that narrow, immediate thinking is a intellectual prison, right?

[01:42:56.960 --> 01:43:01.680] It sort of traps you in thinking very small, which is not a good way to understand the

[01:43:01.680 --> 01:43:03.760] universe.

[01:43:03.760 --> 01:43:10.400] It's fascinating that this was his perspective, because I think at first blush, a lot of people

[01:43:10.400 --> 01:43:18.160] just simply associate Piaget with like, models and theories of child development.

[01:43:18.160 --> 01:43:25.300] That's like, that's like a child psychologist, but his interest in children and in children's

[01:43:25.300 --> 01:43:30.760] ability to learn and in children's ability to make generational change was like fundamental

[01:43:30.760 --> 01:43:34.940] to how he wrote like he was really interested in epistemology, and how children are the

[01:43:34.940 --> 01:43:38.960] carriers of the change of how we know what we know.

[01:43:38.960 --> 01:43:44.000] So that really tracks with that quote in a kind of beautiful way.

[01:43:44.000 --> 01:43:45.000] Yeah.

[01:43:45.000 --> 01:43:46.000] Cool.

Signoff

[01:43:46.000 --> 01:43:47.000] All right.

[01:43:47.000 --> 01:43:48.000] Well, David, thanks for joining us on the show.

[01:43:48.000 --> 01:43:49.000] Thanks, man.

[01:43:49.000 --> 01:43:50.000] We really enjoyed having you.

[01:43:50.000 --> 01:43:51.000] Hey, David.

[01:43:51.000 --> 01:43:52.000] Thanks, David.

[01:43:52.000 --> 01:43:53.000] Thank you guys for having me.

[01:43:53.000 --> 01:43:54.000] Thank you for all your support.

[01:43:54.000 --> 01:43:55.000] We appreciate it.

[01:43:55.000 --> 01:43:56.000] Yeah, absolutely.

[01:43:56.000 --> 01:43:57.000] Absolutely.

[01:43:57.000 --> 01:43:58.000] And thank you guys for joining me again this week.

[01:43:58.000 --> 01:43:59.000] Thank you, Doctor.

[01:43:59.000 --> 01:44:00.000] Sure, man.

[01:44:00.000 --> 01:44:01.000] Of course. S: —and until next week, this is your Skeptics' Guide to the Universe.

S: Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information, visit us at theskepticsguide.org. Send your questions to info@theskepticsguide.org. And, if you would like to support the show and all the work that we do, go to patreon.com/SkepticsGuide and consider becoming a patron and becoming part of the SGU community. Our listeners and supporters are what make SGU possible.

[top]                        

Today I Learned

  • Fact/Description, possibly with an article reference[11]
  • Fact/Description
  • Fact/Description

Notes

References

Vocabulary


Navi-previous.png Back to top of page Navi-next.png