SGU Episode 900

From SGUTranscripts
Jump to navigation Jump to search
  Emblem-pen-green.png This transcript is not finished. Please help us finish it!
Add a Transcribing template to the top of this transcript before you start so that we don't duplicate your efforts.

Template:Editing required (w/links) You can use this outline to help structure the transcription. Click "Edit" above to begin.


SGU Episode 900
October 8th 2022
900 silk moth.jpg

The silk moth, whose complete
pan-genome has been published[1]

SGU 899                      SGU 901

Skeptical Rogues
S: Steven Novella

B: Bob Novella

C: Cara Santa Maria

J: Jay Novella

E: Evan Bernstein

Quote of the Week

No one undertakes research in physics with the intention of winning a prize. It is the joy of discovering something no one knew before.

Stephen Hawking, English theoretical physicist

Links
Download Podcast
Show Notes
Forum Discussion

Introduction, hurricanes and earthquakes

Voice-over: You're listening to the Skeptics' Guide to the Universe, your escape to reality. [00:08.920 --> 00:12.760] Hello and welcome to the Skeptic's Guide to the Universe.

[00:12.760 --> 00:17.440] Today is Thursday, October 6th, 2022, and this is your host, Steve Novella.

[00:17.440 --> 00:19.200] Joining me this week are Bob Novella,

[00:19.200 --> 00:20.200] Hey everybody.

[00:20.200 --> 00:21.200] Kara Santamaria,

[00:21.200 --> 00:22.200] Howdy.

[00:22.200 --> 00:23.200] Jay Novella,

[00:23.200 --> 00:24.200] Hey guys.

[00:24.200 --> 00:25.200] And Evan Bernstein.

[00:25.200 --> 00:27.080] Good evening folks.

[00:27.080 --> 00:33.280] So 900, 900, 900, 900, wait, what?

[00:33.280 --> 00:35.600] This is SGU episode 900, nine zero zero.

[00:35.600 --> 00:36.600] How did that happen?

[00:36.600 --> 00:39.120] I know, it kind of snuck up on you.

[00:39.120 --> 00:40.120] After 899, you know.

[00:40.120 --> 00:41.120] Who would have thought of that?

[00:41.120 --> 00:46.020] Yeah, we're not going to make a big deal out of it.

[00:46.020 --> 00:50.000] But what this means is that we're just a little bit over two years away from 1000.

[00:50.000 --> 00:53.160] I think we're going to have to do something big.

[00:53.160 --> 00:54.160] Something big.

[00:54.160 --> 00:55.720] It's going to have to be something big.

[00:55.720 --> 00:57.400] Like record naked or something.

[00:57.400 --> 01:01.360] Yeah, like can it be, can it be something big, but also something that's not exhausting?

[01:01.360 --> 01:03.560] I don't know, Kara, that's a big ask.

[01:03.560 --> 01:04.560] Can we find that, pal?

[01:04.560 --> 01:05.560] Like a 48-hour episode?

[01:05.560 --> 01:06.560] No!

[01:06.560 --> 01:12.200] We will come up with something to do for 1000, but.

[01:12.200 --> 01:13.200] What about a destination?

[01:13.200 --> 01:14.200] Can we make it?

[01:14.200 --> 01:15.200] Ooh.

[01:15.200 --> 01:16.200] I like that.

[01:16.200 --> 01:17.200] Going somewhere?

[01:17.200 --> 01:18.200] Me too.

[01:18.200 --> 01:19.200] Disney World?

[01:19.200 --> 01:20.200] No, think bigger, Bob.

[01:20.200 --> 01:21.200] Yeah, right?

[01:21.200 --> 01:22.200] Yeah.

[01:22.200 --> 01:23.200] Okay.

[01:23.200 --> 01:24.200] Disney World during Halloween.

[01:24.200 --> 01:29.280] Two years, two years to plan.

[01:29.280 --> 01:30.520] We're open to suggestions.

[01:30.520 --> 01:31.520] Two years.

[01:31.520 --> 01:32.520] Yeah, we can 20.

[01:32.520 --> 01:37.840] It should be around the, ooh, Labor Day of 2024, right?

[01:37.840 --> 01:38.840] You'll be 2024.

[01:38.840 --> 01:39.840] Late 2024.

[01:39.840 --> 01:40.840] Yeah.

[01:40.840 --> 01:43.880] Kara, how did you survive your hurricane?

[01:43.880 --> 01:44.880] I survived.

[01:44.880 --> 01:50.400] It was kind of weird because before it actually made landfall, I'm trying to remember.

[01:50.400 --> 01:52.340] So Hurricane Ian, it was Tropical Storm Ian.

[01:52.340 --> 01:57.260] It was apparently, did you guys see this, two miles per hour shy of being a category

[01:57.260 --> 01:58.260] five.

[01:58.260 --> 01:59.260] Yes.

[01:59.260 --> 02:03.280] Like it didn't quite, it was literally like within a couple of miles of making the cutoff.

[02:03.280 --> 02:05.540] So it was still technically a category four.

[02:05.540 --> 02:10.760] But before it even made landfall, I would say about two days before it made landfall,

[02:10.760 --> 02:15.280] we had all of the storms that were spinning off of it hit where I am in South Florida

[02:15.280 --> 02:16.600] in Fort Lauderdale.

[02:16.600 --> 02:20.680] So two tornadoes touched down within a couple miles of me.

[02:20.680 --> 02:26.320] All night there were these like tornado warnings going off on my phone and just like a lot

[02:26.320 --> 02:27.740] of like severe weather warnings.

[02:27.740 --> 02:28.740] And they're so funny.

[02:28.740 --> 02:32.280] They're like, seek shelter in basement immediately, which is like, nobody in Florida has a basement.

[02:32.280 --> 02:34.320] Like why would you put that warning there?

[02:34.320 --> 02:35.320] Right?

[02:35.320 --> 02:36.320] Well, they got to say it.

[02:36.320 --> 02:37.320] Yeah.

[02:37.320 --> 02:38.320] Exactly.

[02:38.320 --> 02:42.440] And then our clinic was closed the next day.

[02:42.440 --> 02:45.280] I mean, they waited way too long to close it, but they did ultimately close it the

[02:45.280 --> 02:46.680] next day.

[02:46.680 --> 02:48.600] But the weather was just windy.

[02:48.600 --> 02:49.880] Nothing really happened here.

[02:49.880 --> 02:51.040] It didn't really even rain.

[02:51.040 --> 02:53.680] And then the day that it made landfall, I think it was either later that day or early

[02:53.680 --> 02:59.480] the next morning that it made landfall, it just devastated Fort Myers on the West Coast

[02:59.480 --> 03:00.480] and just pummeled.

[03:00.480 --> 03:01.760] Like people, a lot of people died.

[03:01.760 --> 03:03.200] Like it was terrible.

[03:03.200 --> 03:06.960] So much property damage, but we didn't see any of it where I am.

[03:06.960 --> 03:08.520] And that's the thing about hurricanes.

[03:08.520 --> 03:10.600] They're just, they're predictable.

[03:10.600 --> 03:15.160] They're more predictable than tornadoes, but they can just turn on a dime and they can

[03:15.160 --> 03:16.160] pick up speed.

[03:16.160 --> 03:17.500] They can drop speed.

[03:17.500 --> 03:21.880] The forecasts are amazing, but they also have a mind of their own.

[03:21.880 --> 03:26.040] And then I had this whole travel saga where I was going back to LA that weekend and it

[03:26.040 --> 03:28.560] was not even related to the hurricane, which was extra obnoxious.

[03:28.560 --> 03:31.480] It was related to emergency lights on the floor of the plane.

[03:31.480 --> 03:32.480] Kill me.

[03:32.480 --> 03:33.480] I was delayed 14 hours.

[03:33.480 --> 03:38.880] But I was following, yeah, it was bad, but I was following the news and this was amazing.

[03:38.880 --> 03:45.200] So as Ian made landfall, it like did all this destruction and damage because it was a super

[03:45.200 --> 03:46.200] powerful storm.

[03:46.200 --> 03:50.160] And then of course it starts to lose energy and it was downgraded to a tropical storm

[03:50.160 --> 03:54.180] as it moved inland and like dumped a bunch of water on a bunch of inland cities.

[03:54.180 --> 03:59.300] After it crossed over Florida and hit water again, it picked back up and became a category

[03:59.300 --> 04:03.960] one or two, maybe just a one, but it was recategorized as a hurricane again.

[04:03.960 --> 04:05.440] It was that powerful.

[04:05.440 --> 04:06.440] That's like nuts.

[04:06.440 --> 04:07.440] It refueled itself.

[04:07.440 --> 04:08.440] Yeah.

[04:08.440 --> 04:09.440] It was kind of warm water.

[04:09.440 --> 04:10.440] Yeah.

[04:10.440 --> 04:13.040] Like that's, that's, that's a pretty intense storm.

[04:13.040 --> 04:17.160] I mean, this is like, was the worst one that hit Florida in decades, right?

[04:17.160 --> 04:21.840] I think, well, what I read was that it was the worst since 2018, which to me doesn't

[04:21.840 --> 04:23.680] seem like it was that long ago.

[04:23.680 --> 04:28.000] But I think they were saying it could have been, or it was almost like the fifth worst

[04:28.000 --> 04:30.300] storm in America.

[04:30.300 --> 04:32.880] Like there were, it was back when they were doing all the forecasts, they were like, this

[04:32.880 --> 04:34.040] is gearing up to be X.

[04:34.040 --> 04:39.560] And so I haven't seen the final takeaway of like, you know, the ranking system of like

[04:39.560 --> 04:41.880] worst storms to hit Florida, worst storms to hit America.

[04:41.880 --> 04:47.560] Of course, most of the terrible storms to hit America hit Florida, but also Puerto Rico,

[04:47.560 --> 04:51.640] also Texas, and some nor'easters in the Northeast.

[04:51.640 --> 04:54.540] But Florida gets pummeled by hurricanes all the time.

[04:54.540 --> 04:56.440] And you should have seen the people I work with down here.

[04:56.440 --> 04:58.480] Like I was like, uh, are we going to be okay?

[04:58.480 --> 04:59.480] And they're like, this is nothing.

[04:59.480 --> 05:00.480] You'll be fine.

[05:00.480 --> 05:01.480] They're like so.

[05:01.480 --> 05:02.480] They're like, whatever.

[05:02.480 --> 05:05.040] They probably won't even cancel school.

[05:05.040 --> 05:06.040] And they almost didn't.

[05:06.040 --> 05:08.580] I was like, what is happening?

[05:08.580 --> 05:11.600] Is that like earthquake culture in California similar?

[05:11.600 --> 05:13.080] Well, just get used to it.

[05:13.080 --> 05:17.260] And they're so used to everybody freaking out and then it like not hitting them.

[05:17.260 --> 05:19.720] But the sad thing is it probably does hit somewhere.

[05:19.720 --> 05:23.200] And so people who have actually lived through the devastation, like the people in Fort Myers

[05:23.200 --> 05:26.560] who lost everything, I don't think they have that casual view.

[05:26.560 --> 05:30.800] It's the people who have been so lucky that they've like just dodged a lot of bullets.

[05:30.800 --> 05:31.800] But I don't know.

[05:31.800 --> 05:32.800] I have the same thing with tornadoes.

[05:32.800 --> 05:36.040] Like my mom was texting me like, get in the bathtub, grab kitler, get in the bathtub.

[05:36.040 --> 05:38.440] I was like, mom, how many tornadoes warnings have we lived through?

[05:38.440 --> 05:40.920] I never got in the bathtub when we were little.

[05:40.920 --> 05:41.920] What are you doing?

[05:41.920 --> 05:42.920] Get in the bathtub.

[05:42.920 --> 05:43.920] I know.

[05:43.920 --> 05:44.920] It just sounds so ridiculous.

[05:44.920 --> 05:45.920] I know.

[05:45.920 --> 05:46.920] She's like, are you safe?

[05:46.920 --> 05:47.920] And I'm like, I don't know.

[05:47.920 --> 05:48.920] I'm in my apartment.

[05:48.920 --> 05:49.920] She's like, can you get to a lower level?

[05:49.920 --> 05:50.920] I'm like, what?

[05:50.920 --> 05:51.920] Go to somebody else's apartment?

[05:51.920 --> 05:52.920] Like, no.

[05:52.920 --> 05:57.080] So you have to get into the bathtub, but you also have to bring your mattress with you

[05:57.080 --> 05:59.040] and cover yourself with your mattress.

[05:59.040 --> 06:02.360] Which is like the funniest thing because I don't know about you, but I can't lift my

[06:02.360 --> 06:03.360] mattress.

[06:03.360 --> 06:04.360] Right.

[06:04.360 --> 06:05.360] It's so...

[06:05.360 --> 06:06.360] It's not very practical for everyone.

[06:06.360 --> 06:07.360] It's putting your backpack.

[06:07.360 --> 06:08.360] Right.

[06:08.360 --> 06:14.440] I mean, you have all these people like suffocating under their mattresses.

[06:14.440 --> 06:16.000] That was not the intention.

[06:16.000 --> 06:17.480] Oh, my God.

[06:17.480 --> 06:18.480] Jeez.

What's the Word? (6:20)

[06:18.480 --> 06:22.200] All right, Carrie, you're going to start us off with a what's the word?

[06:22.200 --> 06:23.200] What is the word?

[06:23.200 --> 06:26.680] Well, this week I figured we would do a bit of a refresher and I actually don't remember

[06:26.680 --> 06:30.200] because I keep a log of all the what's the words we've ever done.

[06:30.200 --> 06:33.760] And I was positive that we had done this one before, but maybe a long time ago, but it's

[06:33.760 --> 06:34.760] not in my list.

[06:34.760 --> 06:35.760] So I don't know.

[06:35.760 --> 06:36.760] You guys tell me if you remember doing this.

[06:36.760 --> 06:39.520] If not, either way, a refresher is important here.

[06:39.520 --> 06:45.800] I wanted to talk about the word hominin as opposed to hominid as opposed to hominoid.

[06:45.800 --> 06:48.040] And I know we've all heard all of these words.

[06:48.040 --> 06:52.440] I want to talk a little bit about the origin of these words.

[06:52.440 --> 06:57.720] Back in the day, you probably heard the word hominid all the time.

[06:57.720 --> 07:04.080] Like hominid was the main word that we would use in scientific parlance.

[07:04.080 --> 07:09.960] And within, I'd say about the past decade, we started increasingly seeing the word hominin

[07:09.960 --> 07:17.000] show up first within sort of circles of people who do this stuff for a living and then later

[07:17.000 --> 07:19.960] within like science communication and more sort of like outward things.

[07:19.960 --> 07:23.680] So I just want to kind of give you guys a little bit of background about what these

[07:23.680 --> 07:27.000] words are, what they mean, why we use them the way that we use them.

[07:27.000 --> 07:33.420] So one way to sort of think about this is that hominid, which is the word that we used

[07:33.420 --> 07:36.440] to use all the time, and we still do use, but it means something different.

[07:36.440 --> 07:39.800] They're all current and extinct great apes.

[07:39.800 --> 07:43.560] So pretty much anything you can think of that's a great ape, a human, a gorilla, an orangutan,

[07:43.560 --> 07:49.880] a chimpanzee, a bonobo, homo erectus, homo erectus, yeah, like anything that is current

[07:49.880 --> 07:50.880] or extinct.

[07:50.880 --> 07:51.880] Exactly.

[07:51.880 --> 07:52.980] Like you said, that's a great ape.

[07:52.980 --> 07:53.980] That's a hominid.

[07:53.980 --> 07:55.480] So it's quite, quite inclusive.

[07:55.480 --> 08:01.220] A hominoid is even more inclusive above that because it also includes gibbons, which are

[08:01.220 --> 08:02.560] lesser apes.

[08:02.560 --> 08:07.400] So it's basically all apes, not just great apes, but there is a distinction between gibbons

[08:07.400 --> 08:08.400] and the great apes.

[08:08.400 --> 08:10.960] So hominoids include even gibbons.

[08:10.960 --> 08:11.960] What about hominish?

[08:11.960 --> 08:12.960] Hominish?

[08:12.960 --> 08:13.960] Yeah.

[08:13.960 --> 08:20.440] That's for the ish, you know, actually we're going to be talking about things that are

[08:20.440 --> 08:24.440] hominish later in the show, I think.

[08:24.440 --> 08:30.200] But so hominoids, I'm going to put that one off to the side because it's not used as

[08:30.200 --> 08:34.240] often, but really the distinction here we're talking about is between hominid and hominin.

[08:34.240 --> 08:40.680] So hominid is all current and extinct great apes, but hominin specifically, it's more

[08:40.680 --> 08:49.240] narrow, specifically speaks to anything that follows along the human lineage.

[08:49.240 --> 08:54.300] And this really came about with learning about our genes because historically we were categorizing

[08:54.300 --> 09:00.320] things taxonomically by comparing anatomy, by looking at like shared features, morphology,

[09:00.320 --> 09:01.320] things like that.

[09:01.320 --> 09:04.540] But the more that we started to learn about our genes and learn about kind of where these

[09:04.540 --> 09:08.920] different genetic splits occurred, the more that we started to understand that there is

[09:08.920 --> 09:12.220] actually a difference between a hominin and a hominin.

[09:12.220 --> 09:18.120] So hominin now refers to all the species of modern humans and early humans after the split

[09:18.120 --> 09:21.920] from the common ancestor with chimpanzees about seven million years ago.

[09:21.920 --> 09:25.600] So chimps went kind of one lineage, humans went another lineage.

[09:25.600 --> 09:31.200] So all the things, all the intermediate species that happened along the way, that is a hominin.

[09:31.200 --> 09:36.440] A hominid of course also includes chimps and gorillas and orangutans.

[09:36.440 --> 09:39.480] So that split was much, much, much earlier.

[09:39.480 --> 09:40.480] Does that make sense?

[09:40.480 --> 09:42.320] Like as a bigger branching tree.

[09:42.320 --> 09:47.880] So if you look at the reasons why these names came to be, it all has to do with taxonomic

[09:47.880 --> 09:49.280] categories.

[09:49.280 --> 09:53.440] So hominoids are in a superfamily called hominoidea.

[09:53.440 --> 09:56.280] So that's again the great and the lesser apes.

[09:56.280 --> 09:59.160] Hominids are the family hominidae.

[09:59.160 --> 10:02.520] So that's all the great apes, including human beings.

[10:02.520 --> 10:08.980] Hominins, sometimes you'll see H-O-M-I-N-E-S, hominins are the subfamily homininae.

[10:08.980 --> 10:13.440] And so that's human beings and all of their relatives.

[10:13.440 --> 10:17.440] Hominins comes, oh, actually there's a distinction here too, I should make that hominins are

[10:17.440 --> 10:19.600] actually distinct from hominins.

[10:19.600 --> 10:21.640] Hominins are the subfamily homininae.

[10:21.640 --> 10:23.700] I think chimps are actually included in that.

[10:23.700 --> 10:26.400] So it's everything from the chimp split on.

[10:26.400 --> 10:31.000] And then hominins, which is the word that, you know, I opened this talking about, it's

[10:31.000 --> 10:32.540] actually a tribe name.

[10:32.540 --> 10:37.220] So it includes all bipedal apes within that human lineage.

[10:37.220 --> 10:42.840] And so you'll see that, for example, like the chimpanzee clade is Panini, or I think

[10:42.840 --> 10:46.000] it's Panini, not Panini, but it's spelled like Panini.

[10:46.000 --> 10:47.720] That's Pan, right?

[10:47.720 --> 10:51.360] We've seen like Pan troglodyte, like you've seen those like names.

[10:51.360 --> 10:54.280] And so, and then homo would be a human being.

[10:54.280 --> 10:59.200] So that's hominini or hominini, Panini, I can't even, I just said that, I love it.

[10:59.200 --> 11:01.360] I want to just start calling them Panini.

[11:01.360 --> 11:04.320] Actually I found this really funny Reddit post where this guy's talking about it and

[11:04.320 --> 11:07.780] he said, Panini, it's spelled the same as the sandwich, but it's pronounced differently

[11:07.780 --> 11:10.440] and tastes different too.

[11:10.440 --> 11:11.440] Gross.

[11:11.440 --> 11:12.440] Clever.

[11:12.440 --> 11:17.840] And so yeah, when we look at the etymology, of course, it comes from the same root word

[11:17.840 --> 11:25.800] homo, which comes back from hominini, which is related to the word, I mean, there's so

[11:25.800 --> 11:28.160] many words that have homo in them, meaning man, right?

[11:28.160 --> 11:29.640] Like homunculus, like the little man.

[11:29.640 --> 11:34.660] And then id is a, is just that word forming piece there at the end.

[11:34.660 --> 11:37.520] So hominini is how we get the word hominin.

[11:37.520 --> 11:44.080] And there was sort of a decision made within the paleogenetic world that to be more specific

[11:44.080 --> 11:48.760] when we're talking about the human lineage, we should, we should say hominin because hominid

[11:48.760 --> 11:50.540] is too inclusive.

[11:50.540 --> 11:52.760] It includes all these other great apes as well.

[11:52.760 --> 11:54.240] Did I just confuse you further?

[11:54.240 --> 11:56.280] Well, it's getting more confusing.

[11:56.280 --> 11:58.800] They're just adding names to just parse things out.

[11:58.800 --> 12:03.280] I think it's just like shifting to the cladistic approach where like we're trying to parse

[12:03.280 --> 12:08.720] out all these very specific different clades and as we get more information about the evolution

[12:08.720 --> 12:09.720] of humans.

[12:09.720 --> 12:14.920] It's like, yeah, sometimes our old stuff, it's not quite wrong, but it's like less right

[12:14.920 --> 12:15.920] as we learn more things.

[12:15.920 --> 12:17.640] And so, I mean, you're so right, Steve.

[12:17.640 --> 12:23.840] I even have a quote here from a paleoanthropologist at University College in London who said,

[12:23.840 --> 12:27.240] quote, when we start fiddling with names, everybody gets confused.

[12:27.240 --> 12:31.400] The transition over the last decade from hominids to hominins when we talk about human ancestors

[12:31.400 --> 12:32.700] has been a pain.

[12:32.700 --> 12:36.320] We've had to explain and re-explain and people still get it wrong half the time, which is

[12:36.320 --> 12:37.320] true.

[12:37.320 --> 12:39.000] I mean, it's also just hard to like learn.

[12:39.000 --> 12:43.880] If you've been in a field for a really long time and you've always used a certain lexicon

[12:43.880 --> 12:47.320] and then all of a sudden it's kind of turned on its ear, that can be difficult too.

[12:47.320 --> 12:51.680] So you've got experts within the field even using the terminology wrong because it's like,

[12:51.680 --> 12:52.680] eh, I don't know.

[12:52.680 --> 12:54.720] I wrote 17 books using that other word.

[12:54.720 --> 12:55.720] I know.

[12:55.720 --> 12:56.720] I was about to say that.

[12:56.720 --> 12:57.960] The other thing is it's not just a pain.

[12:57.960 --> 13:02.400] You suddenly render all the previously published technical studies obsolete.

[13:02.400 --> 13:05.400] And now you need like a footnote or an asterisk.

[13:05.400 --> 13:07.600] So it makes it difficult.

[13:07.600 --> 13:15.360] So I totally understand the need to make technical terms increasingly precise and to evolve with

[13:15.360 --> 13:17.400] our understanding of the science.

[13:17.400 --> 13:22.180] But they really do need to make an effort and prioritize stability as well as whenever

[13:22.180 --> 13:28.400] they can just so it doesn't cause confusion and in the literature and also just in the

[13:28.400 --> 13:30.520] public and in general discussion.

[13:30.520 --> 13:32.200] It's an all win scenario.

[13:32.200 --> 13:33.200] You just got to pick your poison.

[13:33.200 --> 13:35.200] You just got to make a proper balance.

[13:35.200 --> 13:37.760] But this has been a particularly confusing one.

[13:37.760 --> 13:41.760] And the truth of the matter is if any of these terms are going to refer to people, right?

[13:41.760 --> 13:44.340] Because we are the most specific.

[13:44.340 --> 13:48.920] And so if you're talking about people, you can use a larger umbrella term and it'll be

[13:48.920 --> 13:49.920] fine.

[13:49.920 --> 13:51.880] It's kind of like primate, for example.

[13:51.880 --> 13:54.480] Like the word primate can refer to an ape or a monkey.

[13:54.480 --> 14:00.500] But if you try and call a human being or a gorilla or a chimpanzee a monkey, then that's

[14:00.500 --> 14:02.920] actually incorrect because they're apes.

[14:02.920 --> 14:03.920] Right.

[14:03.920 --> 14:04.920] Right.

[14:04.920 --> 14:05.920] All right.

[14:05.920 --> 14:06.920] Thank you, Kara.

News Items

S:

B:

C:

J:

E:

(laughs) (laughter) (applause) [inaudible]

Nobel Prize in Chemistry (14:06)

[14:06.920 --> 14:07.920] All right, guys, you know what time of year it is again.

[14:07.920 --> 14:09.240] It's Nobel Prize time.

[14:09.240 --> 14:14.320] We got three Nobel Prizes in the sciences, chemistry, medicine and physics.

[14:14.320 --> 14:18.960] Evan, we're going to start with you with the Nobel Prize in chemistry.

[14:18.960 --> 14:19.960] Yep.

[14:19.960 --> 14:21.900] The Nobel Prize in chemistry 2022.

[14:21.900 --> 14:24.900] It is all about click chemistry.

[14:24.900 --> 14:28.820] And that's when chemists gather together into small groups and talk about themselves and

[14:28.820 --> 14:30.560] outsiders need not apply.

[14:30.560 --> 14:31.560] Hello.

[14:31.560 --> 14:32.560] Yeah.

[14:32.560 --> 14:33.560] I'm here.

[14:33.560 --> 14:34.560] My icon.

[14:34.560 --> 14:35.560] Yes.

[14:35.560 --> 14:36.560] CLIQU click.

[14:36.560 --> 14:37.560] Right.

[14:37.560 --> 14:38.560] OK.

[14:38.560 --> 14:39.560] Take two.

[14:39.560 --> 14:40.560] Click chemistry.

[14:40.560 --> 14:41.560] All right.

[14:41.560 --> 14:43.720] Three chemists are sharing the prize this year.

[14:43.720 --> 14:49.200] Barry Sharpless, Morton Meldell and Carolyn Bertozzi.

[14:49.200 --> 14:51.080] And again, it all has to do with click chemistry.

[14:51.080 --> 14:54.280] So I'm going to start with Sharpless and Meldell.

[14:54.280 --> 15:01.560] In the early 2000s, Barry Sharpless and Morton Meldell, they laid the foundation for a functional

[15:01.560 --> 15:06.480] form of chemistry, which is called now CLIQ chemistry, in which molecular building blocks

[15:06.480 --> 15:09.260] snap together quickly and efficiently.

[15:09.260 --> 15:15.680] They liken it to assembling Lego blocks onto one another, which I guess is a very simplistic

[15:15.680 --> 15:17.200] sort of way of looking at it.

[15:17.200 --> 15:19.360] But you know, it draws the picture.

[15:19.360 --> 15:20.360] Barry Sharpless.

[15:20.360 --> 15:24.120] Now, this is his second Nobel Prize in chemistry.

[15:24.120 --> 15:25.120] That is rare.

[15:25.120 --> 15:29.520] I believe I read it was only he's the fifth person to have received two Nobel Prizes.

[15:29.520 --> 15:32.160] So that right off the bat, boom.

[15:32.160 --> 15:33.160] Very rare.

[15:33.160 --> 15:34.160] He started.

[15:34.160 --> 15:35.160] He got this all going.

[15:35.160 --> 15:38.660] And around the year 2000, he coined the concept of CLIQ chemistry.

[15:38.660 --> 15:45.120] So he basically came up with this idea, which is it's simple, it's reliable and where it's

[15:45.120 --> 15:50.020] where reactions occur quickly and without the unwanted byproducts.

[15:50.020 --> 15:52.560] You can avoid all of those using CLIQ chemistry.

[15:52.560 --> 15:54.060] And then he went to work on it.

[15:54.060 --> 16:00.480] At the same time, Morton Meldell was also working on CLIQ chemistry.

[16:00.480 --> 16:03.200] So these were they were independent of each other.

[16:03.200 --> 16:11.440] But they both arrived at a moment or a discovery, which is called the copper catalyzed azide

[16:11.440 --> 16:18.960] alkene cyclo addition, which is basically a way of saying, hey, we can use copper in

[16:18.960 --> 16:26.560] order to get these these molecules to click together in a predictable way that will not

[16:26.560 --> 16:32.180] create all the randomness and or the byproducts that come with just trying to slam a bunch

[16:32.180 --> 16:36.200] of azides and alkenes together without it.

[16:36.200 --> 16:39.280] So it's and it's now in widespread use.

[16:39.280 --> 16:45.320] It was revolutionary at the time and it has taken off among its many uses.

[16:45.320 --> 16:50.240] It's utilized in the development of pharmaceuticals for mapping DNA and creating materials that

[16:50.240 --> 16:52.300] are more fit for purpose.

[16:52.300 --> 16:57.600] They discuss the material sciences, including polymers and gels.

[16:57.600 --> 17:01.500] And it's it's fascinating that they're able to do this.

[17:01.500 --> 17:06.680] So basically what happens is you take an azide, which is any class of chemical compounds containing

[17:06.680 --> 17:11.640] three nitrogen atoms as a group, and then you have your alkenes.

[17:11.640 --> 17:17.760] And they belong to the class of unsaturated aliphatic hydrocarbons, which is a carbon

[17:17.760 --> 17:19.580] carbon triple bond.

[17:19.580 --> 17:28.520] So you've got these and when you combine them using the copper, you get a triazole.

[17:28.520 --> 17:31.720] And that is C2H3N3.

[17:31.720 --> 17:35.040] It's a five membered ring of two carbon atoms and three nitrogen atoms.

[17:35.040 --> 17:36.040] And there you go.

[17:36.040 --> 17:44.360] You describe it almost like a buckle, a linking mechanism between these two molecules.

[17:44.360 --> 17:52.200] And yes, so that has been used a lot in chemistry in the in the last in the last 20 years or

[17:52.200 --> 17:53.200] so.

[17:53.200 --> 17:54.200] Where was I reading?

[17:54.200 --> 17:58.840] They said that at the time there have now been where is it?

[17:58.840 --> 18:00.880] A thousand they've come up with.

[18:00.880 --> 18:07.320] There are over a thousand papers published now that reference this technique and hundreds

[18:07.320 --> 18:13.920] of structures that have been added to the database of new of new molecular structures

[18:13.920 --> 18:18.440] that have arisen because they have been able to do this.

[18:18.440 --> 18:19.600] That's awesome.

[18:19.600 --> 18:27.940] Now the third recipient, Carolyn Bertozzi, what she did was even in a way cooler.

[18:27.940 --> 18:33.720] So because you've got copper in these in these new molecules that you're that you're

[18:33.720 --> 18:39.440] linking together, it's great for it's great for some things, but not for everything.

[18:39.440 --> 18:44.520] In other words, biology, you know, biology, if you got too much copper and you're trying

[18:44.520 --> 18:48.880] to introduce it into a living bio living biological organism, that's not good.

[18:48.880 --> 18:50.960] You can get copper toxicity as a result.

[18:50.960 --> 18:53.740] So it's really not not great for doing that.

[18:53.740 --> 19:03.000] So Carolyn Bertozzi wanted to find a way to get this idea to work in in a biological organism

[19:03.000 --> 19:05.280] and she was able to successfully do it.

[19:05.280 --> 19:11.620] So it says here that she mapped these important but elusive biomolecules on the surface of

[19:11.620 --> 19:13.220] cells.

[19:13.220 --> 19:18.120] She developed these click reactions that work inside the organisms themselves and these

[19:18.120 --> 19:23.680] bio orthogonal reactions, they take place without disrupting the normal chemistry of

[19:23.680 --> 19:30.800] the cell or these toxins that would otherwise interfere with all of the other normal chemistry

[19:30.800 --> 19:33.480] that's going on in that environment.

[19:33.480 --> 19:38.920] And I suppose it was her that coined the term bio orthogonal reactions.

[19:38.920 --> 19:42.120] She was the one basically who discovered that this can work.

[19:42.120 --> 19:48.360] And what she did in her research is she attached fluorescent molecules to the glycans to make

[19:48.360 --> 19:50.400] them easier to map.

[19:50.400 --> 19:57.080] So you take your azide, it's a sugar bound azide, and you add that to a living cell membrane

[19:57.080 --> 20:01.720] and it attaches to the sugars that live on that membrane.

[20:01.720 --> 20:07.640] And then you use a fluorescent tag, which is on the other part, the alkene.

[20:07.640 --> 20:14.640] You tag the alkene with the fluorescent tag and through that you can then get it to bind

[20:14.640 --> 20:17.980] to the azide that's already on top of the cell.

[20:17.980 --> 20:23.100] No copper needed, no contamination, no issue with toxicity.

[20:23.100 --> 20:24.100] And there you go.

[20:24.100 --> 20:30.060] You can start tagging the specific cells that you want with this fluorescent marker so that

[20:30.060 --> 20:34.960] you can see which ones are lighting up when it attaches to the specific cells that you're

[20:34.960 --> 20:35.960] targeting.

[20:35.960 --> 20:36.960] Yeah, it's cool.

[20:36.960 --> 20:43.000] Now, I tried to find though a good like operational definition of what click chemistry is.

[20:43.000 --> 20:46.320] I couldn't really find something that I found satisfying.

[20:46.320 --> 20:47.320] You know what I mean?

[20:47.320 --> 20:52.920] Like it definitely, it sounds like it's, Oh, we're going to have, you know, identify these

[20:52.920 --> 20:57.040] chemical reactions that have these features, you know, that they have high energy so they'll

[20:57.040 --> 20:58.320] happen quickly and completely.

[20:58.320 --> 21:04.820] They won't have any bad side effects, et cetera, but, but no, no like underlying chemical reason

[21:04.820 --> 21:08.800] why these particular reactions have it or anything more specific about it.

[21:08.800 --> 21:10.240] It's, you know what I mean?

[21:10.240 --> 21:14.880] It's just an approach to chemistry, I guess, like we're going to come up with a list of

[21:14.880 --> 21:20.960] these useful reactions and then try to, and then use them to make molecules essentially.

[21:20.960 --> 21:27.320] Well, in, in, in what I read certainly, Steven, I'm not very familiar with, with these fields,

[21:27.320 --> 21:31.960] obviously, but I did do some reading today and watch some videos, including from some

[21:31.960 --> 21:34.640] of the science, some of these scientists.

[21:34.640 --> 21:43.000] And I, they sort of emphasized that the predictability was something that was the common feature

[21:43.000 --> 21:50.000] or the, the, the, the idea that gave it its tag in a sense to call it part of click chemistry

[21:50.000 --> 21:52.720] or at least click chemistry that works.

[21:52.720 --> 21:58.200] And right using the copper as the buckle for this, in other words, as, as, as was first

[21:58.200 --> 22:00.360] discovered is not the only way to do it.

[22:00.360 --> 22:05.520] They have found some other ones since then, but what they seem to, again, stress is that

[22:05.520 --> 22:09.300] you can have predictable outcomes when you do it this way.

[22:09.300 --> 22:14.400] And if you, as opposed to doing it in, in other ways, and it's more, it's random or

[22:14.400 --> 22:17.480] you come up with less desirable results.

[22:17.480 --> 22:21.360] So I imagine it's just the success of the outcome that perhaps defines it.

[22:21.360 --> 22:22.360] Yeah.

[22:22.360 --> 22:24.240] It's just defined kind of in a weird way for me.

[22:24.240 --> 22:29.220] It's not like, you know, organic chemistry uses reactions that include carbon.

[22:29.220 --> 22:34.560] This is just useful chemistry says chemistry would features that we like and they figured

[22:34.560 --> 22:39.240] out a way to make it work, but, but the approach that, yeah, it does seem like it works cause

[22:39.240 --> 22:44.440] they were able to dramatically increase the number of, of new pharmaceutical and other

[22:44.440 --> 22:50.160] chemicals that they were able to, to manufacture by just following this process of using these

[22:50.160 --> 22:51.700] types of reactions.

Nobel Prize in Physics (22:55)

[22:51.700 --> 22:58.320] Like as you say, like Lego building blocks, Bob, tell us about the Nobel prize in physics.

[22:58.320 --> 23:03.800] Well guys, congratulations to the winners of the Nobel prize for physics in 2022.

[23:03.800 --> 23:09.800] John Klauser of JF Klauser and associates, Alain Aspect of the Institute, the optical

[23:09.800 --> 23:16.440] Institute of Paris, France and Anton Zeilinger of the university of Vienna in Austria.

[23:16.440 --> 23:22.160] They won for independent work that they've done over many decades, all related to quantum

[23:22.160 --> 23:29.160] mechanics, all achieving a similar and rare goal of proving Einstein wrong.

[23:29.160 --> 23:34.040] So Einstein famously had issues with the more bizarre nature of quantum mechanics, famously

[23:34.040 --> 23:38.580] referring to those parts of it as spooky action at a distance, which is very appropriate for

[23:38.580 --> 23:43.920] this month of October, of course, his perhaps more famous quote, God doesn't play dice

[23:43.920 --> 23:49.440] with the universe was also a jab at the belief that quantum mechanics showed that nature

[23:49.440 --> 23:53.720] was fundamentally uncertain, random and probabilistic.

[23:53.720 --> 23:59.560] So yeah, a lot of big towering scientists believed it, Einstein and a few others didn't.

[23:59.560 --> 24:04.880] Einstein believed that at a fundamental level, nature was knowable and concrete with none

[24:04.880 --> 24:09.920] of the impenetrable fuzziness that has been, you know, ascribed to it.

[24:09.920 --> 24:15.720] So now Einstein codified this position, you know, this fundamental nature of reality,

[24:15.720 --> 24:21.800] if you will, in his very famous EPR paper with scientists, Boris Podolsky and Nathan

[24:21.800 --> 24:24.960] Rosen, that's EPR, Einstein, Podolsky, Rosen.

[24:24.960 --> 24:29.360] In that they said quantum mechanics must be incomplete since it predicts that measuring

[24:29.360 --> 24:35.680] one particle could affect another correlated particle millions of miles away instantly.

[24:35.680 --> 24:39.720] This of course is quantum entanglement, which we've talked about many times, which kind

[24:39.720 --> 24:44.480] of became the poster boy for proving that the universe was not spooky.

[24:44.480 --> 24:48.840] Now the next chapter of the saga occurred when John Stuart Bell is a theoretical physicist

[24:48.840 --> 24:49.840] at CERN.

[24:49.840 --> 24:54.120] He famously described a thought experiment that could decide who was correct, Einstein

[24:54.120 --> 25:00.160] or quantum mechanics essentially, and this is where these newest Nobel Prize winners

[25:00.160 --> 25:04.040] come in, at least in my narrative description here.

[25:04.040 --> 25:08.820] So Dr. Clauser was the first to test John Bell's thought experiment.

[25:08.820 --> 25:14.100] Going in, Clauser thought Einstein was right and that there was a deeper layer to quantum

[25:14.100 --> 25:18.640] mechanics that could predict and do away with this apparent randomness.

[25:18.640 --> 25:24.240] So in his experiments, thousands of entangled photons were sent in opposite directions,

[25:24.240 --> 25:31.240] each detector showing randomly polarized light, either up, down, down, down, down, up, up,

[25:31.240 --> 25:32.240] up, down, up.

[25:32.240 --> 25:34.400] I mean, just like random up and down.

[25:34.400 --> 25:42.040] But when you compared the paired detectors on each end though, each one was paired, each

[25:42.040 --> 25:45.040] up was paired with a down and vice versa.

[25:45.040 --> 25:47.880] So something was definitely going on there.

[25:47.880 --> 25:51.960] Classical physics could not explain it, nor could Einstein's laws.

[25:51.960 --> 25:57.760] Weird and counterintuitive quantum entanglement had passed its first major test.

[25:57.760 --> 26:02.980] So now I'm sure Clauser was happy, but as a good scientist, he then considered loopholes

[26:02.980 --> 26:08.460] to his experiment or somebody unreasonably attached to his pet theory could also explain

[26:08.460 --> 26:10.880] that behavior, but I'll say he was a good scientist.

[26:10.880 --> 26:16.920] So he was thinking about the loopholes and how to close his experiment and basically

[26:16.920 --> 26:20.440] raise the sigma to a more acceptable level.

[26:20.440 --> 26:25.200] So one of those was the locality loophole in which the laboratory instruments themselves

[26:25.200 --> 26:28.400] might have been leaking information to each other.

[26:28.400 --> 26:32.800] Nobel prize winner number two enters this story, Alain Aspect.

[26:32.800 --> 26:38.640] He also wanted to turn Bell's thought experiment into a real boy, a real experiment.

[26:38.640 --> 26:44.300] And he Aspect targeted Clauser's locality loophole by changing the experiment to remove

[26:44.300 --> 26:45.940] it in many ways.

[26:45.940 --> 26:51.440] He changed the experiment by changing the direction that they measured the photons every

[26:51.440 --> 26:57.080] 10 nanoseconds, trying to prevent any sharing of information since the photons were already

[26:57.080 --> 26:59.600] in the air, they couldn't communicate that fast.

[26:59.600 --> 27:02.720] So that was a really big experiment that he did.

[27:02.720 --> 27:08.560] And he again showed Einstein to be wrong and also put quantum entanglement on the map in

[27:08.560 --> 27:09.840] a real sense.

[27:09.840 --> 27:17.320] He really had helped make it be even much more acceptable and to really have other scientists

[27:17.320 --> 27:24.280] consider this and try to use this for other endeavors and try to apply it, really put

[27:24.280 --> 27:25.880] it on the map, so to speak.

[27:25.880 --> 27:29.900] And then came the third Nobel winner, Anton Zeilinger.

[27:29.900 --> 27:33.400] He took Aspect's experiment and basically made it even better.

[27:33.400 --> 27:38.540] He made it even more random and unpredictable, really throwing randomness in there.

[27:38.540 --> 27:44.820] So that there was just to do away with any kind of any lingering aspects of this locality

[27:44.820 --> 27:47.200] loophole that might have been in there.

[27:47.200 --> 27:54.480] And essentially, it seems like that Zeilinger essentially closed the locality loophole.

[27:54.480 --> 28:00.560] And since then, he's been even going beyond that to try to remove any of the other minute

[28:00.560 --> 28:04.420] possibilities for bias in any of these experiments.

[28:04.420 --> 28:13.720] Now after so many years, I'm glad that they are showing these guys what they have accomplished.

[28:13.720 --> 28:20.120] Quantum entanglement and that aspect of quantum mechanics that riled up Einstein so much.

[28:20.120 --> 28:25.200] It really is a critical and fundamental aspect of quantum mechanics.

[28:25.200 --> 28:32.260] Nature is probabilistic and there is uncertainty that's unremovable.

[28:32.260 --> 28:35.300] There is that fuzziness that we will never get past.

[28:35.300 --> 28:39.480] This is like the base operating system of the universe, it really seems that way.

[28:39.480 --> 28:43.860] There are no hidden variables underneath there that would allow any type of predictability

[28:43.860 --> 28:45.880] that Einstein would have loved.

[28:45.880 --> 28:48.480] He was absolutely wrong on this.

[28:48.480 --> 28:53.600] And it's a little sad after accomplishing so much, he really just couldn't get past

[28:53.600 --> 28:58.000] this problem he had with quantum mechanics, a field that he was one of the founders of.

[28:58.000 --> 29:04.040] Even now, if you look at the research that has been applied, a billion dollars a year

[29:04.040 --> 29:10.160] have been going to develop fields related to this aspect of the universe, quantum entanglement

[29:10.160 --> 29:12.640] in fields like cryptology, quantum computing.

[29:12.640 --> 29:15.860] It is absolutely a fundamental part of quantum computing.

[29:15.860 --> 29:21.280] As we learned even more definitively, researching the book, Skeptic's Guide to the Future,

[29:21.280 --> 29:22.320] it's critical.

[29:22.320 --> 29:27.880] It even can be applied to the future quantum internet whenever that shows up in the future.

[29:27.880 --> 29:33.840] So these guys definitely deserve this prize and all three of them contributed mightily

[29:33.840 --> 29:40.080] to the acceptance and kind of proof that Einstein was wrong in this aspect of nature.

[29:40.080 --> 29:45.400] As counterintuitive and bizarre and weird it is, it's part of nature and it's not

[29:45.400 --> 29:46.400] going away.

[29:46.400 --> 29:47.400] So way to go guys.

[29:47.400 --> 29:52.640] Yeah, I mean, it's almost like about time that it's got a Nobel Prize.

[29:52.640 --> 29:57.560] So yeah, it really seemed like it was time, especially considering we've got quantum

[29:57.560 --> 30:03.640] computer research is just really accelerating and cryptology and talk of the quantum internet

[30:03.640 --> 30:11.880] really, it was one way to look at it, it was a good time to show them how important their

[30:11.880 --> 30:14.640] work was and I'm glad they did it, it's great.

[30:14.640 --> 30:19.160] Yeah, it's a good narrative about scientists being skeptical of their own work and trying

[30:19.160 --> 30:20.160] to prove it wrong.

[30:20.160 --> 30:25.220] That's what they're supposed to do is prove the theory wrong and that's what these experiments

[30:25.220 --> 30:27.000] were designed to do.

[30:27.000 --> 30:31.160] It's the only way we really can know that something may be right if it consistently

[30:31.160 --> 30:37.120] survives attempts that are capable of proving it wrong, something pseudoscientists don't

[30:37.120 --> 30:38.120] get and don't do.

[30:38.120 --> 30:39.120] Right.

[30:39.120 --> 30:40.120] They're always trying to prove their theories correct.

[30:40.120 --> 30:42.640] It's like, no, you're supposed to try to prove it wrong.

[30:42.640 --> 30:46.120] Yes, falsification, very important.

Nobel Prize in Physiology or Medicine (30:46)

[30:46.120 --> 30:52.000] All right, Kara, tell us about the Nobel Prize in Physiology or Medicine somehow tying back

[30:52.000 --> 30:54.160] to your, what's the word this week?

[30:54.160 --> 31:02.320] Yeah, so the 2022 Nobel Prize in Physiology or Medicine was officially granted to Svante

[31:02.320 --> 31:09.660] Päbo, who is a Swedish national, for his discoveries concerning the genomes of extinct

[31:09.660 --> 31:13.160] hominins and human evolution.

[31:13.160 --> 31:19.800] So that's kind of a vague title, like usually the titles are a little bit vague and then

[31:19.800 --> 31:24.880] we do a deep dive obviously into what they specifically did, but I do think it's important

[31:24.880 --> 31:31.520] to remember that the Nobel Prize, yes, is sometimes kind of given for a very specific

[31:31.520 --> 31:36.320] discovery or it is given for a very specific discovery, but for many of these individuals,

[31:36.320 --> 31:40.360] that specific discovery is the amalgamation of a life's work.

[31:40.360 --> 31:44.980] And so Svante Päbo has done a lot of really cool stuff.

[31:44.980 --> 31:45.980] So let's dive in.

[31:45.980 --> 31:51.360] Well, first and foremost, he sequenced the genome of the Neanderthal, like he was the

[31:51.360 --> 31:52.540] guy who did that.

[31:52.540 --> 31:57.840] And that's a really big deal considering that Neanderthal DNA is like garbage, like it's

[31:57.840 --> 32:01.000] so degraded in every specimen that they find.

[32:01.000 --> 32:08.700] So first and foremost, he realized that starting with mitochondrial DNA was probably going

[32:08.700 --> 32:13.720] to be a better bet because of course, just in terms of pure quantity within any given

[32:13.720 --> 32:19.000] specimen, there's so much more mitochondrial DNA because remember, there's one nucleus

[32:19.000 --> 32:23.040] in every cell and that contains all the nuclear DNA which comes from mom and dad.

[32:23.040 --> 32:27.680] And there's like a ton of mitochondria in each cell and certain cells have more mitochondria.

[32:27.680 --> 32:31.400] So there's just like a one to a ton ratio there.

[32:31.400 --> 32:37.520] You're going to find a lot more actual physical material to work with, but it's lacking.

[32:37.520 --> 32:43.840] Mitochondrial DNA only contains about, in the Neanderthal, 16,500 base pairs, whereas

[32:43.840 --> 32:52.520] nuclear DNA contains like 3 million base pairs, no, 3 billion base pairs.

[32:52.520 --> 32:54.680] So it's a huge difference.

[32:54.680 --> 32:59.320] But what he did earlier in his career is he said, I really think we can use these modern

[32:59.320 --> 33:04.260] methods that we use with modern humans and we can look at some of the hominins that we

[33:04.260 --> 33:10.060] actually have access to specimens for, and he started to play around with these methods.

[33:10.060 --> 33:15.980] And even though a lot of this Neanderthal DNA was quite degraded, he worked to develop

[33:15.980 --> 33:21.060] methods to be able to first sequence using mitochondrial DNA.

[33:21.060 --> 33:24.900] So he was able to get a partial genome that way.

[33:24.900 --> 33:28.920] And then he continued to refine, continued to refine, continued to refine, and ultimately

[33:28.920 --> 33:33.820] was able to sequence the whole Neanderthal genome.

[33:33.820 --> 33:35.240] And I mean, learned a lot of stuff.

[33:35.240 --> 33:39.600] So this, you know, his first work was with a 40,000 year old specimen.

[33:39.600 --> 33:41.640] And then he, you know, kept digging and kept digging.

[33:41.640 --> 33:46.560] He's just like really interested in how we came to be who we are and what was going on

[33:46.560 --> 33:51.080] around a similar time when it came to these other hominin species.

[33:51.080 --> 33:57.640] And what he found was, you know, a lot of similarities, a whole lot of differences,

[33:57.640 --> 33:58.640] but that's weird.

[33:58.640 --> 34:01.240] That seems like we got some of that in our own DNA.

[34:01.240 --> 34:02.340] Why would that be?

[34:02.340 --> 34:06.520] And this is when we started to realize that there was like mixing between Homo sapiens

[34:06.520 --> 34:07.680] and Neanderthal.

[34:07.680 --> 34:11.100] They lived contemporaneously and actually, or contemporarily.

[34:11.100 --> 34:16.280] And actually their split, it was determined by comparing all these different specimens

[34:16.280 --> 34:23.320] due to the work of Dr. Pabo that the Homo sapiens lineage and the Neanderthal lineage

[34:23.320 --> 34:27.760] split from a common ancestor around 800,000 years ago.

[34:27.760 --> 34:33.180] So he's looking through all of this, you know, really rich DNA and he's seeing that there's

[34:33.180 --> 34:40.040] like a whole lot of little changes and little examples within the human genome.

[34:40.040 --> 34:46.080] And now it's understood that, especially for individuals of like modern European descent

[34:46.080 --> 34:50.880] or even Asian descent, but not if you go too, too far east, but sort of like Western to

[34:50.880 --> 34:56.300] Central European and farther Western Asian descent, that we're looking at between one

[34:56.300 --> 35:00.440] and four percent of our total genome was contributed by Neanderthal.

[35:00.440 --> 35:04.680] You don't see it as much in individuals with African descent.

[35:04.680 --> 35:12.520] So we think that this admixture was taking place after Homo sapiens left Africa and moved

[35:12.520 --> 35:16.480] up into Europe and then because that's where Neanderthal was populating.

[35:16.480 --> 35:23.320] And that's when there was sexy times.

[35:23.320 --> 35:25.780] And then this is really cool.

[35:25.780 --> 35:33.680] So cut to 2008 in which there's like a finger bone that was taken out of the Denisova cave

[35:33.680 --> 35:37.520] in Siberia and it was like, whoa, this finger bone has some good DNA.

[35:37.520 --> 35:38.560] Let's learn from it.

[35:38.560 --> 35:42.840] So Peabo's team started to sequence that finger bone and they were like, holy, wow, this is

[35:42.840 --> 35:46.360] not Neanderthal and this is not human.

[35:46.360 --> 35:47.360] What is this?

[35:47.360 --> 35:52.720] And that's when we started to understand that there is this whole other group of individuals,

[35:52.720 --> 35:59.400] the Denisovans and we see admixture with them as well, especially in individuals from Southeast

[35:59.400 --> 36:03.300] Asia and very specifically concentrated within Melanesia.

[36:03.300 --> 36:07.040] We see like up to six percent Denisovan DNA.

[36:07.040 --> 36:13.440] So basically when you look at a map of the world, sort of the northwestern part of Europe

[36:13.440 --> 36:19.000] and into like the western part of Asia, we see a lot of Neanderthal DNA within individuals

[36:19.000 --> 36:22.360] who more recently came out of those areas.

[36:22.360 --> 36:26.080] And then you see a lot of Denisovan DNA for individuals who more recently came out of

[36:26.080 --> 36:28.380] eastern Asian areas.

[36:28.380 --> 36:34.080] But if you're looking at Africa or if you get too far south with on the Eurasian continent,

[36:34.080 --> 36:40.940] you don't see much admixture and that's for individuals who sort of either left and came

[36:40.940 --> 36:47.800] back too long ago or never left those sort of origin geographic locations.

[36:47.800 --> 36:51.280] And then the really cool thing is, I mean, this isn't just interesting because it's interesting,

[36:51.280 --> 36:52.280] right?

[36:52.280 --> 36:56.320] We know a lot more about our origins, we know a lot more about the other organisms that

[36:56.320 --> 37:03.000] we lived among and like actually engaged with, but we also know what makes us uniquely human.

[37:03.000 --> 37:07.540] We're learning more and more what it was about human beings, about Homo sapiens that allowed

[37:07.540 --> 37:09.200] them to persist.

[37:09.200 --> 37:13.120] The tool making was somewhat more sophisticated and evolved over time.

[37:13.120 --> 37:16.820] The ability to cross water, for example, these are things that we don't think, like Neanderthals

[37:16.820 --> 37:21.360] had tools, but they were relatively, they didn't really change much over time, whereas

[37:21.360 --> 37:23.560] human tool making exploded.

[37:23.560 --> 37:29.280] But we're also seeing that there are unique advantages that have been conferred by some

[37:29.280 --> 37:33.640] of the conserved DNA from Neanderthal and Denisovan.

[37:33.640 --> 37:39.320] Like for example, there's a gene or a version of the gene, I don't know if it's called EPAS1

[37:39.320 --> 37:44.480] or E-P-A-S-1, I don't know how people in the know pronounce it, but this gene, which a

[37:44.480 --> 37:50.760] lot of East Asian individuals have, actually, if you're a present day Tibetan, it helps

[37:50.760 --> 37:53.840] you survive at high altitudes.

[37:53.840 --> 37:59.160] That's a specific advantage that Denisovan DNA gave these individuals.

[37:59.160 --> 38:03.640] And we also see that in those that are more on the west side of the Eurasian continent,

[38:03.640 --> 38:09.520] that there are numbers of Neanderthal genes that influence our immune response to different

[38:09.520 --> 38:11.080] types of infections.

[38:11.080 --> 38:18.600] So not only was this DNA integrated over time, but because of the way that evolution works,

[38:18.600 --> 38:23.020] the stuff that helped us stuck around, and we can still see the way that it's helping

[38:23.020 --> 38:24.020] us today.

[38:24.020 --> 38:29.720] Yeah, I've been following this story closely the whole time, it's fascinating.

[38:29.720 --> 38:36.440] I do think that if you found your own discipline of science, paleogenomics in this case, that

[38:36.440 --> 38:37.440] you would get a Nobel Prize.

[38:37.440 --> 38:40.840] Yeah, he actually is seen as the founder of paleogenomics, that's true, that's pretty

[38:40.840 --> 38:41.840] cool.

[38:41.840 --> 38:44.320] Yeah, that buys you a Nobel Prize, I think.

[38:44.320 --> 38:48.600] It's funny because when I was reading about him, I was like, oh yeah, paleogenomics founded

[38:48.600 --> 38:52.480] the field, did all this cool stuff, and then I'm like, oh, yo, dude, like, sequence the

[38:52.480 --> 38:56.040] Neanderthal DNA, and then was a genetic sequence.

[38:56.040 --> 38:58.640] And then he was like, not good enough, how about Denisovan, too?

[38:58.640 --> 39:03.040] Wait, first let me discover this, and then I'm going to sequence it, too, and then compare

[39:03.040 --> 39:04.040] it to us.

[39:04.040 --> 39:05.040] That is damn impressive.

[39:05.040 --> 39:06.040] All right, thanks, Kara.

[39:06.040 --> 39:09.120] Well, everyone, we're going to take a quick break from our show to talk about one of our

[39:09.120 --> 39:12.140] sponsors this week, ExpressVPN.

[39:12.140 --> 39:17.760] We all have those searches that we don't want anybody to see when we leave our laptop, right?

[39:17.760 --> 39:19.360] You guys know what I'm talking about.

[39:19.360 --> 39:20.360] Sure, of course.

[39:20.360 --> 39:22.600] I don't think I have to give you any detail about that.

[39:22.600 --> 39:23.600] I have no idea.

[39:23.600 --> 39:28.120] So you're probably thinking, well, if I use incognito mode, it's fine, but let me tell

[39:28.120 --> 39:29.120] you something.

[39:29.120 --> 39:31.960] Incognito mode in your browser doesn't hide your activity.

[39:31.960 --> 39:36.560] It doesn't matter what mode you use, how many times you clear your history, your cache,

[39:36.560 --> 39:41.000] your cookies, your ISP can still see every website you've ever visited.

[39:41.000 --> 39:44.120] And that's why even when I'm at home, not just when I'm traveling, but when I'm at home,

[39:44.120 --> 39:46.400] I always go online using ExpressVPN.

[39:46.400 --> 39:52.520] Yeah, Kara, ExpressVPN is an app that reroutes your internet connection through their secure

[39:52.520 --> 39:56.400] servers so your ISP, they can't see the sites you visit.

[39:56.400 --> 40:01.520] ExpressVPN also keeps all of your information secure by encrypting 100% of your data with

[40:01.520 --> 40:04.220] the most powerful encryption available.

[40:04.220 --> 40:09.680] And ExpressVPN is available on all your devices, phones, computers, even your smart TV.

[40:09.680 --> 40:11.560] So there's no excuse for you not to use it.

[40:11.560 --> 40:16.960] Protect your online activity today with the VPN rated number one by Business Insider.

[40:16.960 --> 40:24.240] Visit our exclusive link, ExpressVPN.com slash SGU, and you can get an extra three months

[40:24.240 --> 40:26.700] free on a one year package.

[40:26.700 --> 40:33.560] That's E-X-P-R-E-S-S-V-P-N.com slash SGU.

[40:33.560 --> 40:36.280] All right, guys, let's get back to the show.

Homeopathy Lawsuit (40:37)

  • [link_URL TITLE][7]

[40:36.280 --> 40:40.160] All right, Jay, tell us about this homeopathy lawsuit.

[40:40.160 --> 40:45.200] Well, Steve, there's two very large pharmacy chains in the United States.

[40:45.200 --> 40:49.620] One is called CVS and the other is Walmart, which most people should have heard of.

[40:49.620 --> 40:53.680] And what's happening is they will be going to trial soon because they put homeopathic

[40:53.680 --> 40:57.080] products next to legitimate products.

[40:57.080 --> 41:01.040] When I say legitimate, I mean that they've been proven to work scientifically.

[41:01.040 --> 41:07.000] The issue is that most consumers don't know what the over the counter meds, they don't

[41:07.000 --> 41:09.600] know the difference between ones that work and ones that don't work.

[41:09.600 --> 41:12.480] They all look like legitimate products.

[41:12.480 --> 41:19.000] The problem is that CVS and Walmart put these homeopathic remedies next to products that

[41:19.000 --> 41:22.120] actually have been proven to work.

[41:22.120 --> 41:28.160] And this is considered to be them defrauding those consumers into thinking that the products

[41:28.160 --> 41:32.360] that they're looking at work or they're equal to, you know, the ones that have been approved

[41:32.360 --> 41:34.800] by the Food and Drug Administration.

[41:34.800 --> 41:36.100] And that's no bueno.

[41:36.100 --> 41:37.100] That's not good.

[41:37.100 --> 41:40.360] So I'm sure many of you know what homeopathy is.

[41:40.360 --> 41:44.560] But for those who don't, let me tell you what let me tell you about homeopathy real quick.

[41:44.560 --> 41:47.960] So homeopathy is based on a couple of ideas.

[41:47.960 --> 41:54.280] First, this idea that like cures like the patient is given an incredibly small dose,

[41:54.280 --> 41:55.280] though.

[41:55.280 --> 41:59.520] And you think that the smaller the dose you get, the stronger the remedy will be.

[41:59.520 --> 42:03.080] I know that this doesn't sound like it makes any sense, and that's because it absolutely

[42:03.080 --> 42:04.080] doesn't.

[42:04.080 --> 42:11.840] Yeah, if they do like a 30 C dilution, which is a one to 100 dilution 30 times, there's,

[42:11.840 --> 42:17.960] you know, you would need a an amount of water greater than the volume of the solar system

[42:17.960 --> 42:22.680] to have an equal chance of having even a single molecule, but basically nothing left.

[42:22.680 --> 42:23.680] Right.

[42:23.680 --> 42:24.680] There's nothing left.

[42:24.680 --> 42:27.760] So for small doses, they generally use nonexistent doses.

[42:27.760 --> 42:29.760] And then you might ask, well, how does that work?

[42:29.760 --> 42:30.760] Right.

[42:30.760 --> 42:31.980] Doesn't tell you what they say.

[42:31.980 --> 42:37.960] They claim that the water contains a memory or, you know, that it has the vibrations.

[42:37.960 --> 42:41.800] I've heard them use this word before it has the vibrations of that in the remedy.

[42:41.800 --> 42:43.240] It's complete nonsense.

[42:43.240 --> 42:48.040] Now what you end up doing, if you take a homeopathic remedy, it's very typical that you would take

[42:48.040 --> 42:51.440] like a sugar pill or liquid drops.

[42:51.440 --> 42:53.820] But there are creams and whatnot, too.

[42:53.820 --> 42:55.560] They've spilled out into other product types.

[42:55.560 --> 42:56.560] All right.

[42:56.560 --> 42:59.160] So bottom line is you're not getting any active ingredient.

[42:59.160 --> 43:03.600] You're not getting anything that's going to interact with, with your biology.

[43:03.600 --> 43:06.180] So therefore it does absolutely nothing.

[43:06.180 --> 43:10.880] There is no water memory, you know, and you could, you might ask the simple question of,

[43:10.880 --> 43:14.280] well, what about all the other stuff that used to be in that water at one point?

[43:14.280 --> 43:15.280] Right.

[43:15.280 --> 43:16.280] Because water is recycled.

[43:16.280 --> 43:17.280] Right.

[43:17.280 --> 43:21.040] So we use water, goes back into the ground, it gets filtered and we keep reusing it.

[43:21.040 --> 43:26.680] So that water has been in contact with possibly millions of different chemicals and molecules.

[43:26.680 --> 43:27.680] Right.

[43:27.680 --> 43:30.600] It's all 100 percent complete nonsense.

[43:30.600 --> 43:35.720] And homeopathy goes against everything that humans have discovered about pharmacology

[43:35.720 --> 43:36.840] and physics.

[43:36.840 --> 43:41.440] So it's going against sciences that have been in the works for a very long time and sciences

[43:41.440 --> 43:42.480] that we can rely on.

[43:42.480 --> 43:47.580] So to be perfectly clear, homeopathy does not work under any circumstance.

[43:47.580 --> 43:52.380] It's been tested for decades and there's virtually zero evidence that it works.

[43:52.380 --> 43:58.460] Now that we've set that baseline, in the best case scenario, you will only be taking sugar

[43:58.460 --> 43:59.660] or water.

[43:59.660 --> 44:01.220] That's the best case scenario.

[44:01.220 --> 44:05.940] In the worst case scenario, you actually get poisoned by the product because the manufacturing

[44:05.940 --> 44:11.840] process that a lot of the people that develop homeopathy fails, which means that they're

[44:11.840 --> 44:14.440] not putting in the safeties that are required.

[44:14.440 --> 44:17.240] Like impurities get in there and other things?

[44:17.240 --> 44:18.240] Absolutely.

[44:18.240 --> 44:19.240] Listen to this, man.

[44:19.240 --> 44:20.240] This is no joke.

[44:20.240 --> 44:25.240] Back in 2017, 10 infants died because they were given a homeopathic teething product.

[44:25.240 --> 44:27.520] This is something that they were chewing on.

[44:27.520 --> 44:31.280] That teething product contained belladonna.

[44:31.280 --> 44:34.440] Belladonna is an incredibly serious poison.

[44:34.440 --> 44:39.560] Four hundred other people suffered from illnesses associated with this bogus product.

[44:39.560 --> 44:44.320] So yeah, a bogus company using a bogus, you know, what would you call it?

[44:44.320 --> 44:45.520] It's not a science.

[44:45.520 --> 44:47.560] It's just an idea like- A pseudoscience?

[44:47.560 --> 44:48.560] Belief system.

[44:48.560 --> 44:49.560] There you go, Kara.

[44:49.560 --> 44:50.560] It's really a belief system.

[44:50.560 --> 44:51.560] Oh my god.

[44:51.560 --> 44:52.560] We should write that down.

[44:52.560 --> 44:53.560] It's some type of pseudoscience.

[44:53.560 --> 44:56.720] This is so interesting.

[44:56.720 --> 45:01.700] And they can't even manufacture their product within safe parameters because they're not

[45:01.700 --> 45:06.640] running under normal operating procedures that pharmaceutical companies would typically

[45:06.640 --> 45:11.200] use in order to provide medication and things that is actually safe to take.

[45:11.200 --> 45:12.200] Right.

[45:12.200 --> 45:14.740] It's like the old James Randi thing where he would like eat a whole box of homeopathic

[45:14.740 --> 45:19.960] sleeping pills, which sure, if they're actually homeopathic, that means they're just nothing.

[45:19.960 --> 45:25.440] But if they're bad homeopathy, which is like kind of funny because all homeopathy is bad,

[45:25.440 --> 45:30.320] but it's like doubly bad, then you could die because nobody's regulating this stuff.

[45:30.320 --> 45:34.320] Yeah, if made poorly, which we found them to, you know, this happens.

[45:34.320 --> 45:36.400] So anyway, so why are we talking about all of this?

[45:36.400 --> 45:41.400] Because the Center for Inquiry, also known as CFI, they filed lawsuits against both Walmart

[45:41.400 --> 45:46.900] and CVS a few years ago, trying to get the homeopathic products removed from their pharmacies.

[45:46.900 --> 45:52.840] Now in both cases, two lower courts dismissed the Center for Inquiry lawsuits, and they

[45:52.840 --> 45:56.480] did that because they didn't think that people were actually being defrauded.

[45:56.480 --> 46:01.160] But last week, three judges from a higher court in the District of Columbia in the United

[46:01.160 --> 46:04.880] States unanimously ruled that the trials can move forward.

[46:04.880 --> 46:07.280] So in short, the judge's response was this.

[46:07.280 --> 46:11.520] This is that I'm quoting them at this juncture, we cannot say that this is implausible, that

[46:11.520 --> 46:16.220] a reasonable consumer might understand that CVS and Walgreens placement of homeopathic

[46:16.220 --> 46:21.520] products alongside science based medicines as a representation that the homeopathic products

[46:21.520 --> 46:26.520] are efficacious or equivalent alternatives to the FDA approved over the counter drugs

[46:26.520 --> 46:29.120] alongside which they are displayed.

[46:29.120 --> 46:34.880] CFI said about this latest update that all evidence demonstrates that it doesn't work

[46:34.880 --> 46:40.540] at any level above that of a placebo and it can't work unless every understanding of science

[46:40.540 --> 46:41.800] we have is incorrect.

[46:41.800 --> 46:42.800] Yeah, it's magic.

[46:42.800 --> 46:43.800] It's witchcraft.

[46:43.800 --> 46:44.800] It's witchcraft.

[46:44.800 --> 46:49.640] CFI also points out that homeopathic products exist for pretty much any illness, right?

[46:49.640 --> 46:50.640] So you go out there.

[46:50.640 --> 46:51.640] What do you got?

[46:51.640 --> 46:52.640] Toothache?

[46:52.640 --> 46:53.640] There's a homeopathic remedy.

[46:53.640 --> 46:54.640] You got cancer?

[46:54.640 --> 46:55.640] There's a homeopathic remedy.

[46:55.640 --> 46:59.880] I mean, they'll throw homeopathy at anything that you got because they just want your money.

[46:59.880 --> 47:05.440] The higher DC court said CFI's factual allegations plausibly support an inference that through

[47:05.440 --> 47:09.720] their product placement practices, Walmart and CVS mislead consumers into believing that

[47:09.720 --> 47:14.640] homeopathic products are equivalent alternatives to FDA approved over the counter drugs.

[47:14.640 --> 47:19.040] After the new ruling was issued, CFI's legal director Nick Little said that the court of

[47:19.040 --> 47:24.280] appeals rightly recognized that giant retailers can't just deny responsibility for how they

[47:24.280 --> 47:27.440] present what are fundamentally worthless products.

[47:27.440 --> 47:31.560] It's a huge victory for consumers and their right to be misled.

[47:31.560 --> 47:36.120] He also noted that they're giving us a chance to prove that these retailers are defrauding

[47:36.120 --> 47:37.500] consumers.

[47:37.500 --> 47:38.800] So this is great progress.

[47:38.800 --> 47:43.840] I mean, well, let's see how this whole thing ends because I still, you know, I'm not completely

[47:43.840 --> 47:46.980] sure about like where it's going to where this is going to lead to.

[47:46.980 --> 47:51.440] But in the short term, looks like CFI did a fantastic job following up with these lawsuits

[47:51.440 --> 47:54.920] and and maintaining their position on it.

[47:54.920 --> 47:59.920] And this is exactly what large skeptical organizations should be doing.

[47:59.920 --> 48:02.800] CFI has completely got their thumb on this.

[48:02.800 --> 48:07.140] I'm really excited about this and I really hope that it leads to other states picking

[48:07.140 --> 48:08.580] up the same legislature.

[48:08.580 --> 48:12.720] We can hope so, but it's hard for me to get hopeful about stuff like this just after so

[48:12.720 --> 48:13.720] much disappointment.

[48:13.720 --> 48:19.040] And but but, you know, we talked a couple of years ago, a year or two years ago, about

[48:19.040 --> 48:27.280] both the FDA and the FTC in the U.S. reevaluating their policy about homeopathy.

[48:27.280 --> 48:34.120] And, you know, they talked about the public comment period where we sent in our own recommendations.

[48:34.120 --> 48:38.020] And, you know, they they tightened them up a little bit, but they didn't go as far as

[48:38.020 --> 48:39.160] we thought they should.

[48:39.160 --> 48:44.000] But the question is, you know, I'd have to take a close look at this.

[48:44.000 --> 48:46.200] Did that help CFI's case here?

[48:46.200 --> 48:53.520] Like is there anything specifically that maybe these chains did that violated the new recommendations

[48:53.520 --> 48:55.840] you know, the FDA and the FTC have?

[48:55.840 --> 48:57.920] They basically they want to have their cake and eat it too.

[48:57.920 --> 49:03.880] They want to basically allow the industry to sell their snake oil with these fake claims,

[49:03.880 --> 49:09.260] but mitigate the level of deception that they're engaging in to convince themselves that they're

[49:09.260 --> 49:10.680] doing their job.

[49:10.680 --> 49:17.840] So did they fall afoul of those regulations, those sort of updated regulations?

[49:17.840 --> 49:18.840] We'll see.

[49:18.840 --> 49:23.040] You know, if it's just that, you know, the judges are saying, yeah, they're deceptive

[49:23.040 --> 49:26.960] even without there being a very specific, you know, regulation that they broke, that

[49:26.960 --> 49:27.960] that's interesting.

[49:27.960 --> 49:32.040] I doubt that's going to hold up, you know, on appeal would be my guess.

[49:32.040 --> 49:35.300] But you know, we can always hope, you know, again, part of it is going to keep doing this

[49:35.300 --> 49:37.880] and just hope for the best at some point.

[49:37.880 --> 49:43.320] But man, the laws are so rigged against science when it comes to stuff like this.

[49:43.320 --> 49:44.320] Brutal.

[49:44.320 --> 49:45.320] I know.

[49:45.320 --> 49:46.880] That's why I said to me, we just got to wait and see what happens.

[49:46.880 --> 49:47.880] All right.

[49:47.880 --> 49:48.880] Thanks, Jay.

Silkworm Pangenome (49:49)

[49:48.880 --> 49:49.880] All right.

[49:49.880 --> 49:50.880] So, guys, let me ask you a question.

[49:50.880 --> 49:53.440] Do you know what a pan genome is?

[49:53.440 --> 49:54.440] Do you know?

[49:54.440 --> 49:55.440] All?

[49:55.440 --> 49:56.440] Yeah.

[49:56.440 --> 49:57.440] All of the genes.

[49:57.440 --> 49:59.440] Oh, is it the genome of all?

[49:59.440 --> 50:00.440] Oh, yeah.

[50:00.440 --> 50:01.440] You're right.

[50:01.440 --> 50:04.200] Of an animal or something.

[50:04.200 --> 50:05.200] Yeah.

[50:05.200 --> 50:07.720] Sort of like or like of many, many species, like a type of animal.

[50:07.720 --> 50:09.040] It's like all the genomes.

[50:09.040 --> 50:10.040] Yeah.

[50:10.040 --> 50:15.580] Basically, it's the it's a all the genes of an entire clade.

[50:15.580 --> 50:18.280] But that could be a thing that could be a single species.

[50:18.280 --> 50:19.280] Right.

[50:19.280 --> 50:23.520] So, for example, we did a pan genome of humans of homo sapiens.

[50:23.520 --> 50:30.480] It would include not just like the one genome of one person, but every possible gene variant

[50:30.480 --> 50:37.420] that exists among all humans, the complete genome of every variant.

[50:37.420 --> 50:39.640] How could you know if you got them all?

[50:39.640 --> 50:41.440] You just have to do a thorough enough survey.

[50:41.440 --> 50:42.440] Insects.

[50:42.440 --> 50:43.440] Yeah.

[50:43.440 --> 50:44.440] Insects.

[50:44.440 --> 50:45.440] And in fact, that's what we're talking about here.

[50:45.440 --> 50:52.960] So they recently published the pan genome of the domestic silkworm, which is obviously

[50:52.960 --> 50:56.320] a very economically important species.

[50:56.320 --> 50:58.420] Did you know a lot of facts about this that are interesting?

[50:58.420 --> 51:05.240] Did you know that the domestic silkworm is the only fully domesticated insect in the

[51:05.240 --> 51:06.240] world?

[51:06.240 --> 51:07.240] Wow.

[51:07.240 --> 51:08.240] Yeah.

[51:08.240 --> 51:09.240] Yeah.

[51:09.240 --> 51:11.760] The honeybee is considered partially domesticated.

[51:11.760 --> 51:17.240] The domestic silkworm is 100 percent dependent on humans for survival.

[51:17.240 --> 51:20.360] Basically, it can't live without people.

[51:20.360 --> 51:22.680] That's why I think it's considered completely domesticated.

[51:22.680 --> 51:26.360] This specific species is Bombix mori.

[51:26.360 --> 51:27.360] Bombix mori.

[51:27.360 --> 51:28.800] That's the domestic silkworm.

[51:28.800 --> 51:36.600] So what they when in the process of doing the pan genome, they identified well, they

[51:36.600 --> 51:44.600] used 205 local strains, 194 improved varieties and 632 genetic stocks.

[51:44.600 --> 51:48.320] So this was, you know, ones that they had like specimens that they had.

[51:48.320 --> 51:55.040] So even in the past, they also had 47 wild silkworms, which is Bombix mandarina.

[51:55.040 --> 52:00.600] That's the wild version that, you know, the domesticated version was domesticated from.

[52:00.600 --> 52:03.600] When do you think the silkworm was domesticated?

[52:03.600 --> 52:08.200] Like, when did all this start using silk as a textile?

[52:08.200 --> 52:10.800] God, it has to be that 2000 years ago.

[52:10.800 --> 52:14.640] I was going to say like 500 is probably way longer than that, though.

[52:14.640 --> 52:23.200] Five thousand years ago, five thousand years ago, probably started in China, spread fairly

[52:23.200 --> 52:26.080] quickly to Japan and South Korea.

[52:26.080 --> 52:29.920] But this is part of what they're able now to do to tell right by looking at how the

[52:29.920 --> 52:33.840] genes are spread around the population around the world.

[52:33.840 --> 52:37.040] You know, they could say, oh, yeah, this kind of all roads lead to China.

[52:37.040 --> 52:43.600] And there does appear to be a separate population originating in Japan, meaning it split off

[52:43.600 --> 52:50.680] from China, but then became its own sub population that's genetically identifiable.

[52:50.680 --> 52:58.200] And it also looks looks like some of the features were independently bred into both of those

[52:58.200 --> 52:59.200] lines.

[52:59.200 --> 53:05.760] Like all of the gene variants that are unique to the domestic version are all in the original

[53:05.760 --> 53:07.520] Chinese stock.

[53:07.520 --> 53:13.320] They're not some of them were reproduced or there are some features unique to the Japanese

[53:13.320 --> 53:15.960] branch and some unique to the Chinese branch.

[53:15.960 --> 53:23.560] That's important because that means if you interbreed those two stocks, you might get

[53:23.560 --> 53:28.600] different genes that are beneficial, right, because each strain has different beneficial

[53:28.600 --> 53:29.600] genes.

[53:29.600 --> 53:31.280] There were two types of genes that they were mostly interested in.

[53:31.280 --> 53:38.160] One is domestication associated genes and the other is what they call improvement associated

[53:38.160 --> 53:39.160] genes.

[53:39.160 --> 53:45.400] So they identified two hundred, they identified four hundred and sixty eight domestication

[53:45.400 --> 53:46.960] associated genes.

[53:46.960 --> 53:52.440] Two hundred and sixty four are new, newly identified and one hundred and ninety eight

[53:52.440 --> 53:56.120] improvement associated genes, one hundred and eighty five of which are new.

[53:56.120 --> 54:00.360] So we basically knew almost nothing about the genes that led to the improvement.

[54:00.360 --> 54:03.000] This is basically the improvement of the silk, right?

[54:03.000 --> 54:05.160] So what are the domestication associated genes?

[54:05.160 --> 54:10.880] These are genes that make the domestic silkworm easier to handle.

[54:10.880 --> 54:16.760] They make them able to live together in large groups, which is obviously necessary to growing

[54:16.760 --> 54:18.640] vast amounts of them.

[54:18.640 --> 54:23.200] And also they tend not they can't fly, which I guess is a big advantage.

[54:23.200 --> 54:24.420] They don't fly away.

[54:24.420 --> 54:30.080] The silkworms themselves don't move as much, you know, they sort of, again, stay put.

[54:30.080 --> 54:35.620] They're also white, whereas the wild version is more more brown.

[54:35.620 --> 54:40.120] So those are, you know, they don't have anything to do with the silk, just the domestication

[54:40.120 --> 54:43.920] of the moth, you know, in the worm itself.

[54:43.920 --> 54:47.540] And then, you know, again, one hundred and eighty five genes associated with improvements

[54:47.540 --> 54:50.480] in the quality of the silk.

[54:50.480 --> 54:56.360] And yes, it's really, you know, a silk industry really could not exist without the domesticated

[54:56.360 --> 54:57.360] silkworm.

[54:57.360 --> 55:00.960] This is a huge, a huge thing to do.

[55:00.960 --> 55:07.840] Did you know also interesting fact that the domestic silkworm feeds exclusively on a diet

[55:07.840 --> 55:10.400] of the leaves from the mulberry bush, right?

[55:10.400 --> 55:16.120] So they're and their silk is called mulberry silk for that reason.

[55:16.120 --> 55:20.720] You know, I have read that there are species that they feed on other things like eucalyptus

[55:20.720 --> 55:21.720] leaves and whatever.

[55:21.720 --> 55:22.960] I mean, there are variants that they do that.

[55:22.960 --> 55:27.320] But the primary one, like if you buy silk, it's basically probably mulberry silk because

[55:27.320 --> 55:29.220] again, they eat the leaves of the mulberry tree.

[55:29.220 --> 55:35.520] So how big do you think the silk industry is in billions of dollars, billions of dollars

[55:35.520 --> 55:36.520] per year?

[55:36.520 --> 55:37.520] What would you say?

[55:37.520 --> 55:38.520] Twelve billion.

[55:38.520 --> 55:39.520] Twelve.

[55:39.520 --> 55:42.040] About seventeen billion in two thousand twenty one.

[55:42.040 --> 55:46.800] It's projected to be over 30 billion by 2029, still on the rise.

[55:46.800 --> 55:52.160] So silk mulberry silk, you know, obviously, you know, it has it's, you know, luster, it's

[55:52.160 --> 55:54.260] smooth and soft textile.

[55:54.260 --> 55:59.020] It has good strength, good biocompatibility, slow biodegradation.

[55:59.020 --> 56:00.720] The synthesis is carbon neutral.

[56:00.720 --> 56:05.900] So it has a lot of features that make it very, very desirable.

[56:05.900 --> 56:14.000] There is a type of silk, however, that is not silkworm silk, but another critters silk

[56:14.000 --> 56:15.000] is stronger.

[56:15.000 --> 56:16.000] Spider.

[56:16.000 --> 56:17.000] Yeah, spider.

[56:17.000 --> 56:20.960] Now, why can't we domesticate the spider and get their silk?

[56:20.960 --> 56:22.680] Because they just won't listen, Steve.

[56:22.680 --> 56:25.240] That's right, Evan, you're correct.

[56:25.240 --> 56:26.240] They can't be domesticated.

[56:26.240 --> 56:29.680] They're like too territorial and they eat each other and stuff.

[56:29.680 --> 56:34.040] So, you know, you can have a whole bunch of silkworms in the moths at one place.

[56:34.040 --> 56:35.940] You can't do that with spiders.

[56:35.940 --> 56:41.320] So they're just not unfortunate, not commercially viable to domesticate them.

[56:41.320 --> 56:47.040] So there have been attempts at splicing spider silk genes into the silkworm to get them to

[56:47.040 --> 56:50.440] make basically make silkworms to make spider silk.

[56:50.440 --> 56:52.600] But none of these have succeeded.

[56:52.600 --> 56:56.240] They just make too little, like only five percent of the silk they produce has the spider

[56:56.240 --> 57:00.960] silk proteins in it and the results are not really acceptable.

[57:00.960 --> 57:03.240] I want a shirt made of spider silk.

[57:03.240 --> 57:08.760] However, however, Bob, completely unrelated.

[57:08.760 --> 57:17.240] But I read a news item today while I was doing my research that this study, they they respin

[57:17.240 --> 57:20.800] silkworm silk, they respin it.

[57:20.800 --> 57:26.320] They remove an outer sticky coating in a way that doesn't destroy the silk itself.

[57:26.320 --> 57:30.720] And then they basically respin it together.

[57:30.720 --> 57:36.840] And they say that the resulting silk is 70 percent stronger than spider silk.

[57:36.840 --> 57:37.840] 70?

[57:37.840 --> 57:40.320] Like dragline strong spider silk?

[57:40.320 --> 57:41.320] Beats the spiders.

[57:41.320 --> 57:42.320] Whoa.

[57:42.320 --> 57:44.440] 70 percent stronger than dragline silk.

[57:44.440 --> 57:45.440] That's right.

[57:45.440 --> 57:47.680] Of course, Bob, you know what the strongest spider silk is.

[57:47.680 --> 57:48.680] Yeah.

[57:48.680 --> 57:53.800] So there may be other ways to get the, you know, the very strong silk that we would want

[57:53.800 --> 57:59.000] for certain kind of applications, you know, like surgical medical bulletproof vests, things

[57:59.000 --> 58:00.000] like that.

[58:00.000 --> 58:01.000] Thank you.

[58:01.000 --> 58:02.000] Rope for swinging between buildings.

[58:02.000 --> 58:03.000] You know.

[58:03.000 --> 58:04.000] Yeah.

[58:04.000 --> 58:06.080] And then you finally perfect that Spider-Man costume.

[58:06.080 --> 58:07.080] You know.

[58:07.080 --> 58:08.080] All right.

[58:08.080 --> 58:09.840] Now is one other wrinkle here.

[58:09.840 --> 58:10.840] The other thing.

[58:10.840 --> 58:12.440] Oh, it wrinkles too easy?

[58:12.440 --> 58:13.440] No, no.

[58:13.440 --> 58:14.440] So.

[58:14.440 --> 58:15.440] All right.

[58:15.440 --> 58:16.440] Sorry.

[58:16.440 --> 58:18.440] What else might this knowledge be useful for?

[58:18.440 --> 58:23.200] Well, you know, knowing all of the genes and gene variants that exist out there in the

[58:23.200 --> 58:28.600] world and the world of silkworms, might this help us either through breeding or through

[58:28.600 --> 58:34.960] genetic engineering, create better silk or, you know, more productive, more environmentally

[58:34.960 --> 58:37.000] friendly, whatever silk production.

[58:37.000 --> 58:38.960] So there's lots of possibilities here.

[58:38.960 --> 58:44.960] One, however, that I read about is producing colored silk.

[58:44.960 --> 58:47.640] Now silk is white, right?

[58:47.640 --> 58:54.360] The raw silk is white and then it has to be dyed after it's produced, after it's spun.

[58:54.360 --> 59:01.760] And that dying process is not very environmentally friendly, uses a lot of chemicals, uses harsh

[59:01.760 --> 59:07.520] chemicals, can degrade the silk and produces a lot of wastewater and we don't like wastewater.

[59:07.520 --> 59:13.960] Now there have been attempts at altering the color of silk by feeding silkworms, basically

[59:13.960 --> 59:16.220] food with dye in it.

[59:16.220 --> 59:19.940] So it gets into their metabolism and it gets into the silk.

[59:19.940 --> 59:24.240] But this produces very poor watered down colors, right?

[59:24.240 --> 59:25.480] Not a good result.

[59:25.480 --> 59:28.520] So the feeding method doesn't really work that well.

[59:28.520 --> 59:33.960] Scientists, however, have been able to have created genetically altered silkworms that

[59:33.960 --> 59:43.240] produce colored silk, so-called pre-dyed silk, specifically though, they've done it using

[59:43.240 --> 59:49.080] genes for fluorescent colors because we already have those, we use them as markers, right?

[59:49.080 --> 59:56.120] If you want to know where a protein is going in an animal, you attach a fluorescent protein

[59:56.120 --> 59:59.040] to it as a tag, as a marker.

[59:59.040 --> 01:00:00.040] And then-

[01:00:00.040 --> 01:00:01.860] Talked about that in the Nobel Prize for Chemistry.

[01:00:01.860 --> 01:00:02.860] So we have them, right?

[01:00:02.860 --> 01:00:07.820] So they said, hey, let's try attaching fluorescent proteins to the silk and it worked.

[01:00:07.820 --> 01:00:10.100] So we could make fluorescent silk, right?

[01:00:10.100 --> 01:00:15.280] So whatever that's worth, but perhaps in the future we might be able to make non-fluorescent

[01:00:15.280 --> 01:00:19.880] but colored silk and then that would save a lot of money and environmental degradation,

[01:00:19.880 --> 01:00:25.400] et cetera, and also improve the quality of the silk by not having to treat it with harsh

[01:00:25.400 --> 01:00:26.400] chemicals.

[01:00:26.400 --> 01:00:34.080] So that would be one potential application of having sequenced the pan genome of the

[01:00:34.080 --> 01:00:35.080] silkworm.

[01:00:35.080 --> 01:00:36.080] Very cool.

[01:00:36.080 --> 01:00:37.080] I wonder if the color would be consistent.

[01:00:37.080 --> 01:00:42.280] Apparently, the results are very good when you do the genetically engineered color.

[01:00:42.280 --> 01:00:44.360] Yeah, it works extremely well.

[01:00:44.360 --> 01:00:49.520] Yes, if you want fluorescent green silk, you can theoretically get it now.

[01:00:49.520 --> 01:00:50.520] Very interesting.

[01:00:50.520 --> 01:00:52.520] There's a lot I didn't know about the silkworm.

[01:00:52.520 --> 01:00:57.360] It's obviously a very important industry in the world, the whole silk road and all that.

[01:00:57.360 --> 01:01:00.040] And a lot of nuances I wasn't aware of.

[01:01:00.040 --> 01:01:01.040] Very cool.

[01:01:01.040 --> 01:01:03.120] And I don't think we've ever talked about the pan genome before.

[01:01:03.120 --> 01:01:08.400] So I think we're going to be hearing more about this because it obviously takes a huge

[01:01:08.400 --> 01:01:13.200] effort because you got to, again, they had to sequence many, many, many individual silkworms

[01:01:13.200 --> 01:01:18.600] in order to be relatively sure that they captured every gene variant that exists.

[01:01:18.600 --> 01:01:20.600] It would be really difficult to do that for humans.

[01:01:20.600 --> 01:01:26.560] There were almost a billion of us, but I could see that project happening at some point.

Quickie(s) with Steve (1:01:27)

New ALS Drug

  • [link_URL TITLE][9]

[01:01:26.560 --> 01:01:32.560] Most weeks, it's not difficult for me to find the one news item that I'm the most interested

[01:01:32.560 --> 01:01:34.720] in to talk about on the SGU.

[01:01:34.720 --> 01:01:40.200] Occasionally, I have weeks where there's like two items like, oh, I don't know which one

[01:01:40.200 --> 01:01:41.200] I want to talk about.

[01:01:41.200 --> 01:01:44.440] I really should talk about this, but I really want to talk about that one.

[01:01:44.440 --> 01:01:46.040] That happens occasionally.

[01:01:46.040 --> 01:01:50.540] This week, this is the first time I can remember there were three news items I really wanted

[01:01:50.540 --> 01:01:51.960] to talk about.

[01:01:51.960 --> 01:01:55.160] So what I'm going to do is just do a quickie for the other two.

[01:01:55.160 --> 01:01:56.160] So very quickly.

[01:01:56.160 --> 01:02:00.880] So one, I sort of have to talk about this very quickly and maybe I'll do a deeper dive

[01:02:00.880 --> 01:02:03.600] next week, but just to give you the idea here.

[01:02:03.600 --> 01:02:10.320] So the FDA approved the third only drug for the treatment of ALS, amyotrophic lateral

[01:02:10.320 --> 01:02:11.800] sclerosis.

[01:02:11.800 --> 01:02:16.360] This is the third drug approved for that alters the progression of the disease, right?

[01:02:16.360 --> 01:02:20.200] There are symptomatic drugs as well, but this is one that actually reduces the progression

[01:02:20.200 --> 01:02:21.200] of the disease.

[01:02:21.200 --> 01:02:23.680] Isn't there enough controversy around this?

[01:02:23.680 --> 01:02:25.440] This is very controversial.

[01:02:25.440 --> 01:02:26.440] Thank you, Kara.

[01:02:26.440 --> 01:02:29.320] I wrote about it on Science-based Medicine, so if you want all the nitty-gritty details,

[01:02:29.320 --> 01:02:30.680] you can look at my article there.

[01:02:30.680 --> 01:02:36.200] The name of the drug is Relivrio or Relivrio, kind of a stupid name.

[01:02:36.200 --> 01:02:40.240] Let's make a name that's hard to pronounce.

[01:02:40.240 --> 01:02:45.600] And also that sounds like reanimate, like that's a weird name.

[01:02:45.600 --> 01:02:51.920] So I think we talked about like a year ago, the FDA approved an Alzheimer's drug based

[01:02:51.920 --> 01:02:56.600] upon a marker, not even with clinical evidence of clinical improvement.

[01:02:56.600 --> 01:03:00.360] Well they broke new ground here, I don't know if they've ever done this before, but they

[01:03:00.360 --> 01:03:07.040] approved it based upon a phase two preliminary trial without even waiting for the phase three

[01:03:07.040 --> 01:03:08.040] trial.

[01:03:08.040 --> 01:03:13.080] I guess it makes sense with ALS because like it is a progressive, I don't know, there's

[01:03:13.080 --> 01:03:15.560] something about like compassionate use here.

[01:03:15.560 --> 01:03:17.120] Totally, yes.

[01:03:17.120 --> 01:03:22.360] But you can approve a drug for compassionate use without getting FDA approval with the

[01:03:22.360 --> 01:03:23.360] indication.

[01:03:23.360 --> 01:03:28.760] So the question is why they do this extra step and actually approve the drug.

[01:03:28.760 --> 01:03:31.360] Because with the compassionate use, they're just saying you have to be in a trial and

[01:03:31.360 --> 01:03:37.800] it's more, you know, this way as an FDA approved drug, anybody can prescribe it, right?

[01:03:37.800 --> 01:03:40.640] But this is such an unusual step for the FDA, why did they do it?

[01:03:40.640 --> 01:03:44.320] Now of course the justification is the ALS is a terminal illness, there's not really

[01:03:44.320 --> 01:03:46.240] many effective treatments.

[01:03:46.240 --> 01:03:51.140] And so, you know, and also the, you know, researchers and experts, many of whom I know

[01:03:51.140 --> 01:03:55.220] by the way, they're like in the Northeast and I've worked with them.

[01:03:55.220 --> 01:04:01.000] And they're very, you know, they're all serious, very accomplished scientists and clinicians.

[01:04:01.000 --> 01:04:02.820] You know, I respect all of them.

[01:04:02.820 --> 01:04:07.920] They said just approve it and give us the option to prescribe this if we want to and,

[01:04:07.920 --> 01:04:10.140] you know, and just trust that we know what we're doing.

[01:04:10.140 --> 01:04:14.200] And then, but there's also a lot of pressure from patient groups to approve it as well

[01:04:14.200 --> 01:04:17.480] because they, again, they don't, they can't, they won't be alive when the phase three trial

[01:04:17.480 --> 01:04:18.980] is done, right?

[01:04:18.980 --> 01:04:23.280] So I get that, I get all of that, but here's the thing, it's not just that it was based

[01:04:23.280 --> 01:04:28.500] on a preliminary data, on preliminary data, it's really sketchy data.

[01:04:28.500 --> 01:04:35.600] There was some significant p-hacking going on and like serious p-hacking.

[01:04:35.600 --> 01:04:42.480] So the, to me, the biggest thing is that the, there was a pretty large asymmetry in terms

[01:04:42.480 --> 01:04:47.680] of the patients in the, in the treatment versus the placebo group in terms of who was taking

[01:04:47.680 --> 01:04:51.400] one of the two other drugs approved for ALS.

[01:04:51.400 --> 01:04:57.040] So the treatment group had a higher percentage of patients who were also taking another effective

[01:04:57.040 --> 01:05:00.440] drug for ALS, so that obviously could have skewed the results.

[01:05:00.440 --> 01:05:03.680] There's also choices that they had to make about how to analyze the data, whether or

[01:05:03.680 --> 01:05:07.900] not, you know, the progression scale they use is linear or not.

[01:05:07.900 --> 01:05:14.800] And basically the only time the data comes out statistically significant is if they do

[01:05:14.800 --> 01:05:19.960] it in the one very particular way that they did and any other way you look at the data,

[01:05:19.960 --> 01:05:21.200] it's not significant.

[01:05:21.200 --> 01:05:23.480] So it's like, you know, the very definition of p-hacking.

[01:05:23.480 --> 01:05:25.460] Now that doesn't mean it doesn't work.

[01:05:25.460 --> 01:05:26.460] It could work.

[01:05:26.460 --> 01:05:28.200] You get this, you know, this is a preliminary study.

[01:05:28.200 --> 01:05:34.240] There weren't that many patients in it, I think 137 patients total, so not very big

[01:05:34.240 --> 01:05:35.240] study.

[01:05:35.240 --> 01:05:36.240] And it's a preliminary study.

[01:05:36.240 --> 01:05:39.440] It's mainly a safety study, you know, with preliminary efficacy data.

[01:05:39.440 --> 01:05:41.920] So, but yes, compassionate use, fine.

[01:05:41.920 --> 01:05:42.920] I get that.

[01:05:42.920 --> 01:05:46.760] My one concern I have is though that how, how are you going to enroll people into a

[01:05:46.760 --> 01:05:49.160] phase three trial if they can get prescribed the drug?

[01:05:49.160 --> 01:05:51.520] Who's going to take a chance on getting the placebo?

[01:05:51.520 --> 01:05:52.520] Nobody.

[01:05:52.520 --> 01:05:55.400] However, here's the only, here's the one thing.

[01:05:55.400 --> 01:05:58.400] Remember, guess how much this drug will cost per year of treatment?

[01:05:58.400 --> 01:05:59.400] Oh no.

[01:05:59.400 --> 01:06:00.400] Oh.

[01:06:00.400 --> 01:06:01.400] A lot.

[01:06:01.400 --> 01:06:02.400] $158,000.

[01:06:02.400 --> 01:06:03.400] What's the point?

[01:06:03.400 --> 01:06:04.400] Okay.

[01:06:04.400 --> 01:06:05.400] So not inaccessible.

[01:06:05.400 --> 01:06:07.440] But like what's the effect size here?

[01:06:07.440 --> 01:06:09.040] What is the actual takeaway?

[01:06:09.040 --> 01:06:10.420] How much more time?

[01:06:10.420 --> 01:06:11.420] How much?

[01:06:11.420 --> 01:06:12.420] Six months.

[01:06:12.420 --> 01:06:13.420] And really?

[01:06:13.420 --> 01:06:17.120] Like, but we're talking six months of function or six months of life?

[01:06:17.120 --> 01:06:19.800] Six months of life, but you know, but that's significant.

[01:06:19.800 --> 01:06:23.600] You know, if you have a two and a half year or three year life expectancy, adding on another

[01:06:23.600 --> 01:06:25.960] six months is significant and you can't downplay that.

[01:06:25.960 --> 01:06:29.840] Yeah, but the truth is you can extend your life with ALS by getting a trach.

[01:06:29.840 --> 01:06:32.600] Yeah, but you're going to do all the, well, that's different to get it.

[01:06:32.600 --> 01:06:36.800] That's that's a really, because the end point they use in these trials is death or trach.

[01:06:36.800 --> 01:06:39.360] Those are equivalent because once you get a trach, it's death or trach.

[01:06:39.360 --> 01:06:44.200] Oh, that's what I meant when I said, okay, that's what I meant when I said, um, uh, six,

[01:06:44.200 --> 01:06:46.720] six more months of life or six more months of function.

[01:06:46.720 --> 01:06:48.120] It's trach free survival.

[01:06:48.120 --> 01:06:49.120] That is the outcome that we need.

[01:06:49.120 --> 01:06:50.160] Yeah, so I call that function.

[01:06:50.160 --> 01:06:51.800] That's six more months of function.

[01:06:51.800 --> 01:06:54.640] It's six more months before you get to the point where you need a trach in order to,

[01:06:54.640 --> 01:06:55.640] in order to breathe.

[01:06:55.640 --> 01:06:59.680] Basically being put on a ventilator, but they're losing declining, but you can breathe declining,

[01:06:59.680 --> 01:07:01.000] but you can see, yeah, right.

[01:07:01.000 --> 01:07:02.000] Yeah.

[01:07:02.000 --> 01:07:03.000] So that's, that's life.

[01:07:03.000 --> 01:07:04.000] That really is life.

[01:07:04.000 --> 01:07:05.000] So that's the, yeah, you can't put a price on that.

[01:07:05.000 --> 01:07:06.000] That's that.

[01:07:06.000 --> 01:07:07.440] I mean, six months is a long time.

[01:07:07.440 --> 01:07:08.440] You're right.

[01:07:08.440 --> 01:07:11.040] But then also, but it can, I don't know if it's actually six months.

[01:07:11.040 --> 01:07:16.560] That's the list price, but you know, the meaning that's like their, their rip off starting

[01:07:16.560 --> 01:07:20.400] price, but insurance companies are going to negotiate for lower prices and patients aren't

[01:07:20.400 --> 01:07:24.200] going to be paying that much, but if you don't have good insurance, you basically have to

[01:07:24.200 --> 01:07:25.200] pay the list price.

[01:07:25.200 --> 01:07:26.200] Otherwise it's fraud.

[01:07:26.200 --> 01:07:28.400] Like you can't just, you can't discount it.

[01:07:28.400 --> 01:07:31.400] You can't use it as a starting point for your negotiations with insurance companies and

[01:07:31.400 --> 01:07:33.800] then discount it to people who pay full price.

[01:07:33.800 --> 01:07:35.340] They have to pay full price.

[01:07:35.340 --> 01:07:38.880] So the only people who might go into clinical trials is for people who can't afford the

[01:07:38.880 --> 01:07:39.880] drug.

[01:07:39.880 --> 01:07:42.320] So you stay, still may be able to recruit, but it's going to make it harder.

[01:07:42.320 --> 01:07:43.320] It is going to make it harder.

[01:07:43.320 --> 01:07:48.240] It's also just like an example of how freaking unethical, like I know it's not intentional,

[01:07:48.240 --> 01:07:53.280] but like we experiment on poor people, like that's how it works in this country.

[01:07:53.280 --> 01:07:54.600] That's how it would work with this.

[01:07:54.600 --> 01:07:56.180] So this is messed up.

[01:07:56.180 --> 01:07:57.680] This is a genuine dilemma.

[01:07:57.680 --> 01:07:59.240] I get all sides.

[01:07:59.240 --> 01:08:02.720] It's just, you know, you got to pick your poison in a way, like what, what you think

[01:08:02.720 --> 01:08:03.720] is more important.

[01:08:03.720 --> 01:08:06.000] They're, they're doing a phase three trial.

[01:08:06.000 --> 01:08:10.080] I hope they, they get enough recruitment to really do a good phase three trial.

[01:08:10.080 --> 01:08:13.160] I would not be surprised if it doesn't work in the phase three trial.

[01:08:13.160 --> 01:08:14.160] Of course, I hope it does.

[01:08:14.160 --> 01:08:18.520] I really hope it works even better, but you know, it's perfectly possible that it won't

[01:08:18.520 --> 01:08:23.840] work once you, you know, do a more, more rigorous trial.

[01:08:23.840 --> 01:08:26.280] And it's just, you know, the FDA is definitely moving in this direction.

[01:08:26.280 --> 01:08:29.520] I think there actually was a specific directive for neurodegenerative diseases.

[01:08:29.520 --> 01:08:30.760] We're going to approve things more quickly.

[01:08:30.760 --> 01:08:31.760] So they're doing it.

[01:08:31.760 --> 01:08:32.760] They did it with ALS.

[01:08:32.760 --> 01:08:33.760] They did it with Alzheimer's.

[01:08:33.760 --> 01:08:36.480] This is what the FDA said they were going to do and they're doing it.

[01:08:36.480 --> 01:08:40.480] But it's, it is very scary to think about spending that much money and getting that

[01:08:40.480 --> 01:08:41.480] much hope.

[01:08:41.480 --> 01:08:45.080] It's like, yeah, six months, sure, but that's like, is that really the efficacy?

[01:08:45.080 --> 01:08:47.120] Because we don't know based on these trial results.

[01:08:47.120 --> 01:08:48.560] Yeah, we really don't know.

3D Printing Computer Chips

  • [link_URL TITLE][10]

[01:08:48.560 --> 01:08:52.200] The other one, very quickly, I have to point this out just because it's so cool, a study

[01:08:52.200 --> 01:08:56.120] that I wrote about in Neurological today, if you want to read all the details, 3D printing

[01:08:56.120 --> 01:08:58.160] and implantable computer chip.

[01:08:58.160 --> 01:09:00.360] So we've spoken about this so many times.

[01:09:00.360 --> 01:09:05.200] This is using the micro electrode array, MEA, basically a computer chip that you stick on

[01:09:05.200 --> 01:09:09.080] the brain and has electrodes and you could either read or stimulate the brain.

[01:09:09.080 --> 01:09:11.600] It's a way of communicating between a computer and the brain.

[01:09:11.600 --> 01:09:17.600] So this is using an advanced 3D printing technology that is able to produce a three-dimensional

[01:09:17.600 --> 01:09:19.840] micro electrode array.

[01:09:19.840 --> 01:09:23.680] So not just two-dimensional, so it's not the, basically the electrodes are varying heights.

[01:09:23.680 --> 01:09:26.840] So they go to different depths in the brain when you attach it to the brain.

[01:09:26.840 --> 01:09:30.000] It's also 10 times as dense as the previous chips.

[01:09:30.000 --> 01:09:31.000] And here's the thing.

[01:09:31.000 --> 01:09:32.000] Nice.

[01:09:32.000 --> 01:09:36.220] Because it's 3D printed, you can customize them and crank them out in days.

[01:09:36.220 --> 01:09:42.260] So if you need a very specific chip to do, you know, to implant into rats for a specific

[01:09:42.260 --> 01:09:46.360] neuroscience research study you want to do, you could just, you know, theoretically, once

[01:09:46.360 --> 01:09:50.560] this gets commercially available, you can order it up and say, I want this design of

[01:09:50.560 --> 01:09:53.760] the chip, you know, this is how I'm going to use it.

[01:09:53.760 --> 01:09:55.220] Now this is still rigid.

[01:09:55.220 --> 01:09:57.720] And so the lifespan is still only about one year.

[01:09:57.720 --> 01:10:02.600] So this is not like a permanent thing, but you could see them advancing this to printing

[01:10:02.600 --> 01:10:04.520] on flexible electronics, right?

[01:10:04.520 --> 01:10:10.400] And if you get a flexible microelectrode array, that may have a much longer lifespan than

[01:10:10.400 --> 01:10:15.440] you could be talking about interfacing with your prosthesis, you know, then becoming a

[01:10:15.440 --> 01:10:16.440] true cyborg.

[01:10:16.440 --> 01:10:20.240] Anyway, incremental advance in this technology that we've been following.

[01:10:20.240 --> 01:10:26.120] But I love the combination of advanced 3D printing technology and brain machine interface

[01:10:26.120 --> 01:10:27.120] technology.

[01:10:27.120 --> 01:10:29.720] Again, I know we're plugging our book, just deal with it.

[01:10:29.720 --> 01:10:33.120] But this is exactly the kind of thing we talk about in our book where you have to be looking

[01:10:33.120 --> 01:10:39.440] at the synergy sometimes between multiple different technologies in order to, you know,

[01:10:39.440 --> 01:10:41.960] think about how they will progress in the future.

[01:10:41.960 --> 01:10:42.960] Very cool.

[01:10:42.960 --> 01:10:45.480] Well, everyone, we're going to take a quick break from our show to talk about one of our

[01:10:45.480 --> 01:10:47.680] sponsors this week, Wondrium.

[01:10:47.680 --> 01:10:48.680] Guys, guess what?

[01:10:48.680 --> 01:10:53.960] I watched a Wondrium course about something near and dear to my heart, bread making.

[01:10:53.960 --> 01:10:54.960] Oh, boy.

[01:10:54.960 --> 01:10:55.960] There's a chemistry behind it.

[01:10:55.960 --> 01:10:58.960] You have to understand like the gluten, you have to understand fermentation, you have

[01:10:58.960 --> 01:11:03.620] to understand like what the word proofing means, all sorts of different things.

[01:11:03.620 --> 01:11:07.120] And they have a couple of courses that you could take that will walk you through the

[01:11:07.120 --> 01:11:09.600] whole process of understanding bread making.

[01:11:09.600 --> 01:11:13.560] And then, of course, they share recipes and get into even more details.

[01:11:13.560 --> 01:11:17.220] Yeah, Wondrium lets you learn about pretty much anything, not just bread making.

[01:11:17.220 --> 01:11:21.320] We're talking history, science, language, travel, and so much more.

[01:11:21.320 --> 01:11:26.400] You can get unlimited access to thousands of hours of trustworthy audio, video courses,

[01:11:26.400 --> 01:11:31.520] documentaries, and tutorials, all without the pressure of homework or exams, something

[01:11:31.520 --> 01:11:33.440] I know a lot about.

[01:11:33.440 --> 01:11:36.640] Those are just the best parts of learning and none of the stress.

[01:11:36.640 --> 01:11:40.960] And Wondrium is offering our listeners a free month of unlimited access.

[01:11:40.960 --> 01:11:44.960] Sign up today through our special URL to get this offer.

[01:11:44.960 --> 01:11:48.240] Go to Wondrium.com slash skeptics.

[01:11:48.240 --> 01:11:54.960] Again, that's W-O-N-D-R-I-U-M.com slash skeptics.

[01:11:54.960 --> 01:11:58.240] All right, guys, let's get back to the show.

Who's That Noisy? (1:12:00)


New Noisy (1:15:33)

[animal or metal guttural growls/scratching]

J: ... So, guys, very interesting sound. If you are listening to this podcast and you think you know what that is, or you heard something really cool this week, then email me in at WTN@theskeptics.org.

[01:11:58.240 --> 01:12:02.480] All right, Jay, it's Who's That Noisy Time?

[01:12:02.480 --> 01:12:05.280] Okay, guys, last week I played this noisy.

[01:12:05.280 --> 01:12:30.280] All right, any guesses?

[01:12:30.280 --> 01:12:31.280] It's that song.

[01:12:31.280 --> 01:12:33.600] I know what it is.

[01:12:33.600 --> 01:12:35.400] Isn't it like all like fake English?

[01:12:35.400 --> 01:12:40.840] All right, well, so many people knew this one that I only got one person to actually

[01:12:40.840 --> 01:12:43.640] email me in an incorrect guess.

[01:12:43.640 --> 01:12:48.400] And I didn't realize how popular this was, like, you know, I remember stumbling on this

[01:12:48.400 --> 01:12:49.400] a long time ago.

[01:12:49.400 --> 01:12:53.040] I just assumed that there'd be a lot of people who don't know what it is, but I am wrong.

[01:12:53.040 --> 01:12:58.340] So a listener named Kyle Polich wrote in and said, I believe the sound from today's episode

[01:12:58.340 --> 01:13:04.080] was a machine-generated work of original music probably produced by OpenAI's Jukebox.

[01:13:04.080 --> 01:13:05.680] So that's not correct.

[01:13:05.680 --> 01:13:07.000] And let me tell you what this is.

[01:13:07.000 --> 01:13:11.000] And I had so many people guess, you know, a couple of people that guessed it correctly.

[01:13:11.000 --> 01:13:14.640] Adam Hill was the first person to guess correctly.

[01:13:14.640 --> 01:13:17.880] And another listener named Ashley sent in the correct guess.

[01:13:17.880 --> 01:13:20.120] So let me just tell you what this is.

[01:13:20.120 --> 01:13:26.120] First off, this song has an obnoxiously long name that I think means absolutely nothing.

[01:13:26.120 --> 01:13:33.000] But it's Prien Colin Encina Cusall, something along those lines.

[01:13:33.000 --> 01:13:34.000] Okay.

[01:13:34.000 --> 01:13:38.520] It's a song that was composed by an Italian singer named Adriano Celentano and performed

[01:13:38.520 --> 01:13:43.840] by Celentano and his wife, Claudia Mori, who was a singer actress turned record producer

[01:13:43.840 --> 01:13:44.840] at the time.

[01:13:44.840 --> 01:13:48.320] This song was released in 1972.

[01:13:48.320 --> 01:13:49.320] No way.

[01:13:49.320 --> 01:13:54.480] Well, the name of the song and its lyrics are, of course, completely made up gibberish,

[01:13:54.480 --> 01:13:58.400] but are intended to sound like English in an American accent.

[01:13:58.400 --> 01:14:04.480] And the reason why that this musician slash comedian did this was because he was essentially

[01:14:04.480 --> 01:14:10.800] punking everybody because English music was so popular in Italy at the time that he wanted

[01:14:10.800 --> 01:14:15.800] to make a song that was complete gibberish that sounded like someone speaking English

[01:14:15.800 --> 01:14:20.380] and just to show how everyone will like it just because of that fact.

[01:14:20.380 --> 01:14:22.440] And that is indeed what happened.

[01:14:22.440 --> 01:14:26.620] It was like very, very, very popular in Italy in the 70s.

[01:14:26.620 --> 01:14:29.760] So good for him and his wife for pulling that off.

[01:14:29.760 --> 01:14:30.760] Very odd, though.

[01:14:30.760 --> 01:14:31.760] I mean, think about it.

[01:14:31.760 --> 01:14:32.760] This is all 100% gibberish.

[01:14:32.760 --> 01:14:33.760] Take a listen.

[01:14:33.760 --> 01:14:52.400] Jay, I found this on Reddit months ago, and I just couldn't stop watching it.

[01:14:52.400 --> 01:14:57.000] It was just like, I don't know, just really compelling in a weird way.

[01:14:57.000 --> 01:15:02.760] The video I saw was like it was like this classroom filled with people and he comes

[01:15:02.760 --> 01:15:07.840] in like the teacher and he's singing and the way he moves kind of mesmerizing in a weird

[01:15:07.840 --> 01:15:12.600] way, but just weirdly addictive, you know, just like, what the hell?

[01:15:12.600 --> 01:15:15.240] Without a doubt, the guy's very charismatic.

[01:15:15.240 --> 01:15:16.240] Yeah.

[01:15:16.240 --> 01:15:17.240] Yeah.

[01:15:17.240 --> 01:15:18.240] In a goofy way.

[01:15:18.240 --> 01:15:19.520] It's a video from the early 70s.

[01:15:19.520 --> 01:15:24.960] It has a lot of 60s vibe in it as well, which I thought was really interesting.

[01:15:24.960 --> 01:15:26.560] So you could look that up.

[01:15:26.560 --> 01:15:30.000] I know I pronounced it wrong, but I don't even know if there's a correct way to pronounce

[01:15:30.000 --> 01:15:32.640] it because it is all gibberish.

[01:15:32.640 --> 01:15:38.040] So anyway, I have a new, a new Noisy this week and guess who sent in this week's Noisy?

[01:15:38.040 --> 01:15:39.040] Meatleg.

[01:15:39.040 --> 01:15:40.040] Nope.

[01:15:40.040 --> 01:15:41.800] Meatleg did not send this in.

[01:15:41.800 --> 01:15:46.840] This is Visto Tutti sent in something very, very interesting.

[01:15:46.840 --> 01:16:05.480] Let me see if any of you guys know what the hell this is.

[01:16:05.480 --> 01:16:07.240] Whatever it is, it belongs in Bob's Halloween.

[01:16:07.240 --> 01:16:08.240] Yeah, right?

[01:16:08.240 --> 01:16:13.000] Bob, if I'm going to make you any sound effects this year, I am going to use this as a starting

[01:16:13.000 --> 01:16:14.880] point.

[01:16:14.880 --> 01:16:17.320] So guys, very interesting sound.

[01:16:17.320 --> 01:16:21.000] If you are listening to this podcast and you think you know what that is, or you heard

[01:16:21.000 --> 01:16:26.680] something really cool this week, then email me in at WTN at the skeptics guide.org.

Announcements (1:16:29)

[01:16:26.680 --> 01:16:29.640] Steve, a couple of things for you to consider.

[01:16:29.640 --> 01:16:30.840] Yes, Jay.

[01:16:30.840 --> 01:16:34.200] We are continuing to sell tickets to our Arizona shows.

[01:16:34.200 --> 01:16:36.240] We have four shows happening in Arizona.

[01:16:36.240 --> 01:16:39.020] We have two in Phoenix and two in Tucson.

[01:16:39.020 --> 01:16:41.640] We're going to be doing one of each of the following shows.

[01:16:41.640 --> 01:16:47.000] We will be doing a SGU private show where you could hear us, watch us in person record

[01:16:47.000 --> 01:16:51.720] an episode of the SGU, and then we'll have an extra hour or so to hang out and talk and

[01:16:51.720 --> 01:16:54.200] socialize with everyone afterwards.

[01:16:54.200 --> 01:16:58.200] And then we also have the skeptical extravaganza of special significance.

[01:16:58.200 --> 01:17:02.920] This will be the holiday version, probably the one and only time we will be ever doing

[01:17:02.920 --> 01:17:06.640] this, this variation of this show over the holiday.

[01:17:06.640 --> 01:17:11.600] So if you're interested to learn more, you're in Arizona, you're in Phoenix or in Tucson,

[01:17:11.600 --> 01:17:12.600] please do join us.

[01:17:12.600 --> 01:17:17.840] We really would love to see you guys go to the skeptics guide.org forward slash events

[01:17:17.840 --> 01:17:19.480] for all the details.

[01:17:19.480 --> 01:17:24.120] Some of you may know, I'm sure a lot of you know that the SGU does does a lot of live

[01:17:24.120 --> 01:17:25.120] events.

[01:17:25.120 --> 01:17:27.200] We have a variety of live events that we're capable of doing.

[01:17:27.200 --> 01:17:29.640] Let me quickly just give you a rundown.

[01:17:29.640 --> 01:17:32.920] We first off, Steve could do any number of lectures.

[01:17:32.920 --> 01:17:37.620] The SGU itself can do live recordings, panel discussions.

[01:17:37.620 --> 01:17:43.400] We also can do things that are based around any particular science topics, especially

[01:17:43.400 --> 01:17:48.440] futurism and current and future or soon to be technology, of course, because of the of

[01:17:48.440 --> 01:17:49.440] the new book.

[01:17:49.440 --> 01:17:54.280] We could be discussing any topics that are within the field of skepticism and critical

[01:17:54.280 --> 01:17:55.280] thinking.

[01:17:55.280 --> 01:17:58.480] We could even do our live game show, Boomer versus Zoomer.

[01:17:58.480 --> 01:18:03.240] And we can also put on a skeptical extravaganza if you were interested.

[01:18:03.240 --> 01:18:09.120] All of these different types of shows are all available to any college, university or

[01:18:09.120 --> 01:18:10.400] corporations.

[01:18:10.400 --> 01:18:15.200] If you are associated with any of these types of organizations and are interested in in

[01:18:15.200 --> 01:18:19.360] hiring us to do any kind of show that I've listed or just to have a discussion with us

[01:18:19.360 --> 01:18:24.680] to see what we're capable of doing, please contact us right now at info at the skeptics

[01:18:24.680 --> 01:18:29.640] guide.org and put in the subject something that will catch our attention like live events

[01:18:29.640 --> 01:18:30.900] or something along that line.

[01:18:30.900 --> 01:18:35.480] So I can I can sort my email, please, because I get hundreds of emails every day.

[01:18:35.480 --> 01:18:36.480] All right.

[01:18:36.480 --> 01:18:37.480] Thanks.

[01:18:37.480 --> 01:18:38.480] Thank you.

Science or Fiction (1:18:41)

Theme: Adapting to Climate Change

Item #1: Scientists at the University of the Philippines have proposed burying plastic waste beneath sinking islands to keep them above water.[11]
Item #2: China is at the forefront of building “sponge cities” – cities that incorporate features that absorb large amounts of water to help reduce storm water damage.[12]
Item #3: A glaciologist at Princeton University has proposed building massive miles-long seawalls at the bases of Antarctic and Greenland glaciers in order to delay their collapse, perhaps by hundreds of years.[13]

Answer Item
Fiction Plastic under islands
Science Sponge cities
Science
Miles-long seawalls
Host Result
Steve swept
Rogue Guess
Bob
Plastic under islands
Evan
Plastic under islands
Cara
Plastic under islands
Jay
Plastic under islands

Voice-over: It's time for Science or Fiction.

Bob's Response

Evan's Response

Cara's Response

Jay's Response

Steve Explains Item #2

Steve Explains Item #1

Steve Explains Item #3

[01:18:38.480 --> 01:18:42.900] OK, guys, let's go on with science or fiction.

[01:18:42.900 --> 01:18:52.360] It's time for science or fiction.

[01:18:52.360 --> 01:18:57.560] Each week, I come up with three science news items or facts, two real and one fake, and

[01:18:57.560 --> 01:19:02.120] then I challenge my panel of skeptics to tell me which one is the fake.

[01:19:02.120 --> 01:19:04.020] We have a theme this week.

[01:19:04.020 --> 01:19:08.120] The theme is adapting to climate change.

[01:19:08.120 --> 01:19:09.120] I knew it.

[01:19:09.120 --> 01:19:10.120] When did you know it?

[01:19:10.120 --> 01:19:11.120] I'm just as soon as you said it.

[01:19:11.120 --> 01:19:12.120] I thought it was going to be.

[01:19:12.120 --> 01:19:13.120] I knew it just now.

[01:19:13.120 --> 01:19:16.640] I thought it was going to be adapting to Florida.

[01:19:16.640 --> 01:19:22.280] OK, so there's a three things that people are working on or have proposed or whatever

[01:19:22.280 --> 01:19:26.240] that to adapt to the changes of the climate.

[01:19:26.240 --> 01:19:28.080] OK, here we go.

[01:19:28.080 --> 01:19:32.840] Item number one, scientists at the University of the Philippines have proposed burying plastic

[01:19:32.840 --> 01:19:37.320] waste beneath sinking islands to keep them above water.

[01:19:37.320 --> 01:19:43.120] Item number two, China is at the forefront of building sponge cities, cities that incorporate

[01:19:43.120 --> 01:19:49.680] features that absorb large amounts of water to help reduce storm water damage.

[01:19:49.680 --> 01:19:56.080] And item number three, a glaciologist at Princeton University has proposed building massive miles

[01:19:56.080 --> 01:20:02.320] long seawalls at the bases of Antarctic and Greenland glaciers in order to delay their

[01:20:02.320 --> 01:20:04.800] collapse, perhaps by hundreds of years.

[01:20:04.800 --> 01:20:06.360] Bob, you seem eager to go.

[01:20:06.360 --> 01:20:07.360] So why don't you go first?

[01:20:07.360 --> 01:20:08.360] Oh, God, man.

[01:20:08.360 --> 01:20:09.360] This is so.

[01:20:09.360 --> 01:20:11.320] All right.

[01:20:11.320 --> 01:20:19.520] Plastic waste beneath sinking islands, how does that make any fiction fiction done?

[01:20:19.520 --> 01:20:22.800] You're not even going to consider the other two.

[01:20:22.800 --> 01:20:23.800] All right.

[01:20:23.800 --> 01:20:24.800] This is ridiculous.

[01:20:24.800 --> 01:20:25.800] Wow.

[01:20:25.800 --> 01:20:26.800] This is heaven.

[01:20:26.800 --> 01:20:36.680] It's hard to argue with Bob's sound reasoning that I'm just right because nope, nope, burying

[01:20:36.680 --> 01:20:39.520] plastic waste beneath sinking islands.

[01:20:39.520 --> 01:20:46.800] But the islands, are they really sinking or is just the water levels rising?

[01:20:46.800 --> 01:20:49.280] Because the island is just a peak of like a mountain.

[01:20:49.280 --> 01:20:53.980] Yeah, it's from it's from the water level rising where they're trying to raise the island

[01:20:53.980 --> 01:20:56.640] because of the you know, it's sinking because the water is rising.

[01:20:56.640 --> 01:20:59.140] It doesn't mean that it's literally.

[01:20:59.140 --> 01:21:06.080] So that means drilling holes into the bedrock or to the wherever the it that wouldn't work.

[01:21:06.080 --> 01:21:07.080] Would that work?

[01:21:07.080 --> 01:21:10.240] I don't even really I don't even know how that's plausible to tell you the truth.

[01:21:10.240 --> 01:21:13.120] I am not that that wouldn't be the case.

[01:21:13.120 --> 01:21:15.800] But you know, it doesn't make it not science.

[01:21:15.800 --> 01:21:20.300] But geez, that's like a big zero in my book as well.

[01:21:20.300 --> 01:21:24.940] But the other ones about the sponge cities in China, incorporating features that absorb

[01:21:24.940 --> 01:21:32.360] large amounts of water to help reduce storm water damage, I think this has been attempted

[01:21:32.360 --> 01:21:33.360] before.

[01:21:33.360 --> 01:21:37.840] I seem to recall reading something about this years ago.

[01:21:37.840 --> 01:21:42.800] I don't know if it was China or somewhere else, but there was the talk of using sponges

[01:21:42.800 --> 01:21:45.780] to do just this to absorb water.

[01:21:45.780 --> 01:21:50.680] So I think that one's at least I've got some reference on that one.

[01:21:50.680 --> 01:21:53.200] The last one about the glaciologist.

[01:21:53.200 --> 01:21:58.360] That's neat that that's a thing proposed building massive miles long sea walls at the

[01:21:58.360 --> 01:22:04.200] bases of Antarctic and Greenland glaciers in order to delay their collapse.

[01:22:04.200 --> 01:22:07.520] Well, yeah, I mean, but again, there's a basis for this.

[01:22:07.520 --> 01:22:16.560] They they are they do build seawalls all over the world in various climates and for very,

[01:22:16.560 --> 01:22:18.600] you know, with the idea of protection.

[01:22:18.600 --> 01:22:23.360] So, yeah, just in a long, much more long winded way.

[01:22:23.360 --> 01:22:24.360] I agree with Bob.

[01:22:24.360 --> 01:22:25.360] OK.

[01:22:25.360 --> 01:22:26.360] It means fiction.

[01:22:26.360 --> 01:22:28.440] I kind of feel the same way.

[01:22:28.440 --> 01:22:29.440] I mean, I don't know.

[01:22:29.440 --> 01:22:30.440] The first one.

[01:22:30.440 --> 01:22:31.960] I mean, it's not that believable.

[01:22:31.960 --> 01:22:33.240] It's a terrible idea.

[01:22:33.240 --> 01:22:38.280] But the way that these are all phrased, it's like scientists have proposed maybe, you

[01:22:38.280 --> 01:22:40.480] know, like they proposed it.

[01:22:40.480 --> 01:22:41.480] They also proposed, right.

[01:22:41.480 --> 01:22:45.560] Like throwing a paper airplane to the moon to see if it reaches the moon.

[01:22:45.560 --> 01:22:46.560] Exactly.

[01:22:46.560 --> 01:22:50.480] So it's like there could be some like really out there scientists in the Philippines who

[01:22:50.480 --> 01:22:54.840] are like, I know because there is a plastic waste problem in the Philippines.

[01:22:54.840 --> 01:22:55.840] Right.

[01:22:55.840 --> 01:22:56.960] And that's the idea here.

[01:22:56.960 --> 01:23:00.160] There's all this plastic waste that's like washing up on the shores from all over the

[01:23:00.160 --> 01:23:02.680] world and they're like drowning in it.

[01:23:02.680 --> 01:23:04.180] So what are they going to do with it?

[01:23:04.180 --> 01:23:08.940] You see that in impoverished parts of like island nations, they burn plastic for fuel.

[01:23:08.940 --> 01:23:10.720] It's like horrifically toxic.

[01:23:10.720 --> 01:23:12.960] It's really sad.

[01:23:12.960 --> 01:23:17.520] And it's like, I don't know, why do we take this like this trash and turn it into gold,

[01:23:17.520 --> 01:23:18.520] keep us afloat?

[01:23:18.520 --> 01:23:24.800] Like it doesn't seem it doesn't seem like a like a crazy thing for like a middle schooler

[01:23:24.800 --> 01:23:29.760] to propose, but it just doesn't seem like it would work and or it would have devastating

[01:23:29.760 --> 01:23:34.080] consequences because the plastic would just end up back in the ocean.

[01:23:34.080 --> 01:23:35.320] Sponge cities for sure.

[01:23:35.320 --> 01:23:40.760] I mean, this is what we saw happening in Houston, just around the world when they're like devastating

[01:23:40.760 --> 01:23:42.260] hurricanes and stuff like that.

[01:23:42.260 --> 01:23:43.720] And the cities are all concrete.

[01:23:43.720 --> 01:23:45.280] They just they drown.

[01:23:45.280 --> 01:23:48.280] If they can't absorb all of that water, they're screwed.

[01:23:48.280 --> 01:23:51.760] So I think geo engineers or civic engineers know that.

[01:23:51.760 --> 01:23:54.820] I don't know if China's at the forefront, but China makes cities fast.

[01:23:54.820 --> 01:23:58.800] When I visited China, I was in a city that had 30 million people in it.

[01:23:58.800 --> 01:24:01.840] And they told me that 30 years ago, it was a farming community.

[01:24:01.840 --> 01:24:03.760] And I was like, are you kidding me?

[01:24:03.760 --> 01:24:06.000] So I'm not surprised by that.

[01:24:06.000 --> 01:24:08.480] And then the whole seawall at that.

[01:24:08.480 --> 01:24:11.520] This feels like some flat earth or stuff.

[01:24:11.520 --> 01:24:14.320] Or is that actually glacier walls that they believe?

[01:24:14.320 --> 01:24:19.480] But yeah, bases of Antarctic and Greenland glaciers in order to delay collapse.

[01:24:19.480 --> 01:24:23.080] I mean, yeah, if we can build something to protect them, why wouldn't we?

[01:24:23.080 --> 01:24:24.080] It's probably super expensive.

[01:24:24.080 --> 01:24:27.800] But if we could build a wall here in the oh, my God, if we can build Trump's wall.

[01:24:27.800 --> 01:24:34.480] Yeah, yeah, I think I'm with the guys on this one.

[01:24:34.480 --> 01:24:36.280] I don't think the trash makes any sense.

[01:24:36.280 --> 01:24:37.280] So that's the fiction.

[01:24:37.280 --> 01:24:38.280] All right.

[01:24:38.280 --> 01:24:39.280] And Jay.

[01:24:39.280 --> 01:24:44.120] Yeah, I got to just flat out say, Steve, I agree with Bob, I wouldn't even have needed

[01:24:44.120 --> 01:24:47.120] Bob to say that in order for me to want to pick this one.

[01:24:47.120 --> 01:24:49.480] I think that one straight up is the fiction.

[01:24:49.480 --> 01:24:50.480] All right.

[01:24:50.480 --> 01:24:52.480] Well, let's start with number two.

[01:24:52.480 --> 01:24:54.320] Since you all agree on that one.

[01:24:54.320 --> 01:24:59.200] China is at the forefront of building sponge cities, cities that incorporate features that

[01:24:59.200 --> 01:25:03.040] absorb large amounts of water to help reduce storm water damage.

[01:25:03.040 --> 01:25:05.400] You all think that one is science.

[01:25:05.400 --> 01:25:07.880] And that one is science.

[01:25:07.880 --> 01:25:08.880] Yep.

[01:25:08.880 --> 01:25:10.840] They're trying to do that.

[01:25:10.840 --> 01:25:12.700] So what's a sponge city?

[01:25:12.700 --> 01:25:14.960] What would that involve?

[01:25:14.960 --> 01:25:24.280] It involves using so called just green roofs, rain gardens, wetlands and other measures.

[01:25:24.280 --> 01:25:30.920] Basically, you want to create an environment that can absorb a lot of water rather than

[01:25:30.920 --> 01:25:37.360] just all blacktop and cement that repels the water so that it can't go anywhere and it

[01:25:37.360 --> 01:25:38.360] builds up.

[01:25:38.360 --> 01:25:39.360] Right.

[01:25:39.360 --> 01:25:44.320] Now, this would be given the degree of flooding, for example, that is being experienced in

[01:25:44.320 --> 01:25:45.600] Florida.

[01:25:45.600 --> 01:25:47.980] These kind of measures wouldn't be enough.

[01:25:47.980 --> 01:25:51.500] They would not be enough for big, massive flooding like that.

[01:25:51.500 --> 01:25:58.840] But it would take the edge off and it may be enough for more minor flooding events.

[01:25:58.840 --> 01:26:04.520] And also, if you have communities that will be getting more tidal water sort of flowing

[01:26:04.520 --> 01:26:07.760] and not massive floods, but they're going to be getting tidal water, you need that water

[01:26:07.760 --> 01:26:10.320] to go somewhere.

[01:26:10.320 --> 01:26:17.760] Some of these mitigation factors can also clean the water, basically filters it or processes

[01:26:17.760 --> 01:26:23.880] it through the plants and everything so it also can help clean the water as well.

[01:26:23.880 --> 01:26:30.200] So, yeah, these are real proposals and Kara's right because China builds cities so quickly.

[01:26:30.200 --> 01:26:37.160] They're trying to basically build, design new cities around this principle, basically

[01:26:37.160 --> 01:26:41.280] more adaptable to the increased flooding that we're seeing with climate change.

[01:26:41.280 --> 01:26:46.180] All right, let's go back to number one, scientists at the University of the Philippines have

[01:26:46.180 --> 01:26:51.400] proposed burying plastic waste beneath sinking islands to keep them above water.

[01:26:51.400 --> 01:26:53.560] You guys all think this one is the fiction.

[01:26:53.560 --> 01:26:56.960] Do you guys know how much plastic waste is out there?

[01:26:56.960 --> 01:26:57.960] Yes.

[01:26:57.960 --> 01:26:58.960] Yes.

[01:26:58.960 --> 01:27:00.920] Well, no, I don't actually know.

[01:27:00.920 --> 01:27:03.360] All I know is it's killing the planet.

[01:27:03.360 --> 01:27:07.520] Yeah, it's like billions of tons of the stuff.

[01:27:07.520 --> 01:27:14.800] And yeah, so one of the ways in which they're trying to raise islands is just by dumping

[01:27:14.800 --> 01:27:21.180] stuff on it, right, dirt, rocks, gravel, sand, mostly sand, to just literally just physically

[01:27:21.180 --> 01:27:22.180] raise it up.

[01:27:22.180 --> 01:27:23.180] A form of fill?

[01:27:23.180 --> 01:27:24.180] Yeah.

[01:27:24.180 --> 01:27:25.180] So they're just filling it.

[01:27:25.180 --> 01:27:29.680] First of all, they're making lots of artificial islands by doing this, which is not good because

[01:27:29.680 --> 01:27:31.920] they're usually building them on coral reefs.

[01:27:31.920 --> 01:27:32.920] Right.

[01:27:32.920 --> 01:27:33.920] God, yeah.

[01:27:33.920 --> 01:27:35.440] It's about as bad as you can get.

[01:27:35.440 --> 01:27:41.960] But they're doing it in order to sort of maintain their footprint because they're losing land

[01:27:41.960 --> 01:27:44.640] to ocean rising, to climate change.

[01:27:44.640 --> 01:27:48.400] So the idea here is, well, just we need a lot of something to put under, you know, under

[01:27:48.400 --> 01:27:49.400] the ground.

[01:27:49.400 --> 01:27:53.120] You know, basically, instead of putting it on top, like instead of putting sand on top,

[01:27:53.120 --> 01:27:56.560] you just dig under, you move the sand out of the way, put something underneath them,

[01:27:56.560 --> 01:27:57.560] move the sand back.

[01:27:57.560 --> 01:28:02.280] And then you still have the surface, but you have something else, some filler to put underneath

[01:28:02.280 --> 01:28:03.280] it.

[01:28:03.280 --> 01:28:07.840] So where do we have a lot of waste filler that we could just put in the ground to raise

[01:28:07.840 --> 01:28:11.320] the level of the ground a little bit more above the ocean?

[01:28:11.320 --> 01:28:12.320] Floating in the ocean?

[01:28:12.320 --> 01:28:13.320] So that's the idea.

[01:28:13.320 --> 01:28:15.040] That's my idea, nobody else ever proposed it.

[01:28:15.040 --> 01:28:16.040] So this is a fiction.

[01:28:16.040 --> 01:28:17.040] Yeah.

[01:28:17.040 --> 01:28:18.040] That's a novel idea.

[01:28:18.040 --> 01:28:19.040] Yeah.

[01:28:19.040 --> 01:28:23.800] I'll tell you in a minute how I got to that.

[01:28:23.800 --> 01:28:27.660] So number three, a glaciologist at Princeton University had proposed building massive miles

[01:28:27.660 --> 01:28:32.280] long sea walls at the bases of Antarctic and Greenland glaciers in order to delay their

[01:28:32.280 --> 01:28:34.980] collapse, perhaps by hundreds of years.

[01:28:34.980 --> 01:28:37.040] That is science.

[01:28:37.040 --> 01:28:41.920] This would be a mega engineering project, absolutely.

[01:28:41.920 --> 01:28:46.320] You know, there's no proposal to actually do this at this time.

[01:28:46.320 --> 01:28:54.200] This was just a paper that was presented at a conference and just sort of doing a basic

[01:28:54.200 --> 01:28:59.600] sort of proof of concept kind of analysis, just showing that, yeah, if you did.

[01:28:59.600 --> 01:29:03.320] Now the sea wall would not have to be above sea level, you know, just it would really

[01:29:03.320 --> 01:29:10.160] be the point of this would be to keep warm water from brushing up against the lower end

[01:29:10.160 --> 01:29:14.840] of the glacier and eroding it out and causing it to collapse.

[01:29:14.840 --> 01:29:20.040] It would still need to be massive and miles long to keep this from happening.

[01:29:20.040 --> 01:29:26.760] But they said that this could slow the erosion and the collapse of these glaciers, these

[01:29:26.760 --> 01:29:30.520] ice sheets, and buy us time, basically.

[01:29:30.520 --> 01:29:34.040] And so if it would work as advertised, this could be amazing.

[01:29:34.040 --> 01:29:40.960] I mean, this could delay from like 100 to 150 years, you know, the significant sea rise

[01:29:40.960 --> 01:29:41.960] that would result.

[01:29:41.960 --> 01:29:45.560] It could delay it to it for a thousand years, you know.

[01:29:45.560 --> 01:29:50.520] And that would certainly give us a lot more time to, you know, otherwise develop mitigation

[01:29:50.520 --> 01:29:54.560] strategies or maybe even reverse some of the global warming.

[01:29:54.560 --> 01:29:58.020] Because once those glaciers collapse, we're effed, right?

[01:29:58.020 --> 01:30:02.160] And that's irreversible on a human time scale, at least with anything, any extrapolation

[01:30:02.160 --> 01:30:03.160] of current technology.

[01:30:03.160 --> 01:30:05.840] Those took millions of years to form, right?

[01:30:05.840 --> 01:30:09.600] So if we could prevent them from collapse, that would be huge.

[01:30:09.600 --> 01:30:13.300] The rumor is true, though, that the walls would just really be there to prevent penguins

[01:30:13.300 --> 01:30:14.300] from migrating.

[01:30:14.300 --> 01:30:15.840] Yeah, right.

[01:30:15.840 --> 01:30:22.080] So this is Michael Wolovik, who's a glaciology postdoc at Princeton University.

[01:30:22.080 --> 01:30:25.600] He actually published this in 2018.

[01:30:25.600 --> 01:30:27.280] Don't know how the idea is faring now.

[01:30:27.280 --> 01:30:29.360] He's got to convince somebody to spend a lot of money.

[01:30:29.360 --> 01:30:34.080] You know, the idea is he would, like, fill it in with gravel and then put large boulders

[01:30:34.080 --> 01:30:36.780] on the outside of it so that it doesn't erode away.

[01:30:36.780 --> 01:30:41.920] And then basically just keeping back the warm water so it doesn't erode away the glacier.

[01:30:41.920 --> 01:30:49.400] So this was, like, one of the rare times when I had a totally failed science or fiction

[01:30:49.400 --> 01:30:50.400] concept.

[01:30:50.400 --> 01:30:56.200] You know, this idea sounded so great when I came up with it, right?

[01:30:56.200 --> 01:31:03.080] It's the craziest climate adaptation or mitigation things that people are coming up with.

[01:31:03.080 --> 01:31:05.840] And I couldn't find any.

[01:31:05.840 --> 01:31:08.120] Everyone is so damn reasonable.

[01:31:08.120 --> 01:31:14.360] It's just, like, the two real ones here, the sponge city and the glaciologist thing,

[01:31:14.360 --> 01:31:16.480] it took me so long to find them.

[01:31:16.480 --> 01:31:20.240] I had to search in different ways, over, you know, so many different ways.

[01:31:20.240 --> 01:31:24.360] Normally, I just put, like, crazy shit people are doing, and I get all kinds of choices.

[01:31:24.360 --> 01:31:25.360] You know what I mean?

[01:31:25.360 --> 01:31:27.240] That's what I'm looking for.

[01:31:27.240 --> 01:31:30.080] I don't know why I was so surprised.

[01:31:30.080 --> 01:31:32.240] I couldn't find anything.

[01:31:32.240 --> 01:31:35.040] And you know, I was running out of time, so I basically had to come up with something.

[01:31:35.040 --> 01:31:41.680] But I'm trying to figure out, like, no matter how I searched for it, no matter what my usual

[01:31:41.680 --> 01:31:47.920] terms that I use to find stuff, I kept getting, like, serious scientific papers and not any

[01:31:47.920 --> 01:31:49.280] crazy stuff.

[01:31:49.280 --> 01:31:51.800] So I wonder if it just doesn't exist.

[01:31:51.800 --> 01:32:00.160] And my hypothesis is that the crazy people don't believe in climate change.

[01:32:00.160 --> 01:32:03.360] They're not the ones who are coming up with ways to mitigate it.

[01:32:03.360 --> 01:32:07.520] I was thinking it would be, like, we're going to build a 30-foot wall around Florida, like,

[01:32:07.520 --> 01:32:08.520] crazy stuff like that.

[01:32:08.520 --> 01:32:09.520] I couldn't find anything.

[01:32:09.520 --> 01:32:12.120] Again, there's probably stuff out there.

[01:32:12.120 --> 01:32:15.120] Just with the time that I had, I could not come up with it.

[01:32:15.120 --> 01:32:17.320] This is the best I could do, you know, with the time that I had.

[01:32:17.320 --> 01:32:18.560] But I was really surprised.

[01:32:18.560 --> 01:32:22.040] This took me a lot longer than I thought it was going to take.

[01:32:22.040 --> 01:32:26.520] And it was really challenging, like, Google-fu, you know, to find the kind of stuff I was

[01:32:26.520 --> 01:32:27.700] looking for.

[01:32:27.700 --> 01:32:33.880] Maybe if anyone listening has some suggestions, maybe we can do a follow-up.

[01:32:33.880 --> 01:32:38.640] Just send them to me or send them to, you know, Deserve Science or Fiction, one, I think

[01:32:38.640 --> 01:32:41.160] there is on the answer form.

[01:32:41.160 --> 01:32:45.120] Sometimes people send us science or fiction suggestions and they send it to the general

[01:32:45.120 --> 01:32:46.120] email.

[01:32:46.120 --> 01:32:49.800] And you've basically eliminated that from consideration as science fiction.

[01:32:49.800 --> 01:32:52.000] It contaminated the evidence.

[01:32:52.000 --> 01:32:56.040] It has to go only to me, otherwise it can't be used that way.

[01:32:56.040 --> 01:32:59.040] And you can see that on the email form.

[01:32:59.040 --> 01:33:03.440] But anyway, I mean, it just may not be, there just may be nothing out there, you know, right?

[01:33:03.440 --> 01:33:06.000] I mean, do you know where I'm coming from, though?

[01:33:06.000 --> 01:33:09.480] Like, wouldn't it seem like there's got to be some crazy shit out there that people were

[01:33:09.480 --> 01:33:10.480] thinking about?

[01:33:10.480 --> 01:33:11.480] Right.

[01:33:11.480 --> 01:33:17.080] I mean, if I had one juicy thing that people seriously proposed but was batshit crazy,

[01:33:17.080 --> 01:33:18.080] that would have been gold.

[01:33:18.080 --> 01:33:19.080] Yeah, gold.

[01:33:19.080 --> 01:33:22.680] I thought for sure I was going to find something, I just couldn't find anything.

[01:33:22.680 --> 01:33:25.600] Homeopaths suggest deleting my soul.

[01:33:25.600 --> 01:33:29.600] Yeah, but then it's like, sure, homeopaths would suggest anything.

[01:33:29.600 --> 01:33:34.800] The point was to find, like, somebody with reasonable sounding credentials proposing

[01:33:34.800 --> 01:33:40.160] something that at first blush sounds really nuts, like, completely implausible.

[01:33:40.160 --> 01:33:43.320] But I just could not find anything to fit the bill.

[01:33:43.320 --> 01:33:46.680] Anyway, so it turned into an easy one for this week.

[01:33:46.680 --> 01:33:48.040] But it's still fun to talk about.

[01:33:48.040 --> 01:33:54.560] But yeah, there's a lot of serious and very, you know, sober proposals about how to mitigate

[01:33:54.560 --> 01:33:56.880] or adapt to climate change.

[01:33:56.880 --> 01:34:03.240] They're all really boring things, you know, nothing that would have worked for this piece.

[01:34:03.240 --> 01:34:08.120] It's all like the basic, we're going to invest money in indigenous populations to help them,

[01:34:08.120 --> 01:34:12.800] you know, whatever, or we're going to help people relocate, you know, it's all really

[01:34:12.800 --> 01:34:13.800] boring stuff.

[01:34:13.800 --> 01:34:20.320] I couldn't, there was nothing science fiction-y out there, you know, for the, and I think,

[01:34:20.320 --> 01:34:21.320] what does that mean?

[01:34:21.320 --> 01:34:25.560] There may not be good ways to adapt to climate change when you think about it, like the biggest

[01:34:25.560 --> 01:34:30.480] proposals were we got to help people move, you know, like that was like the big thing.

[01:34:30.480 --> 01:34:34.960] So like, there's really nothing, you know, if the sea levels are going to rise, what

[01:34:34.960 --> 01:34:35.960] are you going to do?

[01:34:35.960 --> 01:34:37.680] Yeah, we'll make sponge cities.

[01:34:37.680 --> 01:34:41.200] We need those carbon capture trees, fake trees, you know.

[01:34:41.200 --> 01:34:44.560] That's more climate mitigation, like preventing climate change.

[01:34:44.560 --> 01:34:48.060] This is climate adaptation, which is like, it's going to happen.

[01:34:48.060 --> 01:34:50.320] How do we just make the impact less?

[01:34:50.320 --> 01:34:54.280] And again, my preliminary conclusion is they got nothing.

[01:34:54.280 --> 01:34:59.160] It's like really, there's no big geo-engineering ideas out there.

[01:34:59.160 --> 01:35:03.920] It's all just about, you know, helping poor people survive, you know, by moving them around

[01:35:03.920 --> 01:35:05.580] or like, all right, here's one.

[01:35:05.580 --> 01:35:10.400] This is like, not even that nutty, like, you know, one thing is we're going to build houses

[01:35:10.400 --> 01:35:11.400] on stilts.

[01:35:11.400 --> 01:35:13.120] Well, of course you are.

[01:35:13.120 --> 01:35:16.760] You know, if you're going to build your houses back in Florida where they're flooding, you're

[01:35:16.760 --> 01:35:22.720] going to, you know, raise the ground up 10 feet or, you know, build the houses on a raised

[01:35:22.720 --> 01:35:24.320] platform or whatever.

[01:35:24.320 --> 01:35:25.320] Or on top of plastic.

[01:35:25.320 --> 01:35:26.320] Yeah.

[01:35:26.320 --> 01:35:27.320] So.

[01:35:27.320 --> 01:35:28.320] Or on pontoons.

[01:35:28.320 --> 01:35:32.820] I couldn't find anything that you guys would even blink at, you know, except for these.

[01:35:32.820 --> 01:35:33.820] That's it.

[01:35:33.820 --> 01:35:37.720] But that's a dumb idea because that's how long is that going to last?

[01:35:37.720 --> 01:35:40.680] It's only a short term thing and it's not really going to be that safe.

[01:35:40.680 --> 01:35:43.140] You really just need to move inland.

[01:35:43.140 --> 01:35:45.440] But there's a problem with doing that.

[01:35:45.440 --> 01:35:47.560] Who do you think lives inland now?

[01:35:47.560 --> 01:35:48.560] Alligators.

[01:35:48.560 --> 01:35:51.480] Are you talking about in Florida specifically or just in general?

[01:35:51.480 --> 01:35:52.480] So think about this.

[01:35:52.480 --> 01:35:55.480] Beachfront property is really desirable.

[01:35:55.480 --> 01:35:56.480] Very expensive.

[01:35:56.480 --> 01:35:57.480] Right.

[01:35:57.480 --> 01:35:58.480] It's expensive.

[01:35:58.480 --> 01:35:59.480] Super expensive.

[01:35:59.480 --> 01:36:00.480] So who lives inland?

[01:36:00.480 --> 01:36:01.480] Poor people.

[01:36:01.480 --> 01:36:02.480] Poor people.

[01:36:02.480 --> 01:36:03.480] Poor people and indigenous people.

[01:36:03.480 --> 01:36:08.040] And so now that property is becoming more desirable.

[01:36:08.040 --> 01:36:12.200] And because it's not going to be wiped away by climate change.

[01:36:12.200 --> 01:36:15.880] And so they're calling this climate induced gentrification.

[01:36:15.880 --> 01:36:16.880] Yeah.

[01:36:16.880 --> 01:36:17.880] Yeah.

[01:36:17.880 --> 01:36:18.880] A new term.

[01:36:18.880 --> 01:36:19.880] No, that's exactly what it is.

[01:36:19.880 --> 01:36:20.880] Yeah, that's exactly what it is.

[01:36:20.880 --> 01:36:23.320] They're coming into areas that were previously affordable.

[01:36:23.320 --> 01:36:24.320] Oh, I see.

[01:36:24.320 --> 01:36:25.320] Oh, I get it.

[01:36:25.320 --> 01:36:26.320] Yeah, that's what gentrification is.

[01:36:26.320 --> 01:36:30.280] They're coming into areas that were previously affordable, building them up for their uses

[01:36:30.280 --> 01:36:33.600] and then making it and pushing the people that have lived there forever out because

[01:36:33.600 --> 01:36:34.880] they can't afford the prices.

[01:36:34.880 --> 01:36:35.880] Yeah.

[01:36:35.880 --> 01:36:39.420] They're making it unaffordable for the people who are already living there because they're

[01:36:39.420 --> 01:36:41.760] moving away from the coast, which are going to be flooded.

Skeptical Quote of the Week (1:36:43)

No one undertakes research in physics with the intention of winning a prize. It is the joy of discovering something no one knew before.
Stephen Hawking (1942-2018), English theoretical physicist

Signoff

S: —and until next week, this is your Skeptics' Guide to the Universe.

S: Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information, visit us at theskepticsguide.org. Send your questions to info@theskepticsguide.org. And, if you would like to support the show and all the work that we do, go to patreon.com/SkepticsGuide and consider becoming a patron and becoming part of the SGU community. Our listeners and supporters are what make SGU possible.

[top]                        

Today I Learned

  • Fact/Description, possibly with an article reference[14]
  • Fact/Description
  • Fact/Description

Notes

References

Vocabulary


Navi-previous.png Back to top of page Navi-next.png