SGU Episode 900

From SGUTranscripts
Jump to navigation Jump to search
  Emblem-pen-green.png This transcript is not finished. Please help us finish it!
Add a Transcribing template to the top of this transcript before you start so that we don't duplicate your efforts.

Template:Editing required (w/links) You can use this outline to help structure the transcription. Click "Edit" above to begin.


SGU Episode 900
October 8th 2022
900 silk moth.jpg

The silk moth, whose complete
pan-genome has been published[1]

SGU 899                      SGU 901

Skeptical Rogues
S: Steven Novella

B: Bob Novella

C: Cara Santa Maria

J: Jay Novella

E: Evan Bernstein

Quote of the Week

No one undertakes research in physics with the intention of winning a prize. It is the joy of discovering something no one knew before.

Stephen Hawking, English theoretical physicist

Links
Download Podcast
Show Notes
Forum Discussion

Introduction, hurricanes and earthquakes

Voice-over: You're listening to the Skeptics' Guide to the Universe, your escape to reality. [00:08.920 --> 00:12.760] Hello and welcome to the Skeptic's Guide to the Universe.

[00:12.760 --> 00:17.440] Today is Thursday, October 6th, 2022, and this is your host, Steve Novella.

[00:17.440 --> 00:19.200] Joining me this week are Bob Novella,

[00:19.200 --> 00:20.200] Hey everybody.

[00:20.200 --> 00:21.200] Kara Santamaria,

[00:21.200 --> 00:22.200] Howdy.

[00:22.200 --> 00:23.200] Jay Novella,

[00:23.200 --> 00:24.200] Hey guys.

[00:24.200 --> 00:25.200] And Evan Bernstein.

[00:25.200 --> 00:27.080] Good evening folks.

[00:27.080 --> 00:33.280] So 900, 900, 900, 900, wait, what?

[00:33.280 --> 00:35.600] This is SGU episode 900, nine zero zero.

[00:35.600 --> 00:36.600] How did that happen?

[00:36.600 --> 00:39.120] I know, it kind of snuck up on you.

[00:39.120 --> 00:40.120] After 899, you know.

[00:40.120 --> 00:41.120] Who would have thought of that?

[00:41.120 --> 00:46.020] Yeah, we're not going to make a big deal out of it.

[00:46.020 --> 00:50.000] But what this means is that we're just a little bit over two years away from 1000.

[00:50.000 --> 00:53.160] I think we're going to have to do something big.

[00:53.160 --> 00:54.160] Something big.

[00:54.160 --> 00:55.720] It's going to have to be something big.

[00:55.720 --> 00:57.400] Like record naked or something.

[00:57.400 --> 01:01.360] Yeah, like can it be, can it be something big, but also something that's not exhausting?

[01:01.360 --> 01:03.560] I don't know, Kara, that's a big ask.

[01:03.560 --> 01:04.560] Can we find that, pal?

[01:04.560 --> 01:05.560] Like a 48-hour episode?

[01:05.560 --> 01:06.560] No!

[01:06.560 --> 01:12.200] We will come up with something to do for 1000, but.

[01:12.200 --> 01:13.200] What about a destination?

[01:13.200 --> 01:14.200] Can we make it?

[01:14.200 --> 01:15.200] Ooh.

[01:15.200 --> 01:16.200] I like that.

[01:16.200 --> 01:17.200] Going somewhere?

[01:17.200 --> 01:18.200] Me too.

[01:18.200 --> 01:19.200] Disney World?

[01:19.200 --> 01:20.200] No, think bigger, Bob.

[01:20.200 --> 01:21.200] Yeah, right?

[01:21.200 --> 01:22.200] Yeah.

[01:22.200 --> 01:23.200] Okay.

[01:23.200 --> 01:24.200] Disney World during Halloween.

[01:24.200 --> 01:29.280] Two years, two years to plan.

[01:29.280 --> 01:30.520] We're open to suggestions.

[01:30.520 --> 01:31.520] Two years.

[01:31.520 --> 01:32.520] Yeah, we can 20.

[01:32.520 --> 01:37.840] It should be around the, ooh, Labor Day of 2024, right?

[01:37.840 --> 01:38.840] You'll be 2024.

[01:38.840 --> 01:39.840] Late 2024.

[01:39.840 --> 01:40.840] Yeah.

[01:40.840 --> 01:43.880] Kara, how did you survive your hurricane?

[01:43.880 --> 01:44.880] I survived.

[01:44.880 --> 01:50.400] It was kind of weird because before it actually made landfall, I'm trying to remember.

[01:50.400 --> 01:52.340] So Hurricane Ian, it was Tropical Storm Ian.

[01:52.340 --> 01:57.260] It was apparently, did you guys see this, two miles per hour shy of being a category

[01:57.260 --> 01:58.260] five.

[01:58.260 --> 01:59.260] Yes.

[01:59.260 --> 02:03.280] Like it didn't quite, it was literally like within a couple of miles of making the cutoff.

[02:03.280 --> 02:05.540] So it was still technically a category four.

[02:05.540 --> 02:10.760] But before it even made landfall, I would say about two days before it made landfall,

[02:10.760 --> 02:15.280] we had all of the storms that were spinning off of it hit where I am in South Florida

[02:15.280 --> 02:16.600] in Fort Lauderdale.

[02:16.600 --> 02:20.680] So two tornadoes touched down within a couple miles of me.

[02:20.680 --> 02:26.320] All night there were these like tornado warnings going off on my phone and just like a lot

[02:26.320 --> 02:27.740] of like severe weather warnings.

[02:27.740 --> 02:28.740] And they're so funny.

[02:28.740 --> 02:32.280] They're like, seek shelter in basement immediately, which is like, nobody in Florida has a basement.

[02:32.280 --> 02:34.320] Like why would you put that warning there?

[02:34.320 --> 02:35.320] Right?

[02:35.320 --> 02:36.320] Well, they got to say it.

[02:36.320 --> 02:37.320] Yeah.

[02:37.320 --> 02:38.320] Exactly.

[02:38.320 --> 02:42.440] And then our clinic was closed the next day.

[02:42.440 --> 02:45.280] I mean, they waited way too long to close it, but they did ultimately close it the

[02:45.280 --> 02:46.680] next day.

[02:46.680 --> 02:48.600] But the weather was just windy.

[02:48.600 --> 02:49.880] Nothing really happened here.

[02:49.880 --> 02:51.040] It didn't really even rain.

[02:51.040 --> 02:53.680] And then the day that it made landfall, I think it was either later that day or early

[02:53.680 --> 02:59.480] the next morning that it made landfall, it just devastated Fort Myers on the West Coast

[02:59.480 --> 03:00.480] and just pummeled.

[03:00.480 --> 03:01.760] Like people, a lot of people died.

[03:01.760 --> 03:03.200] Like it was terrible.

[03:03.200 --> 03:06.960] So much property damage, but we didn't see any of it where I am.

[03:06.960 --> 03:08.520] And that's the thing about hurricanes.

[03:08.520 --> 03:10.600] They're just, they're predictable.

[03:10.600 --> 03:15.160] They're more predictable than tornadoes, but they can just turn on a dime and they can

[03:15.160 --> 03:16.160] pick up speed.

[03:16.160 --> 03:17.500] They can drop speed.

[03:17.500 --> 03:21.880] The forecasts are amazing, but they also have a mind of their own.

[03:21.880 --> 03:26.040] And then I had this whole travel saga where I was going back to LA that weekend and it

[03:26.040 --> 03:28.560] was not even related to the hurricane, which was extra obnoxious.

[03:28.560 --> 03:31.480] It was related to emergency lights on the floor of the plane.

[03:31.480 --> 03:32.480] Kill me.

[03:32.480 --> 03:33.480] I was delayed 14 hours.

[03:33.480 --> 03:38.880] But I was following, yeah, it was bad, but I was following the news and this was amazing.

[03:38.880 --> 03:45.200] So as Ian made landfall, it like did all this destruction and damage because it was a super

[03:45.200 --> 03:46.200] powerful storm.

[03:46.200 --> 03:50.160] And then of course it starts to lose energy and it was downgraded to a tropical storm

[03:50.160 --> 03:54.180] as it moved inland and like dumped a bunch of water on a bunch of inland cities.

[03:54.180 --> 03:59.300] After it crossed over Florida and hit water again, it picked back up and became a category

[03:59.300 --> 04:03.960] one or two, maybe just a one, but it was recategorized as a hurricane again.

[04:03.960 --> 04:05.440] It was that powerful.

[04:05.440 --> 04:06.440] That's like nuts.

[04:06.440 --> 04:07.440] It refueled itself.

[04:07.440 --> 04:08.440] Yeah.

[04:08.440 --> 04:09.440] It was kind of warm water.

[04:09.440 --> 04:10.440] Yeah.

[04:10.440 --> 04:13.040] Like that's, that's, that's a pretty intense storm.

[04:13.040 --> 04:17.160] I mean, this is like, was the worst one that hit Florida in decades, right?

[04:17.160 --> 04:21.840] I think, well, what I read was that it was the worst since 2018, which to me doesn't

[04:21.840 --> 04:23.680] seem like it was that long ago.

[04:23.680 --> 04:28.000] But I think they were saying it could have been, or it was almost like the fifth worst

[04:28.000 --> 04:30.300] storm in America.

[04:30.300 --> 04:32.880] Like there were, it was back when they were doing all the forecasts, they were like, this

[04:32.880 --> 04:34.040] is gearing up to be X.

[04:34.040 --> 04:39.560] And so I haven't seen the final takeaway of like, you know, the ranking system of like

[04:39.560 --> 04:41.880] worst storms to hit Florida, worst storms to hit America.

[04:41.880 --> 04:47.560] Of course, most of the terrible storms to hit America hit Florida, but also Puerto Rico,

[04:47.560 --> 04:51.640] also Texas, and some nor'easters in the Northeast.

[04:51.640 --> 04:54.540] But Florida gets pummeled by hurricanes all the time.

[04:54.540 --> 04:56.440] And you should have seen the people I work with down here.

[04:56.440 --> 04:58.480] Like I was like, uh, are we going to be okay?

[04:58.480 --> 04:59.480] And they're like, this is nothing.

[04:59.480 --> 05:00.480] You'll be fine.

[05:00.480 --> 05:01.480] They're like so.

[05:01.480 --> 05:02.480] They're like, whatever.

[05:02.480 --> 05:05.040] They probably won't even cancel school.

[05:05.040 --> 05:06.040] And they almost didn't.

[05:06.040 --> 05:08.580] I was like, what is happening?

[05:08.580 --> 05:11.600] Is that like earthquake culture in California similar?

[05:11.600 --> 05:13.080] Well, just get used to it.

[05:13.080 --> 05:17.260] And they're so used to everybody freaking out and then it like not hitting them.

[05:17.260 --> 05:19.720] But the sad thing is it probably does hit somewhere.

[05:19.720 --> 05:23.200] And so people who have actually lived through the devastation, like the people in Fort Myers

[05:23.200 --> 05:26.560] who lost everything, I don't think they have that casual view.

[05:26.560 --> 05:30.800] It's the people who have been so lucky that they've like just dodged a lot of bullets.

[05:30.800 --> 05:31.800] But I don't know.

[05:31.800 --> 05:32.800] I have the same thing with tornadoes.

[05:32.800 --> 05:36.040] Like my mom was texting me like, get in the bathtub, grab kitler, get in the bathtub.

[05:36.040 --> 05:38.440] I was like, mom, how many tornadoes warnings have we lived through?

[05:38.440 --> 05:40.920] I never got in the bathtub when we were little.

[05:40.920 --> 05:41.920] What are you doing?

[05:41.920 --> 05:42.920] Get in the bathtub.

[05:42.920 --> 05:43.920] I know.

[05:43.920 --> 05:44.920] It just sounds so ridiculous.

[05:44.920 --> 05:45.920] I know.

[05:45.920 --> 05:46.920] She's like, are you safe?

[05:46.920 --> 05:47.920] And I'm like, I don't know.

[05:47.920 --> 05:48.920] I'm in my apartment.

[05:48.920 --> 05:49.920] She's like, can you get to a lower level?

[05:49.920 --> 05:50.920] I'm like, what?

[05:50.920 --> 05:51.920] Go to somebody else's apartment?

[05:51.920 --> 05:52.920] Like, no.

[05:52.920 --> 05:57.080] So you have to get into the bathtub, but you also have to bring your mattress with you

[05:57.080 --> 05:59.040] and cover yourself with your mattress.

[05:59.040 --> 06:02.360] Which is like the funniest thing because I don't know about you, but I can't lift my

[06:02.360 --> 06:03.360] mattress.

[06:03.360 --> 06:04.360] Right.

[06:04.360 --> 06:05.360] It's so...

[06:05.360 --> 06:06.360] It's not very practical for everyone.

[06:06.360 --> 06:07.360] It's putting your backpack.

[06:07.360 --> 06:08.360] Right.

[06:08.360 --> 06:14.440] I mean, you have all these people like suffocating under their mattresses.

[06:14.440 --> 06:16.000] That was not the intention.

[06:16.000 --> 06:17.480] Oh, my God.

[06:17.480 --> 06:18.480] Jeez.

What's the Word? (6:20)

[06:18.480 --> 06:22.200] All right, Carrie, you're going to start us off with a what's the word?

[06:22.200 --> 06:23.200] What is the word?

[06:23.200 --> 06:26.680] Well, this week I figured we would do a bit of a refresher and I actually don't remember

[06:26.680 --> 06:30.200] because I keep a log of all the what's the words we've ever done.

[06:30.200 --> 06:33.760] And I was positive that we had done this one before, but maybe a long time ago, but it's

[06:33.760 --> 06:34.760] not in my list.

[06:34.760 --> 06:35.760] So I don't know.

[06:35.760 --> 06:36.760] You guys tell me if you remember doing this.

[06:36.760 --> 06:39.520] If not, either way, a refresher is important here.

[06:39.520 --> 06:45.800] I wanted to talk about the word hominin as opposed to hominid as opposed to hominoid.

[06:45.800 --> 06:48.040] And I know we've all heard all of these words.

[06:48.040 --> 06:52.440] I want to talk a little bit about the origin of these words.

[06:52.440 --> 06:57.720] Back in the day, you probably heard the word hominid all the time.

[06:57.720 --> 07:04.080] Like hominid was the main word that we would use in scientific parlance.

[07:04.080 --> 07:09.960] And within, I'd say about the past decade, we started increasingly seeing the word hominin

[07:09.960 --> 07:17.000] show up first within sort of circles of people who do this stuff for a living and then later

[07:17.000 --> 07:19.960] within like science communication and more sort of like outward things.

[07:19.960 --> 07:23.680] So I just want to kind of give you guys a little bit of background about what these

[07:23.680 --> 07:27.000] words are, what they mean, why we use them the way that we use them.

[07:27.000 --> 07:33.420] So one way to sort of think about this is that hominid, which is the word that we used

[07:33.420 --> 07:36.440] to use all the time, and we still do use, but it means something different.

[07:36.440 --> 07:39.800] They're all current and extinct great apes.

[07:39.800 --> 07:43.560] So pretty much anything you can think of that's a great ape, a human, a gorilla, an orangutan,

[07:43.560 --> 07:49.880] a chimpanzee, a bonobo, homo erectus, homo erectus, yeah, like anything that is current

[07:49.880 --> 07:50.880] or extinct.

[07:50.880 --> 07:51.880] Exactly.

[07:51.880 --> 07:52.980] Like you said, that's a great ape.

[07:52.980 --> 07:53.980] That's a hominid.

[07:53.980 --> 07:55.480] So it's quite, quite inclusive.

[07:55.480 --> 08:01.220] A hominoid is even more inclusive above that because it also includes gibbons, which are

[08:01.220 --> 08:02.560] lesser apes.

[08:02.560 --> 08:07.400] So it's basically all apes, not just great apes, but there is a distinction between gibbons

[08:07.400 --> 08:08.400] and the great apes.

[08:08.400 --> 08:10.960] So hominoids include even gibbons.

[08:10.960 --> 08:11.960] What about hominish?

[08:11.960 --> 08:12.960] Hominish?

[08:12.960 --> 08:13.960] Yeah.

[08:13.960 --> 08:20.440] That's for the ish, you know, actually we're going to be talking about things that are

[08:20.440 --> 08:24.440] hominish later in the show, I think.

[08:24.440 --> 08:30.200] But so hominoids, I'm going to put that one off to the side because it's not used as

[08:30.200 --> 08:34.240] often, but really the distinction here we're talking about is between hominid and hominin.

[08:34.240 --> 08:40.680] So hominid is all current and extinct great apes, but hominin specifically, it's more

[08:40.680 --> 08:49.240] narrow, specifically speaks to anything that follows along the human lineage.

[08:49.240 --> 08:54.300] And this really came about with learning about our genes because historically we were categorizing

[08:54.300 --> 09:00.320] things taxonomically by comparing anatomy, by looking at like shared features, morphology,

[09:00.320 --> 09:01.320] things like that.

[09:01.320 --> 09:04.540] But the more that we started to learn about our genes and learn about kind of where these

[09:04.540 --> 09:08.920] different genetic splits occurred, the more that we started to understand that there is

[09:08.920 --> 09:12.220] actually a difference between a hominin and a hominin.

[09:12.220 --> 09:18.120] So hominin now refers to all the species of modern humans and early humans after the split

[09:18.120 --> 09:21.920] from the common ancestor with chimpanzees about seven million years ago.

[09:21.920 --> 09:25.600] So chimps went kind of one lineage, humans went another lineage.

[09:25.600 --> 09:31.200] So all the things, all the intermediate species that happened along the way, that is a hominin.

[09:31.200 --> 09:36.440] A hominid of course also includes chimps and gorillas and orangutans.

[09:36.440 --> 09:39.480] So that split was much, much, much earlier.

[09:39.480 --> 09:40.480] Does that make sense?

[09:40.480 --> 09:42.320] Like as a bigger branching tree.

[09:42.320 --> 09:47.880] So if you look at the reasons why these names came to be, it all has to do with taxonomic

[09:47.880 --> 09:49.280] categories.

[09:49.280 --> 09:53.440] So hominoids are in a superfamily called hominoidea.

[09:53.440 --> 09:56.280] So that's again the great and the lesser apes.

[09:56.280 --> 09:59.160] Hominids are the family hominidae.

[09:59.160 --> 10:02.520] So that's all the great apes, including human beings.

[10:02.520 --> 10:08.980] Hominins, sometimes you'll see H-O-M-I-N-E-S, hominins are the subfamily homininae.

[10:08.980 --> 10:13.440] And so that's human beings and all of their relatives.

[10:13.440 --> 10:17.440] Hominins comes, oh, actually there's a distinction here too, I should make that hominins are

[10:17.440 --> 10:19.600] actually distinct from hominins.

[10:19.600 --> 10:21.640] Hominins are the subfamily homininae.

[10:21.640 --> 10:23.700] I think chimps are actually included in that.

[10:23.700 --> 10:26.400] So it's everything from the chimp split on.

[10:26.400 --> 10:31.000] And then hominins, which is the word that, you know, I opened this talking about, it's

[10:31.000 --> 10:32.540] actually a tribe name.

[10:32.540 --> 10:37.220] So it includes all bipedal apes within that human lineage.

[10:37.220 --> 10:42.840] And so you'll see that, for example, like the chimpanzee clade is Panini, or I think

[10:42.840 --> 10:46.000] it's Panini, not Panini, but it's spelled like Panini.

[10:46.000 --> 10:47.720] That's Pan, right?

[10:47.720 --> 10:51.360] We've seen like Pan troglodyte, like you've seen those like names.

[10:51.360 --> 10:54.280] And so, and then homo would be a human being.

[10:54.280 --> 10:59.200] So that's hominini or hominini, Panini, I can't even, I just said that, I love it.

[10:59.200 --> 11:01.360] I want to just start calling them Panini.

[11:01.360 --> 11:04.320] Actually I found this really funny Reddit post where this guy's talking about it and

[11:04.320 --> 11:07.780] he said, Panini, it's spelled the same as the sandwich, but it's pronounced differently

[11:07.780 --> 11:10.440] and tastes different too.

[11:10.440 --> 11:11.440] Gross.

[11:11.440 --> 11:12.440] Clever.

[11:12.440 --> 11:17.840] And so yeah, when we look at the etymology, of course, it comes from the same root word

[11:17.840 --> 11:25.800] homo, which comes back from hominini, which is related to the word, I mean, there's so

[11:25.800 --> 11:28.160] many words that have homo in them, meaning man, right?

[11:28.160 --> 11:29.640] Like homunculus, like the little man.

[11:29.640 --> 11:34.660] And then id is a, is just that word forming piece there at the end.

[11:34.660 --> 11:37.520] So hominini is how we get the word hominin.

[11:37.520 --> 11:44.080] And there was sort of a decision made within the paleogenetic world that to be more specific

[11:44.080 --> 11:48.760] when we're talking about the human lineage, we should, we should say hominin because hominid

[11:48.760 --> 11:50.540] is too inclusive.

[11:50.540 --> 11:52.760] It includes all these other great apes as well.

[11:52.760 --> 11:54.240] Did I just confuse you further?

[11:54.240 --> 11:56.280] Well, it's getting more confusing.

[11:56.280 --> 11:58.800] They're just adding names to just parse things out.

[11:58.800 --> 12:03.280] I think it's just like shifting to the cladistic approach where like we're trying to parse

[12:03.280 --> 12:08.720] out all these very specific different clades and as we get more information about the evolution

[12:08.720 --> 12:09.720] of humans.

[12:09.720 --> 12:14.920] It's like, yeah, sometimes our old stuff, it's not quite wrong, but it's like less right

[12:14.920 --> 12:15.920] as we learn more things.

[12:15.920 --> 12:17.640] And so, I mean, you're so right, Steve.

[12:17.640 --> 12:23.840] I even have a quote here from a paleoanthropologist at University College in London who said,

[12:23.840 --> 12:27.240] quote, when we start fiddling with names, everybody gets confused.

[12:27.240 --> 12:31.400] The transition over the last decade from hominids to hominins when we talk about human ancestors

[12:31.400 --> 12:32.700] has been a pain.

[12:32.700 --> 12:36.320] We've had to explain and re-explain and people still get it wrong half the time, which is

[12:36.320 --> 12:37.320] true.

[12:37.320 --> 12:39.000] I mean, it's also just hard to like learn.

[12:39.000 --> 12:43.880] If you've been in a field for a really long time and you've always used a certain lexicon

[12:43.880 --> 12:47.320] and then all of a sudden it's kind of turned on its ear, that can be difficult too.

[12:47.320 --> 12:51.680] So you've got experts within the field even using the terminology wrong because it's like,

[12:51.680 --> 12:52.680] eh, I don't know.

[12:52.680 --> 12:54.720] I wrote 17 books using that other word.

[12:54.720 --> 12:55.720] I know.

[12:55.720 --> 12:56.720] I was about to say that.

[12:56.720 --> 12:57.960] The other thing is it's not just a pain.

[12:57.960 --> 13:02.400] You suddenly render all the previously published technical studies obsolete.

[13:02.400 --> 13:05.400] And now you need like a footnote or an asterisk.

[13:05.400 --> 13:07.600] So it makes it difficult.

[13:07.600 --> 13:15.360] So I totally understand the need to make technical terms increasingly precise and to evolve with

[13:15.360 --> 13:17.400] our understanding of the science.

[13:17.400 --> 13:22.180] But they really do need to make an effort and prioritize stability as well as whenever

[13:22.180 --> 13:28.400] they can just so it doesn't cause confusion and in the literature and also just in the

[13:28.400 --> 13:30.520] public and in general discussion.

[13:30.520 --> 13:32.200] It's an all win scenario.

[13:32.200 --> 13:33.200] You just got to pick your poison.

[13:33.200 --> 13:35.200] You just got to make a proper balance.

[13:35.200 --> 13:37.760] But this has been a particularly confusing one.

[13:37.760 --> 13:41.760] And the truth of the matter is if any of these terms are going to refer to people, right?

[13:41.760 --> 13:44.340] Because we are the most specific.

[13:44.340 --> 13:48.920] And so if you're talking about people, you can use a larger umbrella term and it'll be

[13:48.920 --> 13:49.920] fine.

[13:49.920 --> 13:51.880] It's kind of like primate, for example.

[13:51.880 --> 13:54.480] Like the word primate can refer to an ape or a monkey.

[13:54.480 --> 14:00.500] But if you try and call a human being or a gorilla or a chimpanzee a monkey, then that's

[14:00.500 --> 14:02.920] actually incorrect because they're apes.

[14:02.920 --> 14:03.920] Right.

[14:03.920 --> 14:04.920] Right.

[14:04.920 --> 14:05.920] All right.

[14:05.920 --> 14:06.920] Thank you, Kara.

News Items

S:

B:

C:

J:

E:

(laughs) (laughter) (applause) [inaudible]

Nobel Prize in Chemistry (14:06)

[14:06.920 --> 14:07.920] All right, guys, you know what time of year it is again.

[14:07.920 --> 14:09.240] It's Nobel Prize time.

[14:09.240 --> 14:14.320] We got three Nobel Prizes in the sciences, chemistry, medicine and physics.

[14:14.320 --> 14:18.960] Evan, we're going to start with you with the Nobel Prize in chemistry.

[14:18.960 --> 14:19.960] Yep.

[14:19.960 --> 14:21.900] The Nobel Prize in chemistry 2022.

[14:21.900 --> 14:24.900] It is all about click chemistry.

[14:24.900 --> 14:28.820] And that's when chemists gather together into small groups and talk about themselves and

[14:28.820 --> 14:30.560] outsiders need not apply.

[14:30.560 --> 14:31.560] Hello.

[14:31.560 --> 14:32.560] Yeah.

[14:32.560 --> 14:33.560] I'm here.

[14:33.560 --> 14:34.560] My icon.

[14:34.560 --> 14:35.560] Yes.

[14:35.560 --> 14:36.560] CLIQU click.

[14:36.560 --> 14:37.560] Right.

[14:37.560 --> 14:38.560] OK.

[14:38.560 --> 14:39.560] Take two.

[14:39.560 --> 14:40.560] Click chemistry.

[14:40.560 --> 14:41.560] All right.

[14:41.560 --> 14:43.720] Three chemists are sharing the prize this year.

[14:43.720 --> 14:49.200] Barry Sharpless, Morton Meldell and Carolyn Bertozzi.

[14:49.200 --> 14:51.080] And again, it all has to do with click chemistry.

[14:51.080 --> 14:54.280] So I'm going to start with Sharpless and Meldell.

[14:54.280 --> 15:01.560] In the early 2000s, Barry Sharpless and Morton Meldell, they laid the foundation for a functional

[15:01.560 --> 15:06.480] form of chemistry, which is called now CLIQ chemistry, in which molecular building blocks

[15:06.480 --> 15:09.260] snap together quickly and efficiently.

[15:09.260 --> 15:15.680] They liken it to assembling Lego blocks onto one another, which I guess is a very simplistic

[15:15.680 --> 15:17.200] sort of way of looking at it.

[15:17.200 --> 15:19.360] But you know, it draws the picture.

[15:19.360 --> 15:20.360] Barry Sharpless.

[15:20.360 --> 15:24.120] Now, this is his second Nobel Prize in chemistry.

[15:24.120 --> 15:25.120] That is rare.

[15:25.120 --> 15:29.520] I believe I read it was only he's the fifth person to have received two Nobel Prizes.

[15:29.520 --> 15:32.160] So that right off the bat, boom.

[15:32.160 --> 15:33.160] Very rare.

[15:33.160 --> 15:34.160] He started.

[15:34.160 --> 15:35.160] He got this all going.

[15:35.160 --> 15:38.660] And around the year 2000, he coined the concept of CLIQ chemistry.

[15:38.660 --> 15:45.120] So he basically came up with this idea, which is it's simple, it's reliable and where it's

[15:45.120 --> 15:50.020] where reactions occur quickly and without the unwanted byproducts.

[15:50.020 --> 15:52.560] You can avoid all of those using CLIQ chemistry.

[15:52.560 --> 15:54.060] And then he went to work on it.

[15:54.060 --> 16:00.480] At the same time, Morton Meldell was also working on CLIQ chemistry.

[16:00.480 --> 16:03.200] So these were they were independent of each other.

[16:03.200 --> 16:11.440] But they both arrived at a moment or a discovery, which is called the copper catalyzed azide

[16:11.440 --> 16:18.960] alkene cyclo addition, which is basically a way of saying, hey, we can use copper in

[16:18.960 --> 16:26.560] order to get these these molecules to click together in a predictable way that will not

[16:26.560 --> 16:32.180] create all the randomness and or the byproducts that come with just trying to slam a bunch

[16:32.180 --> 16:36.200] of azides and alkenes together without it.

[16:36.200 --> 16:39.280] So it's and it's now in widespread use.

[16:39.280 --> 16:45.320] It was revolutionary at the time and it has taken off among its many uses.

[16:45.320 --> 16:50.240] It's utilized in the development of pharmaceuticals for mapping DNA and creating materials that

[16:50.240 --> 16:52.300] are more fit for purpose.

[16:52.300 --> 16:57.600] They discuss the material sciences, including polymers and gels.

[16:57.600 --> 17:01.500] And it's it's fascinating that they're able to do this.

[17:01.500 --> 17:06.680] So basically what happens is you take an azide, which is any class of chemical compounds containing

[17:06.680 --> 17:11.640] three nitrogen atoms as a group, and then you have your alkenes.

[17:11.640 --> 17:17.760] And they belong to the class of unsaturated aliphatic hydrocarbons, which is a carbon

[17:17.760 --> 17:19.580] carbon triple bond.

[17:19.580 --> 17:28.520] So you've got these and when you combine them using the copper, you get a triazole.

[17:28.520 --> 17:31.720] And that is C2H3N3.

[17:31.720 --> 17:35.040] It's a five membered ring of two carbon atoms and three nitrogen atoms.

[17:35.040 --> 17:36.040] And there you go.

[17:36.040 --> 17:44.360] You describe it almost like a buckle, a linking mechanism between these two molecules.

[17:44.360 --> 17:52.200] And yes, so that has been used a lot in chemistry in the in the last in the last 20 years or

[17:52.200 --> 17:53.200] so.

[17:53.200 --> 17:54.200] Where was I reading?

[17:54.200 --> 17:58.840] They said that at the time there have now been where is it?

[17:58.840 --> 18:00.880] A thousand they've come up with.

[18:00.880 --> 18:07.320] There are over a thousand papers published now that reference this technique and hundreds

[18:07.320 --> 18:13.920] of structures that have been added to the database of new of new molecular structures

[18:13.920 --> 18:18.440] that have arisen because they have been able to do this.

[18:18.440 --> 18:19.600] That's awesome.

[18:19.600 --> 18:27.940] Now the third recipient, Carolyn Bertozzi, what she did was even in a way cooler.

[18:27.940 --> 18:33.720] So because you've got copper in these in these new molecules that you're that you're

[18:33.720 --> 18:39.440] linking together, it's great for it's great for some things, but not for everything.

[18:39.440 --> 18:44.520] In other words, biology, you know, biology, if you got too much copper and you're trying

[18:44.520 --> 18:48.880] to introduce it into a living bio living biological organism, that's not good.

[18:48.880 --> 18:50.960] You can get copper toxicity as a result.

[18:50.960 --> 18:53.740] So it's really not not great for doing that.

[18:53.740 --> 19:03.000] So Carolyn Bertozzi wanted to find a way to get this idea to work in in a biological organism

[19:03.000 --> 19:05.280] and she was able to successfully do it.

[19:05.280 --> 19:11.620] So it says here that she mapped these important but elusive biomolecules on the surface of

[19:11.620 --> 19:13.220] cells.

[19:13.220 --> 19:18.120] She developed these click reactions that work inside the organisms themselves and these

[19:18.120 --> 19:23.680] bio orthogonal reactions, they take place without disrupting the normal chemistry of

[19:23.680 --> 19:30.800] the cell or these toxins that would otherwise interfere with all of the other normal chemistry

[19:30.800 --> 19:33.480] that's going on in that environment.

[19:33.480 --> 19:38.920] And I suppose it was her that coined the term bio orthogonal reactions.

[19:38.920 --> 19:42.120] She was the one basically who discovered that this can work.

[19:42.120 --> 19:48.360] And what she did in her research is she attached fluorescent molecules to the glycans to make

[19:48.360 --> 19:50.400] them easier to map.

[19:50.400 --> 19:57.080] So you take your azide, it's a sugar bound azide, and you add that to a living cell membrane

[19:57.080 --> 20:01.720] and it attaches to the sugars that live on that membrane.

[20:01.720 --> 20:07.640] And then you use a fluorescent tag, which is on the other part, the alkene.

[20:07.640 --> 20:14.640] You tag the alkene with the fluorescent tag and through that you can then get it to bind

[20:14.640 --> 20:17.980] to the azide that's already on top of the cell.

[20:17.980 --> 20:23.100] No copper needed, no contamination, no issue with toxicity.

[20:23.100 --> 20:24.100] And there you go.

[20:24.100 --> 20:30.060] You can start tagging the specific cells that you want with this fluorescent marker so that

[20:30.060 --> 20:34.960] you can see which ones are lighting up when it attaches to the specific cells that you're

[20:34.960 --> 20:35.960] targeting.

[20:35.960 --> 20:36.960] Yeah, it's cool.

[20:36.960 --> 20:43.000] Now, I tried to find though a good like operational definition of what click chemistry is.

[20:43.000 --> 20:46.320] I couldn't really find something that I found satisfying.

[20:46.320 --> 20:47.320] You know what I mean?

[20:47.320 --> 20:52.920] Like it definitely, it sounds like it's, Oh, we're going to have, you know, identify these

[20:52.920 --> 20:57.040] chemical reactions that have these features, you know, that they have high energy so they'll

[20:57.040 --> 20:58.320] happen quickly and completely.

[20:58.320 --> 21:04.820] They won't have any bad side effects, et cetera, but, but no, no like underlying chemical reason

[21:04.820 --> 21:08.800] why these particular reactions have it or anything more specific about it.

[21:08.800 --> 21:10.240] It's, you know what I mean?

[21:10.240 --> 21:14.880] It's just an approach to chemistry, I guess, like we're going to come up with a list of

[21:14.880 --> 21:20.960] these useful reactions and then try to, and then use them to make molecules essentially.

[21:20.960 --> 21:27.320] Well, in, in, in what I read certainly, Steven, I'm not very familiar with, with these fields,

[21:27.320 --> 21:31.960] obviously, but I did do some reading today and watch some videos, including from some

[21:31.960 --> 21:34.640] of the science, some of these scientists.

[21:34.640 --> 21:43.000] And I, they sort of emphasized that the predictability was something that was the common feature

[21:43.000 --> 21:50.000] or the, the, the, the idea that gave it its tag in a sense to call it part of click chemistry

[21:50.000 --> 21:52.720] or at least click chemistry that works.

[21:52.720 --> 21:58.200] And right using the copper as the buckle for this, in other words, as, as, as was first

[21:58.200 --> 22:00.360] discovered is not the only way to do it.

[22:00.360 --> 22:05.520] They have found some other ones since then, but what they seem to, again, stress is that

[22:05.520 --> 22:09.300] you can have predictable outcomes when you do it this way.

[22:09.300 --> 22:14.400] And if you, as opposed to doing it in, in other ways, and it's more, it's random or

[22:14.400 --> 22:17.480] you come up with less desirable results.

[22:17.480 --> 22:21.360] So I imagine it's just the success of the outcome that perhaps defines it.

[22:21.360 --> 22:22.360] Yeah.

[22:22.360 --> 22:24.240] It's just defined kind of in a weird way for me.

[22:24.240 --> 22:29.220] It's not like, you know, organic chemistry uses reactions that include carbon.

[22:29.220 --> 22:34.560] This is just useful chemistry says chemistry would features that we like and they figured

[22:34.560 --> 22:39.240] out a way to make it work, but, but the approach that, yeah, it does seem like it works cause

[22:39.240 --> 22:44.440] they were able to dramatically increase the number of, of new pharmaceutical and other

[22:44.440 --> 22:50.160] chemicals that they were able to, to manufacture by just following this process of using these

[22:50.160 --> 22:51.700] types of reactions.

Nobel Prize in Physics (22:55)

Nobel Prize in Physiology or Medicine (30:46)

Homeopathy Lawsuit (40:37)

  • [link_URL TITLE][7]

Silkworm Pangenome (49:49)

Quickie(s) with Steve (1:01:27)

New ALS Drug

  • [link_URL TITLE][9]

3D Printing Computer Chips

  • [link_URL TITLE][10]

Who's That Noisy? (1:12:00)


New Noisy (1:15:33)

[animal or metal guttural growls/scratching]

J: ... So, guys, very interesting sound. If you are listening to this podcast and you think you know what that is, or you heard something really cool this week, then email me in at WTN@theskeptics.org.

Announcements (1:16:29)

Science or Fiction (1:18:41)

Theme: Adapting to Climate Change

Item #1: Scientists at the University of the Philippines have proposed burying plastic waste beneath sinking islands to keep them above water.[11]
Item #2: China is at the forefront of building “sponge cities” – cities that incorporate features that absorb large amounts of water to help reduce storm water damage.[12]
Item #3: A glaciologist at Princeton University has proposed building massive miles-long seawalls at the bases of Antarctic and Greenland glaciers in order to delay their collapse, perhaps by hundreds of years.[13]

Answer Item
Fiction Plastic under islands
Science Sponge cities
Science
Miles-long seawalls
Host Result
Steve swept
Rogue Guess
Bob
Plastic under islands
Evan
Plastic under islands
Cara
Plastic under islands
Jay
Plastic under islands

Voice-over: It's time for Science or Fiction.

Bob's Response

Evan's Response

Cara's Response

Jay's Response

Steve Explains Item #2

Steve Explains Item #1

Steve Explains Item #3

Skeptical Quote of the Week (1:36:43)

No one undertakes research in physics with the intention of winning a prize. It is the joy of discovering something no one knew before.
Stephen Hawking (1942-2018), English theoretical physicist

Signoff

S: —and until next week, this is your Skeptics' Guide to the Universe.

S: Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information, visit us at theskepticsguide.org. Send your questions to info@theskepticsguide.org. And, if you would like to support the show and all the work that we do, go to patreon.com/SkepticsGuide and consider becoming a patron and becoming part of the SGU community. Our listeners and supporters are what make SGU possible.

[top]                        

Today I Learned

  • Fact/Description, possibly with an article reference[14]
  • Fact/Description
  • Fact/Description

Notes

References

Vocabulary


Navi-previous.png Back to top of page Navi-next.png