SGU Episode 857

From SGUTranscripts
Jump to navigation Jump to search
  GoogleSpeechAPI.png This episode was transcribed by the Google Web Speech API Demonstration (or another automatic method) and therefore will require careful proof-reading.
  Emblem-pen-green.png This is a transcript of a recent episode and it is not finished. Please help us finish it!
Add a Transcribing template to the top of this transcript before you start so that we don't duplicate your efforts.
  Emblem-pen-orange.png This episode needs: transcription, proofreading, time stamps, formatting, links, 'Today I Learned' list, categories, segment redirects.
Please help out by contributing!
How to Contribute


This is an outline for a typical episode's transcription. Not all of these segments feature in each episode.
There may also be additional/special segments not listed in this outline.

You can use this outline to help structure the transcription. Click "Edit" above to begin.


SGU Episode 857
December 11th 2021
857-DNA-Data.jpg
(brief caption for the episode icon)

SGU 856                      SGU 858

Skeptical Rogues
S: Steven Novella
B: Bob Novella
C: Cara Santa Maria
J: Jay Novella
E: Evan Bernstein
Quote of the Week

Quote

Author 

Links
Download Podcast
Show Notes
Forum Topic

Introduction[edit]

Voice-over: You're listening to the Skeptics' Guide to the Universe, your escape to reality.

9.84 18.80 S: Hello and welcome to the Skeptic's Guide to the Universe. Today is Wednesday, December 8th, 2021, and this is your host, Stephen Novella. Joining me this week are Bob Novella.

18.80 19.60 B: Hey everybody.

19.60 20.88 S: Cara Santa Maria.

20.88 21.44 C: Howdy.

21.44 22.32 S: Jay Novella.

22.32 23.04 J: Hey guys.

23.04 24.48 S: And Evan Bernstein.

24.48 25.92 E: Good evening, everyone.

25.92 30.00 S: So is everybody caught up on their holiday shopping?

30.00 30.56 E: Which holiday?

30.56 31.04 E: No.

31.04 31.84 B: What?

31.84 33.12 E: Not even a little bit.

33.12 34.08 E: I know not of it.

34.08 36.16 E: Hanukkah's done. Done and done.

36.16 38.16 S: Don't you have like eight nights of gifts or something?

38.16 42.32 E: Oh yeah, eight nights. Eight nights. And my family got the exact same thing all eight nights.

42.32 43.36 E: What'd they get?

43.36 44.08 C: Not a damn thing.

46.88 48.08 B: Freshly laundered clothes?

49.04 77.28 E: No, I tell you what. In all honesty, for example, my daughter Rachel, she likes to go to concerts, as you know. So instead of kind of waiting for Hanukkah or some kind of calendar determined celebration to come along, we just kind of, you know, gift her over the course of the year as those events kind of come up. So it's much less formal in our house. We've never really been so into the whole must give a present on a specific day or holiday kind of routine.

77.28 78.80 E: Yeah, that's very healthy.

78.80 80.24 S: Never took hold.

80.24 85.52 S: I'm a big fan of giving gifts that people want rather than a surprise gift.

85.52 86.48 B: Screw that.

86.48 113.92 S: You could do a mix if you come across something or if you really know, but I would much rather, I totally err on the side of I know that you want this. Here you go. Like, you know, or tell me what you want me to get you. There's actually a literature to this. It's very inefficient to buy gifts that you're not sure somebody wants and needs because you tend to value it more than they do. And it's actually an efficient way to expend resources, which I know is like the most un- Christmas-like.

113.92 123.28 S: Yeah, Christmas-like idea. Your gift giving is inefficient, but it's a very easy fix. It's just, you know, Prioritize.

123.28 140.96 S: Well, these gifts receipts are wonderful, but prioritize, you know, knowing what people want, even if you have to ask them. Yeah, I remember, like, you know, just talking to my wife about this. She said the best gift she got as a child from her parents was when they opened up the Sears catalog and said, pick what you want.

140.96 158.96 C: Oh, see, I'm the opposite. I hate that. That's how my friend Kelly's family does their gift giving, and it drives me crazy. When we were kids, they used to go shopping together. She would pick it out. She would watch them buy it, and then she'd be like, can I have it? And they'd be like, no, and wrap it and put it under the tree. And I'm like, there's no surprise.

158.96 170.64 S: But here's the compromise that I do now. It's like, give me a list way longer than you possibly will get, and then at least you won't know which of the things on the list I'm going to pick from.

170.64 171.92 S: Right. That's all right.

171.92 178.80 S: Yeah, the only downside to that is that it puts a lot of the burden on the person. But hey, if you want to get the gifts that you want, put a little bit of a price you got to pay.

178.80 197.76 B: But that, of course, reminds me, I distinctly remember Steve, we'll see if you remember this like I do. My mom and dad gave Jay that choice. Jay, here's a catalog. Jay must have been, I don't know, eight, 10, somewhere around there. Jay, here's a catalog. Circle what you want. I swear to God, almost everything in that catalog was circled.

197.76 199.12 B: I remember that, yeah.

199.12 201.12 B: Almost everything. It was endless.

202.88 224.08 C: I think the best thing to do, but it requires, you know, usually it's for like, your partner or like your best friend, like somebody you're very close to, is I just kind of listen, notice throughout the year. And if I come across something that I remember they pointed out or that reminds me of them, pick it up so that you're not stressed in December.

224.08 250.56 S: Totally. I always prefer to do that. But if that event didn't happen, like I didn't come across the thing, then I get behind the eight ball, it sucks. Of course, also, like after being married for 30 years, it's just that a lot of the range of things has narrowed significantly because, you know, a lot of the one off gifts have been used. So now, like this year, my wife and I got each other a new refrigerator for Christmas.

250.56 252.32 S: Right. Yes. Be practical.

252.32 253.28 S: Yeah, absolutely.

253.28 253.92 S: That's smart.

253.92 264.64 E: Why not? Yeah, Jennifer and I got each other this year for our holiday, a new roof and a solar panel system. That was our gift to ourselves. That cost a lot of money.

264.64 265.04 S: Yeah, it did.

265.04 268.48 S: We're very happy. We afford the things you need by not wasting money on crap.

268.48 269.04 S: Right.

269.04 269.68 S: That you don't.

269.68 274.56 E: Things that, you know, who knows what you'll do with, but we know what we're doing with a roof and a solar system.

274.56 292.16 S: But I have been doing some online shopping and oh, my goodness, you guys, you know, if you search for like curated lists of gift ideas online, it's like half pseudoscience. It's so frustrating. Now, to be fair, I did accidentally type into Google best holiday grifts.

292.16 303.60 S: But even when I corrected, like, for example, you have the acupressure mat and pillow.

304.32 308.32 S: There's a lot of there's a lot of that stuff of the alternative medicine pseudoscience.

308.32 308.72 J: Of course.

308.72 310.32 S: All of the.

310.32 311.20 S: It's yeah.

311.20 322.80 S: The Reiki stuff and the reflexology is everywhere and aromatherapy and detox, you know, just rife in all of these curated gift lists.

322.80 328.40 B: Well, Steve, then I'll jump off that and tell you about what I found, which is similar.

328.40 334.56 B: So this caught my attention, as you might imagine. It was said futuristic patch that uses nanotech to relieve your pain.

334.56 346.08 B: Like, what is this? So this company, Kailo, K-A-I-L-O, they raised almost one and a half million in under 30 days in their crowdfunding campaign. So this is what it says.

346.08 348.00 B: Kalo works with the body's nervous system.

348.00 356.88 B: Each Kalo contains nano capacitors that work as a bio antenna, which in turn assists the body in its reaction to pain.

356.88 359.68 B: So basically you put it on and pain goes away.

359.68 361.04 B: I can't wait to get this.

361.04 365.12 B: I'll buy two nano capacitors and just and this as reuse it.

365.12 366.32 B: Keep reusing it.

366.32 366.80 B: Yeah.

366.80 367.12 B: Wow.

367.68 378.80 E: Here's one from a recommendation from Forbes dot com of all places on their hot holiday gift. The Vital Red Light Elite.

378.80 387.68 E: Yep. For one thousand three hundred forty nine dollars, you too can step into the world of light therapy and experience health benefits like never before.

389.04 398.72 E: It is the best kept secret used by top Hollywood celebrities, professional athletes, beauty experts and doctors to help with skin health, anti-aging, mental sharpness, pain and muscle recovery.

398.72 399.60 E: Thirteen hundred bucks.

400.32 400.64 E: Whoa.

400.64 406.00 C: Like I'm seeing a list where it's all just this weird like pseudoscientific pain management stuff.

406.00 409.36 C: First of all, none of this works like we know that I've seen the same thing.

409.36 410.16 C: The Kalo.

410.16 414.88 C: I'm also seeing this thing called Accu leaf, which is literally a plastic.

414.88 421.04 C: It looks like a plastic bag chip and you stick it in that like webbing between your fore finger and your thumb.

421.04 421.76 C: Acupressure.

421.76 426.32 C: It's supposed to it's supposed to give you migraine relief by hitting this pressure point.

426.32 433.28 C: But also I'm like, why are these gift guides all just like, you know, like, you know, it's all just like things for people who are in pain.

433.28 443.12 C: Like, I don't I wouldn't want to open up like a weird plastic acupressure chip clips and knee sleeves and pain patches.

443.12 445.20 C: What a sad Christmas present.

446.48 448.40 E: It's like, everyone's in pain.

448.40 450.16 E: Everyone's hurting.

450.16 454.08 S: Hey, Evan, do you want to prove your gas mileage by improve your gas mileage by 35 percent?

454.88 455.84 S: Who wouldn't, Steve?

455.84 460.40 S: Apparently, cars are programmed to be massively fuel inefficient.

460.88 471.60 S: And all you have to do is change their the programming of the fuel and do hickey with the connecticazoid and 35 percent improvement of fuel efficiency.

471.60 481.36 S: Now, automakers apparently either don't know about this or are unwilling to spend five dollars on a new computer chip to make their cars massively more competitive in the marketplace.

481.36 482.64 S: What what do they do?

482.64 484.80 E: Like tweak the fuel injection system?

484.80 488.32 E: So it only puts 35 percent less fuel into the engine?

488.32 490.32 E: No, I don't.

490.32 490.32 None What?

490.32 491.28 E: It does nothing.

491.28 493.44 S: OK, holy crap.

493.44 496.72 B: I just saw the price for this Kalo Nano Patch pain reliever.

496.72 499.04 B: Ninety nine dollars for one.

499.76 505.76 B: A hundred dollars for one reusable pain minimizing bullshit.

505.76 509.12 E: Yeah, that's well, that's a lot less than this red light.

509.12 512.56 E: Thirteen hundred dollar gizmo that you have to stand next to all day, apparently.

512.56 513.04 E: All right.

News Items[edit]

S:

B:

C:

J:

E:

(laughs) (laughter) (applause) [inaudible]

Treating the Unvaccinated (8:33)[edit]

513.04 514.88 S: Let's move on to some news items.

514.88 516.80 S: We have actually some interesting news items this week.

516.80 519.44 S: I'm going to start by asking you guys a question.

519.44 521.28 S: Yeah, a provocative question.

521.28 530.24 S: Oh, should we prioritize medical treatment to people who are vaccinated over people who are unvaccinated?

530.24 541.84 S: So if ICU beds and and ventilators are in limited availability and we have to triage, who's going to get that ICU bed?

541.84 547.36 S: Should we penalize people who are sick with COVID because they were unvaccinated?

547.36 550.96 E: I defer to medical moral codes for the answer on that one.

550.96 558.72 C: Well, triage is tricky because typically when there are enough resources, triage is about who is the sickest, right?

558.72 559.92 C: Who needs the most help?

559.92 562.08 C: That's the person who gets at the front of the line.

562.08 568.24 C: But in mass casualty situations in wartime, triage also involves who is the most likely to recover.

568.24 573.92 C: And so there is framing it like should we penalize people who didn't get vaccinated, I think gives you the answer.

573.92 583.04 C: But there is sort of a question when resources are incredibly limited as to like, is this person so sick that venting them may not help?

584.00 591.76 C: And would we put would we utilize that vent when there's another person who needs the vent but has a better chance of recovery?

591.76 598.56 C: And whether I think vaccination calculates into that, but it's obviously not the whole part of the equation.

598.56 601.84 S: Yeah, so what you're talking about, Cara, is the principle of utility, right?

601.84 606.48 S: Which is the primary way in which these medical triage decisions are made.

607.44 613.68 S: We also have to say that there are there's the normal context of medical ethical decision making.

613.68 629.20 S: And then there is the crisis context when you have, as you say, like you're in a war situation or a pandemic situation where there is a critical lack of resources and you are deciding who gets the limited resources or the way organ transplants work because they are always limited.

629.20 633.68 S: You are always deciding who's going to die on the wait list, who's going to get bumped up to the top of the list.

633.68 634.64 S: And that's a great example.

634.64 638.48 C: If somebody smokes, they're not going to get that lung transplant.

638.48 639.76 C: Well, that's not true.

639.76 640.80 C: That's not true, actually.

640.80 641.28 S: There are.

641.28 641.84 S: Really?

641.84 642.96 S: Yeah, it really isn't.

642.96 648.72 S: So there are very specific published rules about how to decide who gets the priority.

648.72 654.96 C: I was almost positive that if you have cirrhosis of the liver due to alcohol, if you so much as drink a drop of alcohol.

654.96 656.32 C: But that's about utility.

656.32 657.68 S: That's about utility.

657.68 658.16 S: Right.

658.16 662.16 C: But I mean, I thought that those were the actual transplant rules here in the US.

662.16 665.68 S: But if you look at the published SNOT, though, it doesn't say that.

665.68 667.76 S: All it says is who's most likely to benefit.

667.76 674.48 S: The number one is the tricky part about the transplant rules are it means really, really utility based.

674.48 677.92 S: And as you say, there's a positive and there's a negative to utility.

677.92 683.68 S: It's who's the most likely to benefit and who's the most likely to be harmed by not getting the resource.

683.68 688.96 S: So like with the transplant, it's like who's going to die the quickest without the transplant?

688.96 699.84 S: But also who's most likely to benefit the most from getting the transplant in terms of doing well, being healthy, surviving for a long time, not rejecting it, having a healthy organ?

699.84 702.08 B: A 20-year-old versus a 70-year-old.

702.08 704.00 B: Yeah, but there's a long list of things there.

704.00 707.68 S: That's where the smoking and the drinking comes in, Cara, because it's a long list.

707.68 715.84 S: Because it's like if you have not proven that you're not going to go back to drinking alcohol, we're not going to give you a liver that you're then going to destroy with alcohol.

715.84 720.96 S: It's not because you damage the liver that you're losing with alcohol.

720.96 724.08 S: It's that you'll damage the one we're going to give you with alcohol.

724.08 725.44 S: But that's still a utility question.

725.44 726.56 S: That's still a utility question.

727.76 738.64 S: And with transplants, we also take into consideration distance from the donor and the center where the transplant is going to be done because that affects how well you're going to do, how the success of the operation.

738.64 742.40 S: But that then bleeds into this number two.

742.40 744.08 S: So number one is utility.

744.08 749.44 S: Number two is the principle of justice, which is mostly common sense.

749.44 752.16 S: It basically means everyone's got a fair shot, right?

752.16 763.84 S: And you can't have rules which systematically discriminate against somebody based upon their location, obviously their race or gender, or on their socioeconomic status, right?

763.84 771.84 S: Which is obviously the trickiest one, unless you have just a universal health care system or something that is completely divorced from socioeconomic status.

772.72 776.96 S: But with the organ transplant, that's an interesting aside.

776.96 792.56 S: The principle of utility and the principle of justice were in conflict because if you go based entirely on utility, it disadvantages people who are in regions that have a relative paucity of organs to people who need organ donation.

792.56 798.08 S: And people who live in places where there's lots of organs and less demand are privileged.

798.08 801.36 S: And so you have to balance these two things.

801.36 808.80 S: You don't want to sacrifice too much utility, but you got to make some sacrifices to make sure that the principle of justice is reasonably adhered to.

808.80 813.36 S: And then the third basic one is respect for persons as individuals.

813.36 818.96 S: And that means you're treating people as an end unto themselves, not as a means to some other end.

818.96 824.88 S: And that you respect their autonomy, their ability to make decisions for themselves, their ability to refuse care, et cetera.

825.44 830.00 S: Those are the core principles in terms of triaging limited medical resources.

830.72 844.32 S: But there's a couple other of ethical principles underneath the justice header that are worth specifically pointing out because these are the ones that come into play when it comes to vaccination.

844.32 847.68 S: And one is the principle of reciprocity.

847.68 856.32 S: And that means that the health care system, like many things in our society, is a contract between society and the health care profession.

856.32 863.36 S: And part of that contract, reciprocity, is that it's sort of the do one, two other things.

863.36 870.88 S: Whatever you expect or would want to expect from society, you also have to give to society.

871.92 880.56 S: And so some people have argued that people who are voluntarily unvaccinated, they're unvaccinated, have broken their contract with society.

880.56 898.40 S: And that based upon the principle of reciprocity, it's fair to use that as a criterion for lowering their priority in terms of getting health care, especially if it's related to an outcome related to the fact that they're unvaccinated.

898.40 904.82 S: So from a practical point of view, we could talk about two common scenarios that are happening right now.

904.82 912.90 S: One is a hospital system is in an area that is experiencing a surge in the pandemic.

912.90 913.86 S: This is happening.

913.86 917.46 S: All the ICU beds are getting filled up with COVID patients.

917.46 922.18 S: So they have to defer semi-voluntary surgeries.

922.18 924.66 S: So let's say a woman has a breast cancer.

924.66 928.18 S: I'm not talking about cosmetic surgery or quality of life or lifestyle surgery.

928.18 931.86 S: I'm talking about a woman has breast cancer and they need mastectomy.

931.86 937.14 S: Can we delay that for a couple of weeks so that we could treat the surge of COVID patients?

937.14 940.10 S: Or somebody has coronary artery disease, they need a bypass operation.

940.10 945.06 S: Can we delay that bypass operation for a couple of weeks until the surge passes?

945.46 954.58 S: Or do those mastectomies and bypasses, even though it means taking ICU beds away from people who have COVID pneumonia and are going to die without those ICU beds.

955.46 961.62 S: There is a, you know, an increased risk of death if you delay the surgery.

961.62 967.22 S: If you delay your mastectomy by four weeks, that's an 8% increase in your risk of death.

967.22 969.70 S: So it's not like it's just an inconvenience.

969.70 971.30 S: Yeah, it's not benign to do that.

971.30 972.26 S: It's not benign to do it.

972.26 980.02 S: It's you are, people will die because they were made to wait for weeks for their bypass surgery or their mastectomy.

980.02 985.30 S: So, all right, so that's one scenario in which you're choosing among patients to prioritize.

985.30 987.86 S: It's not just, you're not just saying we're not going to give you care.

987.86 989.78 S: We're saying we're giving care to these other people.

989.78 998.42 S: The other type is, of course, if you have COVID patients who there's more COVID patients in a region, then that region has ICU beds and ventilators.

998.98 1008.98 S: Do you give a ventilator to patient A who was a breakthrough infection, but they were vaccinated versus patient B who was voluntarily unvaccinated?

1008.98 1013.86 S: So, you know, there's, I think, an emotional reaction to that scenario.

1013.86 1022.66 S: I think most people's minds immediately go to, well, the voluntarily unvaccinated person made a choice that put other people at risk.

1022.66 1024.90 S: That's the reciprocity argument.

1025.62 1040.82 S: However, there's a recent editorial, you know, which I wrote about science-based medicine by Dr. William Parker, who said, you also have to consider the principle of proportionality, which is another ethical principle under the justice header.

1040.82 1045.62 S: Proportionality means that the consequences fit the act, right?

1045.62 1057.22 S: So, was the choice not to get vaccinated enough of a violation of the principle of reciprocity that it deserves death?

1057.22 1059.38 S: Essentially, it's worth one full life.

1059.38 1060.18 S: Right.

1060.18 1063.22 S: You could argue statistically the answer to that question is no.

1063.22 1074.74 S: If you make it a pure mathematical question, again, you know, Dr. Parker estimates that that decision is worth like 0.01 lives if you just look at it mathematically, not one life.

1074.74 1076.42 S: What is that based on, though?

1076.42 1081.46 S: I mean, just some calculation of the probability of whatever, you know, somebody dying because of your choice or something.

1081.46 1081.86 S: Right.

1081.86 1095.62 C: But if you were to look at the probability of somebody dying in, let's say, a healthcare vacuum where vaccination were the only line of defense, that person would likely die or has a chance of dying.

1095.62 1102.10 S: Yeah, but, you know, we have to apply these rules to the real world and not hypothetical situations.

1102.10 1111.62 S: But the other point, you know, you just bring up some other good points under the proportionality argument that I think do help put this into perspective.

1111.62 1120.66 S: One thing is, you know, the sum total of a person's value is not determined by that one choice that they made, whether or not to get vaccinated.

1121.46 1140.58 S: And like, for example, again, if you give a situation where what if somebody is a career criminal who abuses their spouse and lives a horrible lifestyle where they ruin their health, but they got vaccinated, but they get a breakthrough infection and it's really severe because they don't take good care of themselves.

1140.58 1148.66 S: They have, you know, obesity, hypertension, they smoke, they drink, whatever, versus somebody else who's a good citizen in good health but decided not to get vaccinated.

1149.78 1153.30 S: Do we focus on just the one decision whether or not to get vaccinated or...

1153.30 1155.30 S: How are these differences weighted?

1155.30 1156.10 E: Yeah, but no.

1156.10 1157.78 C: That's the thing, too.

1157.78 1159.46 C: We can't focus on any of those things.

1159.46 1164.74 C: Like, you have to give the best care possible to everybody, regardless of their background.

1164.74 1165.22 S: Exactly.

1165.22 1168.50 S: So the thing is, it is a slippery slope argument, but it's legitimate.

1168.50 1176.02 S: It's like once we once we say we're going to decide based on something other than pure utility, then that's a can of worms.

1176.02 1187.38 S: It's like then how are we going to decide, you know, how to proportionally weight these calculations about reciprocity and who's taking care of themselves and who's putting other people at risk?

1187.38 1191.54 S: What if you were a drunk driver that killed somebody in an accident because you were speeding drunk?

1191.54 1194.02 S: Do you still get treated for your injuries?

1194.02 1197.78 S: I mean, yes, the answer is yes, you get you still get treated for your injuries.

1197.78 1205.38 S: So I do think it's important for the health care profession to first of all, it's an ethical principle that we are not judgmental of our patients, right?

1205.38 1205.70 S: Right.

1205.70 1211.62 S: It's not our position to be to be judgmental, to question your life choices, to whatever.

1211.62 1217.94 S: We're here to treat you and to be your advocate and to give you advice, certainly, but not to be are nonjudgmental.

1217.94 1221.06 S: And I think the perfected medical profession has to be that way.

1221.06 1227.06 S: As soon as we cross that line to saying you don't deserve to be treated because I don't like the choice that you made.

1227.86 1240.02 S: The other point that I think is really, really illuminating is that you have to look at the full spectrum of why somebody might have chosen not to get vaccinated.

1240.02 1241.54 C: That's what I was going to bring up.

1241.54 1250.42 C: This idea that you can just assume that you understand somebody's motivations and that you can make decisions based on that is incredibly dangerous.

1250.42 1252.50 J: Yeah, but let's not overcomplicate it.

1252.50 1255.70 J: I mean, if somebody has a legitimate medical reason to not get vaccinated.

1255.70 1256.66 J: We're not talking about that.

1256.66 1257.46 J: Not talking about that.

1257.46 1258.26 J: Fear of reasons.

1258.26 1262.18 S: We'll put legitimate, legitimate medical exemptions aside.

1262.18 1262.34 S: Right.

1262.34 1265.06 C: So that what if they have a terrible abuse history?

1265.06 1266.90 C: What if they have psychological problems?

1266.90 1272.58 E: What if they're part of a group of people who have been misled and abused in the past?

1272.58 1279.38 C: It reminds me, Steve, sorry to interject, but I just this morning I had the opportunity to interview a woman on Talk Nerdy.

1279.38 1280.50 C: It hasn't even aired yet.

1280.50 1285.22 C: It won't have aired yet by the time this comes out, who wrote a book about health care decision making.

1285.22 1289.14 C: She's a social psychologist who studied under Daniel Kahneman.

1289.14 1291.94 C: And so sort of like the behavioral economic side of things.

1291.94 1294.74 C: And she wrote a whole book about how do we make decisions?

1294.74 1295.86 C: How can we be empowered?

1295.86 1297.94 C: How can we when do we want to know everything?

1297.94 1299.30 C: When do we not want to know things?

1299.30 1301.62 C: And how as patients can we navigate this?

1301.62 1309.46 C: And one of the things we talked about, she was saying she was consulting for a company whose job was to try and help people with medication adherence.

1310.26 1313.94 C: And she was saying that, you know, the number one thing that they it was like a website.

1313.94 1316.50 C: And the first thing they wanted to do was like send reminders.

1317.38 1318.82 C: And then they were like, OK, we're done.

1318.82 1323.94 C: And she's like, if you assume that the reason that people don't take their meds is simply because they don't remember.

1323.94 1326.18 C: Because they're going to hit like 10 percent of people.

1326.18 1326.50 C: Yeah.

1327.22 1331.06 C: Like, there are so many reasons that people are not med adherent.

1331.06 1339.94 C: And really, you start thinking about the rich diversity of patients and why do so many people get a prescription and then never fill it or why do they fill it and then never finish it?

1339.94 1342.26 C: I mean, there's a laundry list of reasons.

1342.26 1346.34 C: And if we just assume that everybody falls in the same bucket, we're failing.

1346.34 1346.90 S: That's right.

1346.90 1353.94 S: So, yeah, so there could be a member of a racial minority who believes that they have been they had they could not trust the system.

1353.94 1356.50 S: Right, for example, complicated issue.

1356.50 1364.42 S: But, you know, whatever, they may have legitimate reasons to to to not be trustful of the system or you're in an abusive relationship.

1364.42 1372.18 S: Like there are cases where like the husband that won't let their wife get vaccinated or parents that won't let their children all the time with kids writing.

1374.18 1376.58 S: Yeah, for example, here's the other thing.

1376.58 1390.10 S: Even if you are a, you know, a wealthy, privileged individual under no duress or anything and you decide because you buy the misinformation not to get vaccinated, you're still a victim.

1390.10 1394.66 S: Remember that we've talked about this quite a bit in quite a lot of context.

1394.66 1397.86 S: Don't blame the person who got conned.

1397.86 1398.18 S: Right.

1398.18 1400.42 C: No, don't blame the members of the cult.

1400.42 1401.78 C: Blame the cult leader.

1401.78 1402.26 C: Exactly.

1402.26 1402.50 C: Right.

1403.22 1404.98 S: And yes, it's frustrating.

1404.98 1406.42 S: Yes, it's a burden.

1406.42 1413.54 S: But the people who buy into all this, they're at least partly victims.

1413.54 1416.74 S: And we can't blame the victim because, you know, think about it.

1416.74 1419.38 S: It's like there's a demographics to this.

1419.38 1424.02 S: It's like blaming somebody for being the religion that they were born into.

1424.02 1424.42 S: Really?

1425.06 1428.26 S: We know that that's the way they are because they were born into that religion.

1428.26 1433.38 S: It's not, you know, we can this could slide into the free will argument.

1433.38 1434.82 S: You know, do people really have free will?

1434.82 1442.02 S: Are they just following along with the determinative factors of their culture and their environment and their genetics, et cetera, et cetera?

1442.02 1444.82 S: But even if we don't go all the way that far, put that aside.

1444.82 1456.34 S: And still, it's like, yeah, people are making this decision for tribal reasons because they got sucked down a rabbit hole of misinformation because of the social milieu that they're in, et cetera, et cetera.

1456.34 1465.94 S: I mean, again, if we're going to start blaming people for that and to the point that we're going to let them die, it's just it's really not professional and ethical.

1465.94 1467.54 S: I don't think we can go there.

1467.54 1473.06 S: Now, Dr. Parker left one sliver of exception.

1473.06 1475.30 S: He said, you know, you can make an argument.

1475.30 1476.18 S: I think this is reasonable.

1476.18 1487.54 S: You can make an argument that if everything else is really completely even, that the vaccination status can be a tiebreaker.

1487.54 1497.62 S: Like if it's really that's the only difference between two people, only one of which can get on a ventilator, and you use that as the tiebreaker, you could defend that decision.

1497.62 1498.66 S: But that's about it.

1498.66 1500.26 S: That's about as far as it would go.

1500.26 1503.22 E: And that will translate into hard numbers of what?

1503.22 1504.50 E: Very, very small.

1504.50 1505.54 S: Hopefully nobody.

1505.54 1509.46 S: Like that's a pretty specific situation.

1509.46 1510.58 C: But that's the thing.

1510.58 1512.26 C: That's not a real world scenario.

1512.26 1516.66 C: Real world scenario is, let's say there's a specific state that's hit really hard.

1516.66 1518.02 C: There's only so many events.

1518.02 1521.06 C: There is a situation in which triage has to take place.

1521.06 1523.86 C: And yes, we can grapple with these ethical things all we want.

1523.86 1528.74 C: But what happens if there are four people waiting to be seen and there's only one vent?

1528.74 1537.78 S: I think from a practical point of view, if you just make the best utility decision you can and that will be your justification for choosing one person over the other.

1537.78 1542.02 S: The idea that people are going to be exactly the same is like so contrived.

1542.02 1547.22 S: You could get out of it by just finding some reason to say this person is more likely to benefit than that other person.

1547.22 1552.50 C: Then I guess the question is, is vaccine status predictive of positive outcomes?

1552.50 1553.78 C: It is.

1553.78 1558.82 S: So if you use it as a predictive utility factor, that's fine.

1559.62 1570.26 S: And that's where all other things being equal, you can kind of make a utility argument for vaccines because people who are vaccinated are more likely to benefit from their treatment than people who are not vaccinated.

1570.26 1574.42 S: But not a reciprocity argument.

1574.42 1576.90 S: He basically was just trying to remove the reciprocity.

1576.90 1584.26 S: These people don't deserve limited resources because they made their decision to not get vaccinated.

1584.26 1588.58 S: And that argument falls apart, I think, when you really dig down.

1588.58 1591.46 S: And I think as physicians, we can't really make that.

1591.46 1594.42 S: And I think as skeptics, we have to be careful not to blame the victim.

1594.42 1596.90 S: I think this is a case where that applies.

1597.70 1601.86 C: It's one of those places where I think we just have to be pretty hard lined.

1601.86 1603.46 C: And like you said, you see this in health care.

1603.46 1604.66 C: We see it in psychology.

1605.46 1607.38 C: Every life is worth the same amount.

1607.38 1612.18 C: Like you just have to stand there and say it doesn't matter who like so and so deserve.

1612.18 1615.14 C: That's not even a question coming into this calculation.

1615.86 1617.86 C: Everyone deserves heroic efforts.

1617.86 1622.50 S: That's that third principle, like where every person deserves respect and they're not a means to an end.

1623.22 1631.06 S: So some people argue along those lines that but if we do do this, that's a great incentive to get people to be vaccinated.

1631.06 1634.66 S: It's like, yeah, but you can't use patients as a means to an end.

1634.66 1637.06 S: They have to be the end unto themselves.

1637.06 1638.18 S: But here's the thing.

1638.18 1645.14 S: If that's your goal is to get the most people vaccinated, then that's not the physician's job to do that.

1645.14 1645.30 S: Right.

1645.30 1649.70 S: That's not the medical profession's job to be the stick to make that happen.

1650.66 1654.50 C: If anything, that's just going to cause more distrust with the medical profession.

1654.50 1655.14 C: That's the other thing.

1655.14 1656.42 S: It probably would backfire.

1657.22 1660.50 S: But if you want more people to get vaccinated, do it with education.

1660.50 1663.30 S: And if that doesn't work, you do it with carrots and sticks.

1663.30 1664.90 S: You do it with mandates.

1664.90 1666.10 S: You do it in other ways.

1666.10 1667.38 S: Yeah, with incentivization.

1667.38 1669.94 S: That doesn't involve their medical care.

1669.94 1670.18 C: Right.

1670.18 1673.14 C: Not like if you go to the ER, they're not going to treat you.

1673.14 1673.30 C: Right.

1673.30 1674.10 S: Jesus.

1674.10 1675.94 S: Charge them more for their insurance.

1675.94 1677.06 S: Don't let them fly.

1677.06 1677.46 S: Whatever.

1678.82 1680.90 S: Mandate it any way you think is reasonable.

1681.62 1684.90 S: That's where sort of reciprocity and proportionality come in.

1684.90 1685.70 S: Not health care.

1685.70 1686.82 S: I think that that doesn't work.

1687.38 1688.34 S: But it's interesting.

1688.34 1694.90 S: You know, there's a very lively discussion going on on science-based medicine when I wrote about this today because it provokes a lot of emotion.

1694.90 1698.18 S: And it takes a lot to sort of wrap your head around all these arguments.

1698.18 1713.14 C: I do think, and not to open up a whole new can of worms because I know you and I have slightly different views about this, but I do think that, like you kind of said it off the cuff, like charge more for their insurance is actually a health care decision, ultimately, that does negatively impact people's health care.

1713.14 1725.78 C: And so, I mean, this is why a managed care system is really tricky with those kinds of mandates and things because health care is directly related to health care reimbursement in this country.

1725.78 1726.34 C: Yeah, that's fair.

1726.34 1727.94 S: Context there is very important.

1727.94 1728.34 S: Yeah.

1728.34 1732.34 S: It's obviously not to the point where people are not going to be able to get health care.

1732.34 1733.30 S: Exactly.

1733.30 1737.86 C: We're sadly, we do see these institutional barriers for people who can't afford their health care.

1737.86 1747.14 S: But workplaces that are self-insured, like if you get your health insurance through your company and your company is self-paying for their own health insurance, they absolutely do this.

1747.14 1749.22 S: They will charge you more if you're a smoker.

1749.22 1750.02 S: Absolutely.

1750.02 1754.58 C: Sadly, they charge you more if you're of childbearing age, which is frustrating.

1754.58 1756.02 C: Yeah, that's true.

1756.02 1758.74 C: But yeah, it's based on actuarial tables.

1758.74 1760.26 C: It's literally just based on data.

1760.26 1762.34 C: You're likely to cost the system more.

1762.34 1762.82 C: Right.

1762.82 1763.30 C: Yeah.

1764.74 1765.94 S: Okay, interesting.

DNA Storage (29:26)[edit]


1767.22 1768.58 J: Yes.

1768.58 1769.78 S: How close are we?

1769.78 1773.46 S: This is one of those technologies that we're going to see a long way down the road.

1773.46 1776.82 S: How close are we to using DNA for information storage?

1776.82 1780.34 J: Well, according to Microsoft, Steve, it's not that far down the road.

1780.34 1782.26 J: Let's go to, let's start with this.

1782.26 1784.66 J: Let me ask you guys a question here.

1784.66 1786.42 J: See if you agree with this sentiment.

1786.42 1798.50 J: Have you ever wondered how companies like Google or Amazon or Apple, these big companies that have these big cloud systems, how do they keep up with their memory storage needs?

1798.50 1799.46 J: You've got to ask that.

1799.46 1800.18 J: How do you keep up?

1800.18 1800.82 J: Of course.

1800.82 1803.46 C: I figure they just buy more hardware.

1803.46 1804.02 C: Of course.

1804.02 1804.50 C: Yeah.

1804.50 1813.14 J: And it gets to the point, they are successfully keeping up with those needs, but it's increasing and it's becoming much more of a problem.

1813.14 1819.54 J: And also data centers, you may or may not know, they're incredibly power hungry and they produce a lot of heat.

1819.54 1820.02 J: Oh, yes.

1820.02 1820.50 J: Oh, yes.

1820.50 1820.98 J: Yeah.

1820.98 1825.78 J: So as Steve said, a team of scientists have developed a DNA data storage system.

1825.78 1833.78 J: This isn't the first time that companies have been working on this type of thing, but this particular effort that has been made actually got somewhere.

1833.78 1836.58 J: I think it's a milestone and it's very significant.

1836.58 1841.94 J: So first, why would we even need to store computer data using DNA?

1841.94 1843.86 J: That's, I think, a good question.

1843.86 1846.34 J: And I'll answer that in a very roundabout way.

1847.06 1850.66 J: Everybody essentially has a high quality camera on them at all times.

1850.66 1853.14 J: This is one of many examples I can give.

1853.14 1856.90 J: Now we're including all of the web content that's created every day.

1856.90 1857.22 J: Right.

1857.22 1863.70 J: Think about all of the people out there that have unique content that they're creating and how much gets published every day, every week.

1863.70 1870.02 J: And also in general, the need to archive huge stores of information.

1870.02 1883.38 J: As an example, how about storing all of the information that we, the scientific information that a company has gathered in the 50 years that it's been around or the Library of Congress or information from museums.

1883.38 1887.46 J: There is an amazing amount of information that needs to be archived.

1887.46 1892.02 J: Doesn't need to be accessed every day or whatever, but something that we want to always exist.

1892.02 1892.50 J: Right.

1892.50 1893.70 J: There's a ton of that.

1893.70 1896.34 J: More than I think the average person would realize.

1896.34 1897.70 J: You really have to think about it.

1897.70 1928.90 J: So how will tech companies keep up with the demand for data storage, especially moving forward knowing that the average person is now moving into, I would consider this a very simple question, but I would give most people the average person somewhere between a half of a terabyte up to a terabyte.

1929.62 1930.74 J: That adds up, man.

1930.74 1933.78 J: The world used to only use mechanical spinning drives, right?

1933.78 1934.66 J: You guys know what these are.

1934.66 1938.50 J: These are the hard drives that have a stack of magnetic coated disks in them.

1938.50 1943.86 J: They're spinning, you know, 7200 RPMs in order to read or write data.

1943.86 1950.34 J: Those drives have a physical arm that has to hover over the disk to read and manipulate the stored binary information.

1950.34 1954.18 J: And every once in a while you would hear crunching noises coming from these drives.

1954.18 1954.66 J: Remember that?

1954.66 1956.42 J: Cara, do you even remember that?

1956.42 1956.98 J: Yeah, yeah, yeah.

1956.98 1957.62 J: The scratch.

1958.18 1958.90 J: Yeah.

1958.90 1961.54 J: The ones that exist today are even better.

1961.54 1962.74 J: They're even faster.

1962.74 1965.54 J: But still, I wouldn't use one of those old drives.

1965.54 1969.70 J: That's because not long ago, solid state drives started to become affordable.

1969.70 1971.46 J: Now, these drives have no moving parts.

1971.46 1975.06 J: They store data by using something called floating gate transistors.

1975.06 1976.10 J: It's hardware, though.

1976.10 1980.74 J: It's just a stationary gear, like a CPU in a sense, right?

1980.74 1981.38 J: It's not moving.

1981.38 1982.34 J: There's no moving parts.

1982.34 1984.18 J: It just has a bunch of transistors in there.

1984.18 1988.02 J: If you open up one of these drives, you'll be shocked at how small they actually are.

1988.02 1988.98 J: They're tiny.

1988.98 1994.66 J: Like, when you open up the casing of a solid state drive, you think the whole thing is filled up with...

1994.66 1995.62 J: No, it's not.

1995.62 1998.42 J: It's like this nub that's sticking off the front.

1998.42 2002.66 J: It's not the size of that case at all, which means this technology is very small.

2004.02 2012.74 J: So to compare the price of both of these drives, the old style mechanical drives cost about $100 for a four terabyte drive, right?

2012.74 2016.26 J: So I'm talking about the ones that have the disks in them with the arm.

2016.26 2018.34 J: That's $100 for four terabyte drive.

2018.34 2021.06 J: The same size solid state drive is about $500.

2021.06 2024.10 J: So it's roughly five times the cost.

2024.10 2027.78 J: And again, though, I think you should only be using a solid state drive.

2027.78 2029.78 J: They're faster, a lot faster.

2029.78 2031.94 J: They're more durable and they're less prone to losing data.

2031.94 2034.58 J: They're just a better storage system altogether.

2034.58 2039.06 J: Now, when we talk about storing data size, we use the following measurements.

2039.06 2040.74 J: Now, we've been through this a million times.

2040.74 2044.74 J: I can clearly remember Bob teaching me this, right, Bob?

2044.74 2047.78 J: Bites, kilobytes, megabytes, gigabytes, terabytes.

2047.78 2051.22 J: And now we're moving into the petabyte zone.

2051.22 2062.34 J: So to give you some reference here, because I really want you guys to understand, you know, like just how much data is when we talk about these measurements or the measuring volume of data.

2062.34 2066.50 J: One DVD holds about 4.7 gigabytes of data, right?

2066.50 2069.54 J: That's a movie and a whole bunch of extras and whatnot.

2069.54 2074.42 J: But it's roughly the size of like, you know, one to two movies, say.

2074.42 2080.82 J: One terabyte of storage could hold 218 roughly DVD quality movies.

2080.82 2084.50 J: One gigabyte is about one and a half to two movies.

2084.50 2087.38 J: One terabyte is 218 movies.

2087.38 2091.70 J: One petabyte, 223,100 movies.

2091.70 2092.90 J: Think about that.

2092.90 2100.66 J: How much bigger a petabyte is than a gigabyte, which, you know, there's a lot of people who only have gigabyte storage in their home right now.

2100.66 2101.78 J: Six orders of magnitude.

2101.78 2102.26 J: Yeah.

2102.26 2107.86 J: It's a phenomenal amount of data that is being stored in one place.

2107.86 2112.10 J: So, of course, there's no single petabyte drive that exists for consumers.

2112.10 2115.70 J: You know, when these big data centers go out and buy drives, they're buying mega drives.

2115.70 2117.78 J: They're not buying like, you know, one terabyte drive.

2117.78 2123.46 J: They're buying a drive that has tens of terabytes, if not more.

2123.46 2125.94 J: But there is no petabyte drive for a consumer.

2125.94 2131.30 J: Now, the future of data storage might come from something, like I said, like this DNA storage.

2131.30 2149.70 J: This isn't the first team, like I said, but the team that I am discussing today, Bischle and H. Nguyen and the team of scientists from Microsoft and the University of Washington, Seattle, U.S. have developed a nanoscale DNA storage writer that can function faster and store more data in a smaller space.

2149.70 2158.98 J: So when we talk about storing data at a very and on the very small scale, we want to know how much data can be stored, say, in one cubic centimeter of space.

2158.98 2164.02 J: This is a way that they can discuss, like, well, how much data, you know, what's your data density, right, Bob?

2164.02 2164.50 J: Yeah.

2164.50 2170.42 J: So so if we think about this as like a cubic inch or a cubic centimeter of space, how much data can you store in that space?

2170.42 2175.62 J: This technology can store over 60 petabytes per cubic centimeter.

2175.62 2176.90 B: 60 petabytes.

2176.90 2178.18 B: Wow, cubic centimeter.

2178.18 2186.74 J: So going back to how many DVDs this could store, it could hold, quickly guess in your head, now I'll tell you the number, it can hold 13,380,000,000.

2186.74 2187.94 J: Don't make me do that.

2187.94 2191.38 J: 386,000 DVDs.

2191.38 2194.34 E: Wow, that's the entire ER series, right?

2194.34 2195.30 J: Exactly, man.

2195.30 2197.14 J: I mean, Evan, I got to have that.

2197.14 2199.54 J: I got to have that, you know, whenever I need it.

2199.54 2207.30 J: No, but if you think about about that amount of data, that is an incredible amount of data to be stored in such a small space.

2207.30 2211.62 J: This is exactly what science fiction movies have predicted for decades, right?

2211.62 2215.06 J: You know, just incredible amounts of data in a very small amount of space.

2215.06 2219.06 J: Well, it turns out that DNA storage is a viable solution for this.

2219.06 2224.98 J: So on top of incredible density, the materials that they use are durable under extreme conditions.

2224.98 2227.86 J: And on top of that, it would use a lot less energy.

2228.42 2233.54 J: And now, like they're switching to enzymes instead of using fossil based materials, right?

2233.54 2241.94 J: Because the DNA that they're creating is created by using enzymes and not, you know, oils, you know, things that are going to damage the environment.

2241.94 2248.82 J: Enzyme reactions also happen to be much faster than current chemical processes, which is another boon for this system.

2248.82 2253.38 J: The data is encoded in sequence using the four natural DNA bases, right?

2253.38 2254.58 J: A, C, T and G.

2254.58 2256.26 J: And if you want to know what those are, you look them up.

2256.26 2257.78 C: And the timing, timing, timing, timing.

2257.78 2258.58 J: There you go.

2258.58 2259.70 J: Or you just ask Bob right now.

2259.70 2262.42 J: I couldn't even I couldn't even finish it without him getting to a carrot.

2263.86 2271.06 J: The researchers say that they could add in more base pairs, which is really cool, which makes that whole system much more complicated.

2271.06 2273.38 J: But, you know, there's no reason for them to stick to those four.

2273.38 2276.26 J: It just happens to be the ones that we evolved to use.

2276.26 2277.54 J: But they could just add in more.

2278.10 2286.10 J: They can create DNA sequences that store data, and then they can read these sequences and turn them into digital signals that a computer can understand.

2286.10 2292.10 J: So the team's goal was to increase the throughput so their system could be used for commercial applications.

2292.10 2299.94 J: So when I say throughput, I'm talking about how much data can they write and how fast and how much data can they read and how fast.

2299.94 2302.42 J: That's the throughput in this specific example.

2302.42 2308.42 J: So, of course, they had to translate, like I was saying, this digital information into strands of DNA.

2308.42 2311.62 J: So that right there, what I just said, is about 10 feet.

2311.62 2317.70 J: The paper that I read on this, like it's incredible what they're doing and what they're going through in order to pull this thing off.

2317.70 2319.06 J: And it's very complicated.

2319.06 2325.38 J: And if you're interested, you should just go read, read their paper because it's fascinating if you even understand it.

2325.38 2329.06 J: It was one of those things where I was constantly looking up words and that type of thing.

2329.06 2330.74 J: It was fascinating, though.

2330.74 2333.62 J: The complexity is extraordinary to pull this off.

2334.42 2341.06 J: Then they have to go back and read and decode the DNA information and transform that back into digital information.

2341.06 2344.50 J: This new system is a huge technological step forward.

2344.50 2348.50 J: They're able to increase the read and write speeds by using parallel processes.

2348.50 2362.26 J: So this means that, as an example, instead of writing to one DNA strand, you would be writing to 10 or 100 or 1000 or maybe in the future, it could be tens of thousands of strands at the same time.

2362.26 2364.98 J: That's the way that they're going to be able to scale up the speed of this thing.

2365.70 2367.38 J: And again, it's a complicated process.

2367.38 2387.62 J: And the more strands that you're adding to, like in this parallel process, meaning processes that are happening at the same time, if they're all storing, say, the same piece of data, like let's say you're taking a huge book and you want to put it into a DNA encoding and you're going to take a strand of DNA for each page that's in the book and you do it all at the same time.

2387.62 2392.58 J: Well, of course, it's going to finish much faster than doing it all from front to back.

2392.58 2399.30 J: But the coordination involved in order to distribute that data storage and writing it and everything, it's very complicated.

2399.30 2404.82 J: Of course, so this team is filled with biologists all the way to software engineers.

2404.82 2412.42 J: They have such an incredible spectrum of proficiencies that are needed to come up with these processes and they have to be perfectly coordinated.

2412.42 2416.58 J: And another factor that makes this type of data storage important is lifespan.

2416.58 2419.30 J: And this is one that I was a little surprised to hear.

2419.30 2425.14 J: So today, the longest that we could store data without actually having to worry about it, what would you guys guess?

2425.14 2426.50 B: We're not having to worry about it.

2426.50 2428.10 B: It could be just a decade or two.

2428.10 2432.10 E: I mean, before it starts to decay and becomes unreadable.

2432.10 2432.82 E: Exactly.

2432.82 2434.66 E: 25, 30 years.

2434.66 2435.78 J: Evan hit it on the nose.

2435.78 2436.90 J: 25 to 30 years.

2436.90 2437.62 J: Very good.

2437.62 2439.06 J: And that's with the tape backup.

2439.06 2443.22 J: Tape backups, if they're in a cool and dry environment, can last a pretty long time.

2443.22 2446.66 J: There's other new backup systems that companies have come up with.

2446.66 2447.70 J: There's a lot of them out there.

2447.70 2449.78 J: You could read about many of those as well.

2449.78 2454.90 J: But this type of storage could last hundreds of years or longer.

2454.90 2458.74 J: I mean, think about how long DNA has lasted in the environment.

2458.74 2459.30 J: Right, guys?

2459.30 2459.62 J: Yeah.

2459.62 2461.38 B: And I could talk a little bit about that.

2461.38 2465.14 B: I mean, yeah, you could have it potentially last for hundreds of years.

2465.14 2472.18 B: Pete, there's some evidence that mammoth DNA has remained at least partially intact for a million years.

2472.18 2476.02 B: But that's at the low end of partially intact.

2476.02 2482.42 B: The latest thing that I could find, researchers calculated that DNA has a half-life of 521 years.

2482.42 2483.62 B: So I think it's on the order-

2483.62 2485.22 C: It's still at 25. Yes, it is.

2485.22 2486.02 C: It is better.

2486.02 2490.74 B: So yeah, so it's on the order of a century or two.

2491.30 2493.86 B: So yeah, it's definitely better than anything we have now.

2493.86 2494.90 B: Right, good enough, right?

2494.90 2499.62 J: I mean, because let's say you have to rewrite all of this data every 500 years, even go down to 400.

2499.62 2500.42 J: Who cares?

2500.42 2503.22 J: You know, think about what technology will be like in 500 years.

2503.22 2505.94 J: I'm sure we'll be blowing past this anyway.

2505.94 2509.86 J: It's just a good, it's a cool way to store data.

2509.86 2513.38 J: It's also a very sci-fi way of storing data.

2513.38 2518.74 J: And I remember there was a Star Trek episode, I don't remember what show it was from.

2518.74 2523.94 J: I think it was Worf who had DNA encoded data in his bloodstream.

2524.50 2525.30 J: Yeah.

2525.30 2527.94 J: And they took his blood and then they were able to read the data.

2527.94 2533.70 J: And I remember thinking like, oh my God, could you imagine like someday if we could do that, you know, and it hasn't been that many years.

2533.70 2535.06 B: I think that was a drumhead from Next Generation.

2535.06 2535.62 B: Right, Bob?

2536.26 2536.74 B: Yeah.

2536.74 2545.86 B: So the thing is though, another angle, Jay, is that, you know, in the future, we're not going to have, you know, VHS readers or CD readers.

2545.86 2546.74 B: No tapes, that's for sure.

2546.74 2554.90 B: But they call it DNA though, DNA is different in that they call it eternally relevant because it's going to be around for a really long time.

2554.90 2566.42 B: And so if there's technology around, then it will likely be just getting better and better at reading DNA the decades or even centuries into the future.

2566.42 2567.86 B: Although you could argue against that.

2567.86 2576.18 B: I mean, I think DNA is great, but I think, you know, if we continue on this track that we're on, I think even DNA could be artificially replaced.

2576.18 2583.30 B: But yeah, it's definitely, it has a much better life and it will, you know, be readable for long into the future, I would say.

2583.30 2592.90 S: Jay, the thing is that when I was reading some background material on this and when I read, but the big limiting factor is that it's really expensive to manufacture DNA.

2592.90 2594.18 S: We can do it.

2594.18 2602.34 S: We can do it with the sequence that we want, but the cost would be about $1 trillion per petabyte, which is huge.

2602.34 2611.14 S: We have to bring down the cost by six orders of magnitude before it's really going to be practical, which probably is going to take decades.

2611.14 2619.06 S: So even if we fix these other problems, that one is a deal breaker until we make that six order of magnitude improvement in cost.

2619.06 2620.02 J: Yeah, no doubt, Steve.

2620.02 2628.26 J: I mean, of course, like we've seen with lab grown meat as an example, the first one that they made, it was $300,000 and now it's 11 bucks.

2628.26 2631.70 J: Even like, I think it's $7 now for a lab grown hamburger.

2631.70 2647.22 J: You know, once they refine the process, once they optimize everything that needs to happen and scale it up, once they start manufacturing this and big companies start investing heavily into it, the costs are going to dramatically drop and we'll see efficiencies.

2647.22 2649.70 J: But you're right, Steve, it could be decades.

2649.70 2652.82 J: It just depends on how much time and energy they put into it to develop it.

2652.82 2658.90 J: I mean, that's why it takes a company like Microsoft to be sitting at the top of this thing to make it feasible.

2658.90 2661.78 J: But I found out there's like 40 companies that do this.

2662.34 2662.98 J: 40?

2662.98 2664.42 J: Yeah, it's out there.

2664.42 2665.22 J: It's happening.

2665.22 2669.62 J: And they're all working together, I think, at this point, at least in some type of fashion.

2669.62 2672.82 J: They're all throwing some info into the ring to make something happen.

2672.82 2674.18 J: So I just think it's really cool.

2674.18 2675.30 J: It's one of those things.

2675.30 2681.94 J: It's definitely something that I think if you like this type of stuff, this is one of the ones that you should look at because it's really interesting.

2681.94 2690.02 J: It's very possibly going to be a big problem solver for the future data storage needs.

2690.02 2695.54 J: So even if it does take 10, 20 or 30 years to get there, just like the fusion reactor.

2695.54 2696.34 S: Yeah, I was gonna say this.

2696.34 2698.98 S: I feel like we're talking about fusion in the 1980s.

2698.98 2699.30 S: Yeah.

2699.30 2700.34 S: You know what I mean?

2700.34 2700.74 S: Yeah.

2700.74 2703.30 S: Yeah, but it's a cool technology to watch.

2703.86 2704.50 J: Think about it.

2704.50 2709.30 J: By then though, Steve, Bill Gates will be controlling all of us with the microchips that are already floating in our bloodstream.

2709.30 2710.50 J: With the vaccines.

2710.50 2711.62 J: But those won't last, though.

2711.62 2713.94 J: Actually, I just figured out his end game.

2714.58 2718.18 J: He's going to use us as the batteries, as the data storage unit.

2718.18 2719.38 S: We're the data storage.

2719.38 2720.10 S: Yeah, there you go.

2720.66 2721.86 E: Data is people.

AD (45:25)[edit]

2725.86 2730.90 S: Well, everyone, we're going to take a quick break from our show to talk about one of our sponsors this week, Bombas Socks.

2730.90 2732.98 J: Bala, la, la, la, bomba.

2732.98 2733.94 J: Do you know that song, Steve?

2734.82 2736.50 J: I'm very familiar with it.

2736.50 2738.10 J: They have a mission and it's simple.

2738.10 2743.54 J: Make the most comfortable clothes ever and match every item sold with an equal item donated.

2743.54 2744.74 J: That's pretty awesome.

2744.74 2750.18 J: This holiday, when you gift Bombas to someone on your list, you're also giving them to someone in need.

2750.18 2751.70 J: We call that a give give.

2751.70 2752.18 C: It's true.

2752.18 2753.94 C: Bombas doesn't just make socks.

2753.94 2758.10 C: They design their socks, their shirts, their underwear to be the clothes that you can't wait to put on every day.

2758.66 2761.78 C: They're soft, they're seamless, super cozy.

2761.78 2766.98 C: And of course, if we are going for socks, they come in all sorts of different styles for every sport.

2766.98 2772.42 C: There are holiday styles, different cuts, different colors, so many choices.

2772.42 2778.26 B: Go to bombas.com slash skeptics and get 20% off any purchase during their big holiday sale.

2778.26 2782.50 B: That's B-O-M-B-A-S dot com slash skeptics for 20% off.

2782.50 2784.26 B: Bombas dot com slash skeptics.

2784.26 2785.86 B: All right, guys, let's get back to the show.

MLM Exploiting Women (46:26)[edit]

2786.90 2792.02 S: OK, Evan, tell us how multi-level marketing exploits women.

2794.90 2798.18 E: Right, collective grown, right from the onset.

2798.18 2802.66 E: Well, I had not heard of this company called Lula Ro before.

2802.66 2804.10 E: You got to watch the documentary.

2804.10 2804.66 S: It's a great documentary.

2804.66 2807.94 E: Yes, which I did start actually today watching.

2807.94 2811.54 E: So today, really, as I stumbled on this news item is when I learned about this.

2811.54 2821.14 E: And I suppose I should have at least heard about them this past summer when it was announced in Variety and Variety's an online magazine about Hollywood and movies and TV and such.

2821.14 2828.90 E: And Variety announced then that there was going to be this investigative docuseries about Lula Ro and it was going to come out in September of this year, which it did.

2828.90 2835.30 E: So Jennifer First and Julia Willoughby-Nason are the producers of the show Lula Rich.

2835.30 2842.18 E: And according to Variety, Lula Rich examines the pyramid scheme that was and shockingly still is Lula Ro.

2842.98 2854.98 E: The explosive growth of the clothing company, which began as a multi-level marketing scam in which people, mostly women, sold leggings to one another while also signing up new realtors to be beneath them in the pyramid.

2854.98 2861.30 E: And it has played out, as so many evil things do on Facebook, among other social media platforms.

2861.30 2870.82 E: The docuseries features former real-tech retailers and Lula Ro staffers as talking heads who have tried to dig themselves out from their ruined lives.

2870.82 2879.38 E: So there are those dreaded terms that skeptics have been familiar with for a very long time, pyramid scheme and multi-level marketing.

2879.38 2881.22 E: And we call those MLMs for short.

2881.22 2884.66 E: So a very quick review of those in case you are unfamiliar with the terms.

2885.22 2888.26 E: A pyramid scheme is a recruitment scam.

2888.26 2890.82 E: It starts usually with one person at the top.

2890.82 2895.62 E: That person convinces, for example, six other people to buy into the business.

2895.62 2896.66 E: That's your first level.

2896.66 2901.94 E: Then it becomes the job of those six people at level one, each of them to recruit six more people to buy in.

2901.94 2906.02 E: So that's six people from the first level trying to get a total of 36 people to buy in.

2906.02 2908.90 E: That's level two and so on and so forth.

2908.90 2912.98 E: And the six people at level one each get a cut of the 36 people from level two.

2912.98 2916.26 E: But also the original person at the top gets a cut.

2916.26 2918.02 E: Usually the lions share.

2918.02 2920.98 E: Money continually flows back to the top of the pyramid.

2920.98 2922.58 E: It is unsustainable.

2922.58 2929.78 E: Now, in a pyramid scheme, there's often no product or service being sold, or if it is, it's total crap and worthless.

2930.50 2934.42 E: But the recruitment of the others and getting their money is really the source of the revenue.

2934.42 2942.10 E: Now, MLMs are technically different because they do involve a tangible product such as leggings, for example.

2942.10 2950.90 E: But the MLMs rely on their recruits to make sales to friends and family members primarily, which are considered easy sympathetic targets.

2950.90 2958.26 E: It starts out as just a sale of the products to those friends and families, but the actual goal is to bring those friends and families on as the new sales force.

2958.26 2961.06 E: Am I hearing Tupperware parties?

2961.06 2963.30 C: Am I hearing Avon?

2963.30 2964.50 C: Oh yeah, the Avon.

2964.50 2965.70 E: Remember the Avon lady?

2965.70 2967.70 E: Amway.

2967.70 2971.46 E: Amway is probably one of the best known ones.

2971.46 2974.10 E: But here's the main point about MLMs.

2974.10 2989.06 E: According to a report that studied the business models of over 350 MLM companies in the United States, this was published on the Federal Trade Commission's website, at least 99% of people who join MLM companies lose money.

2989.06 2991.62 E: And it's really more like 99.6%.

2991.62 2992.18 E: Yep.

2992.18 2998.58 E: So, I mean, you're talking about everybody except for the people at the very point of the pyramid.

2998.58 3001.70 E: So in plain English, MLMs are a scam.

3001.70 3008.98 E: They're filled with promises that can never possibly be realized by anyone other than the one or the very few people at the top of the scheme.

3008.98 3024.26 E: So with that basic understanding, and I stumbled across a link to this article in Stylist.co.uk, and I admit that is not on my list of go-to sites to find news stories about science and skeptic topics.

3024.26 3025.14 E: But here's the headline.

3025.14 3031.70 E: MLMs in the UK, how multi-level marketing schemes are using feminist jargon to recruit women since the pandemic.

3032.26 3035.94 E: And they talk a lot about the Lula Ro or the Lula Rich series.

3035.94 3042.90 E: And that series reveals that their success was primarily built upon what they're calling the cheap language of feminism.

3042.90 3067.54 E: It inspired legions of women, often stay-at-home mothers, to elevate themselves into, and I'm putting air quotes in here, girl bosses and boss babes, using, you know, jargon like that to hype them up, you know, and all these sort of feminist ideas and sayings and other things designed specifically to get these stay-at-home moms to be part of this company.

3068.34 3075.54 E: One ex-distributor tells the watching audience that we were empowered, but then the husband was supposed to take over.

3076.82 3080.02 E: So, you know, what does that tell you?

3080.02 3083.06 E: They also said that I had achieved the dream.

3083.06 3085.14 E: I was selling magic leggings.

3085.14 3091.38 E: And in order for some women to do this, some women had to go so deep into poverty basically over this.

3091.38 3096.74 E: They were selling their breast milk so that they could afford the startup costs that were involved.

3096.74 3097.78 E: That's a horrible thought.

3098.34 3104.10 E: Basically, women were targeted and recruited using this feminist empowerment communication.

3104.10 3110.34 E: And if you visit the Lula Ro website, I mean, that's what you see and feel essentially on page one.

3110.34 3112.02 E: Very slick, very appealing website.

3112.02 3118.10 E: Bold and vibrant pictures of women of all ages and sizes and ethnic groups supporting the fashions that they are selling.

3118.10 3124.58 E: But it has this sort of goop-ish feel and tone to it, if you take my meaning there.

3125.62 3130.34 E: This company went from zero to 100 miles an hour very quickly.

3130.34 3137.30 E: 2013, they started this couple, married couple who started the scam.

3137.30 3144.34 E: 70 million bucks by the fall of 2015, $2 billion by December of 2017.

3144.34 3148.58 E: But the Stylist article goes beyond just this Lula Ro story.

3148.58 3160.02 E: It explains how many women, particularly women in the United Kingdom and especially since the pandemic, the female involvement in MLMs is seeing a boom in participation.

3160.90 3167.30 E: These are some statistics from the Direct Selling Association whose members include well-known beauty and well-being brands.

3167.30 3176.18 E: They reveal that growth across many of their members as a result of the lockdown saw a 32% growth in the first quarter of 2021.

3176.18 3179.54 E: I suppose that is compared to the first quarter of 2020.

3180.74 3189.46 E: I also grabbed this data from the Advertising Standards Authority, the ASA, and that's the United Kingdom's independent regulator of advertising across media.

3190.18 3194.42 E: They say the most popular MLM products being peddled in the UK right now.

3194.42 3196.58 E: And you tell me if this isn't skewed towards women.

3197.14 3207.86 E: Weight control products, beauty and cosmetics, aromatherapy, skincare products, and number five on their list is CBD, cannabinoid bright products.

3208.50 3214.18 E: So the first four, the four out of the top five right there, all directly targeted at

3214.18 3216.18 None women. I'd say the first top three.

3216.18 3216.90 C: I think weight loss.

3216.90 3217.38 C: Skincare products.

3217.38 3219.22 C: Weight loss is also targeted towards men.

3219.22 3219.78 C: You think so?

3219.78 3221.06 C: You think it's even?

3221.06 3221.78 C: Yeah, yeah, yeah.

3221.78 3222.10 C: Oh, yeah.

3222.10 3222.82 C: And exercise.

3222.82 3234.18 C: Yeah, because you've got to include in that like exercise equipment and all of the fads around weight loss have to do with both exercise and restriction of food.

3234.18 3238.58 C: So it's the diet pills, but also the stupid machines that you can buy at home.

3238.58 3239.38 C: Fair enough.

3239.38 3240.10 C: Yeah, fair enough.

3240.10 3240.58 C: But you're right.

3240.58 3241.30 C: You're right.

3241.30 3243.14 C: Three out of the five are very.

3243.14 3253.46 E: Now for full disclosure, guys, I looked at these numbers and I went back and looked at the history of MLM growth or actually, frankly, decline in the last 20 years.

3253.46 3256.18 E: It's been a pretty steady decline since the early 2000s.

3256.18 3258.98 E: It really peaked in the early 2000s and it's been declining.

3258.98 3271.86 E: For example, in the US in 2019, MLM direct sales, that model shrunk in proportion to overall sales of products in the US and in the UK, by comparison to the US, that percentage was even less.

3272.42 3273.62 E: So that's a good sign.

3273.62 3282.90 E: So but the fact that the MLM seemed to be on the rise again, you have to keep that in the context of the drop that had they had been experiencing basically for the prior 20 years before that.

3282.90 3287.46 E: So when they say 32 percent increase, you know, it's because it did tail down.

3287.46 3289.14 E: It's by no mean at the peak again.

3289.14 3291.22 E: It's not back to those early 2000 levels.

3292.26 3293.94 E: But again, it is on the rise.

3293.94 3296.66 E: It is to be watched, certainly.

3296.66 3297.86 E: And it's not all doom and gloom.

3297.86 3299.54 E: And this is what I liked about this article as well.

3299.54 3304.66 E: There's a new generation of skeptics that have come in and of all places, you know, through the world of TikTok.

3304.66 3305.54 E: Oh, my gosh.

3305.54 3311.78 E: We think of TikTok as this place where pseudoscience and astrology and all sorts of things are wreaking havoc.

3311.78 3315.70 E: But there are actually people out there doing skeptical work.

3315.70 3318.26 E: Hattie Roe, she's a UK TikToker.

3318.26 3321.22 E: Her account is devoted to researching MLMs.

3321.22 3323.94 E: She has about 70,000 followers.

3323.94 3330.34 E: And she tells in this article that the lockdown and increased job cuts had been used as a recruitment drive for those companies.

3330.34 3336.58 E: She said, I'd go on Facebook and be bombarded by posts and messages about how I can be a boss bitch.

3336.58 3339.06 E: And again, that's in quotes and earn money from home.

3339.06 3343.86 E: But during the pandemic, the amount of recruitment messages that she received increased massively.

3343.86 3345.22 E: So what did she do?

3345.22 3346.50 E: She decided to look into it.

3346.50 3354.18 E: She did research into the company, into these companies and realized, oh, my gosh, this MLM, it's a absolute scam.

3354.90 3367.30 E: Now, she did start doing this a year ago, but now she's basically gotten a reputation of being kind of this go to person who shares advice about her research and warns other people of the risks associated with the MLM.

3367.30 3373.62 E: So good on her and good on other people using the current social media platforms to do that kind of work.

3373.62 3376.02 E: And it's a ray of light, frankly, in the clouds.

3376.66 3384.18 S: It is always like it's a double insult, you know, when you're, you know, exploiting people on two levels simultaneously.

3384.18 3395.54 S: Three, really, you're selling them pseudoscience, you're selling them on the MLM scam model itself, and you're taking advantage of the fact that they're a vulnerable population.

3395.54 3396.66 S: It's a triple scam.

3396.66 3398.42 S: It's really, really disgusting.

3398.42 3404.74 E: And the end of destruction, the path of destruction of all these people, the ninety nine point six percent of people.

3404.74 3406.98 C: So often these people are low income as well.

3406.98 3407.78 C: I mean, that's right.

3407.78 3408.90 C: Yeah.

3408.90 3409.54 C: Oh, my gosh.

3409.54 3419.86 E: She has two people who are living paycheck to paycheck and then they go out and they actually get business loans or open all these credit card lines just so that they can buy into it and they never able to recoup their money.

Binge Watching (56:59)[edit]

3419.86 3423.22 S: All right, Cara, can I get addicted to binge watching TV?

3423.22 3423.62 S: Are you?

3425.86 3427.14 C: Should we compare notes?

3428.26 3431.30 C: I am in season 13 now of ER.

3434.26 3438.90 E: So I realize when ER and all those shows, those were twenty six episodes.

3438.90 3442.26 C: Yeah, they were full season and hour long show.

3442.26 3442.58 E: Right.

3442.58 3445.14 E: Not well, like you're used to kind of today.

3445.14 3447.46 E: Twenty six hour long every year.

3447.46 3450.98 C: Wow, I was like, this will be a nice pandemic project.

3451.54 3451.86 C: Wow.

3452.74 3460.98 C: Well, and so that is an important question, Steve, because I think there's long been a pretty vehement argument about what is the definition of addiction?

3460.98 3462.10 C: How do we define it?

3462.66 3484.74 C: Is addiction only something that is a chemical that binds to certain receptors in your brain that has a high affinity that's hard to unbind and causes all of these downstream effects because of that, or are there behaviors that can induce similar neural states to some of these different molecules that bind in your brain?

3484.74 3489.06 C: So, you know, I think the easiest thing when we talk about addiction is drug addiction.

3489.06 3491.46 C: It's like the cleanest way to talk about addiction.

3491.46 3493.14 C: It doesn't mean it's easy to fight.

3493.14 3498.26 C: It doesn't mean it's easy to recover from, but it's sort of one of the cleanest models for addiction.

3498.26 3504.66 C: Most of our addiction literature and also our addiction language is built around what we understand about drugs.

3504.66 3508.74 C: And again, that has to do with binding affinity and it has to do with downstream changes.

3509.78 3516.42 S: Yeah, we've spoken before on the show, Cara, about the distinction between an addiction and just a compulsive behavior.

3516.42 3528.58 C: Yeah, and there is sort of a whole range and a whole language that we use around compulsions, addictions.

3528.58 3535.86 C: And the truth is, I think, from a psychological perspective, so often we utilize something called the biopsychosocial model.

3535.86 3538.58 C: And the biopsychosocial model is exactly how it sounds.

3538.58 3554.66 C: It's threefold, that behavior, emotions, feelings, thoughts, cognitions, that these things are all sort of affected or influenced by our biology, by our psychology, and by our kind of social milieu.

3555.22 3564.66 C: And when it comes to addiction in terms of behavior, there's a researcher who has been studying this his entire life.

3564.66 3566.10 C: He's published quite a lot.

3566.10 3593.38 C: He works, he's the, let's see, the professor, a professor of behavioral addiction at Nottingham Trent University, and also, interestingly, the director of the International Gaming Research Unit, because of course, he focuses quite a lot on gambling behavior, which if you've ever known anybody who is devastated, whose lives were devastated by gambling behavior, it's hard to argue that that person wasn't fighting against an addiction.

3593.38 3604.10 C: And so, you know, he throughout his career developed what he calls the component model of addiction.

3604.10 3628.98 C: So instead of looking at addiction from a purely biological perspective or neurobiological perspective and saying, you know, this chemical is binding to this receptor, and this is what happens downstream in the brain, he decided to take a step back and say, what are many of the components of addictive behavior, of something that, you know, somebody would say, I am struggling here, this is a really intense issue.

3628.98 3631.06 C: And he developed these six components.

3631.06 3634.42 C: And it's his view, and did I even say his name?

3634.42 3635.06 C: I don't think I did.

3635.06 3639.94 C: Dr. Mark Griffiths, probably important to give him his due.

3639.94 3643.86 C: So in his view, you kind of need all six of these in order to qualify.

3643.86 3652.18 C: And they include salience, mood modification, conflict, tolerance, withdrawal, and relapse.

3652.18 3653.30 C: And so let's go through them.

3654.02 3657.86 C: Salience, it becomes the thing that's out in front of everything else in your life.

3657.86 3667.14 C: So whether we're talking about gambling, whether we're talking about binge watching television, or whether we're talking about getting your next fix, it becomes the most important thing in your life.

3667.14 3668.42 C: It's out in front of everything else.

3668.42 3673.62 C: Mood modification, you use it to reliably modify or change your mood.

3673.62 3677.14 C: So this is where oftentimes we talk about people self medicating, right?

3677.14 3683.78 C: You're not feeling well, you're feeling depressed, you're feeling anxious, you're feeling, you know, fill in the blank.

3683.78 3689.38 C: When I engage in this behavior, at least in the short term, my mood is modified.

3689.38 3695.70 C: And I kind of return to what I think is a more comfortable baseline or a more manageable state of mind or feeling.

3695.70 3697.70 C: Number three, conflict.

3697.70 3704.18 C: It's starting to compromise important components of your life, like your work, your education, your relationships.

3704.18 3720.02 C: This is a factor that you see in the DSM, the Diagnostic and Statistical Manual, which is often kind of jokingly called the Psychiatrist or the Psychologist Bible, where almost every diagnosis requires this as a diagnostic criteria.

3720.02 3729.62 C: It's sort of starting to get in the way of healthy relationships, of your ability to do your work, of your ability to, you know, be involved in your education.

3731.38 3732.42 C: Tolerance.

3733.30 3736.98 C: Okay, just like with drug tolerance, you need more to get the same effect.

3738.02 3741.78 C: Here, you're increasing your binge watching behavior.

3741.78 3750.26 C: So you started slow, and now you need more and more each time you sit down or kind of on the whole, I guess you could say on average.

3750.90 3752.34 C: Number five, withdrawal.

3753.14 3756.42 C: So if you don't do it, you actually experience symptoms.

3756.42 3761.54 C: And of course, there is a distinction between physical withdrawal and psychological withdrawal.

3761.54 3763.62 C: We can make a firm distinction there.

3763.62 3768.98 C: There are substances which when you take away that substance after enough time, you can die.

3768.98 3770.66 C: Alcohol withdrawal can kill you.

3770.66 3775.94 C: Benzodiazepine withdrawal can cause pretty intense reactions that could lead to death.

3775.94 3791.54 C: And then there's the, you know, the psychological symptoms of withdrawal, which we often see as sort of a when we talk about opiate withdrawal, it's sort of a combination of the two, you feel quite ill, there are physiological things at play, but it also really messes with your head.

3791.54 3793.22 C: You desperately need the meds.

3793.22 3801.22 C: And it's not just because it's not just because your receptors are crying out for the opiates, although they are, you're also psychologically hurting.

3801.22 3805.22 C: And you think and you know, you're going to feel better if you can get that hit.

3805.22 3812.74 C: And, you know, we can we can kind of add binge watching to that list, or at least we can conceptualize it within that framework.

3812.74 3814.66 C: And then the final one is relapse.

3814.66 3820.26 C: You might be able to quit, but when you engage again, you kind of fall right back into old patterns.

3821.22 3837.86 C: So this is like the difference between somebody who might drink mild to moderate amounts of alcohol, as socially and somebody who struggles with alcohol addiction is that, you know, I say I used to smoke when I was a teenager and in through my 20s.

3837.86 3843.22 C: And one of my big sort of pressures that I put on myself was that I can't even just have one.

3843.22 3846.58 C: Because if I open a pack of cigarettes, I will likely smoke the whole pack.

3847.38 3848.74 C: Like, I know this about myself.

3849.30 3853.06 C: And, you know, not everybody sort of acts in that way.

3853.06 3862.82 C: But that is what Dr. Griffiths here is saying is one of the six core components of this biopsychosocial model of addiction.

3862.82 3865.86 C: And so he wrote an article for the conversation.

3865.86 3868.42 C: So he's like, let's grapple with this question.

3868.42 3869.86 C: Like, is it an addiction?

3869.86 3870.42 C: Is it not?

3870.42 3871.62 C: How do we define it?

3871.62 3872.42 C: What does it mean?

3872.42 3874.50 C: What does it mean to have an addictive personality?

3874.50 3875.86 C: What are these lines?

3875.86 3885.30 C: And ultimately, you know, obviously, from a med management perspective, I think that the difference between a physiological addiction and a psychological addiction are important.

3885.30 3888.50 C: Do we need to use drugs that reverse binding, right?

3888.50 3896.66 C: Do we need to use Narcan or Naloxone or Methadone or any of these different drug treatments?

3896.66 3903.46 C: But from a psychological behavioral treatment perspective, ultimately, does it qualify?

3903.46 3907.78 C: Does it kind of fulfill these six components or some variation on those themes?

3907.78 3908.90 C: If so, it is.

3908.90 3911.54 C: It is affecting somebody's life negatively.

3911.54 3917.22 C: And I think that it's kind of hard to ignore the fact that that is a addictive behavior.

3917.22 3918.90 C: And that could really run the gamut.

3918.90 3926.42 C: You know, historically, I think the DSM really grappled and tried to obsess about and still to this day, these categorical themes.

3926.42 3927.94 C: Oh, is it drug addiction?

3927.94 3929.38 C: Is it sex addiction?

3929.38 3930.90 C: Is it gambling addiction?

3930.90 3933.78 C: Is it, you know, and let's have a laundry list of things.

3933.78 3940.10 C: Well, basically, many things can be behaviorally addictive.

3940.10 3941.38 C: Depends on the person.

3941.38 3943.54 C: It depends on a lot of features about their life.

3943.54 3950.74 C: It depends on a lot of features about their brain, about their background, about their bio, psycho, social experience and standing.

3950.74 3954.90 C: But ultimately, is it out in front of everything?

3954.90 3965.54 C: Is it giving you these negative, you know, are you are you doing it even though you're experiencing these really downstream negative effects?

3965.54 3973.78 C: If so, you know, this researcher saying and I would say from a psychological perspective and from a treatment perspective, yeah, that's an addiction.

3973.78 3975.86 C: And it's something that you're going to want to work on.

3975.86 3978.82 C: There's a lot of studies that this article is great.

3978.82 3982.26 C: I recommend people look into it if they're if they're curious about this topic.

3982.26 3992.34 C: He does a nice review of the recent literature, you know, COVID during COVID binge watching is up.

3992.34 4006.58 C: What are some of the historical studies that have shown correlates of addictive behavior again, not causality, but correlates in terms of things like the big five personality traits, low conscientiousness, for example.

4006.58 4016.74 C: What are some of the other traits that are predictive of problematic or addictive binge watching depression, social anxiety, loneliness?

4016.74 4023.06 C: So there's a lot of research in this area and it's becoming a more important area of research.

4023.06 4023.62 C: And you know what?

4023.62 4029.14 C: Something that wasn't mentioned in the article that I was thinking about, Steve and everybody else, I'm curious about this.

4029.14 4048.50 C: So often these types of addictive behaviors are they they come to pass specifically or especially, I should say, in situations in which there's a marketing, a sort of social psychology, a selling to us of those behaviors.

4048.50 4053.62 C: You think about smoking, you think about drinking, you think about gambling, you even think about binge watching.

4053.62 4062.26 C: Everything is designed to keep us hitting play, to keep us lighting the next one, to keep pouring the drink, to keep, you know, pulling on the slot machine lever.

4062.26 4071.06 C: So it's it's interesting how often addiction is a response to these cues that are actively trying to get us to engage.

4071.06 4081.38 C: And that's not really something that was talked about much in the article, except when he gave some pretty good tips about how to try and get out in front if you find yourself having problematic binge watching behavior.

4081.38 4103.46 C: And so even even if you don't qualify for a sort of, you know, according to his categorization for an addiction, but you're saying this is getting to be a kind of bad habit, setting realistic daily limits, making watching TV so like, you know, he has his own he says for me, it's two and a half hours if I have worked the next day up to five hours if I don't, probably because we're all just at home.

4103.46 4107.06 C: You know, not a lot of people are leaving their houses or doing much.

4107.06 4117.50 C: He also says, you know, make it a reward like only after you finished your work, only after you've engaged in your social obligations, after you finished the email, sent the texts, whatever.

4117.50 4125.02 C: And then one recommendation, which is a hard one is leave in the middle of the episode, pause the episode, let's say 30 minutes in.

4125.02 4132.90 C: And that's when you stop watching instead of waiting until the end when there's going to be a cliffhanger when you're going to be desperate to see what comes next.

4132.90 4135.70 C: That's one of that's one of my big problems with binge watching.

4135.70 4137.58 C: I find myself going, no, no, just one more.

4137.58 4139.54 C: Just one more because I'm like, I got to know what happens.

4139.54 4140.54 C: That's how they get you.

4140.54 4141.54 C: Yeah.

4141.54 4143.20 C: And that's that's the point I was making, right?

4143.20 4144.20 C: They get you.

4144.20 4146.14 C: It's you know, this is this is by design.

4146.14 4149.20 C: Yeah, it's super formulaic.

4149.20 4156.78 C: So I'm curious as a physician and and a neuroscientist to neuroscientist to boot, what do you grapple?

4156.78 4157.78 C: Are you talking to me?

4157.78 4158.78 C: Yeah, I'm talking to you.

4158.78 4161.42 C: Do you grapple with these with these labels?

4161.42 4162.42 C: Like are you still like, man?

4162.42 4163.42 C: I wouldn't call that addiction.

4163.42 4167.94 S: I do think it's important to be this is a thing where colloquially fine.

4167.94 4170.74 S: Technically, we need to be very precise about what we're talking about.

4170.74 4176.82 S: And I do think, you know, we interviewed the sex addict specialist who's like, it's a it's a behavior.

4176.82 4177.96 S: It's a choice that you're making.

4177.96 4181.14 S: It's not addictive in the same way that medications are addictive.

4181.14 4184.58 S: But but it's still yes, it has these features as a compulsive behavior.

4184.58 4190.38 S: And we need to talk about the things that are leading you to this behavior, leading you to these choices.

4190.38 4203.54 S: But if you if you saying it's an addiction does kind of make it into a passive thing rather than an active thing, meaning that it's something that's happening to you rather than something you're doing.

4203.54 4205.02 S: And that may be counterproductive.

4205.02 4220.58 C: Yeah, I think, you know, I think so often there are these ideas that become codified in in sort of the the literature or even in our common parlance, like this idea of like addiction as a disease.

4220.58 4221.58 C: Right.

4221.58 4222.82 C: Oh, well, addiction is a disease.

4222.82 4225.58 C: And I think that's a big part of this, you know, and so it's something I have.

4225.58 4231.38 C: It's not necessarily it's something I caught or it's something that I and not necessarily something where I have a lot of will.

4231.38 4236.38 C: And I think that works for some people and it helps some people kick it and it doesn't work for others.

4236.38 4238.50 C: And so often there's not a one size fits all.

4238.50 4245.64 C: So although I agree with you that from a sort of physician's perspective, it might be important to make those distinctions.

4245.64 4256.68 C: I think from a psych, which most people are getting psych treatment for addiction, although psych psychiatric treatment and neurological treatment is part of the part of the game.

4256.68 4258.34 C: To me, the distinction doesn't matter.

4258.34 4259.62 C: Is it impacting your life?

4259.62 4261.22 C: How do we how do we work on it if it is?

4261.22 4262.22 S: Yeah.

4262.22 4263.22 S: Oh, yeah.

4263.22 4264.22 S: I mean, from a practical point of view.

4264.22 4266.62 S: And I've noticed that, you know, during my career, the language has changed.

4266.62 4270.82 S: Now it's a a substance use disorder.

4270.82 4272.82 S: You know, so it's like you're not an addict.

4272.82 4274.82 S: You suffer from a substance use disorder.

4274.82 4278.74 S: So again, it's it's the that's the goes back to the nonjudgmental thing.

4278.74 4279.74 C: Yeah.

4279.74 4280.74 C: Reducing the stigma around these things.

4280.74 4281.74 S: Yeah.

4281.74 4282.86 S: This is this doesn't define you.

4282.86 4284.34 S: This isn't you.

4284.34 4287.86 S: This is just a this is a disorder that we will treat together.

4287.86 4288.86 S: Yeah.

Asteroid Monitoring (1:11:28)[edit]

4288.86 4296.46 S: All right, Bob, finish us up with a discussion of detecting asteroids that are going to smash into the earth and kill everybody.

4296.46 4297.46 B: Yes.

4297.46 4311.12 B: Actually, this is just cheered me up a little bit this design because we seem to be making some progress in doing at least in doing our due diligence to find asteroids that could potentially be dangerous to us or to Earth.

4311.12 4321.92 B: This is a new impact monitoring system that's now online that could calculate deadly asteroid orbits better than the one that's been calculating for us for the past 20 years.

4321.92 4328.34 B: There's a new study describing this system called Century 2, and it was published in the Astronomical Journal this past December 1st.

4328.34 4334.70 B: So let's first talk about what Century 1 had been doing ever since 2002.

4334.70 4339.62 B: So the Jet Propulsion Laboratory, JPL, developed the software almost 20 years ago.

4339.62 4348.06 B: What it can do in less than 60 minutes, basically, it could accurately tell you the orbit of a newly discovered asteroid for the next 100 years.

4348.06 4350.90 B: And of course, whether it would hit the earth or not.

4350.90 4358.54 B: Now this wasn't a part time job for the software either, because we've detected something like 28,000 near-Earth asteroids, NEAs.

4358.54 4363.66 B: 28,000 have been found, and we find an average of eight more every day.

4363.66 4367.98 B: 3,000 a year, every year, every day, eight more, eight more, eight more.

4367.98 4370.74 E: And how near is near Bob in this context?

4370.74 4381.30 B: There's different classes, but near Earth is within, relatively near, within millions of miles, many millions of miles, which is actually kind of close if you think about it.

4381.30 4391.02 B: And so with this clearly scary situation, as I describe it, JPL manages the Center for Near-Earth Object Studies, CNEOS.

4391.02 4398.30 B: So now it's their job to run the numbers on every orbit for every near-Earth asteroid that's discovered.

4398.30 4407.86 B: And it does this to support an office, probably the coolest office name ever, NASA's Planetary Defense Coordination Office.

4407.86 4409.30 B: And I think we've talked about that, haven't we?

4409.30 4410.86 B: I think we talked about that once.

4410.86 4414.70 B: It's like my favorite office of all time, right?

4414.70 4415.98 B: Planetary Defense.

4415.98 4417.78 B: It's like that, what's that company?

4417.78 4418.78 B: General Atomics.

4418.78 4422.18 B: It's just such a great 1950s ring to it.

4422.18 4426.98 B: So what they do is they provide early detection for potentially hazardous objects.

4426.98 4427.98 B: That's PHOs.

4427.98 4431.70 B: And okay, so PHOs, they're within five million miles.

4431.70 4436.24 B: If you're within five million miles, that's a PHO, which is a type of NEA.

4436.24 4442.78 B: But you're clearly much more hazardous because you're only five million miles away within that.

4442.78 4448.26 B: So that means, Evan, that near-Earth asteroids, I think, are farther than five million, obviously.

4448.26 4460.62 B: This Planetary Defense Office also categorizes the PHOs as 30 to 50 meters because when you get to that size and at those speeds and the kinetic energy, you can do significant damage.

4460.62 4467.02 B: This office also tracks and characterizes these PHOs and they issue warnings and stuff like that.

4467.02 4468.30 B: So very, very cool.

4468.30 4477.08 B: So the Center for Near-Earth Object Studies, they help the Planetary Defense Coordination Office and they use this sentry software to assess the asteroid orbits, okay?

4477.08 4478.58 B: And they've been doing it for years.

4478.58 4490.46 B: But the rate of discovery is going to be going up very, very soon as new and much more powerful survey telescopes, or as my niece used to call them, skeletopes.

4490.46 4493.06 B: That was such an awesome mispronunciation.

4493.06 4512.34 B: So these new powerful telescopes are going to start looking for near-Earth asteroids and they anticipate an influx of newly discovered asteroids and we need to be able to keep up with that and we need to also be even more accurate because it's going to be so many that the chances, you know, we could find something that could potentially be nasty.

4512.34 4515.54 B: So they came out with Sentry 2, which is essentially an upgrade.

4515.54 4517.78 B: It's an upgrade to the sentry software.

4517.78 4526.10 B: So Sentry 2 is obviously like number one in that it can accurately calculate orbits based on gravitational interactions, right?

4526.10 4527.10 B: That's key.

4527.10 4529.94 B: You might think that that's what orbits are all about, right?

4529.94 4537.08 B: Gravitational interactions and you'd be mostly right because the asteroid, any given asteroid will interact gravitationally mainly with the sun, right?

4537.08 4544.32 B: It's interacting with the sun gravitationally, but the orbit of the asteroid could also be impacted by other planets, right?

4544.32 4546.40 B: And including the Earth, of course.

4546.40 4554.78 B: So that plays in too, but there are non-gravitational interactions that can happen to the orbit that can impact the orbit.

4554.78 4560.08 B: They're not based on gravity at all and Sentry 1 could not handle them.

4560.08 4562.10 B: And what do you think that might be guys?

4562.10 4565.38 B: What would impact the orbit that's not gravitational?

4565.38 4566.38 C: Solar wind?

4566.38 4567.38 B: Oh, have you?

4567.38 4569.56 B: Kind of, kind of.

4569.56 4571.04 B: It's called the Yarkovsky effect.

4571.04 4575.18 B: So now imagine an asteroid spinning as you know, which they do.

4575.18 4583.82 B: So as that happened, as it's spinning, the side that was heated by the sun eventually spins away from the sun and faces the opposite direction and it cools down.

4583.82 4597.52 B: So that infrared energy is released as it cools and that minute thermal energy acts as a force that's a kind of like a little mini engine, a little bit of thrust that over time can actually change the orbit of an asteroid.

4597.52 4600.34 B: Now day to day, of course, that is negligible.

4600.34 4603.02 B: You could just completely discount it fully.

4603.02 4611.54 B: But after decades or centuries, it could act, this Yarkovsky effect can actually make dramatic changes to an orbit and they're very, very difficult to calculate.

4611.54 4619.58 B: So difficult that Sentry 1, which had some very slick mathematical, you know, algorithms, it couldn't calculate it.

4619.58 4622.22 B: But this is what Sentry 2 will be able to do.

4622.22 4624.36 B: Now, do you guys remember Apophis, right?

4624.36 4630.06 B: That's probably one of the most famous, yeah, one of the most famous asteroids out there.

4630.06 4645.02 B: And mainly because for a little while there, we actually weren't sure, you know, what if it was going to hit the earth at some point, you know, more around 2068 when it came back again after the 2029 was it?

4645.02 4649.02 B: So we weren't really sure because we couldn't rule out that impact in 2068.

4649.02 4651.78 B: And that's why so many people actually know about Apophis.

4651.78 4658.02 B: Now the reason why we couldn't rule it out at that time was mainly because of this Yarkovsky effect.

4658.02 4665.62 B: They hadn't fully nailed down that what the impact is on the thermal energy that's hitting that asteroid.

4665.62 4676.04 B: Of course, we do know now we have fully fleshed out the Yarkovsky effect on Apophis and we know that Apophis is not going to hit us for at least well over 100 years.

4676.04 4679.24 B: So don't really, don't worry about Apophis.

4679.24 4681.28 B: But that's the Yarkovsky effect right there.

4681.28 4685.14 B: That's why people were so scared of Apophis because we hadn't really nailed it yet.

4685.14 4691.34 B: So David Farnocchia, the navigation engineer at JPL, and he also helped develop Century 2.

4691.34 4695.58 B: He said, the fact that Century couldn't automatically handle the Yarkovsky effect was a limitation.

4695.58 4705.98 B: Every time we came across a special case like asteroids Apophis, Bennu and 1950 DA, we had to do complex and time consuming manual analyses.

4705.98 4708.42 B: With Century 2, we don't have to do that anymore.

4708.42 4712.90 B: So that's a key advantage to Century 2 over Century 1.

4712.90 4717.86 B: Okay, now Century 1 couldn't calculate the orbital changes due to the Yarkovsky effect.

4717.86 4723.60 B: It also could not deal with asteroid orbits that got really close to the Earth.

4723.60 4736.16 B: When you have an asteroid that happens to get really, really close to the Earth and is gravitationally affected much more than normally, what it does is it creates all sorts of uncertainties in the future orbit of the asteroid.

4736.16 4744.34 B: So it makes it much, much harder to predict what's going to happen after the gravitational interaction with the Earth at such close range.

4744.34 4746.22 B: You guys have probably heard of keyholes.

4746.22 4765.82 B: So those are specific areas in space that if an asteroid went through the keyhole, it could, it could direct, because it went through that keyhole in that specific position, then its next orbit or a subsequent orbit would hit the Earth because it went through that keyhole.

4765.82 4768.98 B: So you'll often hear about, you know, an asteroid going through a keyhole.

4768.98 4782.04 B: If it goes through that keyhole, we're kind of screwed in 20 years or 30 years or a hundred years, whatever it is, when it comes back because that keyhole made it change its orbit just enough so that next time it comes around, it gives us a whack.

4782.04 4783.94 B: So that's kind of related to that.

4783.94 4787.26 B: Now, Century 1 could not calculate that.

4787.26 4796.34 B: If an asteroid was going to come really close to the Earth, Century 1 could not calculate in the future if it could hit the Earth with high confidence.

4796.34 4801.30 B: Century 2 can't do that, so that's another huge, a huge plus for Century 2.

4801.30 4810.66 B: And Century 2 is so exquisitely sensitive that it could tell you the odds of an impact, even if it's as low as three chances in 10 million.

4810.66 4812.38 B: So it's very, very sensitive.

4812.38 4819.74 B: And so I do feel a little safer now, but I won't be happy until we have that deflective beam obelisk from that classic Trek episode.

4819.74 4820.74 B: Yes.

4820.74 4821.74 B: What was it called?

4821.74 4823.62 B: Paradise syndrome?

4823.62 4830.34 B: That would be nice, but I don't think we're going to create a deflective beam obelisk in the ever, actually ever.

4830.34 4831.34 C: In the ever.

4831.34 4832.34 S: Right.

4832.34 4836.14 S: And so this is obviously a two-pronged approach here.

4836.14 4840.78 S: We need to detect the asteroids and we need to work on the technology to deflect them.

4840.78 4841.78 B: Right.

4841.78 4847.42 B: And the key is not to just detect it, but to detect it so far in the future that we have time to deal with it.

AD (1:20:47)[edit]

4847.42 4852.86 S: Well, everyone, we're going to take a quick break from our show to talk about one of our sponsors this week, KiwiCo.

4852.86 4861.10 C: KiwiCo creates super cool hands-on projects designed to expose kids of all ages to concepts in science, technology, engineering, art, and math.

4861.10 4869.48 C: And this year, why not make your holidays a little bit less prepackaged, a little bit more hands-on, and hey, how about some more science?

4869.48 4878.06 C: With KiwiCo, kids can discover the engineering and mechanics behind everyday objects, the science and chemistry of cooking, brand new art and design techniques, and so much more.

4878.06 4888.18 C: And of course, each crate is designed by experts, tested by kids, and each line caters to different age groups, all the way from teeny tiny babies up to big kids like me.

4888.18 4890.74 J: Yeah, Cara, they have holiday projects.

4890.74 4893.26 J: Now, these are a lot of fun.

4893.26 4897.10 J: This is something that you could use to get into the season with your children.

4897.10 4901.90 J: One of the ones that I really love is the ice skating rink, which is really neat.

4901.90 4906.14 J: The ice skating rink actually spins and the figures on it move around a little bit.

4906.14 4913.74 J: And it's just a really cool thing that you can build with your child or your friend or your neighbor or your friend's child or maybe your niece or nephew.

4913.74 4918.48 E: So this holiday, don't just teach kids how to buy, teach them how to build.

4918.48 4926.06 E: Give them a gift of hands-on holiday with a KiwiCo subscription and celebrate a love for hands-on learning all year long.

4926.06 4932.22 E: Get 50% off your first month plus free shipping on any crate line with code SKEPTICS.

4932.22 4938.54 E: That's 50% off of your first month at KiwiCo.com, promo code SKEPTICS.

4938.54 4941.06 S: All right, guys, let's get back to the show.

Who's That Noisy? ()[edit]


4941.06 4942.86 S: Jay, it's Who's That Noisy Time.

4942.86 4947.46 J: All right, guys, last week I played this noisy.

4947.46 4955.22 J: All right, any guesses, guys?

4955.22 4957.26 B: Yes, Steve singing in the shower.

4957.26 4958.26 B: Oh my God.

4958.26 4960.42 S: I don't sing that well.

4960.42 4962.50 C: Steve has become a bird.

4962.50 4964.98 C: He has watched them so long.

4964.98 4969.70 E: A bird imitating Steve singing in the shower.

4969.70 4977.46 J: I typically every week get some who's that noisy response where they say like, it's Steve doing blah, blah, blah.

4977.46 4979.18 J: It's a Steve related noise.

4979.18 4986.54 J: First person that sent in a guest, Michael Blaney, and Michael said, hi, Jay, the trilling sound is very cool.

4986.54 4999.06 J: It's either a fluffly, endlessly replicated device Kirk could use to identify a Klingon or more likely it reminds me of the song of the Tui pronounced like the number two followed by the letter E. This is a bird in New Zealand.

4999.06 5000.06 J: Ciao, Michael.

5000.06 5001.70 J: I think we saw one of those in New Zealand.

5001.70 5002.94 J: I thought we did.

5002.94 5003.98 J: That's not correct.

5003.98 5006.50 J: But you know, this definitely sounds like a bird noise.

5006.50 5009.20 J: So that's not a bad guess.

5009.20 5015.48 J: Another listener named Tassie wrote in and said, I think this week's who's that noisy is a budgie.

5015.48 5021.46 J: It's called a budgerigar from Australia, home of myself and Visto Tutti.

5021.46 5023.32 J: That is not correct.

5023.32 5024.32 J: Another bird.

5024.32 5029.66 J: But you know, like I said, going with a bird in that with that type of noise, I couldn't put you down for that at all.

5029.66 5037.06 J: Kay Dingwell wrote in and said, is this week's who's that noisy a parrot, specifically a cockatiel singing along to a carol.

5037.06 5048.74 J: My nearly 25 year old cockatiel seems to think it is because he was trying to call back to the clip, which I think is very funny that there was a bird somewhere that was responding to my who's that noisy.

5048.74 5049.90 J: It's not a bird.

5049.90 5050.90 J: So check this out again.

5050.90 5062.90 J: This is a llama doing kind of like an alert call to the other llamas.

5062.90 5064.94 J: I mean, it is so a bird noise.

5064.94 5071.04 S: If I had a llama, I would have to call it Dolly.

5071.04 5072.78 S: One of you.

5072.78 5073.78 S: Sorry.

5073.78 5074.78 C: Delayed reaction.

5074.78 5078.86 J: Yeah, I've heard that one before.

5078.86 5080.14 J: There was no winner.

5080.14 5082.54 J: Again, Jay, you're failing.

5082.54 5085.46 J: And I'm like picking now I'm picking ones that I think are really easy.

5085.46 5089.50 J: It must be like the holidays are coming up and I'm losing my mojo over here.

5089.50 5092.38 J: But no, you know that look, you can't win every time, right?

5092.38 5095.74 J: Everybody does not get a reward or an award.

5095.74 5098.94 J: What's the difference between a reward and a award?

5098.94 5105.42 S: An award or reward is for doing something and award is for winning something for behavior.

5105.42 5109.02 E: Earned versus given.

5109.02 5112.14 C: Isn't an award actually like a physical thing to you?

5112.14 5116.62 B: Yeah, like it's more like an object like took a magical thing like a troll money or like

5116.62 5118.94 C: a certificate or a trophy or something.

5118.94 5125.94 J: Anyway, so I will I will throw in a couple of what I consider to be very, very easy ones for you guys.

5125.94 5126.94 J: But thank you.

5126.94 5129.90 J: Jay, this was that this was an interesting noise and I hope you guys learned something.

5129.90 5130.90 J: I certainly did.

5130.90 5135.50 J: I didn't I would never have thought a large land animal would make a noise like that.

New Noisy ()[edit]

[_short_vague_description_of_Noisy]

5135.50 5140.54 J: So this week's new noisy was sent in by a listener named Robert House.

5140.54 5145.62 J: And I'm just going to give it to you right now.

5145.62 5154.42 J: If you think you you know the noisy, if you think that you have a good noisy, you email me at WTN at the skeptics guide dot org.

short_text_from_transcript

Announcements (1:25:54)[edit]

5154.42 5155.42 J: Hey, Steve.

5155.42 5156.42 J: Yeah, Jay.

5156.42 5157.42 J: I've got one announcement.

5157.42 5165.46 J: OK, you could go to the Skeptics Guide shop and you could buy some some SGU exclusive gear.

5165.46 5166.46 J: I'm adding new stuff.

5166.46 5168.70 J: So if you've been there before and you're hearing this, go again.

5168.70 5171.14 J: So I'll be adding new things before this show airs.

5171.14 5175.70 J: I'll have some cool stuff in there, some things that I think people might want for the holidays.

5175.70 5182.30 J: So please do go to our website and just click the shop link and you will be brought right there and you can help us by helping you.

5182.30 5183.30 J: All right.

5183.30 5184.30 S: Thanks, brother.

Name That Quote (1:26:24)[edit]

5184.30 5187.90 S: So let's do the name that quote bit again that you did previously.

5187.90 5191.22 E: Yeah, I guess I'm going to have to come up with a name for this segment.

5191.22 5198.54 E: I wrote down a couple of possibilities and I don't want me to run those by you real quick and see if you want to have any kind of honey.

5198.54 5200.30 E: Oh, a couple of them are, of course.

5200.30 5201.46 E: Can you give me the pun?

5201.46 5202.46 E: I want the funniest one.

5202.46 5203.46 E: All right.

5203.46 5204.46 E: You want the funniest one?

5204.46 5208.10 E: It's probably Quotum Physics.

5208.10 5213.74 E: I also I was I wasn't going to lead with that one.

5213.74 5217.98 E: I also have rock the quote, you know, take on rock the vote.

5217.98 5220.82 E: But the other ones are, you know, a little more straightforward.

5220.82 5229.98 E: There's potent quotables, quotation marks, and I quote and the knockoff borrowed title, who's that quote?

5229.98 5230.98 E: So those are some thoughts.

5230.98 5238.84 E: If you guys have some thoughts about what the segment should be if we choose to continue to do this every once in a while, I'd be happy to take any and all suggestions.

5238.84 5239.84 E: But let's get on with it.

5239.84 5240.84 E: I've arranged five quotes.

5240.84 5241.84 E: How about quoting the quote face?

5241.84 5246.30 E: Quoting the quote face.

5246.30 5249.14 E: I prepared five quotes for all of you.

5249.14 5257.20 E: We're going to play a game in which my four fellow rogues are going to each give a guess as to who they think said each of these quotes.

5257.20 5258.68 E: And it is multiple choice.

5258.68 5262.42 E: So three choices per quote.

5262.42 5263.42 E: Ready?

5263.42 5265.14 E: Number one, here's the quote.

5265.14 5268.38 E: The thing I'll remember most about the flight is that it was fun.

5268.38 5273.30 E: In fact, I'm sure it was the most fun that I'll ever have in my life.

5273.30 5274.30 E: So who said that?

5274.30 5280.74 E: Was it Jeff Bezos, Sally Ride, or Chris Hadfield?

5280.74 5283.38 E: Bob, we're going to start with you.

5283.38 5284.66 E: Who do you think the guess is?

5284.66 5285.66 E: Who are you going to guess?

5285.66 5286.66 E: I'd say Hadfield.

5286.66 5287.66 E: Okay.

5287.66 5288.66 S: Steve.

5288.66 5290.66 S: Yeah, I was going to say Hadfield too.

5290.66 5291.66 E: Okay.

5291.66 5292.66 C: Cara.

5292.66 5295.70 C: I guess I'll depart and say it was Sally Ride.

5295.70 5296.70 E: Okay.

5296.70 5297.70 E: And Jay.

5297.70 5298.70 J: I'll start with Cara.

5298.70 5300.30 J: So you're also saying Sally Ride?

5300.30 5301.30 E: Yes.

5301.30 5304.46 E: And the correct answer is Sally Ride.

5304.46 5305.46 E: Yay!

5305.46 5306.46 E: There it is.

5306.46 5307.46 E: Good for Cara.

5307.46 5308.46 S: Good for Jay.

5308.46 5309.46 S: That was my second choice.

5309.46 5310.46 S: Not Jeff Bezos.

5310.46 5311.46 S: He was too easy.

5311.46 5312.46 S: Well, you know.

5312.46 5313.46 S: All right.

5313.46 5314.46 S: Quote number two.

5314.46 5315.46 E: To eat is human.

5315.46 5320.46 E: To digest, divine.

5320.46 5322.58 E: Who said or wrote that?

5322.58 5324.14 E: I should put it that way.

5324.14 5331.42 E: The three choices are Julia Childs, Epicurus, or Mark Twain.

5331.42 5333.46 E: This time Jay will start with you.

5333.46 5334.46 E: Epicurus.

5334.46 5335.46 E: And Cara.

5335.46 5337.46 C: I'm going to say it was Mark Twain.

5337.46 5339.10 E: Cara says Mark Twain.

5339.10 5340.10 E: Let's go with Steve.

5340.10 5342.50 S: Yeah, I mean, it sounds like Mark Twain.

5342.50 5345.78 S: The reason why I wouldn't say him is because everything sounds like Mark Twain.

5345.78 5346.78 S: Right.

5346.78 5347.78 S: That's right.

5347.78 5352.98 S: So, and it's a little bit, I don't know, could that be Julia Child?

5352.98 5355.46 S: It sounds a little too clever for her.

5355.46 5359.14 S: Maybe I'll go for the radical one and say Julia Child.

5359.14 5360.14 S: Okay.

5360.14 5361.14 B: Would she say the opposite?

5361.14 5362.14 B: What?

5362.14 5366.90 B: To digest is human and to eat is divine?

5366.90 5367.90 E: And Bob, who are you guessing?

5367.90 5368.90 E: Twain.

5368.90 5369.90 E: Twain.

5369.90 5370.90 E: Okay.

5370.90 5372.22 E: The answer is Mark Twain.

5372.22 5373.70 E: So Cara and Bob.

5373.70 5375.26 C: Yeah, because it's so witty.

5375.26 5376.26 E: Got that correct.

5376.26 5380.94 E: And yes, I did confirm it because I have to look up every Mark Twain quote you come across.

5380.94 5382.62 S: So much falsely attributed to him.

5382.62 5388.22 E: This is directly from the Mark Twain webpage, the curator herself.

5388.22 5389.62 E: So it is solid.

5389.62 5390.62 E: Okay.

5390.62 5392.66 E: Next, here's the quote.

5392.66 5395.74 E: Man is still the most extraordinary computer of all.

5395.74 5397.62 E: Who said that?

5397.62 5402.62 E: Was it Alan Turing, Steve Jobs, or John F. Kennedy?

5402.62 5403.94 E: Let's start with Steve.

5403.94 5404.94 E: Jobs.

5404.94 5405.94 E: Next is Cara.

5405.94 5408.66 C: Yeah, I was thinking Jobs also.

5408.66 5409.90 E: Next is Jay.

5409.90 5410.90 B: Steve Jobs.

5410.90 5411.90 B: And Bob.

5411.90 5412.90 B: These don't make sense.

5412.90 5415.90 B: It's got to be Jobs, but that's probably where I'm going to be wrong.

5415.90 5416.90 E: Okay.

5416.90 5417.90 E: And the answer is John F. Kennedy.

5417.90 5418.90 E: Oh, no way.

5418.90 5419.90 C: Why the fuck would he say that?

5419.90 5420.90 C: Interesting.

5420.90 5421.90 E: Speech he made in 1963.

5421.90 5422.90 E: He didn't say that.

5422.90 5427.62 C: It does make sense that he would be calling people computers in 1963, though.

5427.62 5431.50 S: But the thing is, we didn't call computers computers in 1963.

5431.50 5432.50 S: That's my problem with that.

5432.50 5437.72 C: Well, it might be that computer was just starting to be called that, and that's why he said that.

5437.72 5438.72 C: Wait, wait, wait.

5438.72 5439.72 C: When was the transition from people as computers to computers?

5439.72 5441.38 E: But the computer was a person, right?

5441.38 5442.38 E: I mean, like from the hidden figures?

5442.38 5443.38 B: Yeah.

5443.38 5445.86 B: But that was like as well before the 60s, though, wasn't it?

5445.86 5450.14 E: Because even in Dr. Strangelove, which was produced in 1962, they refer to computers.

5450.14 5451.34 E: Oh, okay.

5451.34 5453.90 C: So it's more like the 40s that they were calling computers.

5453.90 5455.90 S: Yeah, I guess that's earlier than I thought.

5455.90 5458.10 B: Yeah, but that's why it didn't make sense.

5458.10 5459.90 E: So swept you on that one.

5459.90 5460.90 E: Two more to go.

5460.90 5461.90 E: Here we go.

5461.90 5462.90 E: Next quote.

5462.90 5466.72 E: When you look at the ingredients, if you can't spell or pronounce it, you probably shouldn't eat it.

5466.72 5469.14 E: Was that Dr. Oz?

5469.14 5470.66 E: Vanny Hari?

5470.66 5472.82 E: Or Gordon Ramsay?

5472.82 5474.90 E: Let's start with Jay.

5474.90 5476.70 J: Who is the middle person?

5476.70 5478.18 E: Vanny Hari.

5478.18 5479.18 E: The food babe.

5479.18 5480.18 J: Food babe.

5480.18 5481.18 J: Oh, yeah.

5481.18 5482.26 J: It was her for sure.

5482.26 5483.26 C: And Cara?

5483.26 5486.34 C: I assumed it was the food babe, but I wouldn't put it past Dr. Oz.

5486.34 5491.42 C: So I'm going to go on a limb and say that idiot running for the Senate is the one who said that.

5491.42 5492.42 C: Steve?

5492.42 5498.02 S: I mean, I know the food babe said, if not that, something almost exactly identical to

5498.02 5503.02 B: that. And I'll say definitely Vanny.

5503.02 5505.10 B: I mean, I remember she said that.

5505.10 5508.06 S: I feel like I remember it too, but maybe it was slightly different.

5508.06 5509.38 E: Oh, maybe, but it's not.

5509.38 5510.38 E: It is Vanny.

5510.38 5511.38 S: Oh, okay.

5511.38 5512.38 E: Yeah, they got one of that.

5512.38 5513.38 C: Pulled that from one of your blogs, Steve.

5513.38 5515.86 C: I feel like that's some dumb thing that Dr. Oz would have like riffed on the show.

5515.86 5519.50 E: In fact, I think she told me that herself.

5519.50 5520.50 E: The food babe.

5520.50 5522.42 E: Gosh, is she still around?

5522.42 5523.42 E: And the last quote.

5523.42 5524.42 E: Here we go.

5524.42 5525.42 E: The fifth.

5525.42 5533.12 E: There's no coming back from biological death because that's the ultimate death and there's no coming back from that.

5533.12 5541.26 E: Was it Bob Novella, Jay Novella, or Steve Novella?

5541.26 5545.02 E: Let's start with Cara.

5545.02 5546.86 C: It wasn't Jay.

5546.86 5548.82 C: I don't think it would be Bob either.

5548.82 5549.82 C: I think that's Steve.

5549.82 5550.82 E: I think it's Steve.

5550.82 5551.82 E: Okay.

5551.82 5552.82 E: I think that's Steve.

5552.82 5554.18 E: Let's jump to Bob.

5554.18 5555.18 E: Really?

5555.18 5556.18 E: Yeah.

5556.18 5557.18 E: You're going to me?

5557.18 5558.18 E: I'm going to you, Bob.

5558.18 5559.18 E: Bob said it.

5559.18 5560.18 E: Okay, Bob, he said it.

5560.18 5561.18 E: Really?

5561.18 5562.18 E: Jay?

5562.18 5563.18 C: Now you want to reverse biological death.

5563.18 5564.18 C: It was Bob.

5564.18 5565.18 C: Damn.

5565.18 5566.18 C: And Steve?

5566.18 5570.26 S: I knew it was Bob before you finished the quote.

5570.26 5578.50 C: So, were you making the argument, Bob, that we got to get to a point where we don't have biological death?

5578.50 5579.50 C: We're going to do everything up until then?

5579.50 5580.50 C: It's just basically distinguishing.

5580.50 5584.54 B: Yeah, it was just distinguishing a definition of death.

5584.54 5590.90 B: A true final definition of death because the definition of death has changed over the centuries, right?

5590.90 5594.30 B: A hundred years ago, if you stopped breathing, oh, he's dead.

5594.30 5595.30 B: Not today.

5595.30 5596.82 B: So, what is that point?

5596.82 5598.90 B: And that point is biological death.

5598.90 5600.78 B: What do you mean by biological death?

5600.78 5601.78 C: Your cells die.

5601.78 5613.38 B: Yeah, that is a point where even in science fiction, they could not resurrect you because the information that makes you you could not be inferred by anything that's left over.

5613.38 5615.14 B: So that's biological death.

5615.14 5617.06 C: Have you said this multiple times?

5617.06 5618.06 E: Nope.

5618.06 5622.74 E: He just said it on episode 144 dating back to 2007, basically.

5622.74 5626.90 C: Yeah, I get to claim ignorance because I wasn't on the show yet.

5626.90 5627.90 E: What?

5627.90 5630.10 E: You haven't memorized our entire back catalog, Cara.

5630.10 5631.10 S: Oh my gosh.

5631.10 5636.02 S: Evan, not only do I actually remember Bob saying that, it is such a Bobism.

5636.02 5638.78 S: I would know Bob said that even if I didn't know he said it.

5638.78 5639.78 S: Right.

5639.78 5641.82 C: Just because of how he said it or because of the content itself?

5641.82 5643.50 S: No, it's the way he said it.

5643.50 5644.50 C: Yeah, it's the wording.

5644.50 5645.50 C: Yeah, it's the wording.

5645.50 5646.50 C: How he repeated it after.

5646.50 5647.50 C: Yes.

5647.50 5648.50 J: That's right.

5648.50 5653.62 J: Evan, that is actually a lot of fun to try to guess who said a quote on this show.

5653.62 5654.62 J: That could be tricky.

5654.62 5656.22 B: I'll make sure I incorporate that in the future.

5656.22 5657.70 B: Or ridiculously easy.

5657.70 5659.50 E: Well, but fun.

5659.50 5660.50 E: But fun.

5660.50 5661.50 E: Good to look back.

5661.50 5662.50 E: Yes.

5662.50 5665.06 E: I'll remember to incorporate that in future contests like this.

5665.06 5667.46 E: Okay, so Cara and Steve each got two of them correct.

5667.46 5669.34 E: Bob and Jay got three out of the five correct.

5669.34 5673.98 E: So Bob and Jay tie for the win this game.

5673.98 5674.98 E: Well done, gentlemen.

5674.98 5675.98 E: Well done, everyone.

5675.98 5676.98 S: Yeah, well, good job, Evan.

5676.98 5677.98 S: That was fun.

5677.98 5678.98 S: Good.

Questions/Emails/Corrections/Follow-ups ()[edit]

5678.98 5679.98 S: All right, we're going to do one email.

5679.98 5680.98 S: This will be a little bit quick.

5680.98 5683.74 S: This is this comes from Nick and Nick.

5683.74 5685.30 S: I'm going to skip to the meat of this.

5685.30 5688.06 S: He says, my eight year old has been struggling this year.

5688.06 5693.66 S: I have been stopping myself from writing to you for parenting advice specific to me, but I wanted to share the thought holes.

5693.66 5695.42 S: I've been going down two main points.

5695.42 5700.08 S: I'm not sure I'm able to give an objective assessment of my own child for human emotional reasons.

5700.08 5708.62 S: It's also been tough to find available specialists to see and for some kind of CBT, that's cognitive behavioral therapy, which tells me that he is not the only kid struggling.

5708.62 5719.38 S: Second, I can't seem to come up to a conclusion to what kind of intervention will be most helpful to what the state of understanding is for childhood psychology.

5719.38 5724.98 S: Did we screw this kid up already is the urgent question and I'm probably just seeking reassurance.

5724.98 5729.70 C: Wait, so he's specifically asking, like, did I as the parent parents screw up my child

5729.70 5736.06 S: by eight year old? So that's why I wanted to just answer this email because I wrote Nick back to reassure him.

5736.06 5741.66 S: No, I said unless you did something environmentally extreme.

5741.66 5747.04 C: Well, unless you like did something that's like on the adverse childhood experiences kind of scale.

5747.04 5752.34 S: Yeah, you lock him in a closet for eight years or something that would be considered abusive or neglectful.

5752.34 5753.34 S: Yeah, abuse or neglect.

5753.34 5754.34 C: I mean, right.

5754.34 5759.10 S: Yeah, if you're just you know, yeah, if you just worry, like, is your parenting style or whatever?

5759.10 5760.10 S: Did you screw up your kid?

5760.10 5762.30 S: You know, the evidence show this.

5762.30 5774.02 S: The thing is, if you go back 50 years, that was kind of the default assumption of psych psychiatry and psychology that any any kid with with mental issues or challenges was somehow screwed up by his mother, you know,

5774.02 5776.62 C: yeah, it was always the refrigerator. Remember that?

5776.62 5779.42 S: Yeah, like, she was too cold.

5779.42 5780.42 S: Exactly.

5780.42 5784.30 S: But we but we've, you know, the research has sort of sort of like, no, it isn't.

5784.30 5785.30 S: Bad parenting.

5785.30 5797.18 S: It's, you know, it's usually a lot of these things are neurological issues, you know, or, you know, or they're they're psychological, but they're, they're, they're a consequence of, of mostly genetics or a lot of things.

5797.18 5800.12 S: But it's it's not bad parenting.

5800.12 5801.38 S: That's not the go to explanation.

5801.38 5806.90 S: I know it's hard for parents and you feel like what you we must have done something wrong.

5806.90 5816.02 S: Why is my kid struggling with this or with that or with ADHD or and just like with the you know, the previous conversation, it's like, no, it's not your fault.

5816.02 5818.22 S: It's not and it's not about blame.

5818.22 5822.66 S: You know, it is a it's just sometimes it's completely biological.

5822.66 5825.46 S: Sometimes it's a combination.

5825.46 5833.70 S: And they you know, and like as a family, you may benefit from from intervention of one kind or another.

5833.70 5835.42 S: But it's not about blame.

5835.42 5837.94 S: And it's not and it's almost never about bad parenting.

5837.94 5844.54 C: Yeah, I think that the fear is that, you know, we were talking about the biopsychosocial model, right, and social is a part of that.

5844.54 5858.18 C: And you know, as when kids are really young, their only social experiences are in the home, which is why when kids are very, very young, adverse childhood experiences have an inordinate effect on their development.

5858.18 5865.66 C: But I like to think of it more as I don't want to say bad parenting, but like neutral parenting, you know, little mistakes here and there.

5865.66 5869.54 C: They don't take away from a sort of neutral place like kids.

5869.54 5875.62 C: Children's are pretty dang resilient, obviously neglect and abuse do good parenting is additive.

5875.62 5876.62 C: It's protective.

5876.62 5883.58 C: Yeah, you know, the more you nurture and love your child, the more that's going to be beneficial for your child, especially when they're very young.

5883.58 5888.78 C: But yeah, it we have such a culture, especially around mental illness of blame.

5888.78 5893.98 C: Yeah, of like why like we want we it's because we want an explanation.

5893.98 5905.10 C: You know, we even see it around biological illness, like people blame themselves or blame others for cancer, for, you know, diseases that they have no control over whatsoever.

5905.10 5913.78 C: And I think, you know, it spills into mental illness where there's even more of a conversation about like willpower and about it.

5913.78 5915.30 C: And there's so much there's so much guilt.

5915.30 5916.30 C: There's so much shame.

5916.30 5920.26 C: Yeah, unless you were like abusing your kid or neglecting your kid.

5920.26 5922.58 S: We're not giving you a pass on parental neglect.

5922.58 5932.46 S: No, and it's also like if you're talking about, oh, like there's maybe there's like not an optimal psychological dynamic in the family.

5932.46 5937.46 S: And the other thing that comes up is the child is often the identified patient.

5937.46 5942.70 S: I'm sure you know about this, Cara, where it's like, no, it's usually a family dynamic issue.

5942.70 5944.98 S: And, you know, it's not the child's problem.

5944.98 5947.98 S: And the family might benefit from family counseling or whatever.

5947.98 5951.30 S: But that we're talking about, you know, mild things.

5951.30 5957.26 S: If someone has something hard core, you don't get like ADHD from bad parenting.

5957.26 5960.22 S: You don't get obsessive compulsive disorder from bad parenting.

5960.22 5961.22 C: Or bipolar disorder.

5961.22 5962.66 S: Yeah, you don't get bipolar disorder.

5962.66 5964.46 S: You don't get major depression.

5964.46 5966.86 S: These are illnesses.

5966.86 5970.54 S: Yeah, but then that also gets back to the stigma of mental illness.

5970.54 5975.50 S: There are still a segment of society that doesn't like to think of mental illness as an illness.

5975.50 5979.98 C: No, they think they can just like, you know, smile their way through it.

5979.98 5980.98 C: Yeah, right.

5980.98 5983.22 C: They just like will themselves out of it.

5983.22 5985.38 S: Or that it's all a choice in some bizarre way.

5985.38 5988.06 S: It's just, you know, it's silly.

5988.06 5993.70 C: It also sounds like the concern in the email was about how to seek treatment.

5993.70 5994.70 C: Yeah.

5994.70 5995.70 C: What to, you know, what treatment to seek.

5995.70 5999.92 C: Obviously, there are qualified professionals who specialize in child psychology.

5999.92 6002.54 C: You definitely if it's a child, you want to take them to a child therapist.

6002.54 6003.54 C: Yeah, yeah.

6003.54 6006.84 S: I think my advice is basically keep looking, you know, yeah, it can be challenging.

6006.84 6014.38 S: I've had encountered this personally, very hard to find competent professionals in this area.

6014.38 6015.66 S: And there's a lot of pseudoscience.

6015.66 6020.50 S: You know, I had to encounter a lot of practitioners throwing woo at me.

6020.50 6021.50 S: Like, really?

6021.50 6024.00 S: So you know who the hell you're talking to kind of thing.

6024.00 6025.78 S: But you just got to keep looking.

6025.78 6027.98 S: You know, unfortunately, just keep looking.

_consider_using_block_quotes_for_emails_read_aloud_in_this_segment_ with_reduced_spacing_for_long_chunks –

Question_Email_Correction #1: _brief_description_ ()[edit]

Question_Email_Correction #2: _brief_description_ ()[edit]

Science or Fiction ()[edit]

Answer Item
Fiction
Science
Host Result
'
Rogue Guess

Voice-over: It's time for Science or Fiction.

Item #1: [6]
Item #2: [7]
Item #3: [8]


_Rogue_ Response[edit]

_Rogue_ Response[edit]

_Rogue_ Response[edit]

_Rogue_ Response[edit]

_Host_ Explains Item #_n_[edit]

_Host_ Explains Item #_n_[edit]

_Host_ Explains Item #_n_[edit]

_Host_ Explains Item #_n_[edit]

6027.98 6034.40 S: All right, guys, let's move on to science or fiction.

6034.40 6037.86 C: It's time for science or fiction.

6037.86 6052.10 S: Each week I come up with three science news items or facts, two real and one fake, that I challenge my panel of skeptics to tell me which one is the fake.

6052.10 6053.98 S: There's a theme this week.

6053.98 6058.86 S: The theme of this week is not a dinosaur.

6058.86 6067.14 S: So these are about creatures that were around during the time of dinosaurs but are not dinosaurs.

6067.14 6068.14 S: OK?

6068.14 6069.14 S: OK.

6069.14 6070.14 S: OK.

6070.14 6071.14 S: All right.

6071.14 6072.14 S: Here's the first one.

6072.14 6082.58 S: Item number one, Quetzalcoatlus northropi, the largest flying animal ever with a wingspan of 10 meters, was able to take off from the ground by jumping directly into the air.

6082.58 6093.06 S: Item number two, while contemporary with dinosaurs and often mistaken for one, Dimetrodon is not a dinosaur and is more closely related to modern lizards.

6093.06 6102.82 S: And item number three, Cricosaurus suficus was a crocodile relative fully adapted to aquatic life and looked like a cross between a dolphin and a crocodile.

6102.82 6106.46 S: All right, Bob, you weren't here last week, so you get to go first this week.

6106.46 6111.22 B: So the premise, though, is that these are not the premises that they're not dinosaurs, right?

6111.22 6112.22 B: Yeah.

6112.22 6113.22 C: Just evaluating on.

6113.22 6114.22 C: OK.

6114.22 6115.22 C: Yeah, we're not deciding which one's not a dinosaur.

6115.22 6116.22 C: Yeah, that has nothing to do with which is science fiction.

6116.22 6117.22 S: That's just a category.

6117.22 6123.22 B: Quetzalcoatlus, basically the biggest thing that ever flew on the Earth.

6123.22 6124.22 B: Really?

6124.22 6125.70 B: Take off from the ground?

6125.70 6127.78 B: Oof, I don't know about that.

6127.78 6129.78 B: That thing was almost as big as an F-15.

6129.78 6132.70 B: OK, let's get the next one.

6132.70 6135.22 B: Dimetrodon not a dinosaur and is more closely related.

6135.22 6136.22 B: Oh, so this.

6136.22 6137.22 B: All right.

6137.22 6141.58 B: So so then we have to believe when you say that Dimetrodon was not a dinosaur.

6141.58 6143.54 B: By definition, that has to be true.

6143.54 6144.54 S: I'm not saying that.

6144.54 6145.90 S: I could be lying about anything.

6145.90 6146.90 B: OK.

6146.90 6149.10 B: So then it's not the theme.

6149.10 6151.02 B: Oh, you're really making it confusing.

6151.02 6152.70 B: So it's not necessarily the theme.

6152.70 6153.70 B: All right.

6153.70 6154.70 B: More close to modern.

6154.70 6155.70 S: Well, that's confusing.

6155.70 6156.90 S: That's the theme.

6156.90 6158.30 S: But the fiction is the fiction.

6158.30 6161.18 S: You know, so but go ahead.

6161.18 6165.30 B: So yes, I mean, if I remember Dimetrodon is not a dinosaur.

6165.30 6171.06 B: I don't think Quetzalcoatlus was able to take off by jumping straight from the from the into the air from the ground.

6171.06 6172.06 B: Too heavy fiction.

6172.06 6173.06 S: All right, Evan.

6173.06 6179.06 E: Oh, Bob, I wish you hadn't chosen that one as fiction because I was going to choose that one as science.

6179.06 6182.78 E: But now that you talked about it, I think it might be fiction.

6182.78 6186.78 E: I only know the name Quetzalcoatlus from mythology, right?

6186.78 6187.78 E: Mm hmm.

6187.78 6188.78 E: Right.

6188.78 6189.78 E: It's a god.

6189.78 6198.02 E: So I mean, but wouldn't a wouldn't a god be able to do things that would otherwise seem impossible to say a person?

6198.02 6202.06 E: So I'm kind of going at it from that angle and like, no way.

6202.06 6203.06 E: Jumping directly into the air.

6203.06 6204.06 E: Absolutely not.

6204.06 6208.38 E: But if you were this god like creature, then yeah, then you would be able to do that.

6208.38 6209.38 B: Right.

6209.38 6211.38 B: Why does God need a starship?

6211.38 6212.38 E: Exactly.

6212.38 6214.86 E: So I'm thinking you might be right, though, Bob.

6214.86 6220.14 E: It's whereas, you know, I don't really have know so much about the other two.

6220.14 6223.02 E: Dimetrodon not a dinosaur.

6223.02 6224.58 E: That sounds OK, I suppose.

6224.58 6225.58 E: And then the Krickosaurus.

6225.58 6226.58 E: Krickosaurus, Krikosaurus, Krikosaurus.

6226.58 6227.58 E: I think Krikosaurus.

6227.58 6228.58 E: Krikosaurus.

6228.58 6229.58 S: It's not a dinosaur.

6229.58 6230.58 S: It's a dinosaur.

6230.58 6232.30 E: Krikosaurus.

6232.30 6234.10 E: It sounds almost made up.

6234.10 6235.10 E: Right, Jay?

6235.10 6236.10 C: That sounds made up to me.

6236.10 6237.62 C: Well, they're all made up.

6237.62 6238.62 C: Their names are at least.

6238.62 6243.78 E: Well, no, I understand someone had to name it, but Steve made it up on the spot.

6243.78 6245.78 E: Jokingly made up.

6245.78 6247.62 E: Cross between a dolphin and a crocodile.

6247.62 6248.62 E: Gee whiz.

6248.62 6251.46 E: All right, Bob, I'm going to follow you into the darkness here.

6251.46 6253.30 E: I'll say that's a quadal.

6253.30 6254.90 E: I'll say that one is the fiction.

6254.90 6255.90 J: OK, Jay.

6255.90 6258.38 J: I make this very easy on myself.

6258.38 6262.94 J: Which one of these three sounds the most like a real dinosaur?

6262.94 6266.18 J: And I would say that that is the second one, so therefore that's the fiction.

6266.18 6267.18 J: The Dimetrodon.

6267.18 6268.18 C: But none of them are dinosaurs.

6268.18 6269.18 C: Right.

6269.18 6270.18 C: Supposedly.

6270.18 6271.18 C: That's the point.

6271.18 6272.18 B: Is that your answer, number two?

6272.18 6273.18 S: Wait, wait, wait.

6273.18 6274.18 S: I'm sorry, Steve.

6274.18 6275.18 S: The premise here is one of these is a dinosaur?

6275.18 6276.18 J: No, none of them are dinosaurs.

6276.18 6277.18 J: None of them.

6277.18 6278.18 J: None of them are dinosaurs.

6278.18 6279.18 J: Maybe.

6279.18 6288.90 S: But all the rules are, all bets are off with the fiction because it's the fiction, but the category is not a dinosaur.

6288.90 6289.90 C: Right.

6289.90 6292.14 C: These are all things that people often mistake for dinosaurs.

6292.14 6293.14 J: All right.

6293.14 6294.14 J: OK.

6294.14 6295.14 None Gotcha.

6295.14 6296.14 J: I still.

6296.14 6303.70 J: So wait, we're picking the one that we think is fake?

6303.70 6306.66 J: Just say one of these things is not true.

6306.66 6307.66 J: One of these things is not true.

6307.66 6308.66 J: That's it.

6308.66 6309.66 J: Very simple.

6309.66 6310.66 J: Oh, God.

6310.66 6311.66 J: You know what?

6311.66 6316.10 J: When you said that, Steve, I thought that one of them was not a dinosaur.

6316.10 6317.10 J: No, no, no.

6317.10 6318.10 C: None of them are dinosaurs.

6318.10 6319.10 C: None of them are dinosaurs.

6319.10 6320.10 J: I'm sorry.

6320.10 6321.10 J: I'm sorry.

6321.10 6322.10 J: I got it.

6322.10 6323.10 S: I got it.

6323.10 6324.10 S: I got it.

6324.10 6325.10 S: But just forget about the category.

6325.10 6326.10 S: Just pick the one, the statement that's not true.

6326.10 6327.10 S: Forget about that.

6327.10 6328.10 S: Just go with the words.

6328.10 6331.86 E: Pay no, don't think about the word elephant.

6331.86 6333.86 J: I'll go with the Quetzalcoatl.

6333.86 6334.86 J: OK.

6334.86 6335.86 J: And Cara.

6335.86 6340.94 C: I am so glad you picked me last and I think you picked me last because you know that I'm a nerd in this area.

6340.94 6341.94 C: Yes.

6341.94 6342.94 C: I'm obsessed with this stuff.

6342.94 6344.94 E: In dinosaurs, but not these things.

6344.94 6351.66 C: No, I love non-dinosaur dinosaur, like non-dinosaur organisms.

6351.66 6354.34 C: So here is what I think.

6354.34 6360.86 C: I think Quetzalcoatlus probably could jump in the air because it's crazy looking because it has like it doesn't look like a bird.

6360.86 6362.06 C: It doesn't have little bird legs.

6362.06 6363.74 C: It has like dinosaur looking legs.

6363.74 6366.10 C: It's got big, beefy legs.

6366.10 6367.54 C: So I bet you it could jump.

6367.54 6374.62 C: I also don't remember Cricosaurus, but there are quite a few swimming reptiles, marine reptiles that are crocodilian.

6374.62 6376.30 C: So why not?

6376.30 6385.66 C: But I'm pretty sure that Dimetrodon is a synapsid and synapsids are mammal like reptiles, mammal like reptiles.

6385.66 6387.74 C: I think it is probably related to mammals.

6387.74 6391.58 C: And I'm also pretty sure it was extinct long before dinosaurs came to pass.

6391.58 6393.54 C: So I'm going to go with Dimetrodon's a fiction.

6393.54 6396.22 B: OK, so you all agree with the third one.

6396.22 6397.22 B: I'll surrender right now.

6397.22 6400.78 E: You don't even have to explain anything.

6400.78 6403.54 S: You all agree with the third one, so we'll start there.

6403.54 6411.90 S: Cricosaurus Suvicus was a crocodile relative fully adapted to aquatic life and looked like a cross between a dolphin and a crocodile.

6411.90 6417.90 S: You guys all think that one is science and that one is science.

6417.90 6418.90 C: Dolphodile.

6418.90 6421.06 S: A dolphin crocodile?

6421.06 6429.98 S: There's a ton of really cool reptiles at this time with incredibly adaptive radiation of stuff.

6429.98 6438.86 S: So this was this is a group called the Thalladosuchia, Thalladosuchia, which are crocodilians.

6438.86 6442.70 S: And this one group did evolve to be fully adapted to the water.

6442.70 6445.70 S: So they look like dolphins, but they have the heads of crocodiles.

6445.70 6446.70 C: That's cool.

6446.70 6449.58 C: So they're not they're not like related to mosasaurs at all.

6449.58 6450.58 S: Well, yeah, well, sure.

6450.58 6451.58 S: Yeah, they are.

6451.58 6452.58 C: I mean, everything's related.

6452.58 6454.58 S: But yeah, not super closely.

6454.58 6458.42 C: No, but yeah, I don't know if they're even considered crocodilians.

6458.42 6460.22 C: No, they're different.

6460.22 6464.22 S: So it's like the plesiosaurs, the mosasaurs, then there's the Thalladosuchians.

6464.22 6467.06 S: It's like just a different group.

6467.06 6469.10 S: But yeah, they were around similar time.

6469.10 6471.22 S: And yeah, there's a ton of stuff adapted to the water.

6471.22 6472.22 S: You're right.

6472.22 6473.22 S: A lot of reptiles adapted to the water.

6473.22 6474.22 S: Cool looking thing.

6474.22 6476.26 S: It looks like a dolphin with a crocodile head.

6476.26 6477.26 C: It's really cool.

6477.26 6480.06 C: Yeah, because I think of mosasaurs as being a little dolphin alike, but they're not really.

6480.06 6481.06 C: They don't really look like dolphins.

6481.06 6485.36 S: No, it's had a fluke and the fins and but the crocodile head.

6485.36 6486.36 S: Very cool.

6486.36 6487.36 S: Cricosaurus.

6487.36 6489.02 S: Yeah, so again, the crazy sounding name.

6489.02 6490.02 S: You got to go with that.

6490.02 6491.02 C: All right.

6491.02 6492.02 C: Let's go back.

6492.02 6493.02 C: So weird.

6493.02 6494.02 C: It had weird legs.

6494.02 6495.02 S: It's like a dolphin with legs.

6495.02 6496.02 S: Yeah, yeah.

6496.02 6497.34 S: But they're not walking legs, though.

6497.34 6498.34 S: They are swimming fin.

6498.34 6499.34 C: They're swimming legs.

6499.34 6500.34 C: Yeah, they're swimming legs.

6500.34 6501.34 C: So creepy.

6501.34 6502.34 S: All right.

6502.34 6503.34 S: Let's go to number one.

6503.34 6510.62 S: Quetzalcoatlus northropi, the largest flying animal ever with a wingspan of 10 meters, was able to take off from the ground by jumping directly into the air.

6510.62 6513.18 S: A lot of things there that could be the fiction.

6513.18 6514.18 S: Oh, crap.

6514.18 6517.98 S: But this one is this is science.

6517.98 6518.98 S: Ah, so cool.

6518.98 6521.30 C: Quetzalcoatlus is so cool.

6521.30 6522.30 S: It is cool.

6522.30 6533.34 S: So there was recently this is the inspiration for this theme because there recently was a series of six papers doing a full evaluation of Quetzalcoatlus as a genus.

6533.34 6534.34 S: Right.

6534.34 6537.10 S: So northropi is one of the species, but Quetzalcoatlus is a genus.

6537.10 6539.02 S: There's multiple species.

6539.02 6542.46 S: Northropi was the first discovered and it's the biggest.

6542.46 6545.10 S: 10 meters, man, that's a massive wingspan.

6545.10 6547.66 S: This thing was a huge, huge animal.

6547.66 6549.26 C: These things are the size of giraffes.

6549.26 6550.26 C: Yeah.

6550.26 6552.94 S: They're freaking huge.

6552.94 6555.62 S: How do we know that it took off from the ground?

6555.62 6556.62 S: Right.

6556.62 6557.78 S: So a couple of ways.

6557.78 6569.10 S: One is that we know it couldn't take a running jump because it couldn't walk with its wings because of the bone structure of its shoulders.

6569.10 6573.14 S: It could not propel itself with its forelimbs.

6573.14 6574.14 B: And what?

6574.14 6575.14 B: And it couldn't climb either?

6575.14 6576.14 B: Is that what you're going to say?

6576.14 6577.14 C: Well, because its wings are army-like.

6577.14 6580.44 C: They're like folded like little front arms.

6580.44 6582.02 S: The way they walk was really weird.

6582.02 6591.98 S: So they would move the left wing out to the side in order to make room for the back to the leg to step forward.

6591.98 6597.54 S: It may be leaned on its right wing when it did that, but then it would reverse it.

6597.54 6602.10 S: It would bring the left wing down, bring the right wing up and take a step with its right leg.

6602.10 6603.96 S: So then that's how they would walk.

6603.96 6610.18 S: So they just had to move the wings out of the way, but they weren't able to propel themselves with their forelimbs.

6610.18 6611.92 S: So they're basically bipedal.

6611.92 6617.94 S: They might have used them for a little bit of quadrupedal aid, but they were basically bipedal.

6617.94 6619.46 S: But they couldn't run and take off.

6619.46 6621.10 S: They were too big to climb, really.

6621.10 6623.06 S: They were just two massive creatures.

6623.06 6636.74 S: They also, there's a lot of comparisons made to the other like modern large water birds, you know, like storks, you know, that would take off in the exact same way.

6636.74 6638.62 S: They would jump into the air.

6638.62 6640.28 S: Why would they need to jump into the air?

6640.28 6645.54 S: Because they couldn't flap their wings from standing still because they couldn't get a full downstroke.

6645.54 6646.54 S: Their wings would hit the ground.

6646.54 6647.54 S: They're too big.

6647.54 6648.54 S: They're too big.

6648.54 6671.58 S: They would have to jump up in order to get a full downward stroke with their wings and to Cara's correct, their legs were a massive, their hind legs were just really powerful, more powerful than they would have to be just for walking, but powerful enough that they could jump six feet into the air or whatever they had to do in order to get a full downstroke with their wings.

6671.58 6675.34 S: So you put all that together and that was the conclusion of the analysis.

6675.34 6680.02 S: That's how they must have taken off, just by jumping into the air and getting a full downstroke with their wings.

6680.02 6686.34 S: Okay, so all this means that while contemporary with dinosaurs and often mistaken for one Dimetrodon, is it Dimetrodon or are you saying Dimetrodon?

6686.34 6689.98 C: I say Dimetrodon, but I took paleontology.

6689.98 6693.78 C: I took paleontology in college because I'm a nerd and I love this topic.

6693.78 6697.02 C: I took it in Texas and I don't know, we say things weird down there.

6697.02 6699.10 C: But also you guys say things weird in Connecticut.

6699.10 6709.86 C: I will say that the minute you said the theme is non dinosaur or things that people confuse for dinosaurs, in my head I go, Dimetrodon is the first thing I thought.

6709.86 6712.46 S: That's why I said anything.

6712.46 6718.46 S: Dimetrodon is not a dinosaur and is more closely related to modern lizards, but there's two things that are wrong there and Carrie, you hit about both of them.

6718.46 6723.90 S: So first of all, Dimetrodon's where it went extinct 40 million years before dinosaurs evolved.

6723.90 6728.66 S: So they were not contemporary dinosaurs and they're not more closely related to lizards.

6728.66 6741.42 S: So lizards and dinosaurs are all in one group and then Dimetrodon is a synapse, you're correct, which is in a separate group that is related to early mammals.

6741.42 6747.22 S: So Dimetrodon is not an ancestor to mammals, but it is in the group of mammal-like reptiles.

6747.22 6748.74 S: They're so weird.

6748.74 6755.34 S: It's essentially the same evolutionary distance from lizards and dinosaurs, right?

6755.34 6757.14 S: Not more closely related to lizards.

6757.14 6762.58 C: They think of them as like, you know, Spinosaurus was the really tall one that looks like T-Rex but had a sail on its back.

6762.58 6764.98 C: People mistaken that because he had a sail on his back.

6764.98 6771.10 C: You're like, oh, like in kids bedsheets and stuff, they draw pictures of Dimetrodon and they're like, look at the dinosaurs.

6771.10 6772.10 C: Yeah, totally.

6772.10 6781.02 S: But the other thing is, Cara, so the old picture of a Dimetrodon was with the legs out to the side, the belly on the ground.

6781.02 6794.46 S: But the more modern reconstructions think that the legs were underneath and it was walking more upright, partly based upon the hips, the track, well, the footprint, the tracks, we have tracks of Dimetrodon.

6794.46 6795.46 C: That makes sense.

6795.46 6801.34 S: And they're narrow, like they're close together and there's no belly drag, you know.

6801.34 6812.38 S: But the paleontologists say, well, maybe they swing back and forth really extremely when they walk, which would bring their feet closer together, you know, in the path.

6812.38 6816.02 S: But the tracks don't really support that.

6816.02 6822.46 S: So it may be that they did walk more like, you know, with their legs underneath them rather than out to the side.

6822.46 6831.10 B: Yeah, but wouldn't Steve, wouldn't the anatomy, wouldn't the skeleton itself absolutely show the orientation of the legs under, you know, under the body?

6831.10 6833.62 C: No, it's all about how you articulate the...

6833.62 6836.82 S: Yeah, no, the short answer is no.

6836.82 6838.58 C: And it wouldn't absolutely, wouldn't necessarily...

6838.58 6844.46 S: Because it says quality of completeness of the skeletons, but also you can make choices about how things fit together.

6844.46 6849.90 C: But if you look at old dinosaurs in museums, it's amazing what the mounts look like compared to now.

6849.90 6862.66 S: And here's the thing, Bob, when I was reading like the updates on the Dimetrodon, they were saying, you know, this question, like this is how it was reconstructed a hundred years ago, and no one's really questioned it since.

6862.66 6864.14 S: And now that's the problem.

6864.14 6868.54 S: Yeah, take a fresh look at it, like, yeah, it makes more sense to have it this way.

6868.54 6869.54 S: Actually Dimetrodon was one of...

6869.54 6870.54 B: That's what I mean.

6870.54 6875.66 B: Yeah, they could have defined that, you know, from first principles instead of, like, going with what it was.

6875.66 6877.78 S: Let me ask you guys a question.

6877.78 6884.32 S: Of these three, Quetzalcoatlus, Dimetrodon, and Crykosaurus, which one do you think was discovered first?

6884.32 6885.86 S: Like how long ago?

6885.86 6886.86 B: Crykosaurus.

6886.86 6887.86 B: Dimetrodon.

6887.86 6891.22 E: What would be the giveaway for that?

6891.22 6893.62 C: Yeah, it could be Dimetrodon, because I think it's...

6893.62 6894.62 C: Crykosaurus.

6894.62 6895.62 C: Really?

6895.62 6901.62 S: Among the first, among the first really ancient reptiles discovered was from that group.

6901.62 6902.62 S: Huh.

6902.62 6908.78 S: There were at least, if not specifically that species, or at least the group that it belongs to, you know, the crocodilian swimming crocodiles.

6908.78 6910.14 C: Do you know where they were?

6910.14 6911.58 C: Oh, Germany.

6911.58 6913.26 C: Three skulls in Germany.

6913.26 6914.26 C: Interesting.

6914.26 6916.06 C: Dimetrodon I know is an American.

6916.06 6919.30 S: Yeah, although that has also been recently discovered in Europe.

6919.30 6920.30 S: Oh, okay.

6920.30 6925.66 S: It is mostly American, the fossils, but there were some recent finds in Europe, yeah.

6925.66 6928.44 S: And Quetzalcoatlus is also, like, near Mexico.

6928.44 6929.44 S: That's why it's based on...

6929.44 6930.44 S: Yeah, it's based on Mexico.

6930.44 6940.54 S: ...the Mexican Quetzalcoatlus, which is a flying god, although it had feathers, you know, which doesn't really fit with the pterodactyls, but that's where the name came from, yeah.

6940.54 6941.54 C: Man, they're cool.

6941.54 6942.54 C: You've never seen a skeleton in person.

6942.54 6943.54 C: So many cool things, yeah.

6943.54 6944.54 C: Go see a Quetzalcoatlus.

6944.54 6947.66 C: It blows your mind how big they are.

6947.66 6948.66 S: Yeah, yeah.

6948.66 6960.66 S: The thing is, you know, and part of why I like this theme is that we do get, like, over and over again exposed to the same few species, like, you know, like everybody knows, you know, T-Rex and...

6960.66 6961.66 S: Brontosaurus.

6961.66 6962.66 S: ...Pterosaurus, yeah.

6962.66 6963.66 S: Triceratops.

6963.66 6964.66 S: And Triceratops.

6964.66 6965.66 S: Stegosaurus.

6965.66 6973.22 S: And so many amazingly bizarro things out there that we know about, and these aren't even the most bizarre.

6973.22 6974.22 S: There are just some...

6974.22 6977.74 S: You see some pictures of things, like, what the hell is that?

6977.74 6990.30 S: So it really is worth exploring, you know, just the number of different groups, the entire groups of, you know, animals alive at the time of the dinosaurs or whatever, even, you know, at any time in the past that you're not even aware existed.

6990.30 6993.40 S: Some of them are really bizarro, you know, to modern eyes.

6993.40 6999.34 S: So definitely, you know, just tool around the internet and look for weird crap like that.

6999.34 7000.34 C: You'll be amazed at what's out there.

7000.34 7002.50 C: I'm going to Google weird extinct species.

7002.50 7003.50 C: Yeah, right.

7003.50 7004.50 C: Right, right.

7004.50 7005.50 E: Careful.

7005.50 7007.50 E: You might wind up binging.

Skeptical Quote of the Week ()[edit]

TEXT
– AUTHOR (YYYY-YYYY), _short_description_

7007.50 7008.98 S: All right, Evan, give us a quote.

7008.98 7012.34 E: This quote was suggested by a listener, someone named Visto Tutti.

7012.34 7014.34 E: Never heard of him before.

7014.34 7016.26 E: Thanks for listening.

7016.26 7018.80 E: We are all delusional to some extent.

7018.80 7021.62 E: Human brains were not selected to perceive reality.

7021.62 7024.48 E: They were selected to reproduce.

7024.48 7032.14 E: And that was said by Shankar Vedantam, who is a host and creator of Hidden Brain.

7032.14 7033.70 E: Very popular podcast.

7033.70 7038.18 S: So that's I agree with that quote, except it's a little reductionist, meaning interesting.

7038.18 7041.10 C: Yeah, it's sort of making a false dichotomy there.

7041.10 7046.06 E: Maybe the reproduce part, but I think the perception of reality, I think.

7046.06 7047.06 E: Yeah, that's true.

7047.06 7048.06 C: We in no way.

7048.06 7049.06 C: Yeah, yeah.

7049.06 7051.58 S: It kind of misses the point, though.

7051.58 7058.40 S: As a neuroscientist, what I would say, like we it's not that it was it our brains are not selected to perceive reality.

7058.40 7067.30 S: It's that our perception is optimized for things other than being completely accurate.

7067.30 7069.20 S: But it's not it's not all about reproduction.

7069.20 7070.58 S: It's also about survival.

7070.58 7071.58 B: You know?

7071.58 7072.58 B: Well, yeah.

7072.58 7079.34 B: So it's kind of related because it was optimized to show us just enough reality so that we can't the brain can reproduce.

7079.34 7080.34 S: Yeah.

7080.34 7083.06 S: But it's not it's not how much reality to it.

7083.06 7087.50 S: It's the way it it reconstructs reality.

7087.50 7088.50 S: You know what I mean?

7088.50 7097.94 S: It was it was meant it was adapted to the things that favor our survival, not which is not necessarily being the most accurate or the highest fidelity.

7097.94 7100.58 S: In fact, sometimes it's wrong.

7100.58 7101.68 S: It's deliberately wrong.

7101.68 7104.38 S: But in a way that favors our reproductive success.

7104.38 7106.62 C: But in a pithy way, isn't that kind of what he said?

7106.62 7107.62 S: Yeah, I agree.

7107.62 7111.18 S: I'm saying I agree with it in general, it is pithy.

7111.18 7112.70 S: It's a fun quote.

7112.70 7116.34 S: But I just it's a little as stated, it's a little reductionist.

7116.34 7119.78 S: And it takes a lot of explanation to really say what it what it.

7119.78 7122.86 C: Yeah, I bet you that came at the end of a longer explanation.

7122.86 7126.98 B: Probably before almost Michio Kaku esque.

7126.98 7128.30 B: Almost, right?

7128.30 7129.30 B: Almost.

7129.30 7130.30 B: Yeah, almost as bad as him.

7130.30 7131.30 B: Hey, Bob.

7131.30 7133.18 C: Except not at all.

7133.18 7134.18 C: But except right.

7134.18 7135.18 C: Except not.

7135.18 7136.18 S: Yeah.

7136.18 7138.18 S: I wouldn't go that far, but.

Signoff/Announcements ()[edit]

7138.18 7139.18 S: All right.

7139.18 7140.98 S: Well, thank you all for joining me this week.

7140.98 7143.64 S: You got your last.

7143.64 7146.30 S: Next week is our last regular show of the year.

7146.30 7147.30 S: Oh, my gosh.

7147.30 7152.46 S: The week after that will be a show we prerecorded recently when we were in Colorado.

7152.46 7153.46 S: Right.

7153.46 7154.46 C: So don't send us email.

7154.46 7155.46 S: Yeah.

7155.46 7156.90 S: Why are you publishing your whole show?

7156.90 7160.54 S: And then the next one after that will be our year end year and review.

7160.54 7161.54 S: Oh, boy.

7161.54 7162.54 S: End of the year show.

7162.54 7163.54 S: Wow.

7163.54 7164.54 S: All right.

7164.54 7165.54 S: So see you guys all next week.

S: —and until next week, this is your Skeptics' Guide to the Universe.

S: Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information, visit us at theskepticsguide.org. Send your questions to info@theskepticsguide.org. And, if you would like to support the show and all the work that we do, go to patreon.com/SkepticsGuide and consider becoming a patron and becoming part of the SGU community. Our listeners and supporters are what make SGU possible.

[top]


Today I Learned[edit]

  • Fact/Description, possibly with an article reference[9]
  • Fact/Description
  • Fact/Description

Notes[edit]

References[edit]

Vocabulary[edit]


Navi-previous.png Back to top of page Navi-next.png