SGU Episode 855

From SGUTranscripts
(Redirected from Colorado (855 SoF))
Jump to navigation Jump to search
  GoogleSpeechAPI.png This episode was transcribed by the Google Web Speech API Demonstration (or another automatic method) and therefore will require careful proof-reading.
  Emblem-pen-green.png This transcript is not finished. Please help us finish it!
Add a Transcribing template to the top of this transcript before you start so that we don't duplicate your efforts.
  Emblem-pen-orange.png This episode needs: transcription, proofreading, time stamps, formatting, links, 'Today I Learned' list, categories, segment redirects.
Please help out by contributing!
How to Contribute

You can use this outline to help structure the transcription. Click "Edit" above to begin.

SGU Episode 855
November 27th 2021
855 spinlaunch.jpg
(brief caption for the episode icon)

SGU 854                      SGU 856

Skeptical Rogues
S: Steven Novella

B: Bob Novella

C: Cara Santa Maria

J: Jay Novella

E: Evan Bernstein

Guest

GH: George Hrab, American musician & podcaster

Quote of the Week

The more connections you can make across an ever wider and more disparate range of knowledge, the more deeply you will understand something. Search engines and videogames do not provide that facility; nothing does, other than your own brain.

Susan A. Greenfield, English scientist and member of the House of Lords

Links
Download Podcast
Show Notes
Forum Discussion

Introduction[edit]

Voice-over: You're listening to the Skeptics' Guide to the Universe, your escape to reality.

9.00 20.00 S: Hello and welcome to the Skeptic's Guide to the Universe.

20.00 28.00 S: Today is Friday, November 19, 2021, and this is your host, Stephen Novella.

28.00 34.00 S: Joining me this week are Bob Novella.

B: Hey everybody.

34.00 39.00 S: Cara Santa Maria.

C: Howdy.

39.00 43.00 S: Jay Novella.

J: Hey guys.

43.00 48.00 S: Evan Bernstein.

E: Hello Denver.

48.00 56.00 S: And we have a special guest this week, George Hrab.

GH: Hello Denver.

56.00 60.00 S: It's really a beautiful, very beautiful city. I mean the big sky country, the landscape is just amazing.

60.00 66.00 S: Very different from what we're used to in Connecticut. Connecticut's all hills and trees, so you're always under a canopy.

66.00 69.00 J: Big sky country. Well that's Montana.

69.00 74.00 E: Steve, we talked about this. Big sky-ish.

74.00 79.00 C: There's mountains. There's a lot of mountains here. They're far away.

79.00 83.00 C: They don't look far away. They sort of block the...no? Okay.

83.00 89.00 B: Farther away than I thought though, right?

B: This is like a huge flat mountain top or something that Denver is on.

89.00 94.00 B: Yeah, a mile, a mile high, right? High Plains. High Plains.

94.00 101.00 S: Yes, and it's literally a mile high. I learned that the 13th step of the what, the Capitol Building is actually exactly one mile above sea level.

101.00 107.00 E: Doesn't fluctuate after a year or two, like a centimeter or something? Do they keep shifting the step?

107.00 116.00 B: They're going to have to lift the building up. It would fluctuate with the tides, because the tides actually, there's a tide on the land as well.

116.00 120.00 B: It's very minor, but it's measurable. Changes the steps a lot.

120.00 125.00 S: So when you calculate, what is sea level then with respect to the tides? They take an average or is it low tide?

125.00 131.00 S: It's mean sea level. Mean sea level. That makes sense. Why don't they take nice sea level?

131.00 135.00 B: I knew he was going to mention that.

135.00 140.00 S: So I've never been to Denver before. What? What? Yes.

140.00 143.00 S: Never been to Denver before. It's my first time here.

143.00 146.00 B: Steve, you've never been to Denver before?

146.00 149.00 J: Oh yeah, I haven't either.

149.00 152.00 E: Wait, you have a dealer? I used to live here. I'm very excited to come.

152.00 157.00 E: Oh, you used to live here? I lived in Englewood in 1982. Anyone from Englewood?

157.00 163.00 J: I went to, is it still there, Cherry Creek Middle School? Does that bring a bell?

163.00 167.00 E: Cherry Creek High School. What was the middle school? It was attached to the high school.

167.00 171.00 E: Campus. Campus. Thank you. That was it. Thank you. Wow. So it's good to be back.

171.00 176.00 J: So you weren't lying?

176.00 179.00 E: Not about that.

179.00 182.00 B: With Evan, I mean half of it's just all bullshit.

182.00 184.00 B: With Jay, he really did his homework.

184.00 188.00 S: So I was curious if I was going to feel the altitude, right? Because this is the big...

188.00 192.00 S: Yes. And the answer is definitely yes.

192.00 198.00 S: Not when we're doing... Well, so we're definitely all pounding back the water because it's like we're getting dehydrated.

198.00 204.00 S: Which I knew about. I did go visit Mount Wilson. So that was 10,000 feet, right?

204.00 208.00 S: So that's even higher than here. I got dehydrated so fast there.

208.00 213.00 S: I didn't realize it until I had a pounding headache. So you really have to keep well, well hydrated.

213.00 220.00 S: But here it's like the lack of oxygen concentration is... If you go up one flight of steps, you notice it.

220.00 225.00 S: Because we have zero acclimation to this. You guys all have more red blood cells than we do. So you don't feel it.

225.00 229.00 S: Well, we went to Red Rocks today to show these guys.

229.00 231.00 B: And the amphitheater is gorgeous.

231.00 236.00 S: It's so amazing. But I went. So we walked. Our friend Brian and I, we walked down to almost the bottom.

236.00 239.00 S: There's a concert they were setting up for us. Almost to the bottom.

239.00 244.00 S: And then we're walking back up and both Brian and I were like... Holy shit.

244.00 247.00 S: We had to literally stop, talk for a little while.

247.00 250.00 B: I got to the top. Like, where's Jay? What happened to Jay?

250.00 252.00 E: He's way down there.

Special Segment: Families, Vaccines, and Thanksgiving (4:19)[edit]

252.00 258.00 S: We're going to start with an open-ended discussion here that George actually inspired.

258.00 265.00 S: So the topic is families, vaccines, and Thanksgiving.

265.00 267.00 S: Ooh.

267.00 269.00 S: That would make a good t-shirt.

269.00 270.00 S: A lot of drama.

270.00 271.00 S: Exactly.

271.00 272.00 S: Exactly.

272.00 274.00 S: Do you want to start, George?

274.00 279.00 GH: Well, I heard about a person, someone sent me this email.

279.00 282.00 GH: It was from the person's family.

282.00 284.00 GH: And I just want to read this email real quick.

284.00 289.00 GH: It said, Dear family, apparently this person is going to be hosting Thanksgiving.

289.00 295.00 GH: And said everybody's got to be vaccinated because there's going to be 25 of us in a room.

295.00 297.00 GH: Everybody's got to be vaccinated.

297.00 300.00 GH: This was made apparent to all the people that were being invited.

300.00 301.00 GH: And part of the family wrote this.

301.00 308.00 GH: Dear family, with sincere regrets and sadness, we will not be able to attend this year's family Thanksgiving celebration because of this vaccine mandate for attendance.

308.00 319.00 GH: Our reasons for not being vaccinated are personal and based on the steady accumulation of evidence of adverse effects and warning efficacy of the experimental vaccines, in quotes.

319.00 327.00 GH: Our absence should certainly relieve you of any concerns of fear of our becoming infected or of you becoming infected by us.

327.00 333.00 GH: Sadly, we are now part of the large number of families who will be separated this holiday season because of vaccine mandates.

333.00 343.00 GH: We are hugely disappointed that it has come to this, a family that is divided based on a vaccine that was originally derided and has now become mandatory in order to be accepted socially and at work.

343.00 349.00 GH: As some food for thought, Thanksgiving pun intended, we enclose the following two items for you to view.

349.00 353.00 GH: And there are two links to two videos that are just a collection of memes.

353.00 357.00 GH: It's like a Fauci meme and some other meme.

357.00 372.00 C: Right, because they aren't, like my father who I might have mentioned on the show, but haven't, definitely on Talk Nerdy, but maybe on this show, is like a full-on Fox News, like just repeating Tucker Carlson, like very, like Trumper, like voted both times for Trump, but he's vaccinated.

372.00 375.00 C: Like he's, you know, and there are those too.

375.00 383.00 GH: That's who I just, I have such ire for Tucker and Hannity and who are all being vaccinated.

383.00 385.00 C: Of course they are. Yeah, of course they are.

385.00 387.00 C: And have been for a long time.

387.00 391.00 GH: And have been for a long time because Fox has vaccine mandates to work there, you know.

391.00 393.00 GH: We all know this. We all know this kind of stuff.

393.00 407.00 C: But how frustrating is it that the burden has been placed, like the framing there was because of your mandate, we can't be together, instead of because of your unwillingness to get vaccinated, we can't be together.

407.00 410.00 GH: But it's like, you know, look, if you're going to be in my house, you have to wear pants.

410.00 411.00 GH: Exactly. It's a rule I have.

411.00 413.00 GH: It's like, because you don't.

413.00 414.00 GH: It's a very, you know.

414.00 426.00 S: But here's the thing, is that this leads to a much deeper, darker, and scarier conversation because what you're experiencing is a microcosm of what America is experiencing.

426.00 427.00 S: No question.

427.00 433.00 S: And Jay's response is, you know, of course it applies in certain contexts.

433.00 444.00 S: But when you think about it, and this is, you know, I've been reading a lot about this very debate, and there are those on the end of the spectrum saying, listen, we're not going to get anywhere as a country.

444.00 445.00 S: We're never going to resolve our issues.

445.00 450.00 S: We're never going to be able to move forward if we treat each other like enemies.

450.00 451.00 S: Sure.

451.00 455.00 S: And like, yeah, that's true in normal times.

455.00 457.00 S: But we are clearly not in normal times.

457.00 473.00 C: The stage before we get there, because yes, if you read the literature of sort of anybody who studies like authoritarian regimes and how they roll and like genocide and like what are the initial seeds of genocide, like we see all the same kind of threads.

473.00 484.00 C: But one of the things that I think the place where we're at right now, and this is what I fear that what you're talking about doing if we do it on a mass scale is very dangerous, is because it normalizes this behavior.

484.00 486.00 S: Yeah, let me give you an analogy.

486.00 489.00 S: There could be different schools of thought within science, right?

489.00 490.00 S: Sure.

490.00 494.00 S: Group of scientists A think that climate change killed the megafauna in North America.

494.00 499.00 S: Group of scientists B think that overhunting by humans killed the megafauna, whatever.

499.00 501.00 S: Pick any scientific disagreement.

501.00 504.00 S: But they all agree on the rules of science, right?

504.00 510.00 S: They disagree on this, and they could fight vehemently and viciously about this answer, but they all agree that science is science.

510.00 518.00 S: If one side starts cheating and one side starts doing pseudoscience, and I think this is what's happening with alternative medicine.

518.00 527.00 S: This is why you can't treat alternative medicine like just another opinion about how to treat people, because they're changing the rules of science in a bad way.

527.00 529.00 S: They're doing pseudoscience.

529.00 538.00 S: When that happens, the debate about what killed off the megafauna, it becomes subservient to the debate about what's science, right?

538.00 540.00 S: So it's the same thing.

540.00 545.00 S: We're apolitical on this panel, meaning that not that we don't have politics.

545.00 547.00 S: That's not what we talk about.

547.00 550.00 S: We talk about critical thinking and logic and whatever.

550.00 551.00 S: Well, we talk about it on the livestream.

551.00 552.00 S: It may have a political context.

552.00 554.00 S: Watch the livestream if you want to hear us talk.

554.00 559.00 S: But it may have a political context, but we don't take political sides really.

559.00 566.00 S: But in this situation, it's not about politics the same way when you're doing scientific fraud.

566.00 568.00 S: It's no longer about the normal scientific debate.

568.00 571.00 S: It's not about left or right, conservative, liberal, whatever.

571.00 574.00 S: It's about you're trying to break democracy, and we're trying to save it.

574.00 577.00 S: And when you're at that level, the rules change.

577.00 579.00 S: And so again, what does that really mean?

579.00 580.00 S: It just feels so dirty.

580.00 581.00 C: I'm like, I'm hearing you.

581.00 582.00 C: It makes sense.

582.00 583.00 C: But it is dirty.

583.00 587.00 C: I mean, that's why I mentioned sort of the authoritarian playbook.

587.00 597.00 C: I mean, yes, authoritarianism is a form of politics, but it's beyond what you said, like this agreement about the left, the right, all within a democratic agreement.

597.00 612.00 C: Because if you look at the structure of the systematic dismantling of trust in institutions, this is a choice that was made, and it's agenda driven by specific people in power so that people will say, I no longer trust the Justice Department.

612.00 614.00 C: I no longer trust the scientists.

614.00 616.00 C: I no longer trust the media.

616.00 617.00 C: Biden is not the president.

617.00 618.00 C: Yeah.

618.00 619.00 C: And what happens when you start to do all of that?

619.00 622.00 C: You go, only that strong man has the answers.

622.00 625.00 C: And this happens just nation by nation.

625.00 627.00 C: We've seen this happen so many times.

627.00 637.00 S: If you don't think you can get satisfaction or resolve your issues through the system because the system is broken, you have no choice but to go outside the system.

637.00 639.00 S: And what does going outside the system mean?

639.00 640.00 S: Civil war?

640.00 641.00 S: Insurrection?

641.00 642.00 GH: Possibly.

642.00 643.00 GH: Insurrection.

643.00 644.00 GH: Obviously, some people thought it meant insurrection.

644.00 646.00 GH: I think we've been so trained.

646.00 656.00 GH: For 14, 15, 20 years, we sort of in the skeptic community have been repeatedly giving or trying to give the message of, oh, my cousin believes in Bigfoot, what do I do?

656.00 658.00 GH: And we say, okay, they believe in Bigfoot.

658.00 660.00 GH: You can go have dinner with them.

660.00 661.00 S: Find common ground.

661.00 662.00 S: Find common ground.

662.00 665.00 GH: This is not Bigfoot.

665.00 667.00 GH: There is no common ground.

667.00 670.00 GH: Yeah, but I think we're so trained to be like reach out and do the thing.

670.00 675.00 GH: And maybe there is this point where we've got to say like we have to draw a line.

675.00 676.00 C: You did.

676.00 677.00 S: You said you cannot come to Thanksgiving dinner.

677.00 678.00 S: Right.

678.00 679.00 S: But let's continue.

679.00 680.00 S: And I'm not judging at all.

680.00 682.00 S: Look, we're talking about this.

682.00 684.00 S: What does it physically look like?

684.00 686.00 C: Yeah, I think it depends on the situation.

686.00 689.00 C: I've maintained a relationship with my father and we do have common ground.

689.00 695.00 C: But there are people who I think you're not going to be able to find that common ground with.

695.00 698.00 C: And it's going to be situation to situation.

698.00 699.00 C: But what does that look like, though?

699.00 701.00 S: I mean, if it's your family, what do you do?

701.00 702.00 S: It's hard.

702.00 706.00 C: There are plenty of people right here in this room who don't talk to family members.

706.00 710.00 S: But the problem is that the media hasn't got the memo yet.

710.00 716.00 S: The media is still – just like George was saying, we are stuck in – we find common ground, work it out.

716.00 718.00 S: The media is stuck in false balance.

718.00 719.00 S: It's all one side, the other side.

719.00 721.00 S: They are normalizing.

721.00 723.00 C: But it's our choice what media we consume.

723.00 733.00 C: And if we choose to consume media that actually cares about informing the public over media that cares about ratings and getting advertisers, then we will be getting what we want.

733.00 735.00 C: And there are media outlets available to us.

News Items[edit]

S:

B:

C:

J:

E:

(laughs) (laughter) (applause) [inaudible]

Russia Shoots Down Satellite (12:17)[edit]

735.00 737.00 S: Let's move on to some news items.

737.00 740.00 S: We're going to start with non-cultural news items.

740.00 741.00 S: Right.

741.00 745.00 S: So who here saw the movie Gravity?

745.00 747.00 S: Got to clap.

747.00 750.00 S: There's no raised hands on podcasts.

750.00 751.00 S: One of my favorite movies.

751.00 752.00 S: Love that movie.

752.00 754.00 S: The science is a little bit crappy.

754.00 756.00 S: But I just love – whatever.

756.00 757.00 S: Love the director.

757.00 758.00 S: It was great.

758.00 765.00 S: But the disaster that the movie surrounds involved Russia shooting down one of their own satellites.

765.00 766.00 S: That would never happen.

766.00 767.00 S: It's just the movies.

767.00 768.00 S: Right.

768.00 769.00 S: How implausible is that?

769.00 774.00 S: And then the debris field like destroying the shuttle and the ISS and causing mayhem.

774.00 779.00 S: So apparently no one in Russia saw that movie.

779.00 780.00 S: Or maybe they did.

780.00 781.00 S: That's when they got the idea.

781.00 782.00 GH: Maybe.

782.00 784.00 GH: There's good training manual for you.

784.00 786.00 GH: That's how you do it.

786.00 788.00 GH: Comrade, watch this movie.

788.00 789.00 S: I like Clooney.

789.00 791.00 S: He's good.

791.00 802.00 S: So last week Russia, in a test of their anti-satellite system, ASAT as they call it, blew up one of their own defunct satellites spewing debris into low Earth orbit.

802.00 807.00 S: NASA says as a result there are 1,500 trackable pieces of debris.

807.00 808.00 S: Trackable.

808.00 809.00 S: Trackable pieces.

809.00 812.00 S: And there's tens of thousands of smaller bits of debris.

812.00 815.00 S: Any one of which could cause a disaster.

815.00 820.00 S: In fact, the debris field was moving past the ISS.

820.00 827.00 S: The astronauts, including two Russian cosmonauts, had to take shelter in their- Bunker.

827.00 828.00 S: No, no.

828.00 830.00 S: Their bunker is their descent vehicle.

830.00 831.00 S: Yes.

831.00 832.00 S: Right, because they're the most shielded.

832.00 833.00 B: Pull out.

833.00 834.00 S: Pull the grab.

834.00 835.00 S: They could take it more.

835.00 836.00 S: They could take it more.

836.00 840.00 S: So yeah, they had to shelter in their descent vehicles until the debris field passed.

840.00 841.00 S: Could you imagine being those astronauts?

841.00 844.00 C: I can't imagine anything they deal with ever.

844.00 845.00 C: So bananas.

845.00 848.00 S: Even when everything is great, it's still like, oh, shit.

848.00 849.00 S: It's so scary.

849.00 854.00 S: I have to pee in a bag.

854.00 864.00 S: This was obviously a disastrous move on the part of Russia, but there is some context here in that this is like Russia's now the fourth country to test an anti-satellite system.

864.00 866.00 S: The first country to test it was- China.

866.00 869.00 S: No, the United States.

869.00 871.00 S: The second country to test it was China.

871.00 872.00 S: Oh, second.

872.00 874.00 S: Yeah, but did we blow up a satellite?

874.00 875.00 S: Yes.

875.00 876.00 S: We did.

876.00 877.00 S: But it was different-

877.00 882.00 GH: Yeah, but that satellite was reaching for a gun.

882.00 891.00 S: That's all I'm saying.

891.00 894.00 GH: Welcome to America.

894.00 902.00 S: Apparently, I tried to find out as much information I could about that. Apparently, the satellite was in a decaying orbit and most of the debris fell into the atmosphere.

902.00 907.00 S: So it wasn't as bad, but still they shouldn't have done it, but it was not as bad.

907.00 914.00 S: And then China blew one up in 2007, and then India tested their ASAT system, and now Russia's done it.

914.00 916.00 S: There's four countries who are developing ASAT systems.

916.00 919.00 C: Do you know the context around China and India?

919.00 923.00 C: Was it a similar thing where they had experts weigh in and said this is the safe way to do this?

923.00 924.00 S: No.

924.00 925.00 S: So this is all about saber rattling.

925.00 930.00 S: This is all about militarizing space, which is a horrible idea in and of itself.

930.00 932.00 S: Of course, Russia's like, nothing to see here.

932.00 933.00 S: The debris was fine.

933.00 935.00 S: It's not going to hurt anybody.

935.00 936.00 S: America are hypocrites.

936.00 937.00 S: They did the same thing.

937.00 938.00 S: Whatever.

938.00 955.00 S: But the bottom line is, imagine if they're preparing for an anti-satellite war, like where they blow up all of our spy satellites and we have to blow up all of their spy satellites, and low Earth orbit would become rapidly- Inedible.

955.00 956.00 B: We can't do anything with it anymore.

956.00 957.00 B: Unusable if that happens.

957.00 960.00 B: So this is far worse than those words might imply.

960.00 961.00 B: I mean, there's a name for this.

961.00 964.00 B: This is the Kessler syndrome or Kessler effect.

964.00 970.00 B: This is so horrific that it's a collisional cascading where eventually everything in low Earth orbit hits everything else.

970.00 975.00 B: It's all debris traveling at like 17, the speed of a rifle bullet.

975.00 978.00 B: We could never, we won't be able to leave orbit.

978.00 982.00 B: You can't leave orbit for years or generations.

982.00 984.00 B: All of our communication satellites- Or have satellites.

984.00 985.00 B: No GPS.

985.00 986.00 B: Satellites are down.

986.00 992.00 B: We will go back decades in time and this would be horrible beyond imagining.

992.00 994.00 B: And it's like, yeah, let's do it.

994.00 995.00 B: Let's make it happen.

995.00 998.00 B: But it's not, it's the thing is- It's beyond irresponsible.

998.00 1000.00 S: Kessler syndrome, it's not like it's an event.

1000.00 1014.00 S: It's when you get past this tipping point where debris is accumulating faster than it's decaying just from the natural drag of the atmosphere, et cetera, then that means that it's going to continue to do that.

1014.00 1020.00 S: It may take a long time, but it becomes this unstoppable juggernaut of increasing debris.

1020.00 1026.00 S: And eventually we'll get to the point where, yeah, that low Earth orbit becomes useless or way too hazardous.

1026.00 1031.00 S: And this, imagine tens of thousands of extra pieces of junk flying around low Earth orbit.

1031.00 1035.00 S: One bolt hitting a satellite could take it out.

1035.00 1043.00 S: Of course, if it hits the ISS or China has a space station up there with a couple of astronauts aboard, it could be catastrophic.

1043.00 1050.00 S: So clearly we need some kind of international, I mean, the UN has a committee that says, yeah, this is the guidelines.

1050.00 1052.00 S: There are treaties, yeah.

1052.00 1054.00 S: But there's no force of law behind this.

1054.00 1061.00 S: This is just the guidelines, number four of which is, let me read it, avoid intentional destruction and other harmful activities.

1061.00 1065.00 S: And I checked, I checked the document does exist in Russian.

1065.00 1067.00 S: Or else what is the point?

1067.00 1069.00 S: Well, it's just this is the guidelines, right?

1069.00 1071.00 S: Sort of a guideline.

1071.00 1077.00 E: Yes, but there's nothing attached to saying we were going to fine you a 10 billion, I get that.

1077.00 1078.00 E: They don't have the authority.

1078.00 1079.00 S: Right, there's no teeth.

1079.00 1080.00 S: There's no teeth, yeah, this is the guidelines.

1080.00 1081.00 C: There's no teeth.

1081.00 1083.00 C: But that's the whole point of a treaty, right?

1083.00 1084.00 C: Is that everybody agrees.

1084.00 1085.00 C: We need to raise it to the level of a treaty.

1085.00 1086.00 S: We need to raise it to the level.

1086.00 1090.00 C: But the problem is if people don't decide to join.

1090.00 1091.00 C: Yes.

1091.00 1092.00 C: You know, it's like we all have to be in agreement.

1092.00 1094.00 C: Yeah, but then, but the point is like- Like Dr. Strangelove.

1094.00 1096.00 C: We have to all kind of be equally free.

1096.00 1102.00 S: The other countries use like economic strangleholds to make them feel it and hopefully they'll comply.

1102.00 1108.00 C: Yeah, but I mean it's like even the Geneva Convention doesn't make sense in places where they're unaware of it and they're not willing to abide by it.

1108.00 1111.00 S: But the thing is this is poisoning your own water, right?

1111.00 1112.00 S: Correct.

1112.00 1115.00 S: I mean Russia and China are invested in orbit too.

1115.00 1116.00 S: Of course.

1116.00 1120.00 S: I mean and it's just madness to blow up a satellite.

1120.00 1125.00 S: I mean we're scrambling to try to find ways to reduce space junk.

1125.00 1130.00 S: So, one goal is to just prevent the accumulation of more space junk.

1130.00 1138.00 S: So, you know, now there are again there are standards for if you put a satellite in orbit, you have to have a plan for how to de-orbit it at the end of its life.

1138.00 1142.00 S: So, there aren't just more and more satellites up there at the end of their life.

1142.00 1151.00 S: And we're researching ways of getting stuff out of orbit that's already there like shooting a laser at it, slowing it down until it goes into the atmosphere.

1151.00 1155.00 S: We're setting up these nets that will capture- James Bond had the James Bond movie.

1155.00 1159.00 S: They have the ship and it opens up this mouth and then it just goes like that.

1159.00 1160.00 S: What's up with that?

1160.00 1161.00 S: That would not work.

1161.00 1164.00 B: That would not work because the fuel you would need would be so prohibitive.

1164.00 1167.00 B: Because you think about it, you have to go from orbit to orbit to collect all the junk.

1167.00 1168.00 B: There's not enough fuel.

1168.00 1170.00 B: Even for the big stuff though?

1170.00 1171.00 B: Yes.

1171.00 1173.00 B: So, James Bond was wrong.

1173.00 1174.00 B: It was totally wrong.

1174.00 1175.00 B: Lied to you.

1175.00 1176.00 B: Lies.

1176.00 1177.00 E: More lies.

1177.00 1184.00 S: It wouldn't work for the same reason that Gravity, the movie, was flawed and that they would totally blew the orbital mechanics.

1184.00 1188.00 S: You don't just point at your destination and use a jetpack to go there.

1188.00 1190.00 S: It's not like the atmosphere at all.

1190.00 1196.00 S: You have to match the velocity and vector of the thing that you want to get to and you couldn't do that with a jetpack.

1196.00 1197.00 S: How about magnets?

1197.00 1198.00 S: No.

1198.00 1209.00 S: So, magnets may be able to be helpful in capturing or holding onto bits of magnetic debris, but it's not like you're going to get a magnet up there that's going to just attract all the junk.

1209.00 1211.00 S: Plus, we have tons of satellites up there that we can't eff with.

1211.00 1213.00 S: We can't screw with while we're collecting the junk.

1213.00 1215.00 B: I think lasers are probably one of our best options.

1215.00 1219.00 S: Laser might be a good ground-based thing, but again, this is all theoretical at this point.

1219.00 1223.00 S: Did you ever see that video game where there's a sticky ball and it rolls and it gets bigger?

1223.00 1224.00 S: You guys remember that game?

1224.00 1225.00 S: It's called Katamari.

1225.00 1226.00 S: Yeah.

1226.00 1227.00 S: We need that for space.

1227.00 1229.00 S: Yeah, but again, you're going to get the good stuff too.

1229.00 1231.00 S: We may have to just hit the reset button.

1231.00 1234.00 S: Just clear orbit and start from scratch.

1234.00 1235.00 S: That would be terrible.

1235.00 1236.00 S: It's not going to happen.

1236.00 1237.00 C: It would go dark for a while.

1237.00 1238.00 S: Yeah.

1238.00 1239.00 S: It's not going to happen.

1239.00 1243.00 GH: Let's just tell the Russians that if you lose the satellites, they can't watch Netflix anymore.

1243.00 1249.00 J: Oh my God.

1249.00 1254.00 S: I just thought everybody needed one more thing to worry about.

1254.00 1257.00 S: I'm going to follow this one just to see what the fall is.

1257.00 1258.00 S: I don't think anything's going to happen.

1258.00 1261.00 S: It's like nothing happened when China did it in 2007.

1261.00 1267.00 S: But clearly, just saying, wagging your finger at Russia for doing this is not going to accomplish our goals.

1267.00 1276.00 S: This needs to elevate to a high profile, very much needed international treaty where you just can't blow up satellites anymore.

1276.00 1278.00 S: Because we can't do this too many more times.

1278.00 1280.00 S: It's like Russian roulette.

1280.00 1284.00 B: This alone could initiate the cascade.

1284.00 1285.00 B: Could have pushed us over the edge.

1285.00 1286.00 B: It's possible.

1286.00 1287.00 B: Not likely, but it's impossible.

1287.00 1298.00 S: Kessler has already simulated, the same guy, the Kessler effect guy, ran simulations like, yeah, we're pretty much already past the point of no return if we don't do anything to whack back the space junk.

1298.00 1301.00 S: It's just going to take, the question is how quickly is this going to happen?

1301.00 1303.00 E: Oh, I see. It's inevitability, but time.

1303.00 1307.00 GH: That's a great album title, Whack Back the Space Junk.

1307.00 1309.00 GH: I'm writing that down.

1309.00 1314.00 S: George, you can have that for free.

Asteroid May Be Fragment of the Moon (21:54)[edit]

1314.00 1318.00 S: All right, Bob, tell us about this wacky asteroid.

1318.00 1320.00 B: So a tiny piece of the moon appears to be orbiting the earth.

1320.00 1322.00 B: Does anybody see this news item?

1322.00 1323.00 B: So what is that all about?

1323.00 1325.00 B: How could that even happen?

1325.00 1331.00 B: So this is published in Nature Communications, Earth and Environment, and astronomers from the University of Arizona are leading the research.

1331.00 1336.00 B: So the name of this is Kamo O'Oaleva is the name of this asteroid.

1336.00 1338.00 B: I'm going to call it Kamo for the rest of this talk.

1338.00 1340.00 B: It's a near-Earth asteroid.

1340.00 1344.00 B: We've talked about near-Earth asteroids on the show before, near-Earth object, near-Earth asteroid.

1344.00 1345.00 B: There's thousands of these.

1345.00 1347.00 B: These are things that are tracked as much as we can.

1347.00 1351.00 B: We've detected a lot of them, but there's a lot more that we haven't detected.

1351.00 1356.00 B: And clearly they're important because they're big, massive objects, most of them, that are moving very fast near the earth.

1356.00 1358.00 B: So clearly they're very, very important.

1358.00 1363.00 B: Some of them cross the earth's orbit, and some of them are tiny, you know, from tens of feet.

1363.00 1364.00 B: Some of them are big, like Ganymede.

1364.00 1365.00 B: I never heard of this one.

1365.00 1367.00 B: Ganymede is 32 kilometers across.

1367.00 1371.00 B: But we're not in any danger from that because that would be quite devastating.

1371.00 1372.00 B: Wait, but that's Ganymede?

1372.00 1373.00 B: No, Ganymede.

1373.00 1374.00 B: Not Ganymede.

1374.00 1375.00 S: Ganymede.

1375.00 1376.00 S: Yeah, Ganymede, not to be confused with Ganymede.

1376.00 1378.00 B: Right, that's why I said Ganymede.

1378.00 1379.00 B: How is it spelled?

1379.00 1381.00 B: Ganymede, G-A-N-Y-M-E-D.

1381.00 1382.00 B: No E at the end.

1382.00 1383.00 None No.

1383.00 1386.00 B: That's why I said Ganymede.

1386.00 1393.00 B: Okay, so this is as big as a Ferris wheel, 50 meters, 150 to 190 feet.

1393.00 1396.00 B: So it's not gargantuan, but I mean that would pack a punch.

1396.00 1399.00 B: And it's potentially the moon.

1399.00 1406.00 B: It's 9 million miles away, so it's not horribly close, but 9 million miles is pretty close, actually, when you think about it.

1406.00 1411.00 B: It's also the most fascinating thing about it up until recently, this is a quasi-satellite.

1411.00 1414.00 B: And I wasn't too familiar with quasi or quasi-satellites.

1414.00 1415.00 B: They're not moons.

1415.00 1420.00 B: They're bound to the sun, so they're not really orbiting the Earth.

1420.00 1423.00 B: Now here's another term I learned just today, a hillsphere.

1423.00 1425.00 B: These quasi-satellites are outside the hillsphere of the Earth.

1425.00 1431.00 B: That's the area around the Earth where our gravity dominates, so that's where we can grab onto any of our satellites.

1431.00 1434.00 B: If you're beyond that, then the Earth can't really have much of a gravitational impact.

1434.00 1443.00 B: That said, though, that there is a gravitational resonance, so it kind of locks in the quasi-satellites, so it stays relatively near the Earth and moves about the same speed.

1443.00 1447.00 B: So it's basically kind of a companion that kind of travels with us.

1447.00 1449.00 B: Its year is our year.

1449.00 1452.00 B: It takes just as long to orbit the sun.

1452.00 1456.00 B: Now we have five quasi-satellites that orbit the sun that are near the Earth.

1456.00 1458.00 B: I didn't know we had five of those.

1458.00 1459.00 B: But that's not even the new stuff.

1459.00 1469.00 B: The new stuff this year, or relatively recently, was that they looked at the light from this quasi-satellite, from KAMO, and they said, this spectrum is like the moon. What's going on?

1469.00 1471.00 B: Could this actually be a part of the moon?

1471.00 1481.00 B: This is the part that I really enjoyed because they showed that they were really good scientists, because this happened like a few years ago, and they could have gone to the newspapers and said, look, we found a chunk of the moon that's orbiting.

1481.00 1483.00 B: But they didn't do that because that would be a little bit premature.

1483.00 1485.00 B: So they actually studied the crap out of this.

1485.00 1488.00 B: They looked at spectra from other asteroids.

1488.00 1490.00 B: They studied it for three years.

1490.00 1495.00 B: They were looking for a plausible explanation to say, no, this isn't the moon because that's kind of very odd.

1495.00 1498.00 B: Here's a quote that I love from one of the co-authors.

1498.00 1499.00 B: This is good science.

1499.00 1503.00 B: We doubted ourselves to death, which is what all good scientists do, right?

1503.00 1507.00 B: So then finally this year, it kind of all came together because the spectrum was really compelling.

1507.00 1513.00 B: They looked at the orbit a lot more closely, and they said, this is an orbit of something that could come from the moon.

1513.00 1519.00 B: If you look at other asteroids, it would be very, very difficult, almost impossible, for it to go into this type of orbit.

1519.00 1521.00 B: So that's just another clue that this is a piece of the moon.

1521.00 1527.00 B: And then the latest round of observations came in because last year, of course, with COVID, they couldn't get the telescope because they shut down the telescope.

1527.00 1533.00 B: So they looked at it again and they said, the only explanation that makes sense is that this is actually a piece of the moon.

1533.00 1535.00 B: So that's their conclusion.

1535.00 1541.00 B: So we're going to have this companion for the next 500 years.

1541.00 1543.00 B: They think it's been here for 500 years.

1543.00 1547.00 B: Well, after a few hundred years, it will probably go into another type of orbit.

1547.00 1548.00 B: It won't be a quasi-satellite orbit.

1548.00 1552.00 B: It could be like a horseshoe orbit or something different, but it's not going to stay there.

1552.00 1558.00 B: They're not very, in the inner solar system, they don't last very long, but the outer solar system, they could actually last for billions of years.

1558.00 1566.00 B: So Jupiter, no, not Jupiter, actually, but Uranus and Neptune could have a quasi-satellite for billions of years because the orbits can be much more stable over there.

1566.00 1573.00 B: The other thing to look forward to with this is that in the future, we may find that the other four quasi-satellites may be there from the moon as well.

1573.00 1575.00 B: So we'll find out.

1575.00 1583.00 S: And Bob, does that mean that this piece was kicked off when the moon was kicked out of the Earth?

1583.00 1587.00 B: No, we think it's like from an ejecta, from a collision.

1587.00 1588.00 B: A big collision.

1588.00 1592.00 B: 150 feet, yeah, I guess, but it must have been a big collision to knock out.

1592.00 1601.00 B: That's probably what happened, but they really don't have a lot of evidence, and there's other theories that potentially could be, but I think that's the most viable reason how it happened.

1601.00 1603.00 S: Any idea of how long ago that might have happened?

1603.00 1615.00 B: No, none of the paper and the research didn't point to any ideas of how long ago it might have happened, or even finding the crater on the moon, that's probably impossible at this point.

1615.00 1617.00 S: Okay, thank you. That is so cool.

Loss Aversion (26:56)[edit]

1617.00 1626.00 S: All right, Cara, you're going to talk about loss aversion, which sounds boring, but it's really interesting because it has to do with...

1626.00 1627.00 S: Great intro.

1627.00 1629.00 J: That's the one for me to slag on.

1629.00 1632.00 J: That really sets the stage.

1632.00 1636.00 GH: It's only his 800th show. Come on, give him a chance. He's learning.

1636.00 1640.00 S: Because it has to do with the psychology of economics.

1640.00 1641.00 S: Yes, yes.

1641.00 1642.00 S: How people make decisions.

1642.00 1673.00 C: Yeah, human decision making, so social psychology, behavioral economics, neuroeconomics, behavioral neuroeconomics, lots of different words for different ways to slice and dice the same thing, and I think ultimately one of the theses, the central theses that I take out of this news item, is that, is that what we're going to be talking about for the next few minutes is a construct, and there's a lot of different ways to slice and dice this construct, and apparently within academic circles people are starting to disagree.

1673.00 1677.00 C: So loss aversion is sort of exactly how it sounds.

1677.00 1683.00 C: It's this idea that was first developed by Daniel Kahneman and his collaborator Tversky.

1683.00 1685.00 C: I think that's how you pronounce it, Amos Tversky.

1685.00 1700.00 C: In 1979 when they published a paper and they described sort of intuitively and through observation, but not necessarily through empirical testing, this idea that losses loom larger than gains.

1700.00 1704.00 C: That's how you'll often see it quoted, losses loom larger than gains.

1704.00 1714.00 C: So the idea here is that people are more concerned, they're more averse to losing something they already have than they are to the prospect of gaining something they don't have.

1714.00 1719.00 C: And this has been a cornerstone of behavioral economics for decades.

1719.00 1727.00 C: Like basically that was the beginning of the development of the field, and it sort of has been one of the keystones of the field for a very long time.

1727.00 1737.00 C: When I talk about slicing and dicing things, I mean this came out of their early publication, which was actually a larger idea.

1737.00 1746.00 C: And then there's a smaller idea called the endowment effect, and that actually does seem to hold a lot of weight in evidence-based investigations.

1746.00 1762.00 C: And the endowment effect, it started, I guess the idea is that it started at I think Columbia, where an economics class took a bunch of mugs from the gift shop, like Columbia mugs, and handed them out to half the class, and the other half the class didn't get mugs.

1762.00 1768.00 C: And then they said, you know, secret ballot, how much would you sell your mugs that you have for?

1768.00 1771.00 C: And then they said, how much would you buy their mugs for?

1771.00 1776.00 C: And this theme happens over and over, where the people who have the mugs are like, my mug is worth $6.

1776.00 1779.00 C: And the people who buy the mugs are like, I wouldn't pay more than $3 for that.

1779.00 1782.00 C: And it happens over and over and over.

1782.00 1787.00 C: Interestingly, researchers, there's sort of like a new wave of researchers who are saying like, I don't buy it.

1787.00 1796.00 C: I don't buy, first of all, that people are always more concerned about losses, more sort of aversed to losses than they are excited about gains.

1796.00 1802.00 C: And I'm also not sure that this mug thing works when there's actual stakes in the world.

1802.00 1805.00 C: And so they were, not like stakes, but like stakes.

1805.00 1818.00 C: And so they started to, yeah, to investigate this across different iterations, like raising the stakes, lowering the stakes, raising the risk factors, looking at different cultures, looking at different personality types.

1818.00 1822.00 C: And they sort of started to realize that this doesn't always hold.

1822.00 1837.00 C: Although it is true in sort of casino games and in some simulations that if you offer somebody the option to win $100, they're less likely to take that option than they are in the same situation having to lose $100.

1837.00 1842.00 C: They're much rather hold on to the $100 they already have than to take an option to win $100.

1842.00 1847.00 C: And sometimes it can be, there have been calculations all the way down to 50%.

1847.00 1856.00 C: They might be more willing to want to hold on to their $100, or sorry, hold on to $50 than they are willing to gamble for $100.

1856.00 1860.00 C: And once you kind of pass that threshold, it sort of falls apart a little bit more.

1860.00 1867.00 C: But with the endowment effect, they're finding that once you raise the stakes, things kind of do start to fall apart.

1867.00 1880.00 C: And some of these more modern researchers are saying things like mostly what we realize is that although, yes, these people said $6 and yes, these people said $3, when we really started to dig, we noticed that nobody really gave a shit.

1880.00 1884.00 C: In the end, they were like, I don't really care. I don't really care about any of this.

1884.00 1894.00 C: And that seems to be one of the important concepts within this idea of loss aversion, that once the stakes start to shift, it sometimes does fall apart.

1894.00 1900.00 C: But of course, this is Nobel Prize winning economist Daniel Kahneman, who with his collaborator came up with this.

1900.00 1904.00 C: The interesting thing is he's been a little bit careful to weigh in on it.

1904.00 1908.00 C: He's like, I don't know, yes, I'm questioning, just like this is good science, right?

1908.00 1911.00 C: Sort of questioning what we laid down. I still think it's true, but I'm not.

1911.00 1915.00 C: I never said it was true in every circumstance as applies to every person.

1915.00 1922.00 C: And he's like, I'm waiting and see. He doesn't want to put his nickel down quite yet because he's like, I'm waiting to see what these young researchers are going to say about this.

1922.00 1925.00 C: But I was just talking to Steve before we started recording.

1925.00 1941.00 C: And I was like, you know, one thing that comes up for me, and I'm interested in you guys' take on this, is that, and maybe I'm making a false connection here, but that another very central tenet of behavioral economics, specifically behavioral psychology, is that punishment doesn't really work.

1941.00 1949.00 C: But we know that classical conditioning in many aspects does work, and that positive reinforcement and even negative reinforcement.

1949.00 1953.00 C: So long story short, negative reinforcement is different than punishment, right?

1953.00 1957.00 C: So positive reinforcement is like you do something and you get the cookie.

1957.00 1964.00 C: And then negative reinforcement is like I beat you in the head until you do the thing I want you to do, and then I stop beating you. So I take away the aversive stimulus.

1964.00 1965.00 C: That would work on me.

1965.00 1971.00 C: Yeah. And then punishment is you do the thing I don't want you to do, so I punch you in the face.

1971.00 1976.00 C: So that's sort of the opposite. And we know when it comes to training animals and things like that, that punishment doesn't usually work.

1976.00 1977.00 C: Do they literally do that in the studies?

1977.00 1979.00 C: They punch them in the head. I don't know. They probably have.

1979.00 1982.00 C: This is all before Nuremberg. We don't know.

1982.00 1986.00 C: I'm interested in that sort of relationship, right?

1986.00 1990.00 C: Yeah. And the fact that loss aversion has so long been such a central tenet.

1990.00 1993.00 C: People are so afraid to get rid of what they already have.

1993.00 2000.00 C: They'd much rather hold on to what they already have than to try and gamble for something that they want.

2000.00 2007.00 C: But at the same time, this idea that like – because this is not this theory that people are talking about in the vapor.

2007.00 2013.00 C: This shapes public policy. This shapes the way that we interact, the way that we build economic systems.

2013.00 2019.00 C: It shapes the way that prisons are built and schools are built and investments are developed.

2019.00 2025.00 C: And so there's a lot of like real downstream outcome of what might actually be kind of shaky science.

2025.00 2026.00 C: That's a little bit interesting.

2026.00 2032.00 S: Isn't positive reinforcement just much more healthy mentally? Is that just a –

2032.00 2037.00 C: Well, it also works better. That's the other thing. It actually is more effective. Like it has a better outcome too.

2037.00 2046.00 S: I would think that in our society, like as an example, China has that whole like societal rating system thing, right? Yeah. Social credit score.

2046.00 2047.00 S: You're talking about WeChat.

2047.00 2055.00 S: That to me is horrifying because it isn't positive reinforcement. It really is like you do what we want you to do or you get tanked.

2055.00 2057.00 S: So what would you consider that to be?

2057.00 2062.00 C: Well, I think that – but I think you're saying it as if in China – and I'm not defending it because it scares me and it's a very black mirror.

2062.00 2068.00 C: But I think it's not as – it's not like the government is handing down cookies and taking your cookies and giving you cookies.

2068.00 2075.00 C: It's more like everybody has a Yelp score. Like that sort – it's more about how they interact with people and then people rate each other.

2075.00 2077.00 C: And businesses rate consumers and consumers rate businesses.

2077.00 2083.00 S: Yeah. I mean but it's not just like a Google rating where you're looking up – you're looking for a contractor and you –

2083.00 2088.00 C: No. It has real world consequences because it's you. Yeah. But – Yeah. There is a black mirror episode about this, right?

2088.00 2089.00 C: Yes.

2089.00 2090.00 C: I thought so. Okay.

2090.00 2094.00 S: But it's not just business. It's like you. It's a personal rating.

2094.00 2096.00 S: Yeah. That's true because there's Yelp scores. There's – yeah.

2096.00 2102.00 S: Imagine if you went on Facebook and you had like a four out of five. Cara's only a four out of five person.

2102.00 2108.00 C: I get it. But I mean – but to be fair, again, like we have credit rating in this country. We do have those kinds of people.

2108.00 2109.00 C: Yeah. But nobody can see that.

2109.00 2112.00 C: Anybody who we want to get a loan from, who we want to –

2112.00 2114.00 S: Yeah. But street people, people that I know –

2114.00 2115.00 J: Street people.

2115.00 2118.00 C: Nobody can see it. Hey, buddy. What's your face for?

2118.00 2123.00 C: I'm just saying remember it's always shades of gray. We want to think in black and white, but really this is shades of gray.

2123.00 2124.00 C: And we do –

2124.00 2126.00 B: And Cara, you're really a 4.5 out of five.

2126.00 2132.00 S: That's positive reinforcement. Cara, is a 4.5 great enough for you?

2132.00 2134.00 B: No. I'll take it. I love that.

2134.00 2145.00 S: So circling back though, I like the give them a cookie thing. Like I like the positive reinforcement thing. And then you said like this bleeds into schools and governmental systems.

2145.00 2151.00 S: I don't even think we have a give them a cookie thing in the real world. Like I don't see positive reinforcement anywhere.

2151.00 2164.00 C: It happens. I mean, but you're right. It would be hard to – it's interesting because I'm seeing a connection where one might not necessarily – because I don't think behavioral economists are weighing in from cognitive behavioral perspective.

2164.00 2172.00 C: So like these experiments that I was talking about, like the Pavlov dog experiments and things like that, it's a very simplistic way to look at human behavior.

2172.00 2187.00 C: And a more sophisticated model is sort of these like Kahneman models, this idea of we tend to see that people are more afraid of this kind of risk or – and some – sorry to kind of go off on a tangent. Some researchers are saying I don't think that's – that loss aversion is what's happening at all.

2187.00 2194.00 C: I think some people just – it's easier to be passive than active. And we've talked about this a lot on the show, this idea of the passive risk.

2194.00 2207.00 C: Some people actually when they're anti-vaxx, one of the reasons that they go to that is that the passive risk of doing nothing, of not getting vaccinated seems less risky to them than the active risk of doing something, which is getting the jab.

2207.00 2212.00 C: Because they're not calculating in the active risk of like COVID flying in their face all day.

2212.00 2213.00 C: Right. It's not a good –

2213.00 2224.00 GH: There's been a bunch of schools I know that have started programs where they actually pay students for good grades. Sometimes they're supplemented by individual contributions, but they've been very successful programs.

2224.00 2238.00 GH: Like if you read a book, let's say, you get 50 bucks or if you get an A, you get 100 bucks. And on the outside, you look at that and you think, oh my gosh, like we're paying kids to get good grades. Like is that a good way to do it? And it turns out it's really good.

2238.00 2244.00 S: I mean in the end, if you think about the investment, right, like I don't know who's paying. Is the university paying?

2244.00 2249.00 S: And why do they care about the grades? Whatever. Like we're not exploring the economics of it.

2249.00 2255.00 S: If you just think about the payback to society, you know – It could be cost-effective even though it seems weird.

2255.00 2256.00 S: Exactly.

2256.00 2266.00 S: But the thing is, getting back to what Cara was saying, that of course, human behavior is super complicated because there's hundreds of these things happening all at once.

2266.00 2279.00 S: And the psychological studies are trying to pull out one thread of this by creating this artificial construct, which is only simulating human behavior in a very narrow way that also has a lot of assumptions involved.

2279.00 2286.00 S: I always think of the marshmallow test where they thought that they were testing the ability – Resolve.

2286.00 2295.00 S: Yeah, like the executive function, the ability to defer gratification, but they were really testing, at least in some people, their trust that the adult's going to come back with the marshmallow.

2295.00 2299.00 C: Yeah, there's a certain percentage of food insecure people who are never going to do well on the marshmallow test.

2299.00 2301.00 S: Right. And it's nothing to do with their executive function.

2301.00 2305.00 S: It's that they have good reasons not to trust that the adults are going to keep their word in their life.

2305.00 2309.00 S: Let's take a vote. Who here in the audience votes for the cookie?

2309.00 2313.00 S: Do you want to be – do you think the positive reinforcement – That's not the point of the story.

2313.00 2316.00 J: This is what I'm – you asked for my opinion.

2316.00 2317.00 S: But I want the cookie.

2317.00 2322.00 S: I'm just curious what people think. Do you guys – who here agrees with me that the – and George, well, you agree, right?

2322.00 2326.00 C: But we all agree the positive reinforcement is more effective than punishment.

2326.00 2328.00 C: I thought you wanted to know what I thought about this.

2328.00 2331.00 C: No, the question is – I was going to say, if you give a damn cookie –

2331.00 2333.00 J: I just want to know – Let's just give them a damn cookie.

2333.00 2336.00 J: Who wants the goddamn cookie in this room, right?

2336.00 2359.00 C: So, Jay, I think the point that Steve was making, which is an important one, and I think it speaks to like a skill that we as skeptics can like continue to sort of sharpen, is that many things within science that are often taught as if they just are, are things that scientists observed and came up with terms to try to describe.

2359.00 2360.00 C: Made their best sort of guess at what's going on.

2360.00 2363.00 C: Right. So we see a phenomenon and we say, I want to organize this.

2363.00 2366.00 C: I want to categorize this. I want to put this into a box.

2366.00 2370.00 C: But more often than not, things don't perfectly fit in boxes.

2370.00 2381.00 C: Very often when we think we're measuring a construct in psychology, we might be measuring a larger construct or a smaller construct and calling it this thing, but there's like these other things that are happening.

2381.00 2397.00 C: And that's why we design experiments, especially experiments that have to do with human behavior, to try and tease out what's called variance, like how much of what we're measuring is what's based on the variable we're manipulating and how much is like other crap that we're catching and fishing that.

2397.00 2401.00 C: Like how much of this is bycatch and how much of it is what we're looking for.

2401.00 2404.00 S: Even if you just limit it to behavioral economics, right?

2404.00 2409.00 S: Just like people's decision making when they're buying stuff or even making this- Or voting.

2409.00 2415.00 S: Or voting or deciding whether or not they should get vaccinated or whatever, how much of that is loss aversion?

2415.00 2418.00 S: How much of it is risk aversion, which is actually a different thing?

2418.00 2422.00 S: And it turns out people are just really weird when it comes to risk aversion.

2422.00 2424.00 S: Because we're not good at measuring it.

2424.00 2425.00 S: We're not good at it.

2425.00 2429.00 S: So people are definitely afraid of missing out on a potential positive.

2429.00 2430.00 S: FOMO.

2430.00 2431.00 S: Yeah, there's the FOMO thing.

2431.00 2433.00 S: So like why do people play the lottery?

2433.00 2438.00 S: They're almost guaranteed to lose their money because they don't want to miss out on the chance of winning the big payoff.

2438.00 2452.00 S: Why do people take alternative medicine that is so weird it can't possibly work because they're more afraid of missing out on the benefits that they're being sold than the money that they're being asked to spend on it?

2452.00 2458.00 S: You would think that the loss aversion would keep them from doing that, but they're afraid of missing out on the benefit.

2458.00 2462.00 S: And it's also what they consider to be their baseline.

2462.00 2467.00 S: People are more clutchy with money that they already have.

2467.00 2471.00 S: Yeah, that's the core of the loss aversion.

2471.00 2473.00 C: It's like I already got this.

2473.00 2474.00 C: Please don't take it from me.

2474.00 2478.00 S: But there's also a sunk cost bias where – Huge.

2478.00 2479.00 C: You see it all the time in poker.

2479.00 2480.00 C: I know.

2480.00 2481.00 C: You see it all the time when you're playing poker.

2481.00 2483.00 C: I'm already in the pot.

2483.00 2484.00 C: I better keep playing.

2484.00 2485.00 C: Because otherwise I'll lose the money.

2485.00 2487.00 E: You don't know when to pull out.

2487.00 2488.00 S: You don't know when to back out.

2488.00 2489.00 S: To cut your losses.

2489.00 2493.00 S: So the thing is we're – and the bigger picture is – and this was a debate.

2493.00 2497.00 S: Are we rational economic creatures or not?

2497.00 2498.00 S: Absolutely not.

2498.00 2502.00 S: Well, I think the answer now is that we're semi-rational.

2502.00 2509.00 S: There is some rationality happening, but there's all of these algorithms running in our heads.

2509.00 2522.00 S: We're chimpanzees when it comes to our behavior in that we could be programmed pretty much statistically speaking, not on the individual level, in terms of how we're going to respond down to sometimes the dollar value.

2522.00 2524.00 S: People will accept it up to 50%.

2524.00 2525.00 S: Yeah.

2525.00 2532.00 C: And here's – it's almost like, okay, if you're like a psych nerd like me, then you can almost think about this in terms of validity and reliability.

2532.00 2536.00 C: Like human behavior is not always valid, but it's often reliable.

2536.00 2538.00 C: Like we'll tend to do similar things over and over.

2538.00 2539.00 C: It's predictable.

2539.00 2540.00 C: Yeah, it's predictable.

2540.00 2542.00 C: Even if it doesn't always seem rational.

2542.00 2553.00 C: Because there are neurological – neuropsychological, I should say, patterns to our behavior that are – they're in – I don't want to use the word ingrained, but they become developed.

2553.00 2556.00 E: And that's where a lot of scam artists take advantage of.

2556.00 2562.00 C: And we evolved to use these heuristics and to use these mental shortcuts because they helped us survive for millions of years.

2562.00 2570.00 S: I mean, this whole thing, Cara, is like – it's such a confirmation that critical thinking is essential to operate a human mind, right?

2570.00 2573.00 S: Because you have to get out of this system, otherwise you're a slave to it.

2573.00 2576.00 C: And you have to remember that the architects of the system are in on this.

2576.00 2579.00 C: They're the ones critically thinking and making these decisions.

2579.00 2583.00 C: These behavioral economists know all of this.

2583.00 2584.00 C: They can be manipulated.

2584.00 2587.00 C: And they're actually – yes, that's what social psychology is.

2587.00 2592.00 C: It's how do we study how to convince people to do things, to choose X over Y.

2592.00 2594.00 C: I'm still like – It's persuasion science.

2594.00 2595.00 C: I'm still about the cookie.

2595.00 2597.00 C: I think that's the – I know. We'll get you a cookie.

2597.00 2598.00 C: Someone get this guy.

2598.00 2599.00 E: All right.

2599.00 2601.00 E: Cookie, please. End this.

Bank Robber Identified After 52 years (43:21)[edit]

2601.00 2602.00 E: George.

2602.00 2603.00 E: Yeah.

2603.00 2608.00 S: How do you catch a bank robber who has evaded the police for 52 years?

2608.00 2609.00 GH: Seriously, right?

2609.00 2612.00 GH: Okay, let me ask this just before we get into that part of the story.

2612.00 2617.00 GH: How many unsolved murders, rogues, do you think there are in the United States?

2617.00 2620.00 GH: How many unsolved murders do you think – Going back to what, George?

2620.00 2622.00 GH: Like that just – that are basically – that exist.

2622.00 2623.00 C: On record?

2623.00 2624.00 C: Cold cases.

2624.00 2626.00 C: Not as a percentage of other cases.

2626.00 2627.00 C: No.

2627.00 2628.00 C: Unsolved murders.

2628.00 2629.00 E: Raw number.

2629.00 2630.00 E: It's hundreds of thousands.

2630.00 2631.00 E: Tens of thousands?

2631.00 2632.00 E: Hundreds of thousands.

2632.00 2633.00 E: Hundreds of thousands.

2633.00 2637.00 GH: Two hundred and fifty thousand plus unsolved murders in the United States.

2637.00 2640.00 GH: And it grows by about 6,000 murders every year.

2640.00 2647.00 GH: So whereas bank robberies – there's only about 14% unsolved bank robbery cases.

2647.00 2654.00 GH: Bank robberies are one of the least efficient ways to be successful as a person that's going to break the law.

2654.00 2656.00 GH: Check that one off.

2656.00 2659.00 GH: The FBI calls it a loser crime.

2659.00 2665.00 GH: Usually the money – usually the money that's stolen in a bank robbery is gone.

2665.00 2667.00 GH: But it's often a small amount.

2667.00 2675.00 GH: Especially in the last, let's say, 15, 20 years, they've developed this kind of note robbery where someone – there's no weapon.

2675.00 2676.00 GH: There's no threat.

2676.00 2680.00 GH: Someone just hands a note to the teller that says, give me money.

2680.00 2683.00 GH: And the tellers are trained that like it's not worth it.

2683.00 2685.00 GH: You give them the money.

2685.00 2687.00 GH: There's video cameras everywhere.

2687.00 2688.00 GH: You're going to be filmed.

2688.00 2691.00 GH: There's a camera that's going to film your car driving away.

2691.00 2692.00 GH: There's a camera down the block.

2692.00 2694.00 E: And an 86% chance of being caught.

2694.00 2696.00 GH: And there's an 86% chance of being caught.

2696.00 2697.00 GH: So it's this thing.

2697.00 2703.00 GH: And yet because almost sort of this loss aversion, I wonder if it enters into it, you know there's money in there.

2703.00 2705.00 GH: You know there's cash in this building.

2705.00 2708.00 GH: So it's like, all right, let's do it.

2708.00 2716.00 GH: To compare, 75% of robberies are unsolved and 88% of burglaries are unsolved.

2716.00 2722.00 GH: Those have much lower sort of financial costs to the victims.

2722.00 2725.00 C: Well, and a smaller machine trying to get that person to justice.

2725.00 2726.00 C: That's true.

2726.00 2728.00 C: A bank wants to recover their losses.

2728.00 2729.00 C: Yeah.

2729.00 2734.00 GH: And then like one count of bank robbery can go up to 25 years sentence in prison.

2734.00 2737.00 GH: So it really is like the dumbest thing.

2737.00 2738.00 GH: Even if it's unarmed?

2738.00 2739.00 GH: Apparently, yeah.

2739.00 2740.00 GH: Yeah.

2740.00 2742.00 GH: Just if you rob a bank, you can get up to 25 years as a sentence.

2742.00 2743.00 S: That's amazing.

2743.00 2744.00 S: Like don't mess with money.

2744.00 2745.00 S: That's basically what you're saying.

2745.00 2746.00 GH: That is a very – that's the thing.

2746.00 2747.00 GH: Don't mess with corporate money.

2747.00 2749.00 GH: Don't mess with the building money.

2749.00 2752.00 GH: Don't mess with people that have large amounts of money.

2752.00 2753.00 C: Yes, yes, yes.

2753.00 2756.00 C: If you want to mug someone, yeah.

2756.00 2757.00 GH: Sadly.

2757.00 2760.00 GH: Again, not that we're condoning any kind of mugging by any stretch.

2760.00 2761.00 S: But if you are going to do it.

2761.00 2763.00 S: But if you are going to do it.

2763.00 2764.00 S: You're having things to say.

2764.00 2767.00 GH: After the event, on the street, stranger, that's the way to go.

2767.00 2774.00 GH: That being said, in 1969, there was a gentleman named Theodore John Conrad.

2774.00 2777.00 GH: And he worked for a – what do you call it?

2777.00 2778.00 GH: Bank.

2778.00 2779.00 GH: Bank.

2779.00 2782.00 GH: And he stole $215,000 out of the bank.

2782.00 2783.00 GH: From his own bank.

2783.00 2784.00 GH: From his own bank.

2784.00 2785.00 GH: Inside job.

2785.00 2786.00 GH: Inside job.

2786.00 2798.24 GH: He figured out – In one clip or – In one clip, this guy went to work and he had figured out he was in charge of kind of putting together bankrolls that then would be sent to other banks, that would be sent to businesses, that would be picked up by the Brinks trucks and all that kind of stuff.

2798.24 2805.22 GH: And he realized that he was sort of holding massive chunks of cash and he wasn't really sort of being paid attention to.

2805.22 2812.44 GH: So one day he put a bag together of $215,000, which in today's money is about $1.7 million.

2812.44 2813.78 GH: He put it in a bag.

2813.78 2815.30 GH: The end of the day came.

2815.30 2816.86 GH: He walked out of the bank.

2816.86 2817.86 GH: He went to the airport.

2817.86 2819.96 GH: And he left.

2819.96 2822.40 GH: And that was 52 years ago.

2822.40 2825.32 GH: And the FBI had been looking for him for 52 years.

2825.32 2826.48 GH: Where did he leave to?

2826.48 2829.48 GH: He went – where did he go from – He left the country?

2829.48 2831.12 GH: Yeah, he was the society – no, he didn't leave the country.

2831.12 2832.12 GH: He was –

2832.12 2834.12 C: He just like flew to Omaha? He flew to another state.

2834.12 2835.12 GH: Yeah.

2835.12 2841.88 GH: He flew a couple states over and he sort of just – he got a day job and he got a family and he raised kids and –

2841.88 2843.88 C: He grew a goatee and nobody could find him. He changed his name.

2843.88 2844.88 GH: He's changed his name a little bit.

2844.88 2845.88 GH: He ended up in Massachusetts, right?

2845.88 2846.88 GH: He ended up in Massachusetts.

2846.88 2848.92 GH: Yeah, Linfield, Massachusetts.

2848.92 2852.16 GH: He changed his name to Thomas Randell, all right?

2852.16 2853.16 GH: And – what's that?

2853.16 2857.76 GH: I think I just said that.

2857.76 2860.08 GH: And he did it and they couldn't find the guy.

2860.08 2861.08 GH: They couldn't find the guy.

2861.08 2862.08 GH: They couldn't find the guy.

2862.08 2887.28 GH: And finally, he – in March of – sorry, May of 21, just this last May, he was deathly ill from cancer and he basically told his family, hey – Oh, by the way – In 1969, I stole the equivalent of $1.7 million.

2887.28 2891.04 GH: So they sort of like, oh, okay.

2891.04 2897.52 GH: They put in the obituary that they wrote for him, they used his parents' real first names.

2897.52 2900.56 GH: So the folks – it was Edward and Ruthabeth.

2900.56 2902.12 GH: That was his mom's name.

2902.12 2903.20 GH: Ruthabeth.

2903.20 2906.48 GH: So they didn't say Edward – you know, son of Edward and Ruthabeth Conrad.

2906.48 2910.48 GH: They said son of Edward and Ruthabeth Randell, which was his new taken name.

2910.48 2914.40 GH: And the FBI finally saw that.

2914.40 2915.40 GH: That came up.

2915.40 2921.80 GH: I guess Ruthabeth – Ruthabeth was interesting enough that like, oh, son of Ruthabeth and this guy's the right age.

2921.80 2928.56 GH: So they contacted the family and they said, we think your father stole $215,000 back in 1969.

2928.56 2930.56 GH: And they were like, oh, yeah, he totally did.

2930.56 2936.04 S: Was there any money left over or was that long gone?

2936.04 2937.04 GH: It was all gone.

2937.04 2938.04 GH: Yeah, it was all gone.

2938.04 2939.76 GH: And the FBI was kind of like, all right.

2939.76 2940.76 GH: Case closed.

2940.76 2942.96 GH: We're not going to charge.

2942.96 2949.60 GH: But the FBI was like, we get to put it in the win column because we caught him.

2949.60 2950.60 GH: First-rate murder.

2950.60 2951.60 GH: Good job.

2951.60 2953.56 GH: No charges and no one's responsible for the money.

2953.56 2954.56 GH: It gets in the thing.

2954.56 2960.56 GH: So again, if you're going to rob a bank, make sure you work there or don't rob a bank.

2960.56 2961.56 S: All right.

Alex Jones Liable in Sandy Hook Hoax Case (49:20)[edit]

2961.56 2965.96 S: Speaking of getting caught, Evan, you're going to get us up to date on Alex Jones.

2965.96 2966.96 S: Alex Jones.

2966.96 2967.96 E: Sort of.

2967.96 2968.96 E: Go for it.

2968.96 2969.96 E: Let's go for it.

2969.96 2970.96 E: Come on.

2970.96 2974.96 E: Do I need to explain who Alex Jones is to this – Give us the bullet.

2974.96 2975.96 E: He's a fucking asshole.

2975.96 2985.40 E: And he's been on our radar for a very, very long time.

2985.40 2986.40 E: All right.

2986.40 2988.24 E: And there's kind of a personal touch to this as well.

2988.24 2991.08 E: So bear with me, please, as I go through this.

2991.08 2992.08 E: Radio show host.

2992.08 2993.08 E: We know about that.

2993.08 2994.08 E: He's also in multimedia.

2994.08 3000.28 E: He has all these Infoware channels, Planet Prison, and all these other shady media companies that he has.

3000.28 3002.28 E: But he's popular, which is very, very unfortunate.

3002.28 3009.48 E: He's made a career out of being basically a conspiracy theorist, but in the worst possible, possible way.

3009.48 3013.40 E: I first learned of him not long after the 9-11 attacks.

3013.40 3014.40 E: Really?

3014.40 3015.40 E: Yeah.

3015.40 3025.20 E: About a couple years later, this is a little bit before loose change, but he was being talked about that it was an inside job and he started perpetuating that total hoax.

3025.20 3027.80 E: And it's a sensitive point for me.

3027.80 3031.16 E: I had a friend who was on Flight 11 who I went to high school with.

3031.16 3036.20 E: His name was Peter Hanson and his wife and his two-year-old daughter, who was the youngest victim of the attacks that day.

3036.20 3038.04 E: They all perished in the attack.

3038.04 3041.52 E: And I know you guys know the Blackwell family.

3041.52 3046.52 E: Chris Blackwell was a firefighter, first responder to the towers, and he perished also.

3046.52 3048.16 E: So we all have arm's length.

3048.16 3050.48 S: If you live in the Northeast, you probably know somebody who knows somebody.

3050.48 3051.48 E: Yeah.

3051.48 3052.48 E: I mean, 3,000 people.

3052.48 3053.48 E: You're right.

3053.48 3056.36 E: It was hard to sort of not know somebody, but this was someone I did know.

3056.36 3057.36 E: I did know.

3057.36 3060.48 E: And World Trade Center Tower 7 was an inside job.

3060.48 3061.96 E: So the whole gamut.

3061.96 3063.60 S: So is that when he came out?

3063.60 3065.44 S: Was that his first kind of like public appearance?

3065.44 3066.90 E: No, not his first.

3066.90 3070.08 E: He's talked also before about the moon landing hoax.

3070.08 3071.08 E: Of course.

3071.08 3073.72 E: And he does all this to sell snake oil.

3073.72 3074.72 E: Yes.

3074.72 3075.72 E: That's, yeah.

3075.72 3076.72 E: Yep.

3076.72 3077.72 E: Yep.

3077.72 3078.72 E: And I will definitely, definitely gets that.

3078.72 3079.84 E: And the Oklahoma City bombing was also apparently an inside job.

3079.84 3082.20 E: So he's been doing this really since the 1990s.

3082.20 3087.92 E: Now the 2012 Sandy Hook, Connecticut massacre that took place, that's Adam Lanza.

3087.92 3093.32 E: He murdered his mother and then took his weapons to Sandy Hook, gunned down 20 children and six teachers.

3093.32 3096.84 E: And one of my lifelong friends, his name is Rob Sibley.

3096.84 3100.48 E: He works for the town, Sandy Hook, which is really Newtown, Connecticut.

3100.48 3101.48 E: Bob.

3101.48 3103.60 E: Yeah, it's like a mile from where I live.

3103.60 3104.60 E: Right.

3104.60 3105.84 E: And his children were there.

3105.84 3106.84 E: They were not victims.

3106.84 3111.74 E: They were in a different wing of the school when it happened.

3111.74 3112.74 E: His wife was there.

3112.74 3114.96 E: She happened to be dropping something off at the time.

3114.96 3118.68 E: And she was one of the first people who kind of discovered that something was wrong and she got on the phone.

3118.68 3122.32 E: And so the Sibley family was very much in the center of all of this.

3122.32 3124.44 E: And they are my dear lifelong friends.

3124.44 3125.80 S: And no, they're not actors.

3125.80 3127.72 E: No, not at all actors.

3127.72 3129.20 E: And here comes Alex Jones again.

3129.20 3135.20 E: And he's promoting that this massacre was a ruse, that these were actors, as Steve alluded to.

3135.20 3140.60 E: And a scam perpetuated for the greater cause of trying to take gun rights away from everyone.

3140.60 3141.60 E: Which never happened.

3141.60 3143.16 E: No, which certainly did not happen.

3143.16 3146.72 S: No, every time something like this happens, there's a spurt of gun purchases.

3146.72 3147.72 S: That's right.

3147.72 3149.28 E: The best thing that happens with the gun industry.

3149.28 3150.28 E: It's madness.

3150.28 3151.28 E: Every time.

3151.28 3160.80 E: But the news item this week pertains to Jones because he lost yet again in court, in a Connecticut court.

3160.80 3175.68 E: Families are suing him for defamation for perpetuating and contributing to the fact that the family members of the parents, mostly, of these murdered children were getting death threats, letters written, being harassed.

3175.68 3178.52 E: And this went on for many, many years.

3178.52 3185.04 E: And they held Alex Jones and his broadcasting entities liable for spreading these false claims.

3185.04 3192.84 S: And so Evan, did you know one family had to move?

3192.84 3193.84 S: Nine times.

3193.84 3194.84 S: Yeah, nine times.

3194.84 3201.84 E: Nine times in order to evade the harassment that he suffered on top of having lost his son for.

3201.84 3203.76 E: And he did it over years.

3203.76 3204.76 E: He had to.

3204.76 3205.76 S: And he continues to move.

3205.76 3207.28 S: They were like posting their new address.

3207.28 3212.64 S: They would move into a house and then the people that were harassing them would find out like that and post it online.

3212.64 3215.40 S: And then they had to move again because they were getting death threats every place.

3215.40 3217.36 C: So this was a civil suit?

3217.36 3218.84 E: Yes, that's right.

3218.84 3219.84 E: Lost the civil suit.

3219.84 3220.84 E: He's never been found.

3220.84 3221.84 E: Not criminally liable.

3221.84 3227.56 E: The judge came up, decided this case because of willful noncompliance.

3227.56 3234.52 E: In other words, Alex Jones and his team refused to hand over records that the court demanded that he turn over.

3234.52 3239.76 E: It was a delay tactic and he delayed and he delayed for two years and they said enough is enough.

3239.76 3246.04 E: And this is actually, I researched the fourth court case that has gone against Alex Jones.

3246.04 3250.90 E: In Texas there were three counts of this as well earlier, actually about a month ago.

3250.90 3254.48 E: And a judge for the exact same reason gave a verdict against Alex Jones.

3254.48 3263.00 E: So he's going to have to pay and hopefully, let's hope forever bankrupt him into oblivion.

3263.00 3264.12 E: We will know next year.

3264.12 3265.36 E: We will know next year.

3265.36 3267.80 S: But he still doesn't have to give the documents.

3267.80 3271.16 S: At some point, this is what blows my mind about it.

3271.16 3272.16 C: What documents?

3272.16 3273.16 C: He said all this?

3273.16 3275.12 E: His business bank records.

3275.12 3280.96 E: And as Steve was alluding to earlier, he said that he makes his money off of selling snake oil.

3280.96 3283.40 E: That's really where the main parts of his profits come in.

3283.40 3285.84 E: I mean, some real wacky crap here.

3285.84 3287.60 E: I mean, well, let's see what-

3287.60 3289.60 C: He's got vitamins. Boner pills.

3289.60 3291.08 C: That's like his whole business model is boner pills.

3291.08 3292.08 GH: Manpower.

3292.08 3293.08 GH: It's pills for manpower.

3293.08 3299.84 GH: Because manpower is the best kind of power.

3299.84 3307.88 E: Dietary supplements, including a product named Survival Shield, which contained iodine and basically nothing else.

3307.88 3315.32 E: A product named Oxy Powder, which comprised a compound of magnesium oxide and citric acid, common ingredients in dietary supplements.

3315.32 3318.74 E: Also, something for brain power, Steve.

3318.74 3320.32 E: His brain power pills.

3320.32 3327.56 E: And his listeners just eat this up and they fuel and they give him effectively the money by purchasing these absolute crap products.

3327.56 3331.20 S: But it seems like his strategy was, I'll delay as long as I can.

3331.20 3332.96 S: There'll be a summary judgment against me.

3332.96 3337.48 S: I'm not going to basically waste my time and effort fighting it because I can't.

3337.48 3343.36 S: And then just try to minimize the damage as part of the cost to do him business and he'll move on.

3343.36 3349.24 S: I can't think of any time in a situation like this where the judgment was so high that it ended the scam.

3349.24 3350.76 S: It doesn't send him into bankruptcy.

3350.76 3352.52 S: It's just like, okay, they move on.

3352.52 3355.40 S: Maybe they have to do a different business model or whatever.

3355.40 3357.76 S: Or again, this is the cost of doing business.

3357.76 3362.80 S: I make my hundreds of millions and I have to spend tens of millions paying off the people

3362.80 3375.72 E: that I harm. Yeah, but there isn't the blood and 20 gunned down children as part of it, which I hope the jury, because it's going to be juries that decide, from what I read, what the damages are going to be.

3375.72 3377.04 E: They are the ones who are going to-

3377.04 3379.04 B: That could be dramatically high. That's what it is.

3379.04 3384.44 C: You also have to know if there's a cap because a lot of jurisdictions have caps.

3384.44 3390.88 C: So even though the jury decides 20 million, 200 million in punitive damages, the legal cap is two.

3390.88 3391.88 C: Yeah, whatever.

3391.88 3395.64 GH: But he gets to frame this as a free speech issue because he says, we never went to trial.

3395.64 3396.64 GH: We never went to trial.

3396.64 3399.12 GH: So I didn't get to my voice wasn't heard.

3399.12 3400.60 GH: So he kind of wins regardless.

3400.60 3401.60 GH: Right.

3401.60 3402.60 E: Okay.

3402.60 3406.60 E: And in his deposition, he tried to walk it back essentially and make excuses.

3406.60 3410.68 E: And now he claims he really no longer actually believes that, but too freaking late.

3410.68 3416.28 E: The damage is so far done and you can not put the toothpaste back in the tube.

3416.28 3417.28 E: I am sorry.

3417.28 3418.28 S: Right.

3418.28 3419.28 S: How can you walk that, walk it back to where?

3419.28 3421.28 C: I mean- He's always done that.

3421.28 3422.28 C: He plays both sides.

3422.28 3423.28 C: Didn't he do this with his wife's lawsuit too?

3423.28 3424.28 C: Yes.

3424.28 3425.28 C: Where he's like, I'm a character.

3425.28 3430.20 E: Yeah, he plays the Alex Jones character for his audience was his defense in that.

3430.20 3432.24 C: And somehow more people continue to listen to him.

3432.24 3433.24 E: He is a psychopath.

3433.24 3434.24 E: Absolutely.

3434.24 3435.24 S: Yeah, I think so.

3435.24 3436.24 S: This is psychopathic behavior.

3436.24 3438.32 S: In my opinion, he's a psychopath.

3438.32 3440.48 S: The justice system is so broken.

3440.48 3446.40 S: Cases like this, the fact that they can't totally dismantle that son of a bitch means that it's broken.

3446.40 3455.48 S: But the bigger question is, and this is going to be the transition to the next news item, is that this may not be solvable with liability, with liable cases.

3455.48 3460.96 S: There may need to be other regulations or like again, like might be criminal charges involved before you could really solve this kind of thing.

3460.96 3464.68 S: Just suing them just becomes a slap on the wrist or part of doing business.

3464.68 3467.96 C: But often that's the only way that these victims do see justice.

3467.96 3470.80 C: The point is that we don't have the laws in place.

3470.80 3471.80 S: That's all that's available to them.

3471.80 3472.80 S: Yeah, there's nothing else that's available.

SpinLaunch (57:53)[edit]

3472.80 3475.08 S: All right, Jay, last news item.

3475.08 3477.72 S: You're going to tell us about the SpinLaunch system.

3477.72 3478.72 S: SpinLaunch, what is that?

3478.72 3483.04 S: All right, well this is more of a cool like, you know, feel good, right?

3483.04 3487.80 S: Because this has been such an uplifting show.

3487.80 3488.80 S: I get it.

3488.80 3493.88 S: So today when we launch things into outer space, we use liquid propellant rockets.

3493.88 3494.88 S: I mean I love it.

3494.88 3496.72 S: You know, most people think it's really cool.

3496.72 3498.28 S: But it is very expensive.

3498.28 3499.28 S: More solid fuel.

3499.28 3500.28 S: Yeah, well you use both.

3500.28 3501.76 S: Sometimes you use both.

3501.76 3508.00 S: But still, it's all basically, it's very expensive and depending on what it is, it could be very dangerous.

3508.00 3509.72 S: You know, we've had lots of bad things happen.

3509.72 3514.16 S: So this company called SpinLaunch, you know, we're only going back a couple of years.

3514.16 3521.72 S: They came up with a really awesome idea that I think is really going to be something that's going to help get things into space cheaply.

3521.72 3527.40 S: They're developing a system to launch small objects into space using a giant spinning centrifuge.

3527.40 3528.40 S: So I'll tell you all about that.

3528.40 3531.72 S: I'll just give you like the history of shooting things into space.

3531.72 3534.28 S: There's basically one other attempt that was made back in the 60s.

3534.28 3539.00 S: The US and Canada created Project HARP, which is High Altitude Research Project.

3539.00 3543.04 S: And it basically came up with like, it kind of looked like a big tank gun.

3543.04 3546.76 S: And they would shoot, tried to shoot a projectile into outer space with that.

3546.76 3550.40 S: They did shoot a projectile 180 kilometers into space, right?

3550.40 3551.40 S: That's technically space.

3551.40 3552.40 S: But not into orbit.

3552.40 3553.40 S: Not into orbit.

3553.40 3554.40 S: Suborbital trajectory.

3554.40 3556.16 S: They got it up there, but they couldn't pull it off.

3556.16 3557.88 S: It was too expensive.

3557.88 3560.64 S: And they just shut it down because they didn't want to spend the money anymore.

3560.64 3566.48 S: Now, Jay, I saw a science fiction movie in the 1950s where a planet was going to come by the Earth.

3566.48 3567.48 GH: You weren't alive in the 50s.

3567.48 3568.48 E: Right.

3568.48 3569.48 E: I mean, that's true.

3569.48 3571.92 E: From the 1950s, not in the 1950s.

3571.92 3574.58 S: That was made in the 1950s.

3574.58 3580.84 S: And they had to, basically, they had to launch a ship to get the survivors off Earth to this other planet before the Earth got flung into space.

3580.84 3585.08 S: So they had this ship go down these ramps on the side of a mountain.

3585.08 3586.08 S: Oh, ski.

3586.08 3589.08 S: And then up at the end, and then launch into space.

3589.08 3590.08 S: Like, really?

3590.08 3591.84 S: How much is that going to make a difference?

3591.84 3592.84 S: Nothing.

3592.84 3597.88 S: Because it's going like 10 miles an hour, but it left that ramp at the end.

3597.88 3598.88 S: It was cool for kids to watch, I guess.

3598.88 3601.88 C: Yeah, I was going to say they didn't know that then, but they knew that then.

3601.88 3603.88 C: Of course they knew that then.

3603.88 3609.68 S: So SPIN launched, they're using a system that's called the Kinetic Space Launch System.

3609.68 3610.68 S: First off, it's big.

3610.68 3613.68 S: The prototype that they have, it has a circular chamber.

3613.68 3616.20 S: Think of it kind of like a thick coin.

3616.20 3620.08 S: And then they have a centrifuge arm in there that spins up and spins up and spins up.

3620.08 3626.60 S: And then when it gets the right speed, there's a tube that comes off that's basically connected to the inside of that chamber.

3626.60 3631.64 S: And they release it, and the object goes up the tube, and it will basically go wherever the tube is pointing.

3631.64 3633.04 GH: It's like the Millennium Falcon, sort of.

3633.04 3634.04 GH: It looks like the Millennium Falcon.

3634.04 3635.04 C: Kind of, yeah.

3635.04 3636.04 C: Yeah, kind of.

3636.04 3640.68 C: But the principle is just like when you're spinning something on a pen really fast, and then it flies off and hits somebody in the head.

3640.68 3646.52 S: The prototype is a third the size of what they plan on the full-scale version being, which they do plan on building.

3646.52 3648.04 S: But even the prototype isn't that small.

3648.04 3649.04 S: It's 160 feet tall.

3649.04 3653.48 S: So it's basically the height of, it's a little bit taller than the Statue of Liberty without the base, right?

3653.48 3655.48 S: Just the metal part of the statue.

3655.48 3658.04 S: And that's 165 feet, 50 meters.

3658.04 3661.52 S: So their launch system program, they're trying to be aware of the environment.

3661.52 3665.76 S: So they decided that's going to be run by, it's all electric powered, which is cool.

3665.76 3669.32 S: The circular chamber that holds the centrifuge arm is a vacuum.

3669.32 3674.40 S: And obviously, because if it's in a vacuum, it removes all the aerodynamic heating and drag.

3674.40 3678.32 S: The thing could just spin and the air is not going to get in the way and complicate trajectory and all that crap.

3678.32 3680.56 S: The air is really a big problem.

3680.56 3692.80 S: So the arm, once it spins up between 800 and 5,000 miles per hour or 1,287 or 3,100 kilometers per hour, it releases the payload into the launch tube.

3692.80 3696.60 S: And that launch tube is what, like I said before, that's what directs it into orbit.

3696.60 3701.60 S: So I believe that the test one, because looking at pictures, all the information isn't there.

3701.60 3704.68 S: Company is kind of still keeping details under wraps.

3704.68 3707.60 S: But the test one that they have, I don't think that they can move it.

3707.60 3709.92 S: I think it's pretty much where it is.

3709.92 3716.16 S: But the big one that they're planning on making, instead of the tube being vertical, it's kind of like in this position.

3716.16 3717.16 S: You mean it's like at a 45 degree angle.

3717.16 3718.16 S: It's on a 45 degree angle.

3718.16 3719.16 S: It's not pointing straight up.

3719.16 3720.16 S: Thank you.

3720.16 3721.16 S: Yeah, exactly.

3721.16 3724.72 B: Why would it do that though?

3724.72 3727.18 B: Because if you're at an angle, you're going through more atmosphere, which you don't want

3727.18 3729.18 S: to do. But you've got to be in an orbital trajectory.

3729.18 3730.18 S: Yeah, it's got to be in an orbital trajectory.

3730.18 3731.18 S: It's going straight up.

3731.18 3735.80 S: So what I saw on the website though is the whole giant disk can rotate like this.

3735.80 3738.36 S: It's like a carnival ride or something.

3738.36 3740.20 S: I don't know if it can go like this yet.

3740.20 3741.20 S: Whatever.

3741.20 3744.00 S: It's like the Roundup, but just a lot faster.

3744.00 3745.00 S: Exactly.

3745.00 3749.64 S: So the big one that they're proposing on building is 300 feet or 91 meters in diameter.

3749.64 3750.64 S: That's big.

3750.64 3751.64 S: That's huge.

3751.64 3754.00 S: So let me tell you a little bit about what's actually happening today.

3754.00 3759.04 S: So they did a test launch last month, actually not that, four weeks ago or less.

3759.04 3764.96 S: They tested their system at 20% power and they successfully launched a 10 foot long projectile thousands of feet.

3764.96 3766.68 S: And that's really all they were hoping to see.

3766.68 3769.06 S: They just wanted to see it work, the basic concept work.

3769.06 3772.24 S: And then with that test, they were able to do all the math and just figure out, yep,

3772.24 3776.70 B: the whole system can work. And that's a critical thing because this isn't the first company to think about this.

3776.70 3777.84 B: Other companies have thought about this.

3777.84 3778.84 B: They've planned it.

3778.84 3780.40 B: They've talked about it and all that stuff.

3780.40 3784.72 B: They are the first company that actually has a prototype proof of concept and it worked.

3784.72 3785.72 B: Third scale.

3785.72 3786.72 B: So that's a critical thing here.

3786.72 3788.12 B: They show that this can work.

3788.12 3791.80 B: So that's why this is so big, I think.

3791.80 3794.24 S: It could be a game changer in a lot of ways.

3794.24 3802.80 S: So the launch, their system, the big one that they're going to build next is going to be able to launch a 440 pound or 180 kilogram satellite.

3802.80 3804.64 S: And that's not something to laugh at.

3804.64 3806.64 S: That's actually a good- Small satellite.

3806.64 3808.08 S: It's a small, but a lot of satellites are small today.

3808.08 3809.84 S: Like I think that's really where things are heading.

3809.84 3810.84 GH: People will never use it.

3810.84 3813.84 GH: And they're going to be smaller very soon.

3813.84 3819.20 J: So they're going to open for business.

3819.20 3825.76 S: They're saying that they're going to open for business in 2024, which means that they're going to be having a- Did they have the funding for this?

3825.76 3828.64 S: The CEO opened it up to, they have big investors.

3828.64 3829.84 S: That's basically what the website says.

3829.84 3832.38 S: They have very deep pocketed investors.

3832.38 3837.38 S: And before anyone makes the joke, you can't launch a person into space with this system.

3837.38 3841.08 S: And the reason is amazing when I tell you the numbers.

3841.08 3843.16 S: Humans can withstand 8 to 9 Gs.

3843.16 3849.56 S: At 9 Gs, you could stand 9 Gs for one second and then they got to slow you down or else it'll start to do significant damage.

3849.56 3854.20 S: Spin Launch's system has over 10,000 Gs.

3854.20 3856.56 S: It would turn you into something less than jelly.

3856.56 3858.72 S: It would just, I don't know, you would just squirt you out.

3858.72 3861.00 S: I don't know what would happen.

3861.00 3862.80 S: You would be a smudge on the outer ring.

3862.80 3867.72 C: It's so funny too because when you said you can't launch a person, I immediately went to a body.

3867.72 3869.72 C: Like when you said that, like launch a body into space.

3869.72 3872.48 C: A lot of people want to do that.

3872.48 3873.48 C: You could do that with this.

3873.48 3874.48 C: You couldn't even do that.

3874.48 3875.48 C: Just put it in a capsule.

3875.48 3879.48 C: Yeah, I mean that might be a way to launch like a- It's going to happen to you in space anyway.

3879.48 3885.44 S: But the most important thing is, and this was a cool thing that I read, they tested regular electronics in the thing.

3885.44 3886.44 S: They just put it in the center.

3886.44 3891.40 S: They didn't really need to shoot it because really the damage is just spinning it up to 5,000 RPMs.

3891.40 3895.00 S: And they put like a cell phone in there and today's technology, the cell phone worked.

3895.00 3896.00 S: Did they call it?

3896.00 3897.00 S: Yeah.

3897.00 3898.28 S: No, but it actually was able to do it.

3898.28 3906.08 S: So they put in like modern telescope lenses and pieces of things that they know will go into a satellite and it all survived, which is really interesting.

3906.08 3910.40 B: And they're very confident that when you scale it up that those electronics will survive.

3910.40 3911.40 B: Yeah.

3911.40 3913.48 B: I mean they will need some modification I would think.

3913.48 3918.32 S: Yeah, you might need to engineer the satellite to be spin launch ready.

3918.32 3919.32 S: Absolutely.

3919.32 3920.32 S: That's a little hard.

3920.32 3921.44 S: So one more very important thing though.

3921.44 3928.28 S: The centrifuge will only get it up to really low, low, low, low earth orbit and the satellite will come out, right?

3928.28 3933.72 S: So the shell will come off because it's really like to a point, a needle point because of aerodynamics.

3933.72 3944.28 S: But once the satellite opens up and that shell falls away, there's a real rocket engine in there that sends it the rest of the way into low earth orbit, which is really cool because it uses about a 20th of the fuel.

3944.28 3945.56 S: No, no, 25%.

3945.56 3946.56 S: That's the other was the cost.

3946.56 3947.56 S: That's the fuel.

3947.56 3948.56 S: That's the cost.

3948.56 3950.20 S: 17,500 miles per hour.

3950.20 3951.80 S: Yeah, so the cost is a 20th.

3951.80 3953.80 S: I think the fuel is- A cookie.

3953.80 3954.80 S: A cookie.

3954.80 3955.80 S: A cookie.

3955.80 3957.88 S: You can put a cookie into outer space.

3957.88 3958.88 S: So that's good though.

3958.88 3962.64 S: That's saving a lot of fuel and getting the cost down- I feel good.

3962.64 3966.92 S: Because this is all about what's the cost of getting a pound into space, right?

3966.92 3968.76 S: This has been coming down significantly.

3968.76 3972.40 B: But it used to be the classic number was like $10,000 a pound, right?

3972.40 3973.40 B: Which is nuts.

3973.40 3974.68 B: That's incredibly expensive.

3974.68 3977.68 B: But now with SpaceX and reusability- We're at 1,000 now.

3977.68 3979.12 B: Yeah, we're under 1,000 and that's great.

3979.12 3980.12 B: But this could be even cheaper than that.

3980.12 3982.24 S: This goes 20 times less than that.

3982.24 3984.44 S: This could go down to 50 bucks a pound or something.

3984.44 3985.44 S: That would be a game changer.

3985.44 3986.44 B: It's a game changer.

3986.44 3987.44 B: It is.

3987.44 3988.44 B: A game changer for small loads.

3988.44 3989.44 C: Yeah.

3989.44 3993.68 C: But that also means that we're like way more people can put way more crap into space now.

3993.68 3994.68 S: Yeah.

3994.68 3995.68 S: How many meatballs is that?

3995.68 3997.68 S: I would be happy to work on that answer.

3997.68 3998.68 B: Did you do that calculation today?

3998.68 3999.68 B: I didn't, but I can work on that.

3999.68 4001.68 B: What's the diameter of the meatball?

4001.68 4004.36 B: You've got to answer that question first.

4004.36 4007.52 S: How much was the breadcrumb to meat ratio in the meatballs?

4007.52 4008.52 S: I really need to know.

4008.52 4011.00 C: But don't you think that like, I mean, maybe this isn't right.

4011.00 4014.24 C: Like, you know, it's like oil should be expensive.

4014.24 4016.60 C: Like shouldn't it be not cheap?

4016.60 4019.24 S: No, that's not the answer.

4019.24 4022.36 S: We just need to regulate space well.

4022.36 4025.60 S: But keeping it expensive doesn't do anyone any-

4025.60 4028.52 C: But isn't that a way that things are often regulated?

4028.52 4030.76 S: Well, but not in this case. Think about it this way.

4030.76 4033.56 S: It's not stopping Starlink from sending 20,000 satellites up into space.

4033.56 4037.56 S: So just being as expensive as it is now isn't cutting it.

4037.56 4039.76 S: So making it even cheaper, I don't think it's going to be a problem.

4039.76 4040.84 C: I mean, it's like, I agree.

4040.84 4043.76 C: I like the concept of democratizing these things.

4043.76 4050.16 C: But we have to also remember the very first news item we talked about today.

4050.16 4051.16 C: And there has to be a balance.

4051.16 4052.16 S: I know.

4052.16 4053.16 S: You're right.

4053.16 4055.22 S: The problem isn't sending technology into outer space.

4055.22 4060.40 S: It's all about morals and regulations and people doing the right thing.

4060.40 4062.68 C: Part of the problem is there's too much shit up there.

4062.68 4064.08 C: That is like an actual problem.

4064.08 4065.08 C: There's too much shit up there.

4065.08 4066.08 S: But we need some of that shit, though.

4066.08 4067.08 C: We do need some of that shit.

4067.08 4068.08 S: We do.

4068.08 4069.08 S: There's good shit up there.

4069.08 4070.08 C: There's good shit up there.

4070.08 4075.44 C: But then it's like at a certain point, yeah, how much shit is too much shit?

4075.44 4077.68 B: And eventually it's going to go to shit.

Science or Fiction (1:08:00)[edit]

Answer Item
Fiction Obscure laws
Science Ancient civilization
Science
Ghost towns
Science
Stegosaurus fossil
Host Result
Steve win
Rogue Guess
Bob
Ancient civilization
Evan
Obscure laws
Jay
Obscure laws
Cara
Ancient civilization
George
Ancient civilization

Voice-over: It's time for Science or Fiction.

Theme: Colorado

Item #1: Evidence for the earliest occupation of the Colorado area by people go back 11,000 years, although the most famous ancient civilization of Mesa Verde is only about 700 years old.[7]
Item #2: Colorado is home to over 1,500 ghost towns, but only about 640 have visible physical remains.[8]
Item #3: Colorado is famous for having strange outdated laws on the books, such as it being illegal to drive a black car on Sunday or to loan your neighbor a vacuum cleaner.[9][10]
Item #4: Bonus item Steve didn't use: Although the Stegosaurus and the Apatosaurus were discovered near Morrison, Colorado, the first Stegosaurus skeleton now resides in New Haven, CT, in the Peabody Museum.[11]


4077.68 4083.96 S: It's time for science or fiction.

4083.96 4090.52 S: It's time for science or fiction.

4090.52 4095.48 S: We have a theme this week.

4095.48 4102.08 S: The theme is Denver, Colorado.

4102.08 4109.20 S: That means I will not be surveying the audience before the rogues answer because I'm assuming you guys know all of this.

4109.20 4110.20 S: But we'll see.

4110.20 4111.60 S: We'll get their answer, then we'll get you.

4111.60 4112.88 S: But then afterwards, you get there.

4112.88 4115.28 S: No kibitzing from the peanut gallery.

4115.28 4116.28 S: All right.

4116.28 4125.36 S: I actually made up four items in case one of them got spoiled during our activities over the last couple of days.

4125.36 4126.36 GH: Was it?

4126.36 4127.36 GH: No.

4127.36 4128.36 S: Oh, good.

4128.36 4129.36 S: So I have four items.

4129.36 4130.36 S: All right.

4130.36 4131.36 S: But I could probably still just pick the three best.

4131.36 4144.96 S: Item number one, evidence for the earliest occupation of the Colorado area by people go back 11,000 years, although the most famous ancient civilization of Mesa Verde is only about 700 years old.

4144.96 4154.00 S: Item number two, Colorado is home to over 1,500 ghost towns, but only about 640 have visible physical remains.

4154.00 4166.16 S: And item number three, Colorado is famous for having strange outdated laws on the books, such as it being illegal to drive a black car on Sunday or to loan your neighbor a vacuum cleaner.

4166.16 4167.16 S: All right.

Bob's Response[edit]

4167.16 4168.16 S: Bob, we'll start with you at the end.

4168.16 4169.16 B: All right.

4169.16 4172.38 B: I'm going to believe the ghost towns because, you know, ghosts.

4172.38 4176.56 B: And the third one about the outdated laws, that strikes me as plausible.

4176.56 4179.08 B: The first one I'll say is the one about the ancient.

4179.08 4180.08 B: The first one.

4180.08 4181.08 S: Yeah.

Evan's Response[edit]

4181.08 4182.08 S: Evan, you're next.

4182.08 4183.08 E: Okay.

4183.08 4184.08 E: The first one is about the black cars, huh?

4184.08 4185.08 E: I don't know.

4185.08 4186.08 E: That seems to be reasonable.

4186.08 4189.64 E: I don't understand why Bob thinks that one would be the fiction of these three.

4189.64 4190.92 E: So I'm going to go a different direction.

4190.92 4197.64 E: I don't think those obscure laws in the third one about the car and the vacuum cleaner.

4197.64 4203.64 E: Why I can't see a reason in any circumstances why those would be illegal.

4203.64 4208.08 E: Yes, wacky laws exist in all the states, but those?

4208.08 4209.08 E: Nah.

4209.08 4212.04 E: Has nothing to do with vices or money or anything like that.

4212.04 4213.04 S: That's fiction.

4213.04 4214.04 S: All right.

Jay's Response[edit]

4214.04 4215.04 S: Jay.

4215.04 4216.04 J: Yeah.

4216.04 4223.44 S: I mean, I think the wacky laws one, you got to remember like these laws, if they're real, they probably go back, I mean, God, 150 years, whatever, like weird circumstances.

4223.44 4225.08 S: The car and the vacuum cleaner.

4225.08 4226.08 S: I know.

4226.08 4227.08 S: You know what I mean though?

4227.08 4229.64 S: Like, yeah, I guess you're right.

4229.64 4235.84 S: But I'm also factoring in Steve's brain.

4235.84 4238.18 S: The 1500 ghost towns to me is a no brainer.

4238.18 4240.80 S: Like that doesn't even like sound like enough, if anything.

4240.80 4243.80 S: But what classifies a town as a ghost town?

4243.80 4247.12 S: Like does that, it's defunct or is there, did it have to be haunted?

4247.12 4248.12 C: Nobody lives there.

4248.12 4249.12 C: It's not haunted.

4249.12 4250.12 C: That's not what I'm saying.

4250.12 4251.12 C: I'm just, I'm fishing for information.

4251.12 4252.12 J: I'm a ghost town.

4252.12 4265.68 S: But you know, when you say that people were, go back 11,000 years, you can't tell me like who those people would have been.

4265.68 4268.68 S: Yeah, they were homo sapiens.

4268.68 4272.64 S: And that's the one that you picked, Bob?

4272.64 4273.64 E: That's the fiction.

4273.64 4274.64 E: I don't know.

4274.64 4275.64 J: What did I pick?

4275.64 4276.64 J: Yes, that was the one he picked.

4276.64 4277.64 E: I mean, he won.

4277.64 4278.64 E: Macy Evans.

4278.64 4279.64 E: Bob passionately picked that one.

4279.64 4280.64 S: I guess I'll go with Evan.

4280.64 4281.64 S: With Evan on the ghost town?

4281.64 4282.64 S: The vacuum cleaner.

4282.64 4283.64 S: The ghost town.

4283.64 4284.64 S: No, the vacuum cleaner.

4284.64 4285.64 S: The car.

Cara's Response[edit]

4285.64 4286.64 S: All right, Cara.

4286.64 4287.64 C: Okay.

4287.64 4291.28 C: 11,000 years versus 700.

4291.28 4292.32 C: Maybe that's operative.

4292.32 4294.60 C: Most famous ancient civilization.

4294.60 4296.96 C: That seems subjective to me.

4296.96 4300.68 C: And then 1500 ghost towns, but only about 640 have visible physical remains.

4300.68 4305.58 C: So just to be clear, because I thought the whole point of a ghost town was that there's a town there.

4305.58 4311.32 C: But maybe you're saying by definition that everybody left and left the town behind and then the town might have been.

4311.32 4312.32 C: And then it got haunted.

4312.32 4313.32 C: There might be archeological evidence of a town there.

4313.32 4314.32 C: Right, it might have been buried or whatever.

4314.32 4315.32 C: But there aren't buildings there.

4315.32 4319.12 C: Right, but at some point in town there was a ghost town there.

4319.12 4320.12 C: Yeah, okay.

4320.12 4321.88 C: And then strange outdated laws on the books.

4321.88 4324.36 C: Again, like we know that this is kind of a thing.

4324.36 4327.08 C: Illegal to drive a black car on Sunday.

4327.08 4329.76 C: So that would be since there were cars.

4329.76 4332.12 C: And illegal to loan your neighbor a vacuum cleaner.

4332.12 4334.36 C: That would be since there were vacuum cleaners.

4334.36 4338.44 C: Although I sometimes think that laws are written by like lobbyists.

4338.44 4341.60 C: And I can imagine that there might be a weird financial incentive with those.

4341.60 4343.14 C: So I don't know.

4343.14 4344.56 C: Those two are bizarre.

4344.56 4351.36 C: Like the ghost towns and the vacuum cleaners, which I feel like if I know Steve, he dug for really bizarre but true things, then the other one's kind of like meh.

4351.36 4353.96 C: So I think I'm going to go with Bob.

4353.96 4357.84 C: Because this is a bad proposition to be like trying to get inside of Steve's head.

4357.84 4358.84 S: Always.

4358.84 4359.84 None But going with Bob is never.

4359.84 4362.84 C: But see, he's saying always because he doesn't want me to do this.

4362.84 4365.84 C: Yeah, I'm going to go with Bob on this one.

4365.84 4366.84 C: GWB.

George's Response[edit]

4366.84 4367.84 GH: Yeah, GWB.

4367.84 4368.84 GH: All right.

4368.84 4369.84 GH: Yeah, you know, it's funny.

4369.84 4371.80 GH: I have this recollection of there being this law.

4371.80 4372.92 GH: You can't loan a vacuum.

4372.92 4373.92 GH: And I don't know.

4373.92 4375.80 GH: And it's going to be like not for this state.

4375.80 4377.48 GH: It's going to be for some other state.

4377.48 4379.72 GH: I'm going to get really upset.

4379.72 4382.20 GH: But I think like, okay, you can't drive a black car on Sunday.

4382.20 4386.56 GH: They didn't want anybody driving except like maybe ambulances and fire department people.

4386.56 4387.56 GH: So that was the rule.

4387.56 4390.92 GH: Like how do we say, oh, they're allowed to drive without saying only ambulances or whatever.

4390.92 4392.92 GH: They're not black cars because all cars are black.

4392.92 4393.92 GH: So I'm thinking that's good.

4393.92 4394.92 GH: I'm thinking that's good.

4394.92 4396.52 GH: I'm thinking the ghost towns are good, too.

4396.52 4397.52 GH: I mean, the ghost towns are good.

4397.52 4399.96 GH: And I think that I think I'm going to go with Bob as well.

4399.96 4403.68 GH: There's something about those numbers that are a little bit weird, maybe.

4403.68 4406.52 GH: Like the Clovis thing or the 700 thing is weird.

4406.52 4407.68 GH: So I'm going to go with Bob as well.

4407.68 4408.68 GH: Say number one is a fake.

4408.68 4409.68 B: All right.

4409.68 4413.92 B: So the vacuum was from the space station because it's a bad idea.

4413.92 4414.92 B: Never mind.

Audience's Response[edit]

4414.92 4415.92 S: All right.

4415.92 4420.48 S: So we got Bob, Cara and George saying that the oldest people is the fiction.

4420.48 4424.66 S: We have Evan and Jay saying that the laws are the fiction.

4424.66 4425.90 S: Everyone believes the ghost town.

4425.90 4428.64 S: So now we're going to we're going to pull the audience.

4428.64 4430.80 S: We're going to do the guys know the one clap thing, right?

4430.80 4431.80 GH: George, you do it.

4431.80 4432.80 GH: Single clap.

4432.80 4433.80 GH: Let's just try it together.

4433.80 4434.80 GH: Everybody claps when my hand comes back down here.

4434.80 4435.80 S: Ready?

4435.80 4436.80 S: Here we go.

4436.80 4437.80 S: Great.

4437.80 4438.80 S: All right.

Steve Explains Item #2[edit]

Steve Explains Item #1[edit]

Steve Explains Item #3[edit]

4438.80 4443.96 S: So the oldest occupation in Colorado is the fiction clap.

4443.96 4448.92 S: If you think that the 1500 ghost towns is the fiction clap.

4448.92 4450.34 S: You die hard.

4450.34 4451.96 S: And then way to go out on the limb.

4451.96 4452.96 S: I appreciate that.

4452.96 4458.80 S: And then if you think that the outdated, wacky laws is the fiction clap.

4458.80 4460.52 S: So I think one in three are pretty much time.

4460.52 4461.52 S: Yeah.

4461.52 4462.52 S: You guys think that?

4462.52 4464.12 S: OK, so let's start with the ghost town.

4464.12 4466.32 S: Wouldn't it be awesome if this was fiction?

4466.32 4472.04 S: Yes, it was over 1500 ghost towns, but only about 640 visible physically made.

4472.04 4473.04 S: That one is science.

4473.04 4474.48 S: Yeah, that was an easy one.

4474.48 4477.12 S: Why do you think there are so many ghost towns in Colorado?

4477.12 4478.12 S: Because they mined them and got out.

4478.12 4479.12 S: Because they had a gold rush.

4479.12 4480.12 S: And then it was over.

4480.12 4481.12 S: Get the goods and get out.

4481.12 4482.12 S: Sure.

4482.12 4484.84 S: And not only gold, but but lots of other things.

4484.84 4485.84 S: Silver was big.

4485.84 4486.84 S: Silver was big.

4486.84 4487.84 S: These things are still big in Colorado.

4487.84 4489.36 S: Colorado is a big mining town.

4489.36 4490.56 S: There's a lot of stuff that comes out of Colorado.

4490.56 4495.26 S: I think silver is the number one mineral export for Colorado at this point in time.

4495.26 4506.88 S: But yeah, when you have that kind of a location specific boom and bust, which is mining is like a these towns, a lot of these towns were built with the idea that they would be occupied for 10 to 15 years.

4506.88 4510.28 S: Like they knew that they were not going to be occupied longer than 15 years.

4510.28 4512.28 C: Which is why only some of them survived.

4512.28 4513.64 S: Right, right, right, right, right.

4513.64 4514.64 S: All right.

4514.64 4516.76 S: Let's go back to number one.

4516.76 4528.08 S: Evidence for the earliest occupation of the Colorado area by people go back 11000 years, although the most famous ancient civilization of Mesa Verde is only about 700 years old.

4528.08 4530.76 S: And Bob Carragio, you think this one is the fiction.

4530.76 4533.16 S: About half the audience think this one is the fiction.

4533.16 4537.44 S: And this one is science.

4537.44 4543.44 S: That's bullshit.

4543.44 4548.52 S: They were the oldest evidences was these were probably Clovis people.

4548.52 4551.48 S: Then you have paleo Indians.

4551.48 4555.00 S: But the Mesa Verde, which everyone thinks of, that's the ancient people of Colorado.

4555.00 4557.68 S: You know, 700 years, six to 700 years.

4557.68 4559.12 S: Not as old as you might think.

4559.12 4561.12 S: But an amazing archaeological site by the way.

4561.12 4563.54 S: I'm sure you guys are very, very familiar with it.

4563.54 4568.12 S: Before I give the last one, my alternate was this.

4568.12 4571.60 S: That I won't tell you if it's a science fiction because it could have been either one because I had to adjust them.

4571.60 4585.84 S: So it was that although the stegosaurus and the apatosaurus were discovered near Morrison, Colorado, the first stegosaurus skeleton now resides in New Haven, Connecticut in the Peabody Museum.

4585.84 4586.84 S: Oh, you know this.

4586.84 4587.84 S: Is that science or fiction?

4587.84 4588.84 S: Science.

4588.84 4589.84 S: Science.

4589.84 4590.84 S: That's science.

4590.84 4591.84 S: Right, right.

4591.84 4592.84 S: Of course.

4592.84 4595.36 S: But of course, you know, we went through the museum.

4595.36 4598.72 S: I'm like just waiting for like, oh, look, the first stegosaurus was, you know.

4598.72 4601.48 S: Steve, does this mean that Evan and I win a cookie?

4601.48 4602.48 S: It does.

4602.48 4608.48 S: Jay and I win a cookie because Colorado was famous for having strange outdated laws on the books, which is true.

4608.48 4610.28 S: Both, these are not two of them.

4610.28 4615.96 S: But such as being illegal to drive a black car on Sunday or to loan your neighbor a vacuum cleaner.

4615.96 4618.72 S: For the people who thought that this one was science, have you heard these before?

4618.72 4622.04 B: Well, all I know is I'm looking at Movoto.com.

4622.04 4624.84 B: In Denver, it's unlawful to lend your vacuum cleaner to your neighbor.

4624.84 4625.84 S: Those are myths.

4625.84 4626.84 S: Those are common myths.

4626.84 4627.84 S: This is a random website.

4627.84 4632.84 S: If only there was some podcast that could deal with these myths.

4632.84 4639.84 GH: I didn't make these up.

4639.84 4641.84 S: These are common myths.

4641.84 4645.48 B: 50 facts about Denver that you never learned in school.

4645.48 4648.68 J: Let's see, Alex Jones' website.

4648.68 4652.92 B: So there goes my pre-research for the website.

4652.92 4660.88 S: Well, when I research these kind of themed shows, I come across all of those sites, and they're mostly bullshit.

4660.88 4667.20 S: After this, the 10 weirdest things about whatever it is I'm researching, they're thinly sourced.

4667.20 4671.40 S: And a lot of them are myths, or they're just repeated or exaggerated.

4671.40 4675.56 S: I always have to go back and then just independently source them.

4675.56 4679.84 S: And with this one, I read that, and I'm like, all right, now let me dig deeper.

4679.84 4682.20 S: And it's like, oh, they're myths.

4682.20 4683.20 S: Perfect.

4683.20 4684.88 S: Because those are always the perfect fiction.

4684.88 4686.88 C: Is there any sort of backstory around them?

4686.88 4688.90 C: Why did these myths persist?

4688.90 4690.48 C: Why these specific myths?

4690.48 4691.48 C: Do we know?

4691.48 4692.48 S: Not really.

4692.48 4693.72 S: These are just urban legends.

4693.72 4694.72 S: They're urban legends.

4694.72 4697.72 S: So I did come across the frozen dead guy.

4697.72 4698.72 S: Guys hear about that?

4698.72 4699.72 S: Oh, yeah, yeah, yeah.

4699.72 4704.72 S: Yeah, so there's a town in Colorado where some guy wanted to get cryogenically...

4704.72 4705.72 S: Yeah.

4705.72 4706.72 S: Easy.

4706.72 4712.80 S: Well, yeah, so there's an annual frozen dead guy celebration for this guy who they managed to...

4712.80 4713.80 S: I know.

4713.80 4714.80 S: They had him on ice for like 12 years.

4714.80 4715.80 S: It's called cryonics when you...

4715.80 4716.80 S: I know.

4716.80 4717.80 S: Cryonics.

4717.80 4718.80 S: You said cryogenic.

4718.80 4719.80 S: I'm sorry.

4719.80 4720.80 S: Cryogenic means frozen?

4720.80 4721.80 S: Cryonically.

4721.80 4722.80 S: You're cryonically frozen?

4722.80 4723.80 S: Yes.

4723.80 4724.80 S: Okay.

4724.80 4725.80 S: What's the difference?

4725.80 4726.80 S: Humans are cryonically frozen.

4726.80 4727.80 S: You could cry...

4727.80 4728.80 S: Cryogenic as animals?

4728.80 4729.80 S: It's non-humans.

4729.80 4730.80 S: Non-humans.

4730.80 4731.80 C: Okay, gotcha.

4731.80 4732.80 C: All right.

4732.80 4737.80 C: No, I think the difference is that cryogenics is an actual scientific procedure for tissues and cryonics is like pseudoscience.

4737.80 4740.88 C: You can't just freeze somebody and bring them back.

4740.88 4741.88 C: But this guy...

4741.88 4742.88 B: It's not pseudoscience.

4742.88 4743.88 S: Okay, Bob.

4743.88 4754.76 S: He originally was frozen and then the person didn't have the money to continue it, so they just sort of kept him on ice for 12 years, which I don't know if that's going to cut it.

4754.76 4761.16 S: But then they got funding and he's like properly frozen now, but I don't know.

4761.16 4762.16 S: Those 12 years are dodging.

4762.16 4767.72 S: Have you ever thought meat out of your freezer and cooked it and was like, oh, that was bad before I froze it?

4767.72 4768.72 C: A lot of freezing.

4768.72 4769.72 C: I don't really freeze meat.

4769.72 4771.72 C: I don't thaw it and freeze it again.

4771.72 4774.32 S: Time is still passing at 32 degrees.

4774.32 4775.32 S: Right, right.

4775.32 4777.32 S: You have to be that liquid nitrogen.

4777.32 4779.20 S: You have to be a super low for it to slow it down.

4779.20 4780.88 S: So he was like, yeah, no.

4780.88 4785.52 S: So but anyway, there's a frozen dead guy festival in the town, so that's cool.

4785.52 4786.52 S: Oh, that's science.

4786.52 4787.52 S: That's science.

4787.52 4788.52 S: Yeah, the frozen dead guy is science.

4788.52 4792.00 S: The stegosaurus is science.

4792.00 4794.00 S: What's not science is what we chose.

4794.00 4796.36 S: That's the one really obvious.

4796.36 4798.00 S: All right.

4798.00 4799.00 S: So good job, half the audience.

4799.00 4800.00 S: Good job, Evan and Jay.

4800.00 4801.00 S: Thank you very much.

4801.00 4802.00 S: Thank you.

4802.00 4803.00 S: Evan.

4803.00 4804.00 None And badjobmovoto.com.

4804.00 4809.00 B: Damn, that's very bad.

4809.00 4811.52 S: Because I love the myths that are not true.

4811.52 4813.52 S: Because they make perfect fictions.

Skeptical Quote of the Week (1:20:14)[edit]

The more connections you can make across an ever wider and more disparate range of knowledge, the more deeply you will understand something. Search engines and videogames do not provide that facility; nothing does, other than your own brain
Susan A. Greenfield, English scientist, writer, broadcaster, and member of the House of Lords

4813.52 4815.16 S: Evan, give us a quote.

4815.16 4822.80 E: The more connections you can make across an ever wider and more disparate range of knowledge, the more deeply you will understand something.

4822.80 4826.60 E: Search engines and video games do not provide that facility.

4826.60 4829.70 E: Nothing does other than your own brain.

4829.70 4837.48 E: And that was a quote from Susan Greenfield, who is an English scientist, a writer, a broadcaster and a member of the House of Lords.

4837.48 4838.48 E: She's very cool.

4838.48 4839.48 E: Yep, that's true.

4839.48 4840.48 GH: I agree with that.

4840.48 4841.48 GH: I totally agree with that.

4841.48 4863.00 S: So, you know, having, one thing as a physician educator, I teach people at every level, right, from even before they're medical students to like first year, second year, third year, fourth year medical students who are all extremely different, interns, residents, and then fellow physicians.

4863.00 4868.24 S: And the more you know, it's so much easier to teach people the more they know, right?

4868.24 4869.96 S: And also it's easier to learn stuff.

4869.96 4871.56 S: You have more stuff to connect it to.

4871.56 4873.04 S: You have more context to put it in.

4873.04 4874.76 S: You have more stuff to hang it on.

4874.76 4875.76 S: You know what I mean?

4875.76 4876.76 S: So I absolutely agree.

4876.76 4880.32 S: The more you know, the easier it is to learn and things start to, they start to make sense.

4880.32 4882.08 S: You start to see more patterns.

4882.08 4888.76 S: It's really challenging to teach people who have a relative, you know, vacuum of knowledge about what you're trying to teach them.

4888.76 4893.96 S: Because they have a lot of misconceptions that you might not be aware of or they have gaps in their knowledge you might not be aware of.

4893.96 4895.44 S: They're just not prepared.

4895.44 4899.60 S: Medical school is all about preparing you to learn the next year, right?

4899.60 4911.76 S: Like the stuff you, the famous statistic is 90% of what you are going to do in your practice, you learn like in your fellowship, like your last bit of your training.

4911.76 4917.88 S: The other thing else builds up to everything is just preparing you for learning the stuff you're actually going to be doing.

4917.88 4927.20 S: And it's true because I always remember this one time where I was lecturing to first year medical students and I thought I was lecturing to second year medical students.

4927.20 4930.28 S: And there was such a difference, I was like, what is wrong with you people?

4930.28 4937.44 S: And it was like, I realized they were first year, like, oh, I'm totally aiming at the wrong level.

4937.44 4938.44 S: Seems like I got to dumb it down.

4938.44 4939.44 S: Yeah, true.

4939.44 4940.44 S: No, it's true.

4940.44 4941.44 S: It's just like, you just don't.

4941.44 4942.44 S: Yeah, they're like fresh out of college.

4942.44 4947.20 S: Yeah, they don't, they didn't have the background knowledge to learn what I was teaching them at the second year level.

4947.20 4951.12 S: But it's anyway, this, I liked, Evan pitched me this quote, I'm like, yeah, perfect.

4951.12 4953.60 S: I totally, totally agree with this.

4953.60 4956.32 S: So I think also you could learn anything.

4956.32 4963.08 S: Like I talk, I like to talk about the fact that I taught my daughter about birding when she was four and you can learn about all of science through birding.

4963.08 4964.72 S: Does, you know, it's all there.

4964.72 4968.72 S: Categorization and, you know, eyewitness, you know, the fallibility of eyewitnessing.

4968.72 4970.92 S: It's all, you know, yeah, evolution.

4970.92 4971.92 S: It's all there.

4971.92 4977.12 S: Just learn about anything you're interested in and that knowledge will have a lot of downstream effects.

Signoff/Announcements (1:22:57)[edit]

S: —and until next week, this is your Skeptics' Guide to the Universe.

4977.12 4978.12 S: All right.

4978.12 4979.12 S: We're a little bit over time, but that was fun.

4979.12 4980.12 S: Did you guys enjoy the show?

4980.12 4981.12 S: You guys are great.

4981.12 4982.12 None George, always wonderful to have you on the show.

4982.12 4983.12 None Thank you.

4983.12 4984.12 None I'm so happy to be here.

4984.12 4985.12 None Thank you, George.

4985.12 4986.12 S: Thank you, George.

4986.12 4987.12 None Everyone else, thank you guys for joining me again.

4987.12 4988.12 S: Thank you for flying out to Denver with me.

4988.12 4989.72 S: We'll try to catch our breath so we can keep going.

S: Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information, visit us at theskepticsguide.org. Send your questions to info@theskepticsguide.org. And, if you would like to support the show and all the work that we do, go to patreon.com/SkepticsGuide and consider becoming a patron and becoming part of the SGU community. Our listeners and supporters are what make SGU possible.

[top]                        

Today I Learned[edit]

  • Fact/Description, possibly with an article reference[12]
  • Fact/Description
  • Fact/Description

Notes[edit]

References[edit]

Vocabulary[edit]


Navi-previous.png Back to top of page Navi-next.png