SGU Episode 893

From SGUTranscripts
(Redirected from Misinformation (893 SoF))
Jump to navigation Jump to search
  Emblem-pen-orange.png This episode needs: proofreading, time stamps, formatting, links, 'Today I Learned' list, categories, segment redirects.
Please help out by contributing!
How to Contribute

SGU Episode 893
August 20th 2022
893 alex jones.jpg

Alex Jones' lawyer accidentally sent two years' worth of texts to plaintiffs' lawyers

SGU 892                      SGU 894

Skeptical Rogues
S: Steven Novella

B: Bob Novella

J: Jay Novella

E: Evan Bernstein

Guests

AJR: Andrea Jones-Rooy,



political, social, and data scientist

KB: Kelly Burke, from Guerrilla Skeptics

GH: George Hrab, NECSS emcee

IC: Ian Callanan, SGU tech guru

Quote of the Week

An educated person is one who has learned that information almost always turns out to be at best incomplete and very often false, misleading, fictitious, mendacious – just dead wrong.

Russell Baker, American journalist

Links
Download Podcast
Show Notes
Forum Discussion

Introduction, Live from NECSS, Book Update[edit]

  • Perry DeAngelis Memorial Episode

Voice-over: You're listening to the Skeptics' Guide to the Universe, your escape to reality.

S: Hello and welcome to the Skeptics' Guide to the Universe. This is your host, Steven Novella and today is August 6th, 2022. Joining me this week are Bob Novella...

B: Hey, everybody!

S: Jay Novella...

J: Hey guys.

S: ...and Evan Bernstein.

E: Hello everyone.

S: And we have two in-studio guests, Kelly Burke.

KB: Hello.

S: Kelly, welcome to the SGU. This is your first time on the show.

KB: It is.

S: And Andrea Jones-Roy. Andrea, welcome back to the SGU.

AJR: Hello. Thank you for having me.

S: Thank you all for joining me in the studio. You were here live, so we had to have you on the show. Now Cara was going to join us for this episode, she was going to join us remotely, but as you remember, she had surgery not too long ago and she's going through a bit of a rough patch. She did want us to say that Cara does struggle with depression and she's having a depressive episode partly due to hormones and the surgery and everything that's going on. She's dealing with it, but that has to be her priority this weekend to deal with that. And so she decided to not do the show. So we wish her well, she'll be back for next week's show. But we have six people instead of five to make up for it. So as you all know, this episode, every year, this is our Perry DeAngelis Memorial Episode and before it was NECSS, it was just the Perry DeAngelis Memorial SGU episode, then it basically morphed into NECSS and we kept it as the episode where we remember our lost rogue, Perry. Damn, that was 15 years ago, guys.

E: Oh my gosh.

S: He was with us for two years and it's been 15 years since he was on the show. It's just unbelievable. And of course, we remember many of the friends that we lost along the way, David Young, Michael Oreticelli, all lost too young, too soon, all really, really good friends of the SGU. So we like to remember them every year. Okay. So as you all know.

J: I have an announcement from our book publisher, if you don't mind.

S: Oh yes, go ahead, Jay. The actual. Physical book.

J: That book is the result of an incredible amount of work. So this book is about, it's about science, it's about the history of science and it's about making predictions on future technology. Historically and modern day. And we had a lot of fun writing the book. It was really intense, but it's an archive now of incredible information about predictions that were made in the past, predictions that were made five, 10 years ago, predictions that are made today. We also wrote some science fiction for this to illustrate some interesting future concepts of technology.

S: Yeah. What I like is that it's also a time capsule. It's like our own time capsule for the future. So future generations can look back and see how we did. Just like we are looking back at the past future and see how they did.

B: I hope they don't laugh at us the way we've been laughing at them. Yeah.

AJR: We were talking about the Jetsons earlier.

E: Happy birthday, George.

J: If you go to skepticsguidetothefuturebook.com, is that right? skepticsguidetothefuturebook.com. And you fill out the form there and you put in the secret password, which is the word "future". Don't tell anybody. It's a secret.

AJR: That's clever.

S: [inaudible] come up with that.

J: And you will be entered in to a giveaway of the very first signed copy of the book.

B: Wow.

J: So please go to skepticsguidetothefuturebook.com and the secret password, George, what's that secret password?

GH: Flabbing garbage.

J: Flabbing garbage. Or spelled as in non-George language, "future". The word is "future".

S: All right. So this slide is just to remind everybody that the theme of NECSS 2022, the 14th NECSS is the misinformation apocalypse. You've had a lot of talk so far about it. That theme might crop up on the SGU show this weekend. But we've tried to focus on the positive, right guys? We don't just want to say how bad it is. We want to focus on what you can do about it. And there's a lot of things you can do about it. All the things that we try to do every week. Understand science better and be more critical thinking, be more humble, know how to communicate, be more positive when you communicate, understand how to access information over media, how that all works. We'll hit some of those themes during the show today as well.

J: Well, it's always awesome as one of the people that organize NECSS, we talk to the speakers, but we don't get an incredible amount of details about what their talk is going to be. Because we're just trusting professionals. There's some conversation, but it's not detailed.

S: Always a bit of a throw of the dice.

J: It has to be. It has to be the people and not the talk, basically. So when we get to hear the talk and we get to see how it folds into our theme and what information that they cover, it's always a fun discovery. Oh my god, that's cool. It's more relevant than I thought, or they went into an area I didn't expect them to go. I thought your talk this morning was a lot of fun that you had with Richard Wiseman.

S: Oh yeah. Richard Wiseman's─

AJR: That was really cool.

S: ─very easy to talk to about stuff like that.

J: Definitely.

S: Always a pleasure. Andrea You just gave your talk on political science and misinformation, which is obviously a huge intersection there. So there's still a lot to learn. I think after doing this for 17 years, it doesn't amaze me, but it's always fascinating how much we still have to learn about something that we've been knee-deep in for a quarter of a century. We've been doing this skepticism since 96, 26 years, the podcast for 17 of those years. But there's just so much depth, and it's getting deeper, which is the good thing, is that this is not a static field. It's dynamic. We are actually learning more, and we have to update ourselves.

E: Constantly.

J: Well, the world, I mean, look at what's happened since this podcast began. Look how much things have changed, like you were talking about time capsules. Just take a look at what happened in science, the skeptical community itself, and in politics. The world has changed so much, and we're running to try to keep up with it, if anything. It's not an easy thing to do.

S: Yeah we're focused on almost entirely different issues now than we were at the start. The things that are important or critical have been constantly evolving. And then some things come back and stay the same. We're doing UFOs again. Really, we're all at the same point. So some things are different, some things are the same. It's always interesting.

B: I'm just waiting for Bigfoot to become relevant again. How long before Bigfoot is like, we're talking about Bigfoot again?

S: Yeah, that's one of those eternal ones that never goes away.

AJR: I saw a Loch Ness Monster post on Twitter recently.

E: Oh, yeah.

AJR: New evidence.

E: Loch Ness Monster.

J: New evidence. I love that. New evidence.

E: Never goes away.

J: New blurry photos.

S: Even after it's been definitively, The guy confessed, yeah, that was me. I made the first.

J: Steve, that's an admission that there's an Illuminati. That's what that is. Come on, man.

Special Segment: Chorizo Hoax (7:09)[edit]

S: All right. Let's get to some fun stuff. What do you guys think that is? I'm showing a picture on the screen there.

J: That is a meeple. I mean, whoever made it needs a little help, but we'll get there.

S: You're close. So a French scientist spread this picture on Twitter, claiming that it was─

AJR: Is it a Loch Ness Monster?

S: ─a close-up photo from it through, did he mention the telescope?

E: James Webb.

S: This is a James Webb close-up of Alpha Centauri.

B: Proxima Centauri.

S: Did he say Proxima?

B: I think he did.

E: I think he said Proxima.

S: Proxima Centauri, the closest star to the Earth. And it was pretty much believed by a lot of people. Turns out that is a picture of essentially a slice of chorizo. (laughter)

J: I love it. Oh, my god. It's so awesome.

AJR: Yeah, we're having chorizo after the conference is over? Because that's great.

S: Can we have some of that?

E: Look at all those solar swirls in that chorizo.

S: He must have looked at it and goes, you know what? This kind of looks like those blurry photos, close-ups of the sun. I wonder how many people I can get to believe that.

J: You know what I love about this? It's not even cropped. That is the shape of the piece of meat. That's it. That's the whole thing. That's great.

S: It is funny. There is a superficial resemblance. He later apologized, I'm not sure he had to, but we talk about this at times. He's a scientist, and he pulled a little prank. I thought it was pretty harmless, and the point was be a little bit more skeptical before you believe things online. But I do agree that it's problematic to have a scientist doing it, because then we're simultaneously saying consider the credentials of the person that you're listening to or that you're getting information from. If people are saying, hey, no, a scientist put this up. This wasn't just some random guy on the internet. This was a scientist saying, but he was pranking us. It may cause more harm than good. Kelly, you are a social media expert. What do you think?

KB: I was going to say, that's actually pretty tricky with the James Webb pictures, too, because I've noticed not all of them are coming from NASA, because the data is just out there and anybody can compile the pictures. So anytime I've seen something presented as a James Webb picture, I have to go and look into it, because it's not coming directly from NASA. So I could totally see why this took off.

B: You may think my knee-jerk reaction is, wait a second, stars are point sources of light. You zoom in as much as you want. You're really not going to see the disk for the most part. That has been true for as long as astronomy has been around, until of course, relatively recently. Now we can zoom in on certain stars, certain giant stars or stars that are close, and we can at least observe some of the disk itself. It's not just a point of light. And I think the number now is 23. 23 stars we have actually seen part of the disk or a little bit of the disk. Sometimes you can even see the convection cells. So it's not an outrageous thing to say I could see the disk of this nearby star.

E: It was not implausible, right.

B: And if you looked at some of them, we found one.

S: Well, yeah, I got it here. I do want to point out before we move off this picture, though, that while that's correct, you can see the grain of the meat. This is an in-focus photo. If he had just blurted out, it would have been a hundred times more powerful.

KB: If you're looking on your phone, it's really tiny.

E: That could also be a bowling ball for all you know.

S: So this is the closest up picture I could find of Alpha Centauri A and B, including Proxima Centauri. Actually, Proxima C, I always forget. Is that the third star in the system?

B: I just call it Proxima, and they're messing around with the names of these stars.

S: Yeah, but this is Alpha 1 and 2, or A and B. And you can see they're basically point sources of light. You're not seeing really the surface of those stars. There's some flare, lens flare, but you're not seeing the surface. But Bob and I found not the best picture of Alpha Centauri, but just what's the best picture of any star ever, and there you go. And it looks pretty much like a blurry slice of chorizo.

J: It doesn't even look symmetrical, Steve.

S: Why would it?

AJR: It's kind of a fattier chorizo, this one, though, right?

S: So as I said, if you blurt out that chorizo slice, you have a pretty good facsimile of a close-up picture of the star. Now this is, what was it, about 520 light years away?

B: Yeah, surprisingly far.

S: But it's a red supergiant, so it's massive.

B: So that helps.

S: Yeah, that helps a lot. Yeah, that was more plausible than I thought when I first saw it.

AJR: I feel like the scary thing is that we're all so worried about misinformation that scientists can't make jokes. It's kind of where we're going to live. Not that this was the best joke of all time, but the idea of a prank is sort of, it feels irresponsible, and it's too bad that that's the case, because it's making science fun and engaging, and you could imagine he could do a fun quiz show, like Cartwheel Galaxy or Lollipop or whatever, right? Fallon could do a segment, but it feels like it would cause more harm than good, which is...

S: Right, unless you're transparent up front. If he did it as, like, you might think this is a close-up star, but it's actually a chorizo. Here's a close-up star, something like that.

KB: That might be a new thing for our social media. Close-up, what is this?

J: This one looks more like a pizza pie, though.

AJR: It's like that gimmick they did forever ago, where they were like, was this a famous painting, or did a gorilla paint this?

J: All right, so real quick, apparently the password that the publisher put up there, the space for the password only takes five characters, so just type in the first five characters of the word future. You can't get good help these days, George.

GH: Futter.

AJR: Futter.

J: I don't understand. The password field actually has a limit to how many characters it takes. How does that even happen?

E: You have to pay more for the six characters.

AJR: Most passwords require it to be way too long these days, and I can't fill it in.

GH: A third book, Jay, maybe you could have six characters.

J: Look, this is what I'll do. I'll call the publisher on Monday, and I'll tell them, forget the password, just whoever entered is going to be legit. So just put your info in there if you want to enter in.

S: All right, let's get to some news items. We have more fun bits coming up later, too, but first a couple news items.

News Items[edit]

Scientific Rigor (13:25)[edit]

S: That picture of a giant complex of buildings that I'm showing is the NIH, the National Institutes of Health. They are essentially the main biomedical research funding institution in the United States. They are a creature of Congress, as we like to say. They are created, funded by Congress. Essentially, if you do biomedical research in the U.S., you get your funding from the NIH, more likely than not. They're massively important for medical research. Recently, the NIH created an initiative. It's not a new office or anything. It's just an initiative. They're funding specific groups who are going to create an educational module to teach researchers how to do rigorous science. That sounds pretty good. That sounds pretty good to me.

J: That doesn't already exist, though?

AJR: That's my thought.

S: That's a good question. Right now, how do we teach researchers how to do good research methodology? Some universities may have courses on it. They may be required. They may be elective. They might be a statistics course or a research methodology course. You do get that, but not like, all right, here's how you do really rigorous research. Here's how you avoid p-hacking or how you avoid false positives, etc., etc. Clearly, that is needed for reasons that I've been talking about and writing about for the last 20 years. The other way that people learn that is through, essentially, individual mentorship. You work in somebody's lab, and they teach you how to do research, not only in their specific area, technically, but also just, this is what good science is. But it's not systematic, and it's not thorough enough. Clearly, there's a perception that there is a gap, a gap there. They want to fill that gap. Their goal is to fund the creation of this module to teach rigorous research design and to then make it freely available, basically. And then the hope is, so universities may require it. They might say, all right, if you're going to work at our university, this already happens. I work at Yale, and I have to do 20 different certifications every year on everything, like sexual harassment sensitivity or how not to burn your eyes out or whatever, all of these things.

E: That's a good one.

S: How to treat patients ethically, all good stuff. A lot of safety things all in there. But just adding one that's, here's how not to do fake research. Here's how not to accidentally commit research fraud. Or how to p-hack or whatever. It would be very easy to slip that into the existing system of getting certified for quality control. That's basically what this is. Now, the NIH, of course, they could require, if you apply to the NIH for a research grant, and they're not saying they're going to do this, but imagine if they said, all right, in order to get this grant, you've got to have certification that you took this module and you passed. Because again, they're interested in not wasting money. That's their primary interest. Obviously, they want to do good science. That's their goal. Their mission is to obviously do good science, but they have a finite budget, and they want to make the most use out of that money. That, again, is their mission. One of the biggest wastes in research is bad science. If you publish a study, and it's a false positive, let's say, you think that you have a result, but you did poor methodology, you p-hacked or whatever. You underpowered the study. Or the blinding was inadequate. Or your statistics were off, or whatever. And then other people try to replicate that study, how many millions of dollars could be spent proving that your crappy study was crappy when you could have filtered it out at the beginning by putting in some internal controls that you didn't know you should do? Or by tightening up your research methodology. The other goal here, other than not only doing good science, is to save money by weeding out the inefficiency in the system of fraud. It makes sense, not fraud, but just bad rigor in research design. It makes sense that once these modules are up and running, phase two would be, and you've got to be certified in this before we'll give you any money. So that's one way that you, and again, the NIH already does this for other things, for example, they now require, this has been going on for about 10 or 15 years or so, if you get public money to do your research, you have to make the results of your research available to the public and accessible by the public. You have to say, how are you going to explain your results to the people who are paying for your research, the public. So this would be another way, how can you assure the people who are funding your research that you're not wasting their money by doing rigorous research design? And by the way, here is an educational module, and we could easily connect certification to that. That's awesome. I would like to see big science journals do the same thing. You want to get published in our journal, we require that you have the author, the lead author, or every author has certification. And of course, once either of those happens, like if the NIH says you need to have certification to get grant money, you better believe every university will make sure that it happens. They're not going to have any of their people not be able to get NIH grants. So it's very easy to make this systematic. So again, we're right at the very beginning of this, and everything I'm hearing and seeing is very, very good. We'll keep a close eye on it. And again, a lot of people react like you, Jay. It's really, why isn't this kind of already happening? But that's because I think the main reason is, I would say there's two things. One is people think it is happening, but it's just not happening enough. The second one is that the science of doing rigorous science has been getting better. We're learning more and more subtle ways in which studies go awry or that results can be tweaked or researchers can put their thumb on the scale. We talk about researcher degrees of freedom and researcher bias and publication bias and citation bias and all these things that can alter the utility and the rigor and the quality of science and essentially the old method of just relying upon some just here's some classic statistics class. And then whoever's lab you work in, they'll teach you how to do good science. It's just not good enough anymore. It's got to be systematic, and everyone's got to go through it in order to absolutely minimize the waste in the system that comes from poor research design. So this is a massive move in the right direction. This is very, very encouraging.

J: Steve, where did you learn how to do it?

S: For me, well, it's been the whole science-based medicine initiative, which is I've been reading about it, following, reading the literature on it for 20 years and writing about it, trying to digest it. That's basically what we explore at science-based medicine is how to do rigorous science. The relationship between science and practice. How do we know what's true, what's not true? Where's the threshold of evidence before something should affect your practice? That's what we do. That's how I learned it. It was all basically just self-taught by reading the literature, talking to my colleagues, writing about it, engaging about it. But most researchers are not spending most of their time, their academic time, doing that. They're doing their research. They're trying to figure out what receptor is causing this disease or whatever. This is sort of part of that, but it's not their focus. That's why it needs to be done systematically. This is also one final word and then we'll move on. Part of a bigger trend that I've noticed, at least in medicine. Andrea, you can tell me if you think it's true in your field as well, that you're going away from the model of just counting on mentorship and counting on that people will learn what they need to learn and moving towards things that are way more systematic, that are verified, and also that there are checks in place rather than just trying to raise the quality by just over-educating people. You just have checks in place to make sure that they do it. Medicine is getting too complicated. Science is getting too complicated to rely upon methods that are not absolutely systematic. Is that something you find in academia from your end?

AJR: Definitely. I'm thinking about something that I think Jay brought up on a different live a while ago about the movement towards pre-registering your hypotheses. That's another way of just putting the system in place because it turns out we can't rely on everyone to do great science even though we all like to think that we're doing it. Where I thought you were going, Steve, with that was we can't rely exclusively. Well, we still rely on it a lot, but peer review. Peer review is not a perfect process. It's a strong process in a lot of ways and I don't have great ideas about what to do instead, but it's not like it's perfect. A lot of stuff gets through peer review, and so this is something that could help steer people. The only question I'm having, though, is how you could imagine a world where they're sort of methodologically specific. I'm thinking of machine learning where you have issues with overfitting your model. That would be totally irrelevant to someone running an experiment. I don't know what the future would look like. Ten years from now, are there different modules? Do we need different modules?

S: This is what exists currently in medicine. If I'm doing some quality control certification thing that I do every year, there's the first part of it, which is for everyone or maybe every physician, and then you say what your specialty is. I'm a neurologist. Then you get the neurology-specific stuff. You could do the same thing. Here's the generic rigors that everyone needs to know, and then what are you doing research in? Particle physics? Here's the particle physics part of the module for you for those specific issues. I could absolutely see that working that way.

AJR: I kind of like the idea of making a bunch of social scientists do the particle physics, just to keep us humble.

S: Absolutely.

More Space Debris (23:51)[edit]

S: Jay, tell us about crap falling from the sky.

J: Steve, there's crap. And it's falling from the goddamn sky.

S: Oh, my goodness.

J: This is about the fact that space agencies around the world are not doing a very good job of figuring out how to exactly de-orbit pieces of spacecraft that are left up there for one reason or another. There is a significant number of objects in low Earth orbit. NASA tracks anything from 2 inches or 5 centimeters and up. And there's 27,000 objects that are being tracked. And 70% of the tracked objects are in LEO, low Earth orbit, which is the orbit that's basically as close to the Earth as you could pretty much get.

S: Do they say LEO? I've only ever heard L-E-O.

AJR: I just thought you meant something astrology, Jay and I was like, I can't believe this is happening. I've got to go.

J: I'm blazing trails here. It's low Earth orbit. Every one of these objects that are up there and that are going to be up there for a long time are hazards. They're dangerous. They actually have to plan accordingly. When anybody launches anything into outer space, they have to figure out the right time to do it and how to avoid these known objects. Because one of them could be traveling at such an incredible speed in relation to the ship that you're putting up there that it could destroy it. It could rip right through it. So this is a growing issue, and we have another issue that is a problem, is that there are objects that are being left in low Earth orbit that are big, that are slowly de-orbiting over time, because there's a tiny, tiny, tiny, tiny bit of atmosphere in low Earth orbit. And that's just enough to slowly take something out of orbit and bring it back down to Earth. As an example, China had one of their Long March 5B rockets bring something up. And a week later, when it came out of orbit, because it was only up for a week, and by that time there was enough inertia and everything to get it back down into the atmosphere. Pieces of it landed in Malaysia and Indonesia. And it landed right near a village where people were living. It is a real threat. And we're not talking about millions of people getting hurt, but it could kill people. It could kill handfuls of people now and again, which is something that we definitely want to avoid. It's also just not good practice. It's not keeping your shop clean. So getting back to the Long March 5B rocket, now this rocket is huge. China launched it on July 24th, and they were bringing up a new space station module to their Tiangong space station, which is a China-only space station. It's actually pretty cool, they should read up on it. Now this rocket is not designed to de-orbit itself. They don't send it up with the ability to do that. And in fact, the engines can't even restart after the engines are shut off. When it does its main push and gets all that weight up to the altitude that they need it to, and those engines shut off, they can't go back on. This ultimately means that there's no way for China to control the de-orbiting of this massive rocket. It's just going to fly back into the Earth's atmosphere, and I'm not even sure that they know where it's going to end up going. I don't even know if there's good physics that will really accurately predict where something willy-nilly is de-orbiting at some point and coming back into the atmosphere. It could end up anywhere, which is the scary part. Believe me, I feel completely happy and thrilled and lucky that we're alive during a time when space exploration is starting to explode again. It's a great time.

S: Hopefully explode.

J: Yeah, you're right.

AJR: Pun intended Jay?

J: When all of these nations are launching new projects, how's that? Is that better?

E: Better.

J: What we don't have right now are proper rules of etiquette. There are things that people would like. NASA is making it known what information that they would like, but in this instance, China didn't share any of the information about what trajectory their rocket was on and where they think it'll end up coming back into the atmosphere. The NASA administrator, the name of Bill Nelson, he said, and I'm quoting him: "All spacefaring nations should follow established best practices and do their part to share this type of information in advance to allow reliable predictions of potential debris impact risk, especially for heavy-lift vehicles like the Long March 5B, which carry a significant risk of loss of life and property. Doing so is critical to the responsible use of space and to ensure the safety of people here on Earth." I wish that I could have found some information on what would have happened if one of these pieces of larger debris ended up barreling into a city. Could it take a part of a building out? What's its velocity? How much mass does it have? I do know that SpaceX had a module, a piece of debris come back down as recently as July 9th. Now, if you look at a picture of the Crew-1 module, there is a component that's right underneath it that is used to relay electricity to the module and all that, but it's also a cargo hold, right? A cargo hold that's not pressurized. This thing is about 3 meters long and it weighs 4 metric tons. That's an incredibly heavy object that hit the Earth at one point. It came back down on July 9th and it took a year for it to deorbit. So that's just another thing that needs to be tracked. It could take time for them to come back down and then we have to try to figure out where they're going to go. But okay, let's say we know where it's going to go. So what? What if it's going to hit a major city somewhere? What are we going to do about it? The answer is there's nothing. There's nothing we can do about it. We're going to shoot rockets up to take out rockets that are coming. The whole thing is crazy. So what we need to do is we need to have this rules of etiquette where space agencies start to send up more fuel, have rocket engines that can deorbit themselves and not only have one turn-on cycle. These pretty costly and probably very expensive engineering feats that need to become a part of all of these projects. And that's what NASA wants. But right now...

S: Just to make sure that the point is crystal clear, it's to control the deorbit so that we know where it comes down. We dump it in the middle of the Pacific so it doesn't hit Australia or whatever.

J: Exactly, yeah. So right now there's a couple of companies that are starting to, or space agencies that are starting to comply and build in this functionality into the new rockets that they're building. But let's face it, it's not a global thing. A lot of people aren't doing that. Some good things that we have are like SpaceX, which is leading the pack on this whole idea of reusability. That's fantastic. You want to reuse your rockets. You want your retro rockets to land themselves. You see it all the time. That's great. More reusability that we build into things means more control. More ability to bring things down safely, which is exactly what everybody needs to be doing. One, we don't want to pollute low Earth orbit any worse than it is. If anything, we want to get that stuff out of there, which no one has come up with a feasible economic way to do it yet. But I imagine at some point in the next 50 years, someone will come up with something that's making that move. But in the meantime, our goals are no more debris and absolutely no more craziness of things falling out of the sky without any predictability on where they're going to go or drivability, meaning we want them to go to a specific place. So what do you think about that, Steve?

S: Well, it wasn't too long ago. It was just a science or fiction item where an estimate was that in the next decade, there's actually something like a 10% chance of somebody getting hit by space debris.

AJR: Remember that. We all thought it was fiction.

S: Yeah, it's getting pretty significant now just because of the sheer volume of stuff that we're putting up there. So, yeah, it's, again, one of those things that we have to take a systematic approach to it rather than relying on individuals to all do the right thing.

J: How would we figure that out, Steve? Where would we come up with such an approach?

AJR: People aren't just going to automatically do the right thing on their own volition. It's just stunning.

S: I know.

AJR: I feel like we're going to have apps where you have, weather forecast, air pollution, space debris. What's the probability of that thing landing in Manhattan today?

S: Take your umbrella.

AJR: A steel umbrella.

E: 50% chance of rain, 5% chance of...

AJR: Low Earth orbit de-orbiting.

KB: Emily Calandrelli, who does a lot of space-related science communication, she was following this one as it was coming down. And what shocked me about it was we really didn't know where it was going to be until, like, an hour before, even days before, it was like half of the Earth was in the possible target area. But she did say, at least this one, they thought. But, again, they didn't really know what exactly it was made of, but it would only take out a house or two.

J: A house or two.

S: Just a house or two.

AJR: It's all fun and games until it's your house.

KB: Since you suggested a city, a house was the better alternative.

S: Does space debris zero in on trailer parks like tornadoes do? I'm just wondering.

AJR: And lawn chairs and stuff.

J: But there's things to consider, though, because it's not just, could there be explosives in there? Could there be some leftover rocket fuel fumes? Or I have no idea, like, what potential explosive.

S: They're probably out of fuel.

E: Probably burned out.

J: You'd hope.

S: Yeah, you'd hope.

J: Who knows? What about waste? What about dangerous gases and things like that?

E: Well, when Columbia broke up in 2003 and came down over the American South and Southeast, there was concern that they didn't know what sort of contamination, I think, there was in some of the materials, that people were finding and picking up, a piece of a helmet and things. They warned people to not go near them. So I don't know what sort of danger that...

S: I don't know. It always comes up whenever they're sending up any satellite or anything that has a nuclear battery in it. If that thing blows up or reenters, then we could be dumping nuclear waste.

AJR: Well, now I'm thinking Cold War Sputnik stuff, too, where it's like, what if it's not an accident? Not to be the conspiracy theorist of the group, but that would be a good way to, anyway, I'll stop with that one thought.

S: All right.

Auditory Pareidolia Again (34:16)[edit]

S: This is actually a couple of years old, but it's making the rounds again, and I saw it. I don't think we've ever played this on the issue.

E: I missed it the first time around.

S: Take a look at this video, just listen to the sound. You don't have to see the video. So either think the word brainstorm or think the word green needle. And whatever you think, that's what you will hear. You don't even need to be caught with the actual words. You just have to think it. Isn't that bizarre? That's crazy.

KB: Although I'm hearing the green needle a lot more than I'm hearing the brainstorm. But it's distinctively green needle or not green needle.

S: Yeah, but I can flip both ways at will.

J: You would think, though, they seem like such different phrases phonetically and everything, but it's in there. There are things in there that will trick your brain for both of those.

E: It's uncanny.

AJR: It's not even the same number of syllables, which is surprising to me that it still works, right?

S: Yeah, it's one extra syllable.

E: Two versus three.

B: I think the distortion itself must be a critical component of the ability to switch between it from one to the other, perhaps. Otherwise, why make it sound so distorted?

IC: I believe it also works brain needle and green storm as well. If you try it.

E: How did you stumble upon this.

S: It's one of the more dramatic examples of auditory pareidolia. This happens in a lot of our sensory streams, but it happens a lot with language. Your brains are wired to make the closest fit to phonemes that you know. It's constantly trying to make that fit between speech sound and words that you know. That's why you can misunderstand lyrics all the time and misunderstand what people say. It sounds like something close to it. This is just demonstrating that in a very dramatic way. It's amazing how well the priming works.

E: As far as the distortion, when Bob brought up the distortion, it reminded me of, we talked about it on SGU, the doll that would talk. Full-string dolls. It has a recording. It's a voice, but it's a crackly kind of voice. It has a bit of distortion to it. And people think they're hearing things that the doll is saying that it really isn't programmed to say, but they can't distinguish what it was programmed to say. They're thinking what they think it's saying instead. We've come across this before in other mediums.

AJR: Is this behind those Disney conspiracies too, where they're like, there are secret horrible messages in various cartoons?

E: Remember [inaudible] is the light, that was one of the dolls that had it, but that's not really what the doll was saying, but it became spread virally and that's what everyone started to hear. It was saying because it was suggested that that's what it was saying.

S: The backward masking on records.

J: I was just going to say that Steve. I've listened to Stairway to Heaven backwards. I really hear a lot of stuff in there that has a demonic connotation. The words that they're saying. It's probably because I've been priming myself since I was a teenager. When I hear that, every once in a while I'll listen to it because it's actually kind of interesting. I'm hearing, here's to my sweet Satan and all that stuff. It seems very clear to me. Again, your brain is trying to make sense out of chaos. Sometimes your brain concocts something that isn't actually there.

S: It's kind of like the dress.

AJR: I was just thinking about the dress.

KB: Or Laurel and Yanni.

S: Yeah, Laurel and Yanni. The internet will spit out more of these things. We'll share them with you. This was a particularly impressive one.

[commercial brake]

The Alex Jones Saga (39:15)[edit]

S: All right. One thing that we can agree on, that is that Alex Jones is a giant douchebag.

J: Yeah. He is.

GH: You don't have my permission to use that photo. (laughter) I'm going to get your internet permission to not use that photo. Buy my vitamins.

S: I have a worse photo. All right, Kelly, give us an update on the Alex Jones saga.

KB: Yes, so I, like the insane person I am, have kind of had this on in the background for the last two weeks, and I was very glad to have an opportunity to put that to use. But in Steve fashion, I'm going to start with a question. So what percentage of Americans do you guys think question the Sandy Hook shooting?

E: 20%.

S: Four foot one.

J: 10%.

S: Question it? Probably I would say like 22%.

B: 22.1%.

AJR: 25%.

E: Wow.

KB: It depends on whether we're doing Price is Right rules or not, but I don't think we are because I didn't say it, so Andrea wins.

E: Oh, it's that high?

KB: So it's 24%.

B: That's horrible.

J: A quarter of the people polled.

E: Close but not close enough and that's sad.

B: Price is Right rules, I would have won.

KB: Granted, there's always issues with polling, but even if it's half that, that's absolutely insane, and it's almost single-handedly because of Alex Jones. So I'm going to talk more about the misinformation piece. I know everyone has seen all of the clips of his testimony and all of the perjury and all the fun stuff. But since this is a misinformation conference, I'm going to focus on that aspect of it. And I think as skeptics, we often hear the question, what's the harm? Especially with things like conspiracy theories or supplements. It's just easy to dismiss until it gets to this point. And Alex Jones took both of those things and ruined some families' lives. So some backgrounds. The caricature that you think of as Alex Jones is pretty much accurate. He peddles all of the conspiracy theories, 9/11 truth or pizza gate. Now he's talking about the globalists trying to bring about the New World Order. And when the Sandy Hook shooting happened, he almost immediately was questioning the narrative. And he's gone from saying it's a hoax, calling the parents crisis actors, and that's changed over time. His position has definitely evolved, but the consistent through line of that is that he's questioning the official story and doesn't think that the official story is true. And because of this, the families of the children who died have received death threats, they've been harassed, and they're dealing with this constantly circulating. So a bunch of the families have sued him, rightfully so. And so this trial was for the parents of Jesse Lewis, who was a six-year-old who died in Sandy Hook, for defamation and intentional infliction of emotional distress. And we're about to make fun of Alex Jones, but as we're doing it, keep in mind that this all sounds silly and ridiculous, but it's causing real harm to these families. And I don't want to make light of it, but at the same time, there's something really satisfying, especially in the misinformation apocalypse that we're in right now, about somebody who is this awful actually being held accountable. So we've got to at least appreciate that for a minute. Also, his lawyers are comically terrible. So that's just making it even better.

S: Just wonderful.

J: How can they be that? For a guy that has this much money, how could he─

S: Because he's a losing case.

KB: Because nobody wants to defend him.

S: He probably has been working his way down the ladder of terrible lawyers.

E: And you've had that experience.

KB: I mean, his lawyers were pretty terrible.

E: With your case, your opponent had that as well. He kept going through lawyers because nobody of quality would defend him.

S: Who wants to defend this guy?

B: Or. My theory is that they did it on purpose.

AJR: That's what I was thinking.

S: You think they're sandbagging?

AJR: Yeah. His morals got the better of him.

S: No. That's way unethical.

KB: That thought has been brought up. But the thing is, one, it's a civil case, so he can't get away with the whole, my lawyers were incompetent, so get out of it that way. But also, they cross-examined the parents. And I feel like if you were sandbagging it, you wouldn't want to inflict additional trauma on the parents. And some of the questions that he was asking them, I couldn't believe.

AJR: Have the lawyers made a statement about how it happened? Because it's hard to accidentally send a huge set of files or file. I always forget to send attachments.

S: Oh, the phone that's almost definitely going to the one-sixth committee is like a whole story in itself. But basically, the one lawyer said, please disregard after he accidentally sent the files, but didn't actually take the legal steps to pull back all that information. So they just got to use it after his ten days were up. This trial was specifically for damages, because Alex Jones didn't provide any of the documents or evidence that he was supposed to during the discovery phase, and he dragged things on for years, and so there was a default judgment. So it wasn't a question of if the defamation happens. The court had decided the defamation happened. This was just to decide how much he had to pay for it. And the trial was exactly as dramatic as the clips are portraying it to be, and I think this one exchange between Alex Jones and the judge is the epitome of his testimony at least. So I'm going to read that. I'm sorry, I don't have as good an Alex Jones impression as George. So the judge, after sending the jury out because Alex Jones was talking about things that he wasn't supposed to while he was on the stand, said: "You're already under oath to tell the truth. You've already violated that oath twice today." And granted, twice today. He had been on the stand for like 10 minutes by that point maybe. That might be an exaggeration, but it was end of the day, he had just gotten on the stand. "It seems absurd to instruct you that you must tell the truth while you testify, yet here I am. You must tell the truth when you testify. This is not your show." And then she explains some of the specifics, and she goes: "Do you understand what I have said?" And he goes: "I..." and she interrupts him and says: Yes or no." He goes: "Yes, I believe what I said is true." And she cuts him off. She goes: "You believe everything you say is true, but it isn't. Your beliefs do not make something true. That's what we're doing here."

J: Oh my god.

AJR: Wow. And you should really watch that whole clip because there was so much more of it, but I couldn't go into the whole thing. And watch all the clips from his testimony because it is absolutely horrifying, but also really satisfying because he's an awful person and deserves every bit of that.

J: And I can't help, through all the things that I've consumed about this man, I can't help but think that this entire thing is an act.

AJR: I was thinking the same, Jay. I'm wondering what you all think about that. You think he knows what he's doing and he's just pretending?

J: Of course, I'm not 100% sure, but it just seems like it is all a money-making act. I don't think he's a real conspiracy theorist. I think he is.

S: I think you're right.

KB: He uses his conspiracies to sell supplements because he'll talk about the conspiracy theory to get the views and then he pivots into an ad for supplements or for shelf-stable food because the Great Reset is coming and so you need to have food, or gold because there's going to be one world currency, so you need gold.

E: And didn't he admit as much during his trial with his, what, divorce with his wife, effectively?

S: Custody.

E: Was it custody?

S: Yeah, Alex Jones is a character that he is playing. That was one of his lines of defense, which I think probably is accurate. Again, we can't read his mind. We don't really know what he believes or doesn't believe, but it certainly is plausible and it certainly fits everything I've seen about him, that this is a character he's playing. He did admit that, which means he doesn't necessarily have to believe anything.

KB: But he's still doing the same level of damage, whether or not.

S: Totally. Absolutely. People believe that he's real.

AJR: Well, and he's doing the character under oath, right?

S: Yes, that's the thing. That has consequences.

KB: It's been so interesting to watch because he's not used to being challenged on his show. He has control over the entire narrative. Now he has to be in reality. And so he started to do one of his ad pitches on the stand. He started talking about how great his supplements are and they get the best supplements.

E: He can't help it.

B: Oh, my god.

E: It's all he knows, effectively.

B: If he can make a few bucks on the stand, why not go for it, I guess, right?

S: It's always satisfying to see, because this is not the first time this has happened, there are cases where people who are con artists or pseudoscientists or whatever, and they find themselves in a court of law where there are rules of evidence. Not that courts are perfect, but they do have fairly rigorous rules of evidence and argument, et cetera. Judges, if they're competent, aren't going to let you get away with stuff. And just watching that disconnect, somebody like Alex Jones who's living in a fantasy world, whether he believes it or not, he is used to being in this con artist construct, and now he has to deal with reality and rules of evidence, and the clash is just wonderful to behold.

AJR: It's kind of reminding me, Jay, I think you talked about this on a live, SGU Live, maybe a year ago when Sanjay Gupta was on Joe Rogan and we all expected it to be kind of like that, but Joe Rogan just sort of steamrolled the whole thing. This is what I wish that had been like, because now we're in a place where the rules, reality has to hold for a second.

KB: Fun fact, Joe Rogan was on Infowars on 9/11. As he was spewing his─

AJR: One of the least fun-fun facts I've ever heard.

KB: As soon as 9/11 happened, he was already spewing conspiracy theories, and then he had Joe Rogan on.

J: Wait, wait, Joe Rogan was on Alex Jones' Infowars show? Well, that guy literally just dropped lower than I thought he would. That is ridiculous. So I read in the chat, somebody said something about Texas tort law that drops the 45 million down to 750,000.

E: I read that too.

KB: From what I saw from the plaintiff's lawyer, he was saying, so there was talk about a cap because it was divided into two sets of damages. So there were the compensatory damages and the punitive damages. The compensatory damages were 4.5 million, and then the punitive damages were 41 million. And while we were waiting to hear what the punitive damages were, people were talking about a cap because it had to be a certain multiple of the compensatory damages. But from the statement that the plaintiff's lawyer gave afterwards, that was more of a guideline, not a hard cap.

B: More of a guideline.

KB: I'm just going based on his statement. I don't know anything about Texas law, not a lawyer. But that was what I heard about that.

J: I was hoping to see them literally dismantle him and his company. Why wouldn't this guy see prison time?

S: It's a civil case not a criminal case.

E: It's a civil case, you don't go to prison.

J: I understand that, but it doesn't mean that he can't be put in prison legitimately. He did perjure himself.

KB: That would be a whole other story.

S: That would be something emerging from the trial itself. But it's hard to bring criminal charges against somebody for what they're saying in a public forum because of free speech laws, etc. But civil is different. Holding people liable for the damage that they knowingly and maliciously caused, the law allows for that.

KB: One more thing I did want to bring up is, in my opinion, one of the best witnesses that they had. Her name is Becca Lewis and she does research in misinformation and disinformation and how it spreads. They had her on as an expert witness about misinformation. She talked about how and why it spreads faster than the truth since it feeds into people's world views, the confirmation bias. The things that confirm their existing world views are going to circulate, especially once you start to have echo chambers like Infowars'. Also, Alex Jones platformed other conspiracy theorists. There was one that she talked about who his content only had three views before Alex Jones started promoting it. It was something that nobody was going to see. But because of his platform, a lot of people saw it. Now we have 24% of the country who questions this main narrative. That was a lot of what the trial was about. He would claim, oh, I was just asking questions. I was just having these people on to get their opinion. Oh, my guest said it, but I didn't say it. But he provided that platform for them to get their views out. I think the most interesting thing she talked about was this idea of three degrees of Alex Jones. She said that you basically can't do misinformation research without encountering Infowars and Alex Jones. The common rule is that you're never more than three recommendations away from Alex Jones or Infowars videos.

S: Wow.

E: Ouch.

J: The way to restate that is you can't be more full of shit than Alex Jones.

KB: Yeah, basically. Jones' lawyer was trying to trip her up, and he was trying to use all of the things that a scientist or a skeptic would use. He's talking about sample size and bias and things like that because in any paper at the end, they're going to talk about all of the limitations and say, this is a potential limitation. This is a potential source of bias, but we tried to account for it as best we could. But she's a researcher, so she knew it a lot better than he did. So she'd stop and she'd be like, no, this is what that means. You have no idea what you're talking about.

J: Oh, that's great.

KB: Yeah, and he tried to say that she hated Alex Jones and things like that, and that would bias her, and she didn't know who Alex Jones was before she started researching this. And she just goes, yes, that's correct. When he'd present something, she'd say, yes, that's correct, and it's based on hundreds of hours of research. It's not just her opinion. And so he kept trying to trip her up, and the best part was he was asking her questions and said, the poll that found 24% questioned Sandy Hook, that it was under 1,000 sample size and was trying to discredit it that way. And she's like, you can have statistical significance with less than 1,000 sample size, like trying to explain that. And then the plaintiff's lawyer comes up and hands her the actual study and the Jones lawyer was full of shit because it was over 1,000.

J: So it wasn't even that, yeah. Even the lawyer is full of BS. We're really seeing this trend here with these crazy lawsuits. How do you defend Alex Jones legitimately? How do you do it? You literally have to try to slip through some cracks.

KB: Well, but you also don't have to defend him and say he's innocent. I mean, I know innocent and guilty isn't what's happening here because it's a civil case, but you don't have to say, oh, no, he didn't defame people. You can just try to mitigate the damage in an ethical way.

J: Detune it tight [inaudible].

S: But lawyers can give a defense they don't personally believe, they don't have to believe it. The ethics of law does not require that. It just has to be a legally responsible and viable argument. Their personal belief is actually not relevant to it. So as long as they are mounting an ethical defense, it's fine. But it's certainly reasonable to think that there isn't an ethical defense of somebody like Alex Jones because it seems so obvious that he's guilty. But again, the law is based upon the notion that everybody deserves a defense. But that doesn't mean that lawyers can do unethical things on the stand. It also is why I think that might speak to the quality of the lawyers because, again, the high-quality lawyers, Jones clearly has the money. He could pay some high-priced law legal firm to defend him. They probably don't want their reputation sullied with this. They don't want to go anywhere near it.

KB: Nobody wants to be the guy who defended Alex Jones.

AJR: Do we have any idea how much money, like what his net worth is? Like how ruinous is $41 million, $45 million?

KB: They were desperately trying to figure that out.

S: So officially, I'm sorry if you didn't notice, but officially it's $200,000 that his enterprise makes $200,000 a day. But $200,000 a day.

B: Is that net?

S: But that's probably an underestimate. And in the phone records that were revealed, on some days they make up to $800,000.

J: That was their best day.

S: That was a good day, yeah.

AJR: You guys have got to sell supplements, man.

KB: We've got to switch sides. But they had a really hard time figuring that kind of stuff out because he didn't turn over all the documents that he was supposed to turn over.

E: Right, part of the problem.

KB: So they couldn't really get a solid answer on that.

J: What kind of bullshit is that? Okay, so you don't do that. You don't turn over the documents. Doesn't the law, doesn't the court have the ability to deliver some type of incredible smackdown?

KB: So that's what they did. That was why there was the default judgment. And so that's why this was just for damages because they already determined that he was liable for the defamation and for the infliction of emotional distress.

J: I get that they clicked into summary judgment. We see we have some experience with that.

S: But in a good way. But yeah.

J: Don't you get into legal trouble if you don't hand over? Doesn't he have to now deal with the fact?

S: Well, you could be held in contempt, would be the legal remedy there. But just in a case like this, the remedy is you lose. You now lose the case and now we're going to talk about how much money you have to pay the plaintiff. So that was the remedy.

B: He was asked, turn over like emails or texts where, you mentioned Sandy Hook. And he said, I did a search on my phone, did not see any text that mentioned Sandy Hook. So I want to know what did the court or the judge do at that point? Because then, of course, afterwards they got two years of text and of course it's all over the place. So he was just flat out lying. But if they didn't get that dump, what recourse would they have had to say? Yeah, I don't believe you. I don't believe your phone doesn't have those.

S: They can get the info if they want to. They can get the info. They can appoint somebody to go through the phone and get the information that they want. I know like when I had to turn over my emails, I didn't do it. My lawyer hired an independent person to come in, go through all my emails and find the ones that were relevant. My hands were not on it at all. All right. Anything else you want to add before we move on?

KB: I will throw a quote out there from the lawyer today. So this was just the first of a few cases. And the plaintiff's lawyer said: "There's going to be a large set of plaintiffs dividing up the corpse of Infowars." And fingers crossed that that actually happens.

S: Yeah, that would be nice. Tiny slice of justice in this world.

AJR: The corpse of Infowars.

B: It's a nice sentence.

AJR: Add that to your Halloween display.

B: I would, I would.

Earth Spinning Faster (58:21)[edit]

S: All right, Bob. I understand that the Earth is supposed to be slowing down over the long historical time. But maybe that's not 100% true.

B: Well, I don't want to get everybody concerned. But the Earth is now spinning faster than it ever has before in the age of atomic clocks.

E: I thought I felt something.

B: January 22nd, this past year, January 22nd, no, June 22nd, 2022. The shortest day ever recorded. And we're not sure why. Should we be scared? Should we be afraid? So what's what's going on here?

S: You mean the longest day ever recorded?

B: What did I say?

E: Shortest day.

B: Shortest day.

S: Because the Earth is spinning faster.

B: Faster, so it's short days, right?

S: Yeah, it's getting shorter.

E: Yeah, it'd be shorter.

B: So it all starts with a day. What is a day?

E: Yeah, what's a day?

B: If you ask anybody, what's a day?

E: 24 hours.

B: 24 hours. Steve, what is that in metric? Oh, never mind. So a mean solar day is 24 hours. That's what it is. But that's the outer the outermost onion layer. As we say, you get a little deeper and it's never really 24 hours. Exactly. It kind of this is 24 hours. It goes a little shorter, a little longer. It's like right around 24 hours. 24 hours is should be the average. But it varies because you've got the interior of the Earth kind of roiling around. You've got seismic activity. You've got the wind, the wind running across the surface of the Earth and causing friction, pushing against mountains. All those things conspire to make the day slower and faster than 24 hours. But if you look at it over the over many decades, what you find is that the average is about 24.001 hours. So somebody asks you, how long is a day? You say 24.001 hours, because that would be more accurate. A little bit more accurate. But the problem here is that we have two ways to tell time. We have atomic time, which is extremely accurate. And here's solar time. And every day, if the Earth is a little bit slower, a little bit faster, it notches up and it diverges from atomic time. And after a while, you can't get beyond this, which is about, I don't know, 10 seconds. They don't want to get beyond that, whatever that is. So they throw in a leap second. That's what a leap second is. A leap second isn't because the Earth is slowing and slowing and slowing and we need to throw in a second. It's because because of that divergence between atomic time and solar time. That's what a leap second is. So but why is there this general average of slowing the Earth? There's a bunch of reasons. The main and most fascinating one for me is tidal breaking. It's because that damn Moon, the Moon is doing it, is doing it towards the end. The tides, it's happening because of the tides.

S: The Moon is stealing our angular momentum.

B: Exactly. Exactly. Because as because of the way the Earth is rotating and the bulges created by the tides, the Moon is pulling on those on that bulge, which actually causes friction on the Earth, which slows the Earth, making our days longer. And the moon is stealing our rotational energy, our angular momentum, because that's got to be conserved. And that's going into a higher orbit and getting farther and farther and farther away. And eventually, if the solar system lasts long enough, which it won't, it will get so far away that we'll be facing each other. The Moon and the Earth will be facing each other, will be tidally locked like the Moon is to us right now. So that's just the interesting aside of why the Earth is slowing.

J: Bob real quick. When and if that ever happens, does that mean that one side of the Earth would be getting Sun and the other side will not be getting sun?

B: No, it's all about the orientation of the Earth moon.

J: Right. It's not tidally locked to the Sun. It's tidally locked to the Moon.

B: Right. Now, if we were like─

AJR: Would the whole thing rotate, basically?

B: Yes. We would always be facing each other. Our orbit would be like this. Instead of the Moon is locked now and the Earth is rotating. So some side of the Earth will see the Moon always and the other side will never see the Moon. But that wouldn't happen because we're going to burn up before we get to that point, I believe.

E: Oh, thank god.

AJR: Perfect.

B: But there are planets that have been tidally locked to their sun because they're very big and they're very close to their parent star. So the tidal forces are strong enough to tidally lock that. But 2020, 2021, and 2022 were a little bit different. And it wasn't just because of that damn pandemic. Because these were the shortest days ever recorded. 2020 had 28 of the shortest days ever recorded since 1960.

J: What?

B: 28 days.

J: Why? (laughter)

B: 2021 also had a plethora of very, very short days. No dramatic records were broken in 2021, but they were still very, very short. Oh, and 2020, I think we all can agree that if the days in 2020 were shorter, that's a good thing because that year needed to be shorter than it was.

AJR: Literally the only good thing is this.

B: So 2022, we're not even done with it. We've already broken some good records. June 22nd was 1.59 milliseconds shorter than 24 hours.

J: Holy shit.

B: The shortest day.

J: Is that a lot? (laughter)

B: It's not an absolute a lot, but relative to history, it is a lot. 1.59 milliseconds. The short day, shortest day ever recorded, ever recorded. And then in July, we had a day that was the second shortest. So something's happening. So why do we have three years where the average day was less than 24 hours when over the past 30, 40, 50, 60, 70 years, the average day has been a little bit longer than 24 hours? Why? What's going on? Well, we're not sure. We're not sure exactly, but there's lots, of course, there's lots of scientists and their theories. They have got lots of ideas of why. One idea is that glaciers are melting and basically the poles don't have as much mass or weight by them as they used to. That's one idea that may be contributing.

S: So is that like a skater pulling in their arms?

B: Right. Yes. Distribution of mass as the skater pulling in the arms to go faster. That's definitely related. And related to that, Steve, is also another idea of why we're getting the speed up is the different movements of the molten core of the planet that could address that and speed up the Earth. Seismic activity is another option that they throw out. My theory is that it's the sheer mass of meatballs at Jay's house that is kind of screwing with our rotation.

E: That would do it.

B: Jay, I'm telling you, man. I've got two scientists that agree with that with me. But a lot of scientists will also throw out there the Chandler wobble as one potential reason why the Earth is speeding up.

E: Is that a dance? What is it?

AJR: Is that a France thing?

B: Yes. That's the joke. And I couldn't think of a really, really good version of that joke. But I'll just describe what it is. It's essentially the varying wobble of Earth's axis of rotation. It's actually kind of complicated. I'm trying to really wrap my head around what's exactly going on with this Chandler wobble. But it's the axis of rotation that varies, causing a shorter term wobble. So that's as much as I'll say about the Chandler wobble. Okay, so what does this mean? What's going to happen? What are some really bad things? Okay, it's the leap second that could be concerning here. Because we've had leap seconds. We've had plenty of leap seconds where you add an extra second to coordinated universal time. And that's been done. Nobody really thinks about it anymore. But it's problematic. In 2012, Reddit was taken down because of a leap second was added that year. Wow. And if I was into Reddit then as I am now, I would have been pissed if Reddit went down. But they've done tricks. They've got something called leap smearing, where they take microsecond slowdowns.

AJR: They need to rebrand.

B: Yes. (laughter) In the course of a day, they might do microsecond slowdowns leading up to the leap second. So that to make it a little more palatable, I guess.

J: Bob, but wait. I hate to cut in. But why does a fraction of a second matter in the world?

B: Well, it's not a fraction of a second. It's a full second. I mean, think about it, Jay. A second is small, but it's important. And computer systems and GPS and satellites, lots of things are interrelated. And it took down Reddit. This can happen. Y2K is kind of a related example of when you mess with something so fundamental. And I'll go into it in a little bit more detail in one second, Jay. So a normal leap second can be problematic. Perhaps it's not as much as problematic as it was. But a negative leap second, if the Earth keeps spinning faster and faster, or if we maintain this average of faster than 24 hours, then we may need to add a negative leap second. And that's much more problematic than a regular leap second, where you're skipping one second. It's tougher to do and more risky than adding a second for various technical reasons. For example─

S: This is going into the future.

AJR: This really sounds like a time travel episode. This is awesome.

B: Yeah, right? But smartphones, computers, communication systems, they synchronize using something called a network time protocol. And that network time protocol is based on the number of seconds that have transpired since January 1st, 1970. So you throw out a second there and things can go a little wonky. So that's a little concerning. It can cause some issues with these systems. Also, there's GPS satellites. GPS satellites don't account for rotation. They're not really built to deal with rotation. So if the Earth is spinning faster, the GPS satellite will all of a sudden be over a specific area a little earlier than it would have been previously. And that could mean the difference, even if the Earth sped up by a half a millisecond, it could be 10 inches or 26 centimeters off. And that would compound. And eventually the GPS satellites could be essentially useless if we don't do anything, which we probably will. It's not like, oh my god, GPS is going to be worthless.

J: And when you say do something, we've got to program this problem.

B: Yeah, I'm not sure what level of effort would be required, but I'm sure it's not going to be trivial. So some people say that this is going to be over soon and this increased rotation speed of the Earth isn't going to necessarily stay this way for years. Some people are saying this could be the beginning of a 50-year scenario where the Earth is spinning faster than 24 hours. And we may absolutely need to throw in some of these negative leap seconds, which could cause some problems. So that's the story. It's interesting. I'm not too worried about it. But we'll see if some negative leap seconds get thrown in there, and we might find out by the end of this year or the following year if this keeps up.

J: So, Bob, are you angry about all this? (laughter)

B: No. It was just an interesting research. It was actually tough. I'm answering. It was tough to really get to fully understand all the nuances here, because you've got sidereal day, solar day, mean solar day, all these things that are different websites had different takes on exactly what those mean. And it was interesting to put it all together and understand exactly what was happening. So, yeah, I enjoyed this.

S: A great bar bet that we were talking about when we were talking about this before. So, Andrea, how many times does the Earth rotate on its axis in one year?

AJR: 365 and a quarter, isn't that it?

S: Wrong. 366 and a quarter, because in going around the Sun, it's got to rotate one extra time. A day one day is a full rotation plus a degree, a full rotation plus a degree, and it adds up over a year to a whole other rotation.

B: 361 degrees is the mean solar day, 24 hours. A sidereal day is─

S: 23 hours and 56 minutes.

B: Exactly.

AJR: Wow.

B: 23 hours and 56 minutes. It's four minutes. But there's also lots of variations.

AJR: You're going to leave work early and be like, I'm on a sidereal day.

J: That is such a skeptic thing. Wrong. 365. You know what I mean? Come on.

B: But also, the other nuances is that the day varies depending on where you are in the orbit and what season it is and the tilt of the Earth. There's so many little factors that go in here to make it extra confusing.

J: So can't we help by having a party somewhere on Earth that will slow the rotation down? There must be some human configuration that we could do. We all go to the North Pole at the same time.

E: We all have to jump at the same time so that we can alleviate the pressure.

AJR: Should we do it now?

KB: It would be like in an elevator.

J: Andrea, it would be like an 80s movie, like the end of an 80s movie where we all jump.

AJR: And like slow motion and freeze and this is us saving the world.

S: Everyone needs to jump at the same time or we have a negative leap second. You choose.

AJR: Right.

B: All right.

S: All right. Thanks, Bob.

B: I'm glad that's over.

AJR: That's cool.

S: All right, guys. You know what time it is.

J: Science or fiction.

E: It's time.

S: It's time for science or fiction.

Science or Fiction (1:11:11)[edit]

Theme: Misinformation

Item #1: Reported trust in the media in 2021 was highest in China at 80%, and lowest in Russia at 29%, with the US in between at 39%.[6]
Item #2: Analysis of social media posts finds that bots are far more likely to spread false information and are responsible for as much as 90% of its spread on the most popular platforms.[7]
Item #3: Research shows that fake news spreads 6 times faster and 10 times farther on Twitter than true news, and that people are 70% more likely to share a false tweet than a truthful one.[8]

Answer Item
Fiction Bots spread 90% of disinfo
Science Reported trust in the media
Science
Fake news faster & farther
Host Result
Steve win
Rogue Guess
Evan
Reported trust in the media
Kelly
Reported trust in the media
Jay
Reported trust in the media
Bob
Reported trust in the media
Andrea
Bots spread 90% of disinfo

Voice-over: It's time for Science or Fiction.

S: I have three items here. There is a theme to these items. The theme is misinformation. Pretty obvious theme. These are things you may have heard before. And the details matter because you know these things in broad brushstroke. One of these things may be wrong because of the details. Someone's going to warn you about that. All right. Let's get going. Item #1: Reported trust in the media in 2021 was highest in China at 80%, and lowest in Russia at 29%, with the US in between at 39%. Item #2: Analysis of social media posts finds that bots are far more likely to spread false information and are responsible for as much as 90% of its spread on the most popular platforms. And item #3: Research shows that fake news spreads 6 times faster and 10 times farther on Twitter than true news, and that people are 70% more likely to share a false treat. Tweet. I like sharing false treats better than myself.

AJR: It's like we're going to put spinach in a brownie.

S: The truthful one. So, spinach flavored candies.

J: That's going to be fun editing, Steve.

S: We're going to order this way. Evan, we're going to start with you.

Evan's Response[edit]

E: Okay. I guess.

J: Wait, wait, Steve. We have a guest.

S: I know. And the political scientist is going last.

AJR: I'm ready for this one.

S: Can you ask the audience to vote for which one they want? We'll get that at the end.

E: I'm going to take them in reverse order if that's okay.

S: Go ahead.

E: Let's see. On Twitter, six times faster and ten times farther and that people are 70% more likely to share a false tweet. I think those numbers line up correctly. Twitter has reach. It has arms and it is deep and pervasive. So, I'm not really surprised by those numbers in that regard. It's kind of disappointing though. The second one, social media posts find that these bots are more likely to spread false information and are responsible for as much as 90%. Wow. Of its spread on the most popular platforms. Well, there are bots. There's no doubt about that. But to this degree, that even surprises me. But if you think about it, depending on how you program the bot, you could do it in such a way that this would be possible because the thing is it's a runaway train basically. So, yeah, it could go that fast. The first one is the one I'm not really in sync with here. And I think part of the one, my gut is telling me it's the U.S. percent here, 39%. I actually think my memory serves it's lower than that. So, these numbers I think are out of sync. Perhaps it's the U.S. 29, Russia 39, China 80, something like that or they're all just entirely wrong. I think that one's the fiction.

S: Okay, Kelly. It's your first science fiction on the SGU. Don't blow it.

Kelly's Response[edit]

KB: I'm just glad I got out of the guess going first thing. So, I'm going to go reverse order too because I feel pretty good about number three because misinformation is designed to be more interesting and more shareable, so six times faster seems reasonable, if not low. Ten times farther makes sense. I see no problem with that. Number two, 90% sounds kind of high, but if the bots are just sharing misinformation rather than creating it, I could see that happening because sharing is a lot easier on social media. So, I guess that leaves me with number one. I'm not quite sure why I think number one is the fiction, but process of elimination.

S: Okay, Jay.

Jay's Response[edit]

J: I'm going to start with number two because I think that one is the most likely to be science. Social media bots are absolutely real. They absolutely are spreading misinformation and I'm not surprised to hear that they're spreading up to 90% of that misinformation. So, that one seems pretty obvious to me. Going to number three about fake news spreads six times faster and ten times farther, I mean that sounds legitimate as well. That tracks with all the information that I have with my head. The first one I think is the fiction and I'll give you a couple of reasons why. One, I think that for some reason that 80% number in China, it seems a little high to me. But the one that really has got me here is that Russia is as low as 29%, which from my understanding, especially because of the Ukraine war, that there is quite a bit of belief in the media coming out of Russia by the citizens there and I think it's much higher than 29%.

S: Okay, Bob.

Bob's Response[edit]

B: Wow. When I first read that first one, 80 percent, 29 and 39, it seemed fairly spot on to me and now I'm questioning myself based on what everyone's saying here. The second one seems reasonable to me. 90% at first blush seemed pretty high, but I mean that's what bots do. It doesn't take much for them to do that and Kelly was saying that they're not creating it. They're just kind of spreading it, which is very easy for them to do. It's kind of what they've designed to do. And three makes a lot of sense to me here as well. Again, as Kelly said that these are designed to be spread. This misinformation is designed to be enticing and clickable and spreadable. So that makes perfect sense. So yeah, there's just a lot of opportunity for this first one here, which is trust in media. There's a lot of potential for one of these to be off. So I'm going to agree with everyone and say that that's fiction.

S: Okay. And Andrea.

E: Correct us all, Andrea.

Andrea's Response[edit]

AJR: All right. Well, I'm going to put – this is really going to be unfortunate when I get this wrong because I studied trust in media for some time and those numbers actually seem okay to me.

B: Can I change mine? I'm really doubting it because it's been a while since I looked. So I don't have 2021 numbers. And just like you said and just like we see with Contagion and all of that, the more everyone else doubts one, I'm like, maybe, I don't know. But I think Russia seems a little low. China is always sky high. And I like it because it's a good example of surveys as blunt instruments and getting someone to report they have trust in media versus whether they have trust in media. There's a lot more nuance to it than that. But it's one of the findings is that it comes out that way even though – anyway. So I'm going to say one is science. Two is the one that I'm hung up on. And I think I've fallen into this trap before and this is one of the reasons I'm bad at science or fiction. But I think it's so vague. The 90% feels like a lot but it could be they're all just talking to each other and we just don't see it or we're not following it. But I feel like it's the most popular platforms. We're not great at necessarily identifying bots. I just feel like there's a lot of detail but it's not as crystal clear as I would like to believe that it's science. And then number three, I was pretty convinced was science and then Kelly's social media expert take fully convinced me that it was science.

E: Yeah.

AJR: So I'm going to say one is fiction. I'm sorry. Two is fiction. Two is fiction.

S: Andrea is departing from the crowd.

Audience's Responses[edit]

S: Ian, do we have votes from the listeners, the audience out there?

IC: We sure do. So with 28 votes, number one is winning 68%. Number two has 14%, although it's still kind of moving. And number three is 19%.

S: Okay. So pretty close to the panel. So I guess I'll take this in reverse order since everyone agrees on the third one.

Steve Explains Item #3[edit]

S: Research shows that fake news spreads six times faster and ten times farther on Twitter than true news and that people are 70% more likely to share a false tweet than a truthful one. Everyone on the panel thinks this one is science, most of the audience thinks this one is science, and this one is science. This one is science.

AJR: I get so stressed out for this.

S: We actually talked about this study I think on the show, whatever it was, a year or two ago when it came out. And yeah, there was a massive review of tweets over years, millions of tweets, and they found that the evaluated tweets that were objectively true or objectively false, the ones that were false, they spread much faster and they had, they would average 10,000 retweets, whereas the true ones would rarely get above 1,000, so they had basically ten times deeper into Twitter, and that people were 70% more likely to share a false tweet than a truthful one. As Kelly said, if you are unfettered from reality and facts and truthfulness, then you can formulate your tweet to be more shareable and interesting. And also, other research shows that the factor that probably predicts whether a tweet will be shared is that it's novel. It's something new. People haven't heard before. And again, that's more likely to be true if the tweet itself is fake. You can make, you can craft a novel tweet. This chorizo is actually a sun. You can craft a novel tweet if you don't have to be true. I guess we'll keep going, we'll take this in reverse order.

Steve Explains Item #2[edit]

S: Number two, analysis of social media posts finds that bots are far more likely to spread false information and are responsible for as much as 90% of its spread on the most popular platforms. Andrea, you are pretty much out there on your own thinking that this one is the fiction. Everyone else thinks this one is science. The majority of the audience thinks this one is science. And this one is the fiction. (applause)

AJR: My whole field was on the line. There's also a lot of people in the chat being like, we changed our vote based on what you said.

E: Listen to the experts.

S: This was the gotcha one, because everyone thinks that bots are out there menacing the social media and doing all this horrible stuff. People are far more likely to spread false tweets than bots. In fact, the same study as number three looked at this, where bots are actually as likely to spread a true tweet as a false one. There's no discrimination. They basically amplify tweets by automatically spreading them, but they're not good at discriminating. Also, they're not good at retweeting, because they don't really know what are the good tweets. Basically, the AI is just not that good enough to be as good as people, whereas people are much more effective spreaders of tweets and are also way more discriminatory towards false tweets. I made it platform nonspecific, just the popular ones, but the bots are, again, not to say they're not a problem, not to say they're not making the problem worse, but actually people are far more of a problem than bots, because they actually are more likely to spread the false bits of news, whereas bots are really not good at discriminating.

AJR: I've got to follow more bots, then.

Steve Explains Item #1[edit]

S: Okay. All this means that reported trust in the media in 2021, Jay, before the war in Ukraine, was highest in China, 80%, and lowest in Russia, 29%, with the U.S. in between, 39%. This is science, as Andrea said. I was very careful to put reported trust in the media, because that's only what we know, and that 80% in China, there's a lot of reason to think that that's what people are saying, but not necessarily what they believe. What was interesting to me here, because I immediately saw that, of course they're saying they trust the media, because they don't trust to say that they don't trust the media, they don't have the security to think that they do that. But then the disconnect between that and Russia being literally at the bottom, wouldn't it be the same in Russia? Despite the fact that these are both authoritarian regimes, there's something fundamentally different about the experience of people in China and in Russia. This is in 2021, maybe it's different now because of the war in Ukraine, and so it's a good thought, Jay, to look at that detail.

B: Could it be Russia's access to the Internet? Because we know China's locked down, much more on the Internet than I think we should.

S: Yeah, I don't know, maybe that's the difference. I don't know how much different that is between China and Russia.

J: I was so sure I was right. It's really unbelievable how you just don't know anything.

AJR: This is a little bit outdated and certainly self-serving, but so my dissertation was on censorship in China, and I mostly looked at what they covered in their national news, but I compared it to Russia, Venezuela, France, and the U.S. for a couple of big events. And Russia, one of the things, I mean, these are my ideas, this isn't necessarily founded science, but I'm trying my best, but exactly right. It's much more interaction with the outside world. Basically, my whole argument is you can't censor that much if there is information flowing from elsewhere. And so Russia can't get away with that kind of thing, and therefore I'm not surprised to see the lower numbers.

J: Why didn't you say that? Why didn't you say that before?

S: The most important thing is…

AJR: I wasn't sure if the actual numbers were right, but China being higher than Russia is not surprising.

J: You know what I thought? I really did think number two was wrong, Steve. I really thought that that was wrong. Yeah, I thought that one was a fiction. I really did mostly that. I went with number one because I thought I was helping out Andrea.

S: That's very nice of you, Jay. The important thing to remember here is that my decision to have Andrea go last was the right one. It was absolutely spot on.

J: Yeah, you didn't get it now. I understand now, Steve.

S: Yeah, okay. All right.

Skeptical Quote of the Week (1:24:57)[edit]

An educated person is one who has learned that information almost always turns out to be at best incomplete and very often false, misleading, fictitious, mendacious – just dead wrong.
– Russell Baker (1925-2019), American journalist and Pulitzer Prize-winning writer, host of PBS' Masterpiece Theater

S: Evan, take us out with a quote.

E: "An educated person is one who has learned that information almost always turns out to be, at best, incomplete, and very often false, misleading, fictitious, mendacious, just dead wrong." Russell Baker, American Pulitzer Prize winning writer, host of PBS's Masterpiece Theater.

S: Very nice and very appropriate.

E: Very appropriate.

S: To the conference.

E: Absolutely.

S: Well, this was a ton of fun. Thank you all for joining me. Andrea, welcome back.

AJR: Thank you.

S: Always great to have you on the show. Kelly, welcome to your first SGU.

E: Well done.

KB: Thank you.

S: You did great.

E: Really good job.

S: Thanks for talking about Alex Jones with us. And remember, this is coming out soon. The Skeptics Guide to the Future.

J: You can pre-order right now.

S: Pre-order right now. Seriously, it does help us a lot if you pre-order. That will help promote the book tremendously. So please check it out. We really would appreciate it. Our first book is also still for sale, The Skeptics Guide to the Universe.

E: Nice pair.

S: If you're new to the show and you didn't realize we have a book out there, we do. Just check it out. It basically goes over all the stuff that we talk about on the show, all the critical thinking skills and such. And again, thanks to George for hosting the NECSS for us. And as for his kind introduction, thanks to everybody working behind the scenes.

Signoff/Announcements[edit]

S: —and until next week, this is your Skeptics' Guide to the Universe.

S: Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information, visit us at theskepticsguide.org. Send your questions to info@theskepticsguide.org. And, if you would like to support the show and all the work that we do, go to patreon.com/SkepticsGuide and consider becoming a patron and becoming part of the SGU community. Our listeners and supporters are what make SGU possible.


GH: (in a Russian accent) I think is basically—I don't trust, I don't trust the media because I am inebriated. And feel free to talk about— (Rogues laugh.)

[top]                        

Today I Learned[edit]

  • Fact/Description, possibly with an article reference[9]
  • Fact/Description
  • Fact/Description

Notes[edit]

References[edit]

Vocabulary[edit]


Navi-previous.png Back to top of page Navi-next.png