SGU Episode 517

From SGUTranscripts
Jump to navigation Jump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.
  Emblem-pen-orange.png This episode needs: transcription, proofreading, formatting, links, 'Today I Learned' list, categories, segment redirects.
Please help out by contributing!
How to Contribute


SGU Episode 517
June 6th 2015
Laser3.jpg
(brief caption for the episode icon)

SGU 516                      SGU 518

Skeptical Rogues
S: Steven Novella

B: Bob Novella

J: Jay Novella

E: Evan Bernstein

Quote of the Week

"Belief is so often the death of reason."

Anton Lesser as ex-maester Qyburn, Game of Thrones, Season 5, Episode 8

Links
Download Podcast
Show Notes
Forum Discussion


Introduction

  • Game of Thrones zombies

You're listening to the Skeptics' Guide to the Universe, your escape to reality.

Forgotten Superheroes of Science (3:48)

  • Magaret Hamilton: Saved the landing of the first moon mission with her robust computer code

S: Bob, tell us about this week's Forgotten Superhero of Science.

B: All right, guys, this week, in Forgotten Superheroes of Science, is Margaret Hamilton - Not the witch. She was a software engineer – yes, the Wizard of Oz -

E: Now I get it.

B: She was a software engineer, pioneer, who, including her team, helped prevent an aborted first landing on the Moon.

E: How'd she do that?

B: Well, you shall see. Hamilton was the director of the software engineering division of the MIT instrumentation laboratory. And they worked on the on board guidance software that helped us land on the Moon. And this software – I didn't realize this – this software was one of the first times that pure software was allowed to handle such a real time, mission critical task. It was really, you know, a little scary, 'cause they had never really done anything quite like this before.

Now, remember, this task that she was given, her and her team, this was given before you could almost literally find any computer science, software engineering courses in any school. They just weren't really offered at that time. She was self-taught. She was a mathematician before that. And in fact, she coined the term “software engineering.” I did not know that.

She said, “I began to use the term software engineering to distinguish it from hardware, and other kinds of engineering. When I first started using this phrase, it was considered to be quite amusing. It was an ongoing joke for a long time,” which is funny, 'cause they figured, “Ah, a software engineering?” That was kind of oxymoronic. But no, it's not. It's really, of course, a legit branch of engineering. And it's a huge, gargantuan industry today.

In this role, that she played, she was among the first to develop critical concepts like asynchronous software, priority scheduling, end to end testing, and human in the loop decision capability. And much of those things became, later on, the bedrock for ultra reliable software design, that are used in critical, life-dependent systems. Notably, it was used in subsequent NASA programs, and including Skylab as well. So she was a real pioneer.

I'm assuming a lot of you guys probably heard that the landing module was almost running out of fuel, before they landed. I think that's a fairly common story, that they cut it really close. But, did you know that three minutes before touch down, the software that this team and Margaret worked on, triggered multiple alerts that freaked everybody out.

E: Uh huh

B: They were getting these nasty errors, and they're like, “Holy crap! What does this mean? And should we abort because of it?” So it turned out afterwards, after the dust settled, they finally found out what happened. What happened was that Buzz Aldrin had a check list of things to do. Bam bam bam, one, two, three; in order, critical things that he had to do. But that checklist was in error. There was a mistake in that check list.

It asked him to turn on the rendezvous radar system, which was not needed in order to land. So it was too early, it was placed too early in that check list. But, of course, he was following the check list, and he turned it on. And that almost ended in disaster, because, when that radar was turned on, it assaulted the computer with too much extra work, which caused overflow errors. For a lot of computers, even today, computers will not be happy with that, or even seize up, or stop working totally.

S: Bob, did he get the equivalent of the blue screen of death?

B: (Chuckles) No,

S: Can you imagine that when you're trying to land on the Moon?

B: Oh my god, no. That would be scary. No, but he, it really was just like, these three-oh-two errors and things, but there was beeping noises, and they were

E: Alarms

B: people were a little freaked out, especially when you're minutes away from landing on the Moon. Hello!

E: Yeah

B: It's the last thing you want to hear. So, what happened was that Hamilton and her team, they had designed the system to be so robust, that this is what it did: It basically ignored those low level requests, and it focused on the top priority tasks at hand, that the navigation computer was doing, namely, landing the craft. So that pretty much saved the day. And also, this software could, oxymoronically, immediately reboot, to flush the data. I mean, those are two words you don't hear together, right? Immediate, and rebooting? But this was able to reboot itself,

E: Wow!

B: almost instantaneously. So they quickly realized, when this was happening, that these were the types of errors that they could ignore, and not worry about, 'cause they were going to be discarded by the computer, to focus on the important stuff. And that they could then proceed with the landing, and preventing an aborted landing, or worse. I mean, who knows, that's it! So remember Margaret Hamilton, the Wizard of Oz witch, and the one I was just talkin' about. Mention her to your friends, perhaps when discussing the role of bit line wires in rope core memory.

Steve's Computer Rant (8:43)

S: What's amazing to me, is that forty-plus years later, computer programs are not smart enough to do what she figured out back then, you know what I mean? You guys have had this experience, right? Where you tell the computer to do something, like copy files, or whatever. You know it's gonna take an hour or two. And it runs into a problem.

B: Ah!

S: Like, do you really want to move this file, or whatever. And it

E: And it stops the whole thing!

S: stops! It stops! I can – not smart enough, they haven't figured out to just put that aside, complete the task that you're doing, and then at the end of it, you can ask me, “Hey, did you want to do one file? I put it aside for you.” You know? That doesn't seem so amazing,

B: Right

S: that a computer shouldn't be able to do it. But I just don't understand, Jay, this is what you do for a living, right? I know

(Evan laughs)

S: you're involved with computers. Why, that little stuff like that drives me crazy, and I don't know

J: So you're, are you saying that when the computer locks up, and only lets you – all it lets you do is make that seemingly non-important decision

S: Yeah

J: and everything else stops?

S: Yes

J: Yeah, I mean, I don't know, Steve. It's a hard thing to crack, because, for some reason, the people that made the operating system – you know, a lot of this is built into the OS.

S: It's like, it's exactly like saying that we cannot take off the ship until we have our full complement of lemon soaked napkins.

(Loud laughter)

B: Oh my god! What a great analogy! That's great!

S: That's what it is! It's like, we have this little thing; I'm gonna stop the whole process, until you answer this stupid question about this one

B: Right

S: insignificant file, or whatever it is.

B: Now, Steve, don't forget, that's a good point. And I totally agree, absolutely. But don't forget, what we're talkin' about here, we're talking about computer systems software that lives depend on.

S: Oh, I know!

B: So these are hyper critical, hyper redundant. These go above and beyond what your average OS is gonna do.

J: Yeah

E: Sure

B: But point taken. To me, that's a brainer. Like, that should have been solved. And I'm sure there are lots of OS's that do do that, but the ones that I frequently use do not.

S: I just, it does amaze me that no one at Microsoft, for example, has figured out (Evan laughs) that this is insane, that this is a serious end user issue that they should fix.

E: It's only been around for fifteen years, though. (Laughs)

S: It's still the way it works. It still comes up.

E: Yeah

S: I'm amazed every time. Like, “Really? No one has figured out

B: Yep

S: that this is a dumb idea?”

B: No, it's happened to me a million times. I get so frustrating.

S: I think, in general, I think that, I mean, obviously, some pieces of software are fantastic. And I know it's really complicated, but being (Evan laughs), using lots of different software all day, there's lots of programs, and really expensive, high end programs that have little end user issues that drive me crazy. And I do think that the software, there's some problem with how software is created. Like, the engineers, or whoever, is designing the interface, aren't really thinking in a sophisticated way about the experience of the end user, you know?

Don't get me started on my electronic medical record system at work. It is the worst piece of shit interface you could imagine.

E: How many people use that?

S: Mind boggling. It's like, it's a huge segment of the health care industry.

E: Oh, wow! Is it really?

S: Yes.

E: Oh, that's bad.

B: But Steve, that's funny, I experienced maybe something similar to what you're talking about. But from an end user perspective, I've gone to this doctor now, every month for whatever, for five or six months, and I missed an appointment, and I had to pay a fee. And every time I go to the doctor now, for the past three times, they're saying, “Oh, you have to pay this fee.” And I keep saying, “I already paid it. Please take that off the record. I already paid this. I mean, how many times do I have to say or pay for something

S: Yeah

B: before you'll stop getting this pop up saying 'Oh, by the way, Bob did not pay this late fee'” Like, I did it! And that's minor compared, I'm sure, what you're dealing with.

S: It's just, you know, it's obviously a very complicated piece of software. And it all works on the back end. It's just the user interface is thoughtless. It's like, no one is managing the experience of the end user. And that's a huge difference, just managing that experience versus not managing that experience. Huge difference. You can tell when it's been done, and when it hasn't been done.

B: Yes

S: You know, I'm constantly working for the software, instead of the software working for me, you know what I mean? It's like, I'm jumping through hoops, and doing stuff to make it work, rather than it making my life easier!

E: Yeah, who's the machine here? Who's the master.

S: (Chuckles) Right. Who's working for whom? It's bad. It actually costs a lot of money, because, on average, (this is not my statistic, this is what they've even told us), on average, when the system gets used in an office, it drops productivity by twenty percent!

B: Ow! Twenty!

S: Twenty percent hit because

J: Easily

B: Ohh! That's huge!

S: That – something is wrong when that's happening.

J: They say this is a really old for internet development. You know, go back I think, about ten or fifteen years, some one wrote a book called, “Don't Make Me Think.” It says interfaces should never stop the user from moving forward,

S: Right

J: Or not slow them

E: Right

J: down too much, right? You know

E: Yeah

J: you just don't want the person to sit there and have to go, “Oh my god! Where is this? How do I do that?” That's

E: Yeah

J: so notorious for Microsoft products. You know, you ever use Excel or Word, and you just, you are looking for things quite often.

S: Yeah

J: You know, it's just not there. And then, the other thing that blows my mind about Microsoft is, then they will completely reinvent their interface!

(Chuckling)

J: I know Outlook

E: That does happen.

J: 2005, I think it was, and now the latest version, like, the ribbon version they have now, it's like I'm in high school again.

S: We got frustrated with Skype, remember? We've been using Skype for this show for the last ten years, and it seems like every time they do a major upgrade, they make it worse, because they just, they're taking an interface which has a certain intuitiveness about it, and then you get used to it. And then they change it. They don't make it better, they just

J: Yep

S: change it. And oftentimes, they make it less intuitive! And you're right, that's a perfect title. Don't Make Me Think. That's exactly right. Software should be designed so that you don't have to do a lot of mental work to navigate the software. Like, basic things like make it easy for me to find what it is I'm trying to interact with right now, instead of burying it in all of this extra data that I don't need right now. And you should know I don't need it! 'Cause I just asked you for this one piece of data. Please just give me what I'm looking for, and make it really obvious and easy for me to find. Don't make me hunt around my screen for the thing I just asked you for, you know?

J: It's really expensive to develop good interfaces. There's lots of techniques. Like, as an example, there's something called split testing, where they'll use, I like to think, like two somewhat similar interfaces. One difference could be the “Okay” button is on the left, versus on the right, or whatever, right? They do things like that, and then they watch user experience. They'll actually video tape people

S: Yeah

J: using it, or interview people using it, or just track what they're doing with like, a heat map type of thing. They could see what people are doing, and what they click, and how long it takes, and to do what they gotta do. And then they make iterative advancements 'till they literally get it to the point

S: Yeah,

J: where

S: that's good!

J: it's as good as it's gonna get.

S: You know what else is expensive? Twenty percent drop in physician productivity, that's really expensive too.

J: Yeah, but that doesn't hit the people who are selling the software, the people that are selling the software just want to make the sale, you know what I mean? And it takes, like I said, it's a lot of work, and it takes a lot of expertise to be

S: I know! I

J: really good at doing this type of thing.

S: I believe that. I don't think this, what I'm talking about, I mean, I think there is some low hanging fruit that they're ignoring. But I understand that the really sophisticated stuff, like knowing where I'm looking on the screen, and not making me hunt around for something, could be complicated to make it work well. But it's worth the investment when you have a piece of software this important, and this widely used, there's no excuse, you know, for the level of crappy user interface that it currently has.

J: Well, last think you want is your doctor, between every office visit, getting furious with his computer.

S: Exactly! I'm doing a lot of mental work, you know, managing my patient, thinking about their complicated diagnoses, and my treatment options. You don't want me using my brain to try to figure out a friggin' piece of software that I'm interacting with. You want, that should be so simple, it's not taking any attention away from practicing medicine, which is how the software is designed to function.

E: You just need a lot of minions. You know, like in Despicable Me.

S: Yeah

E: A bunch of those little guys to do all that kind of crap.

S: Yeah, but you really can't, as a physician, 'cause there are certain things you just have to do yourself.

J: I really do think a lot of these problems are gonna be solved when we get software that can operate, you know, it doesn't have to be artificial intelligence, but it'll have intelligence-like qualities.

S: Yeah

E: Watson!

J: Yeah, but I'm thinking of it more on the fly, like wouldn't it be wonderful if you're using a piece of software, and then you could just tell the software, “I can't find this. Where is it?” And it knows what you're talking about.

S: And then a little paperclip shows up on the lower right?

(Laughter)

E: With a cartoon bubble, that's

B: Clippie!! Clippie!!

E: Hello, enter username, I am Clippie.

S: All right, we've ranted enough for one week, I think.

B: Yeah

S: Let's move on.

News Items

Laser Weapons (18:18)

(Commercial at 29:30)

Chocolate Science Sting (30:42)

How Many Species? (41:44)

Quickie with Bob: Proton Spin (48:22)

In Memoriam: Dr. Wallace Sampson (51:24)

Who's That Noisy (53:54)

  • Answer to last week: talking piano

(Commercial at 56:56)

Questions and Emails

Question #1: Rape Statistics (58:38)

S: So, we got a couple of quick emails. These are follow ups to topics that we covered on previous episodes, and good feedback I want to talk about. The first one comes from Max, right here in Hartford, right around the corner. And Max writes,

In a recent Science or Fiction, Steve included a news item suggesting that about twenty percent of college females had been victims of rape or some other form of sexual assault. The rogues were almost incredulous upon hearing such a high figure, as was I; not just because it seemed incredibly high, but because late last year, I’d read an article reporting that the number was around 0.61 percent of college females. The article claimed to have gotten this figure from a recent DOJ report on sexual assaults. I was able to dig around for the report, which can be found in the following link from the Bureau of Justice Statistics website: http://www.bjs.gov/content/pub/pdf/rsavcaf9513.pdf Honestly, I’m not even sure what to believe anymore regarding this topic, but I trust the SGU more than I do most media outlets. So if you say the twenty percent figure is the more accurate figure, then I’ll be more inclined to dismiss the DOJ’s report. If not, then it seems you guys are due for a correction. Thank you for the work that you do.

Thank you, Max. That's a good question. We got a number of questions about that. Not as many as I was expecting, actually. Whenever we touch on a big topic, like for Science or Fiction, and I do a quick treatment of it, it's hard to cover this whole issue in the very few minutes we have for a Science or Fiction wrap up. But that's why I'm taking a second bite at it now, to give a little bit more of the background.

So I tried to add enough caveats into my original discussion, to make sure it was understood that that was one survey about upstate New York University saying that 18.9 percent of the women interviewed said that in their freshman year, they were the victim of either a rape or an attempted rape. That's a high figure. And it was definitely surprising.

This issue has – I've been paying attention to this issue for a while, and trying to really wrap my head around these statistics. I think the bottom line is that we really don't know what the number is. We don't have sufficient data to really nail down what the actual figure is.

So here's what we do have, right? We have a couple of big surveys from two large universities, one in the mid-west, one in the south. This was the survey that has been widely reported. It was referenced by Vice President Biden in a speech. This is the one in five figure that has become sort of the “go to” for people saying that rape on college campuses is a huge problem.

So, the problem with those surveys – they showed about twenty percent, right? About one in five. The problem with those surveys is that it was self-selective. It was a web-based survey, and so people would choose to give the survey. So that means it's self-selective. And the reporting rate was fairly low, which means that self-selection bias can have a huge impact on the actual numbers. So it's hard to use that as a single source for saying “This is what the number is.”

Max referred to the Department of Justice statistics, and yes, that was a different kind of survey that showed their number was 6.1 per one thousand. By the way, it was 7.6 per one thousand for women who are not students. So the risk of being the victim of sexual assault was actually higher if you were not in college at the same age.

But in any case, now, that's probably an underestimate for a number of reasons. One is that this was a household survey. So it's quite possible that they would call somebody up, and then that person would give the responses for the household. So they may have been talking to a parent, saying “Has anyone in your household been the victim of rape?” And their child didn't disclose that to them, they wouldn't know, you know, so that's been criticized as likely being an underestimate of the actual number.

So we have some methods which probably underestimate the number, showing this .6 percent. Other surveys, which are probably overestimating the number – again, this is rapes and attempted rapes, or sexual assault. It kind of depends on exactly how you ask the question, which is true of all surveys. But we're getting eighteen, nineteen percent at the high end.

One of the surveys that I think has the best methodology, actually, my wife heard about this, 'cause this is what she does for a living, is this kind of stuff. She works in a college counseling office. When she said, “Oh, the best statistics come from the National College Health Assessment.” And their statistics, let me just read you what they report.

So for females reporting that they were the victims of sexual touching without their consent, 9.7 percent. Sexual penetration attempt without their consent: 4.3 percent. And sexual penetration without their consent: 2.7 percent.

So, those are kind of in the middle, which is interesting. So we have data ranging from little bit less than one percent to almost twenty percent. And then some kind, in the middle, you know. If you take all that together, it's around ten percent or so. That's what we have. And we don't have any rock solid surveys that have really high numbers of reporting, with really good methodology. Again, I think the best methods of all those studies is probably the National College Health Assessment, one that was sort of in the middle.

I do think that it'd be a good idea to try to find a way of compensating for the self-reporting, and the under-reporting to try to get the estimates a little bit closer. The other advantage to the Health Assessment is that it's many colleges participate in this. Whereas the two surveys where one was at two colleges. And the one that we were reporting on Science or Fiction was one college. So it's hard to extrapolate to all colleges, or all universities. We don't know how size plays a role, the size of the university, or the part of the country, or whatever. I mean, it's just hard to extrapolate from that. That's not necessarily generalizeable.

But the point that we made when we were talking about this, is even if it's an over-estimate by an order of magnitude, it's still a huge problem. I mean, I don't think that debate over, yeah, what the actual number is, I mean, it's important. Do the science. Let's figure out what the actual numbers are. Let's figure out, have the best data that we can.

But I don't think that the debate about what the actual number is should distract from the fact that whatever, anywhere in that range is too high, you know what I mean? I think that it's still a problem. I think it shouldn't keep us from focusing on how are we gonna make campuses safer, and just in society in general. How are we gonna make it so that this is much less common. I think that's the more important thing, while we collect the data. You know, do that too. But let's not make that a distration.

Question #2: Charities and Effectiveness (1:06:03)

S: Jay, the next one is about the item you talked about, charities. And we got a few people making the same point. Let me just read it. This one comes from Michael, and he writes,

”Regarding your recent piece on charities with high overhead costs and low transparency: You may be interested in following this piece up with some information about why it's not just the overhead costs and transparency that can be problems in charities, but also their effectiveness. Some charities/projects actually have negative impacts on communities, and some charities are hundreds or even thousands of times more effective than others. I think this is an important addition and SGU is a great platform to share it. The work of Peter Singer and organisations such as GiveWell and The Life You Can Save are good places to start, and I'm also happy to provide some more information and ideas, being involved in this space myself.”

So, that was pretty typical of the feedback we got. Some people were saying that we were unfair to focus on efficiency, and that some charities may have a very high effectiveness, even though their efficiency is ten percent, or whatever is lowest. Some people thought it was unfair that we said that using professional fundraisers was an unfair red flag for a questionable charity, because legitimate charities might use professional fundraisers also.

Jay, you could tell me what your take on this was, but my take was that, yeah, it's a really good point. Effectiveness is another way to look at how valuable a charity is over all. Efficiency and effectiveness, I think, are two complementary ways of measuring that. I don't think though, I wasn't convinced by the argument that a reasonable effectiveness trumps a really low efficiency. If ten cents on my donated dollar is going to the thing that I think it's going to, even if they're being effective with that ten cents, it still bothers me that there's a ninety

E: Oh yeah

S: percent inefficiency built into the system. I don't think that effectiveness completely trumps efficiency. I think the two are complementary. You have to consider both, you know? And I just don't see how there's any excuse for an efficiency that's that low.

B: Yeah, that seems like a no-brainer. Ten percent's a joke. I don't care how good they are with that ten percent.

J: Yeah, at that point, why even bother?

S: Well, I mean, they could be doing good work, but I think that's not an excuse to accept a really low efficiency. You have to work on that problem too. And if it's the professional fundraisers that are causing the inefficiency, that's a problem! I don't think that that should just be accepted, that your only ten or twenty percent or whatever is going to the thing that people are donating to.

E: At least, not without proper disclosure, I mean,

S: Yeah, you gotta have transparency.

E: You really do.

S: And the thing is, they don't want to do that, because who the hell's gonna donate? “Ten cents on your dollar goes to,” you know, who's gonna donate to the charity who discloses that up front?

E: What the hell, though? You're otherwise in a sense being lied to in a certain

S: Yeah, yeah yeah yeah.

E: context.

S: So

J: What you think about the idea that they're not legally allowed to call themselves a charity unless they hit a certain percentage benchmark?

S: Yeah, I think it's a good idea. Or, have tax-exempt status, you know? You shouldn't have tax-exempt status unless a certain percentage of the money you're collecting is going to the charity that you say it's supposed to go to. Anyway, yeah, that was, but, yep, thanks, that definitely added a needed dimension to the discussion.

Also, a few people mentioned, I think this got edited out, 'cause I remember bringing this up. But it didn't make the edit. There is a good resource called the Charity Navigator. So I think it's just CharityNavigator.com. You go there, and it'll rank a charity. So if you want to check out a charity, to know what its efficiency and effectiveness and transparency is, you could go there, and it will tell you about it before you donate any significant amount of money.

Announcement (1:09:54)

  • TAM coming up

Science or Fiction (1:11:52)

Item #1: A new study finds that an impaired ability to identify odors is associated with a significant increase in the risk of death over a four year follow up. Item #2: Researchers find that higher socioeconomic status correlates with a significantly increased breadth of taste in music. Item #3: A new study changes the estimate for the evolution of eukaryotes from a maximum of 2.8 billion years ago to 1.5 billion years ago, 1.3 billion years later than previously thought.

Skeptical Quote of the Week (1:25:30)

"Belief is so often the death of reason." Actor Anton Lesser as disgraced ex-maester Qyburn, 13 minutes & 36 seconds into Season 5, Episode 8 “Hardhome” of Game of Thrones (First Aired May 31, 2015)

S: The Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information on this and other episodes, please visit our website at theskepticsguide.org, where you will find the show notes as well as links to our blogs, videos, online forum, and other content. You can send us feedback or questions to info@theskepticsguide.org. Also, please consider supporting the SGU by visiting the store page on our website, where you will find merchandise, premium content, and subscription information. Our listeners are what make SGU possible.


Today I Learned

  • Steve gives a long rant about his Electronic Medical Record System in this episode. It generated controversy over the following weeks.

References


Navi-previous.png Back to top of page Navi-next.png