[0:08] Well it's that time of the week again. It's time for Chit Chat Across the Pond. This is episode number 766 for April 22nd, 2023, and I'm your host, Alison Sheridan. This week our guest is your favorite psychological scientist, Dr. Marianne Geary of the University of Waikato in New Zealand.
How are you doing today, Marianne? I'm fine, Alison, but I want to know if I'm your favorite psychological scientist. Well, wait a minute though. Isn't Mrs. Geary also a psychological scientist? Yeah, but who's so? You're gonna make me choose between the two of you?
Choose wisely. Who are you interviewing? There is that. Hey, so I wanted to get Marianne on here because one of her grad students and some of her colleagues, she published a paper in the Royal Society about a very, very interesting topic and I want to tease Marianne first because she sends me these incredibly long-form things. I mean like 30 pages of reading to do when my attention span is about as long as a mastodon toot nowadays. It's just so hard to read so much. But this one was fantastic. I'm fascinated by this study. I think it's really, really interesting. And let me just give the...
The, we'll give it the 30,000 foot view of this, is that a lot of people believe that given the opportunity.
[1:34] Or given the circumstance of a pilot being coming in, capacitated, and they're the only one on the plane, a large percentage of people actually think that they could land the plane.
[1:46] When I knew I was gonna be talking about this study, I mentioned it to my brother Grant, and he immediately went, oh yeah, I can do it.
And I just thought that was hilarious. But this has been talked about before, this concept, and this paper and research, you took it, you and this team of people, took it a lot further.
So maybe talk a little bit about what we already knew and where you went with this.
[2:10] Yeah, so here's what we knew before, and by we, I mean the scientific community.
Um, what we knew is that some people just think they can do all sorts of things they can't, right? And, they, we usually think of these as features of a person, right? So, for instance, some people are just straight-up narcissists. Now, I know that that word gets kicked around in everyday parlance, but, you know, it's reasonably accurate, like some little blowhards, you know, that kind of thing, they can do things, all kinds of things. So that's a kind of stable personality characteristic of somebody. So wherever they go, there they are. There's a name for that, right? That's the Dunning-Kruger effect I've heard about? No, that's a different thing.
Oh, that's different. Okay.
Yeah, so Dunning-Kruger, and I have to say.
[3:05] Ironically, Dunning-Kruger is typically misreported by people, so...
Okay, well, set that aside then.
Well, Dunning-Kruger is very interesting because everybody knows about it, right?
It shows up in everything, except that what it means is that the people who have the least amount of skill tend to be the worst calibrated when you ask them to talk about their actual skill, right?
So, you have, because you're an expert in what you do, you have pretty dialed in awareness of your own abilities and also awareness of what you can't do in the area in which you work, right? So, like, let's say tech stuff or coding or something, right? But someone who's just starting tends to be the worst calibrated and thinks they can do more than they can, typically. And as you learn more, you know, expertise is sometimes described as the process of learning what you don't know.
So, you start to become more and more dialed in. So, Dunning-Kruger refers to this chunk of people down towards the bottom that they're just least dialed in to their actual skills and abilities.
And sometimes, and same guy, Dave Dunning, identified that there's this period of when.
[4:21] You're first starting a skill, you have what is sometimes called the beginner's bubble, where you think you have the things figured out, but that's only because you don't know a lot about it, right?
And so as you start to learn more, you start to become more dialed in and be like, oh, well, that's not true. That's not true. That's not true.
So basically what...
[4:39] What we knew before this paper was that it was a lot of evidence that some people are just.
[4:47] Let's just call them blowhards over confident for various reasons, and The thing is is like we we also know though that most people in in specific areas will think they're, Unusually good at what they do. So the majority of drivers for instance think they're a better than average driver So this is called the above average effect, right?
Oh, yeah. We've talked about that together on Chitchat Across the Pond.
I always thought that was fascinating. There's some number larger than 50% of drivers.
90% of drivers think they're better than average.
I must be a truly terrible driver because I'm pretty sure I'm below average.
So I must be in the bottom 3%.
Yeah, yeah. 40% of software engineers in firms think they're in the top 5% of the software engineers in their firm. Oh, really?
Oh, I like that one too. Most students think they're above average and 70% of professors think they're above average, right?
Or no, they think they're actually in the top chunk of professors. So it's just funny.
So that can't really be explained by rampant narcissism, it's just like, what is that about?
So there's most of us, or many of us, at least in specific, we'll call them domains, areas, right? Think that we're just not dialed in to how good or bad we are.
[6:09] Okay. So, we thought, you know, we don't really do personality stuff in my lab or stable features of, you know, clinical characteristics like narcissism. We don't do any of that in my lab.
But what we do is we know that the features of the, some characteristics of the task that you're doing, like the environment you find yourself in right in that moment, can turn people into temporary narcissists, can turn people into temporary blowhards.
[6:36] Make them really confident about things, make them think that something that they're reading is true or easier than it is or, you know, and this is a phenomenon that's related to the stuff I've talked to you about before in previous chitchats about, you know, people remembering things that that never happened to them.
And we thought, well, maybe there's a role here for creating situations like that in which people think, oh, well, this is easy for me to imagine doing, for instance.
So something you could insert that would make them think they could do something they can't.
Yeah, just ordinary people, right? Just ordinary people, not like just your random, like specific blowhards, but ordinary people.
So, could we take the people who, for instance, you know, the 90% of drivers who think they're above average, and could we create a situation like that out of anybody, but having to do with something we thought, well, we need a really preposterous skill, something that everyone should know they can't do.
We thought about a whole lot of things.
[7:39] So one of the skills that we tinkered with was eye surgery. Oh! Yeah, exactly.
Please tell me people don't think they can do that.
Well, we didn't actually go down that road because it was kind of gross, I mean, unsurprisingly, the videos of people doing like corneal repair, I was like, eww.
Oh, I know the one you should have done. You know how if you've watched MASH, you're pretty sure that you could open an airway by punching a hole in somebody's throat and sticking a pen in it?
Oh, yeah. Yeah, we did think of that one. Did you really? You did think of that one.
I'm positive I could do that, and I would know to do it in the right circumstance. I know it.
Yeah. Yeah, we did think of that one, but it didn't have enough steps.
And also when you see it on YouTube, so we can't find a video, so we see it on YouTube, it's like a, here's what to do if, you know, and it's not a real person, obviously.
So we rejected those things and then we thought, what about those people who sometimes say day.
I could land a plane in an emergency. Because you hear about this like apocryphal.
What is it? Mythbusters even did this.
They did a show on Mythbusters. First we set out to say, well, is it really a myth?
It turns out so Mythbusters did this show where they taught, I forget the guy's names on Mythbusters because I don't watch it.
[9:02] But I watched this episode and they have the guys who, they went in, not in an actual plane, but they went in a flight simulator, like they trained pilots on.
And they tried to land a plane. And one of the guys, I think he landed in the woods, 10 miles away from where he was supposed to go and like killed everybody on board.
And the other guy- Well, they're testing it on themselves.
They're having themselves try to do it. Okay.
Yeah, yeah. In the flight simulator, right? And the other guy, I think, landed the plane, but it did a whole lot of damage.
I mean, shocking, I know. And then they wouldn't let them, then they had another part where they did it again, they wouldn't let them talk to the tower.
And they just both crashed and burned, crashed and burned, crashed and burned.
So they're just like, here, and the myth buster's conclusion was, no, random person, you can't land a plane.
And most pilots will tell you, they'll just snort and say things like, I'm really tired of hearing people say that they can do this.
And, you know, yeah, like I'm sure everyone's gonna write to me, so let's just preempt this right now. And instead you can write to Allison saying, well, what about that guy in Florida who landed the plane? And I'm just like, yeah, okay.
He, I think was like a flight simulator.
[10:17] You know, that game, he played that game a fair bit. And also he- So maybe new terminology in order to be able to communicate.
Yeah, and talked to him by the tower.
Okay. He was talked to down by the tower. So I'm not to take away from what the guy did, right?
But it's just like, it's a particularly- He's remarkably lucky.
Yeah, and it was a particularly unique set of circumstances, right?
So in general, no, you can't land a plane.
Probably people listening to this right now are saying to themselves, I could land that plane and maybe even saying it out loud to us in their ears, I could land that plane and we call these people men, but we'll get back to that in a minute.
I was wondering how long it would take us to get to the men bashing, but yeah, standby.
By. It's science. No women listening to this are gonna be like, no, that's shocking. So here's what we did. We conducted two experiments, almost 800 people, and we asked some of them, but not others, to watch a video of a guy landing, a pilot's landing a plane. And they were commercial pilots.
[11:23] I forget the name of the aircraft. In the video. Yeah, in the video. You can't really see them doing anything. There's a whole bank of control. And their hands are covering the controls.
Our hands are covering the controls. So it's like from the back.
Yeah, it's from the back, like you're just peering in the cockpit.
So you see them doing, like, basically it looks like they're driving a fancy car, and then the plane lands. Well, they take it over hills and mountains and over houses and then onto the tarmac, right? And they land the plane.
So there's literally no instructional value to this, right? No.
None at all. No. In fact, right, yeah. Rachel Zajonc at the University of Otago's dad, was a pilot for Air New Zealand for years and years and years and years and was involved in training pilots and stuff.
And he watched this video. So Rachel said, dad, look at this video.
Would it help someone learn how to land a plane? He said, this video is absolutely useless.
[12:18] Okay, good. Oh, I also want to jump in. The way you chose the people that you had do this was use the Mechanical Turk, which is a tool you can use where you can pay people miniscule amounts of money to do little tiny tasks.
And so you eliminated anyone who, you asked them something like, what kind of pilot are you? Or something like that.
So if they responded, well, I'm this kind of pilot, they were out. Their responses didn't count.
And you did ask their sex or gender, I don't know which, but whichever you knew who was male and female in this test.
We asked them how they identified gender-wise. Okay. what language they speak.
[13:03] Their age, and we eliminated it. We also asked them questions about if they played like flight simulators, and we got rid of those people. So these are people who don't...
They got to do it, but you ignored their results.
Yeah, yeah, yeah. And we had, you know, the age of people, the mean age was 40, which is pretty good, right? So it's not like we're getting just, let's say, college students who are not as calibrated about their own skills as, frankly, they should be, and across a number of dimensions. All right, so these are actually like legit grown-ups.
[13:41] And a good chunk of people, 40 plus or minus, so that's good. Yeah, so watching this video...
You don't show the video to all of them? You show the video to half of them?
No, some of them, but not others. Yeah, because we don't show the video to the controls, right?
It's a little less than four minutes, this video, as you said, no sound.
So you see from the back of the flight deck, you see the view of the flight deck.
So I assume you're going to put this article in the show notes because it's open source.
So if people want to go and click on the link, we point to the materials and the data in case you want to totally nerd out in the article so you can see exactly what this is.
So you can see their hands and what they're doing, but it's somewhat obstructed just because the angle it's shot at. So it doesn't teach anybody to do anything, right?
It's really just you watching someone sort of glide in and land to plane.
So the people who see the video, they see the video. And then people who don't, obviously they don't. And then.
[14:46] Then they're asked this these two questions and one is um, how confident you can land the plane without dying, And how confident are you that you could land the plane as well as a pilot could?
[14:57] And they have to answer on a scale and we varied sometimes we do the scale from zero to a hundred like not at all confident to very confident or, Not, you know that kind of thing. Um And we ask them at the end for instance, have you ever flown a plane before have you ever landed a plane before?
Uh and then Here, this is important. We asked everybody, we always ask everybody, how much expertise do you think is involved in doing this task? So here, like in landing a plane, so like no expertise to a great deal of expertise on a scale, right?
Do you ask them that before they answer the question of how confident are they?
Afterwards. Afterwards. Afterwards. Yeah, afterwards. And everybody all the time is on what we say in, you know, in data analysis, the ceiling. So, they're up banging their head on the ceiling of, it takes a lot of expertise to land a plane. And that's important for what I'm going to tell you about. So, they're not delusional about what it takes. Yes, exactly. They're just delusional about their own abilities. Yeah, exactly. Because our results would be far less interesting if people thought, eh, it's just easy to land a plane. Because you could just say, well, you've got a bunch of dumb asses in your experiment. We just had like actually ordinary people in our experiment. So anyway, then we, you know, we're gonna, we asked them these questions.
[16:19] Right afterwards. And we assume that what people are doing is operating on a kind of gut feel or hunch. And, and so what we know when people operate, make these kind of gut hunch sort of decisions is that they make them on the basis of how easy it is to bring to mind thoughts and images and feelings of doing the task. Right? So if you go, if you think this through and you say, well, people have seen the video now, it's easier for them to bring to mind thoughts and images and feelings of maybe them doing the task, whether they see themselves in the situation or it's just, the task itself, which is landing the plane successfully, then you would predict then that people who saw this video, even though it was just four minutes and not instructional by any measure, would be more confident that they could land the plane. And that's what we saw.
[17:17] No, remember there are two questions, right? So could you land a plane as well as a pilot could or could you land the plane without dying?
So let's call this without dying question the low bar and as well as a pilot could the high bar.
[17:31] And so Almost thankfully we get different responses here. So People are less confident. They could do it as well as a pilot could, So at least there's they're throwing the pilots a bone, which is you know, I guess nice But they still think, you know, they're still more confident that they could do it if they see the video than if they don't.
And that's what I thought was pretty amazing, right?
And you know, since we foreshadowed this, and because it's fun, men are more confident than women.
By how much? Even if they don't see the video. What kind of margin? I mean, on average?
Do you have any numbers on that? I looked for that in the paper, and I didn't see it.
But as I recall, in the news articles I heard about this study, that it was a pretty...
Or it might not have been about this study, but in other studies, that it was a pretty significant margin difference.
Yeah, I think it depends how you calculate it, and I don't remember it exactly.
But if you're thinking about maybe 20, 30% higher, more confident, is that what you...
Yeah, right. I think that's what it was.
Yeah. Right. that they were, it just, it's compared to women in the same condition.
So yeah, it's very interesting and fits with some other work.
[18:55] And even just survey stuff that men tend to be more confident about their abilities than women are, to the surprise of no woman anywhere.
You know, when I got to that part and talking to you about it too, it started to make me think about two angles to that is in the workforce, as a woman, if you are not portraying as much confidence as the men that you know you're at least as good as, probably possibly better, that you're gonna be perceived as not as good at it because they're showing this confidence and you're not.
And it's a pretty common thing to hear that women don't project themselves above what they think they can do, and men can blowhard it better than women can.
So as a woman, learning that skill of pretending you know a little more than you do, which sounds really hard, so there's the flip side, as a leader, looking at your employees, looking at two people who say they can do the task, or trying to get a promotion or whatever, that take into account the fact that the women are probably not overstating what they can do as much as maybe the men are, on average.
[20:11] Yeah, there's a whole literature, and I mean, I don't work in this area, but there's a I know there's a whole literature, for instance, and social psychology literature, particularly that applied to the workforce in that women, for instance, are terrible, terrible negotiators.
And I would urge every woman to read a book called women don't ask, written by a proper social psychologist who's got an expertise in, you know, the research about negotiating.
And it's a related idea is that, you know, so men are much more comfortable at trying things on, than women are. So what we don't know here is whether.
[20:58] Men are somehow responding to this question differently for different reasons. Like, are they just trying it on or kind of blowharding? And women are more calibrated. So you don't know, it's kind of interesting, like, well, who's more accurate? You know, what's, so I don't know what the calibration issue here is. But it's interesting because, you know, of course, this finding fits with, we have this in the paper, you know, that YouGov survey, which is that 12% of men thought they could win a point against Serena Williams in a game.
12%? 12% of men. How many women? And 3% of women, who I want to kick out of the club, made the same claim. But at least it's, you know, 12% of men, only 3% of women. And then there was another YouGov survey. I always show these whenever I present this work in a talk, right? It's that they asked men and women to identify which animals they could beat in a fight. Really? Like cobras, bears, and eagles, yeah. Cobras, bears, and eagles?
Yeah, more men than women claim they could beat every single animal.
[22:10] So, you know, there is another, uh, another angle to this. Maybe they can, Marianne. Maybe they can land the plane. Oh, yeah. Maybe that's it.
That's right. Maybe the men are just more accurate and women are underestimating their abilities, or maybe men are just more capable. That's probably, you know, we got to take that into account, right?
That's probably right. That's probably right. But it is, you know, it is interesting.
[22:38] That we see this gap. And like I said, it's not my area, but it's always the thing that people want to talk about. Right, right, right. It is interesting. One of the things back on the study, looking at video and no video and the comparisons, I know this is nerdy to ask about, but I was really intrigued by the way this is graphed in the article.
So... Oh, the violin plots, yeah. Yeah, they're called violin plots. I call them bowling pin plots, but violin plots. So, in the vertical axis is the confidence from zero to a hundred, and the width of the bar, let's call it the violin bar, the width of the bar is changing as you go up to show you what the distribution looks like. So instead of just in a horizontal axis and a single line, you know, it's in the width of it.
And I thought that was a really neat way to graph it because you can look at it, if you look, for example, on the video versus no video on can you do it as well as a pilot could, without the video, basically the bar is really fat down at the bottom, and there's a little slight bulge above 50% where the crazy people live.
[23:56] But then if you look at it after taking the video, that bottom isn't super fat anymore.
It's pretty narrow, it's like half-width, and then it stays pretty high, like 25%.
The mean is 25% of the people think they can do it, as well as a pilot after seeing the video.
Right, right. So these are nice, these plots, because as you say, they show the distribution compared to a bar graph, which is fairly useless, and at worst conveys the idea that all the points are the same, the distribution is the same throughout the bar, right?
And so it's only relatively recently in my career, I would say five, 10 years, that these have become more and more easy to do.
So R, which is a language, a data analytic language that's free and is very popular now, swept around. It's just the letter R is the name of it. It's just capital R. And ironically, quite ironically, although no one here likes when I say this, but I don't care, Allison, because I'm talking to you and it's just us. But R was developed at the University of Auckland, so it was developed in New Zealand. And I always say, who would pick the one letter that New Zealanders can't say and name their software after it? It's just...
[25:22] Do you ask Mrs. Gary to say it, just for the comedy? Ah. Ugghhh Yeah, okay, okay. That's pretty funny. Anyway, yeah, and it's great because if you use our code, you can write an R, then of course, chat GPT can help you. You can do all kinds of things. And you can also go to an employer and say, why are you paying for SPSS, which costs a squillion dollars a year when when I can do this for free, right? So you can make these nice plots. And then you can see if you're if listeners want to just look at these figures, like figure one, figure two, you can think of the no video control conditions as like a balloon of a certain shape, and then you can kind of imagine that the video condition is that balloon squeezed differently, so you're moving people around and you can kind of then, visualize the effect of the video. Yeah, I got excited about these graphs and I, tried to do it in Excel and then I did some searching and it said, okay, there's this plugin called Excel Stat, you can try that. Okay, I put the plugin in, I couldn't find the thing that said how to do it, and it was going to use Python. And then I went online and I found.
[26:28] Oh man, I just, I went down about 42 rabbit holes. I probably spent two hours trying to do this.
And Marian kept saying, use ChatGPT, but I was asking ChatGPT, but the language, I didn't know what it was, and apparently it was R. I never did succeed at it. I was able, like she said, you can download the data from there, from her, from this paper. You can download the source data and then try to run R against it. I did not succeed and I'm disappointed myself. I may keep trying. That's okay.
I wanted to make those graphs on men and women. That's what I wanted to see.
Oh, right. Yeah. So you can start by, if you get an app called RStudio. I think it's two.
Yeah, RStudio. It's just a nice front end for R.
Oh, okay. So you can see the coding, you can do whatever. And then R will ask you, it will just, you know, it just runs R with a front end. And so you'll be asked at some point to install different, basically, libraries that do certain kinds of things. And when you run some code, if it doesn't have that, like library, and it will say that it needs to go get it, and then you just allow it to go get it. So it's, it's, it's pretty good. I'm pretty sure I did install R using homebrew, but I didn't know what to do next. So by that time, I'd spent two hours. And that's why the article that I was writing yesterday and the day before and the day before isn't done yet.
[27:51] Well, this should be your next thing that you do, where you teach people how to do R, and that way you can learn to do it too.
My next obsession, right? Yeah, your next obsession.
So, there were actually two separate studies, and I didn't quite understand why this was done twice.
You did it with a small sample size. Is it the same study twice, just two different sample sizes?
[28:17] Uh, it was the same study with a larger sample. Okay.
Uh, but the first time, and that's good because we want to replicate, science needs to replicate, right? So, uh, if we had just done experiment one, experiment two, and experiment two with a much larger sample to make sure that we could replicate the basic pattern, but then also, with a larger sample size, you get a more precise estimate of the size of the, the effect and that's the those error bars that you see in the plots are around the mean and the these error bars which are called confidence intervals. They give you an idea of the plausible range of values in the population because it's always what we're trying to do is estimate what is the actual value in the population. So so that's one reason to replicate something so science always needs to replicate, but what we had noticed in the first experiment is that we had this fact where if you were asked the as well as a pilot could question first, it kind of.
[29:21] Threw cold water on your confidence and we're like, what is that about? So then we tried to manipulate it. We just did it again to make sure that that really was a thing. And it turns out it was a thing. So it's almost like being asked the question, could you do this as well as a pilot could, before you were asked, could you land a plane, makes people have some kind of reality check and they calibrate better and their confidence isn't boosted as much.
Well, hang on. So that's kind of cool. I'm looking at the graphs and it looks to me like it says the other way around.
So if the question asked first was, as well as a pilot could, the confidence bars are R.
[30:02] Higher. Look at the top. Look at the top on page eight, figure three on top.
So the question asked first, without dying.
So if you're asked the without dying question first, now your confidence.
So now look at those, let me see, how do I say this?
Look at that left panel, the left side of the dashed line.
So, what you see is the confidence is higher than it is if you go over to the without dying question when that comes second, which is on the right side of the panel.
Hmm. I must, I don't know if I'm looking at, I think I'm looking at the same graph, but to me, like if we just look at asking without dying first.
Yep. Oh, I see.
Boy, this is hard. So what Marianne and I are looking at, which you can't even see, which makes it even harder to follow, is eight of these violins.
The ones on the four on the left are question asked first without dying.
The four on the right are question asked first as well as a pilot could.
So within the set of without dying, there's four different scenarios, video and no video twice, one without dying question, and as well as a pilot could question.
The results of that.
[31:22] Well, what you see is when you're asked first about, could you land this plane as well as a pilot could, which is the bunch of plots on the right-hand side, you see that there's no... I mean, there's confidence, but there's nothing really going on.
So it's as though people thought, hang on, you know, like, could I, am I really as good as a pilot?
So that's, as opposed to just, could you do this without dying, which we consider to be, And I say this with some irony, an easier task, right?
And so it's as though the, as well as a pilot could question first, it's like, hang on, could you do this as well?
The pilot could, makes them go, and recalibrate everything. So it's the only thing that we've, we found that kind of arrests that overconfidence and doesn't totally, right?
Because if you look, you just see on a hundred, on a hundred point scale, people are still roughly at about 25, which is not great.
Yeah, that's astonishing.
[32:25] So yeah, what do you do next with with results like this? What does this I assume what you want?
It makes you ask other questions You're gonna there's there's something else you're gonna want to know.
[32:39] Yeah, um, well first of all, I just want to say i'm happy to crush this dream because this is a dangerous dream, Right, I don't want someone on my plane thinking that that, and I'll just use this pronoun as the placeholder, he could land the plane, right?
Just a placeholder. Well, see, I do.
Just by the way. I do, because if there's a person on the plane who thinks they can do it, does that increase the probability that they'll be able to do it? Oh, and you're all gonna, yeah.
Well, if there's no other option, yeah, I suppose. If there's no other option, I think you should try and land the plane.
And look, just to circle back here to this, why? Why are people overconfident?
In general, if you think about it, overconfidence probably has some adaptive benefit, like back when we were just cave people.
So you had to be able to run with a pack that wanted you to run with it.
And so, it was probably beneficial to think that or to present yourself as someone who could do things that we might today call stretch goals.
Yeah. So, like, I can do this thing. Yeah. A stretch goal.
[33:56] Yeah. So, you could, right? I wouldn't say that landing a plane is a stretch goal, but maybe, maybe, right?
But it's just the same kind of mechanism.
So that, you know, it's good to present yourself as more confident if you want to be able to have a pack, the safety and security of a pack that's going to protect you like in prehistoric times.
Because if you, you know, back in those days, if you didn't have a pack that you could run with, you didn't have access to resources like food and safety, and you could be picked off from the pack and killed.
So it's good to appear confident.
So that's, this is why the idea goes that people have a disposition to be maybe confident about things that they can't yet do.
And then also, if you think about the relationship between confidence and maybe optimism, you want to get out of bed in the morning, wake up every morning thinking that you can do some things, that maybe you haven't done before, right?
So all this stuff is probably an adaptive characteristic.
[35:05] But we try to push on it and take it to its extreme. Because sometimes like when people say, why would people think this? And the fact is, it's like most of these weird phenomena.
Beliefs or tendencies or claims or even the way memory works When we really push on them and take them to extremes what we're doing is taking something that, We have probably because it serves some adaptive function most of the time So we're distorting it. So this is a distortion of things that probably most of the time are good.
[35:38] Right, so where are we gonna take this next women need to get more of it? I, Think women need to get more of it. Yeah Yeah, well, again, I mean, I'll tell you what.
If you read this, I don't know if you have read it, and now that you're retired, I guess you don't need to, but this Women Don't Ask book.
I read this, my friend told me to read this once, because I was negotiating for something, and my friend told me to read this book, and she said, just be careful, because you're going to get angry.
And I read it on the plane, on my iPad, and if it had been an actual book, I would have thrown it across the plane a half a dozen times.
Oh no, because you didn't agree with it, or it just made you angry because you knew it was true?
It made me angry that I didn't know that I could be behaving that way.
Like, in a good way, like, here's things I should be doing that I'm not doing.
Here's things men tend to do that I don't do, that a lot of women tend not to do.
And that made me angry. It made me angry that I, cause I don't think of myself as like a pushover, right?
It made me angry that I didn't know these things.
And so that's why I now proselytize and tell all sorts of women to read that book, Women Don't Ask, read it, read it, read it. And there's a follow-up called Ask For It, but I think Women Don't Ask is just fantastic.
[37:00] And yeah, so I think women could do with more confidence. Like I'm not a social psychologist.
I'm just a cognitive psychologist. So I do this kind of stuff like skills and memories and problem-solving and whatever.
And what we're really talking about is women and men, and at workplace and negotiations and whatnot.
This is social psychology, so not really my area. First and foremost, I don't have any social skills, but I'm not a social psychologist.
But yeah, but I've read some of this work, and not being an expert, I will say this book is great, and in general, you've hit a real problem. But now that that digression is over, where are we going next? Where are you going?
Well, you know, we've just submitted a paper, a manuscript, to a journal, a different journal, same, many of the same set of authors, led again by Kayla Jordan, who's now Dr. Kayla Jordan. Woohoo!
I know, great. So what we did is we.
[38:10] Well, this is at least an overconfidence that is not fatal, which is good.
And I noticed this weird thing when we were all under our shelter-in-place orders, right?
In the before times, well, or the after times.
[38:30] So I was watching Netflix, and I was watching this Danish TV series on Netflix, a political drama called Born. And it's subtitled and it has to be because Danish sounds like, I don't know, Dr. Seuss tripping on acid. It just does. There's no, like, you can't understand. I mean, if you're Danish, I'm sorry, but I'll tell all my Danish friends this too. It just does. And you can't really understand anything they're saying. So, but what I thought was after about the third episode, I thought, I'm learning Danish. This is also a great side benefit because when I go back to Denmark, because I go there sometimes because they have a great autobiographical memory center. I go there, I'll be able to have better conversations with people and they don't have to speak in English. And so then I turned off the subtitles to quiz myself. And I within 10 seconds thought, Oh, this I don't know any what is happening. I don't understand any of this. And I thought it was a bad scene. So I turned it back on and went on and off like this for about 15 minutes. And I thought I was finally convinced that I wasn't learning Danish.
And there's some kind of illusion going on.
Oh, so the subtitles are like watching a video of somebody land a plane.
[39:44] Yeah, yeah, that's what I wondered. So we tried this, we turned this into an experiment and we showed people short video clips from different shows.
One was this political drama and the other one was a show, great show on Netflix called Rita, still Danish, about a school teacher. And you either saw the clip or you didn't.
[40:07] And we just, we had to have different clips because we didn't just so the this effect I'm going to tell you about is not, you know, tied to a specific clip, it's not. So, we either showed you the clip, what we showed you the clip, but we either showed it showed it to you with subtitles or without subtitles. And it's like a minute, right? It's not long, it's shorter than the plain video if I remember correctly. And then we asked you questions like, you know, how confident you that you could follow Danish instructions in an emergency, or read the weather forecast, or make friends if they spoke only Danish. And it really boosted, if you saw the subtitles, it boosted your confidence that you could do these things. Time and time again, it was fantastic.
[40:50] Oh, nice, nice. It was significant? Oh, yeah, totally. And if we then gave you a Danish quiz, because that's the obvious answer, well, maybe you are learning something from the subtitles. And it had words that were in the scene, but also like the most common words in Danish, like the. We asked people just to transcribe, just to write down what they thought the words meant. They just were completely, first of all, A, the same, and B, on the floor. So that was pretty fun.
By the same, you mean the people who didn't watch the subtitles and the people who did?
Yeah, yeah, yeah. The subtitles weren't doing jack, except making you more confident that that you could do this thing.
Are the people angry afterwards?
No, I don't think so. When they realize, or is it just more like, oh, that's interesting.
Yeah, I think people always think those kinds of things are interesting, right?
I think they always think those kinds of illusions are interesting.
I think people like to learn that about themselves.
I think because they haven't, you know, they haven't. Maybe.
They don't do anything where they really make it like an ass of themselves. They don't hurt anybody else, right?
They don't embarrass themselves in front of anyone. So it's a private response. So, you know, in fact, we don't, Talk to them personally they get everything in in writing so they can have their own private reaction, but usually people just write this is interesting.
[42:18] Oh, that's good, that's good. Yeah, but this is this is crushing of dreams Because I think we all thought we could fly the plane Yeah, but like I said, that's a dream I'm okay to crush that one.
I'm okay to crush that dream, yeah. Be nice to the pilot, don't hurt her, she needs to fly the plane because trust me, nobody on the plane can.
[42:43] Yeah, exactly. You know, where I want to take this is the obvious application of this kind of work, that we need to move into that area is to see what happens in education, because there's this idea in education that, you know, you start off with things that are easier and build confidence, and then you bring in more and more difficult or exceptional examples.
And it is possible that by doing that kind of thing, what you're doing is just creating this illusion of overconfidence, and then it would miscalibrate you as a student thinking about how much maybe time you had to put into something, or how well you knew what you were talking about, and so on. So, we want to repeat these kinds of studies. So, for instance, adding subtitles to a lecture. That's a very common thing. Like, you watch a recorded lecture, if you have subtitles there. And I mean, like, English lecture, English subtitles, so same language subtitles, would you think you were learning more about that topic than you were actually learning?
So I think that's a really interesting question, and I think we're going to try and investigate that soon. Yeah.
[43:55] Yeah. That is interesting. I know it changes the way I think about what I'm watching. Does it?
Well, probably the biggest thing is I hate it on comedies, because it ruins the joke, because the timing is lost.
Oh, because the timing's off. Yeah, yeah, yeah.
A little angry, but there's certain shows where it's like, I can't understand them, I need the subtitles, and then I, or closed caption, I forget which, I think we may be misusing the word there, but it's... Yeah, subtitles is a generic term.
Okay, okay. Right, and when they're in your language, it's a caption.
Yeah, that'll be interesting to see what that does. That sort of relates circles back to to work you've talked about before of whether you retain information better if you take Take notes or not.
Oh, yeah. And also if you retain information better, if it's a little more difficult.
[44:54] Oh, right. If it's a little bit more challenging to. So one of the one of the ideas is that when you're taking notes, you have to work in the moment to distill out the essence because you can't transcribe when you're taking notes, right. So it injects a little bit of challenge into the task, a little bit of difficulty, effort, let's call it effort. Okay, into the task. So there's this idea in cognitive psychology called desirable difficulties, which is just a little bit of, challenge or effort injecting it into a task, like with note-taking, makes you remember that thing better. And so that's, I think that's an interesting idea.
[45:34] I do like the idea of a little bit of effort. Have you ever been in a class where, I'm sure everybody listening has, where you're in a class and you start to get behind and you're really concentrating because it's difficult, and you're really working on it, but you reach a point where you realize you're never going to catch up, like you're over the hump, you're on the other side, and you kind of throw up your hands.
You may even giggle like, I got nothing, never going to get this.
I saw that happen in an entire class one time.
[46:06] It was a quantum mechanics class taught by Professor Van Hoven, and the entire class just started laughing like, what is this guy talking about?
Knew what he was talking about. And Steve was in the class and he developed a theory. He said, I think that the Russians, this is during the Cold War, the Russians have infiltrated our university system and they're teaching absolute gibberish to the engineering students and our entire society is going to collapse as a result in 20 or 30 years. That's how bad this was.
Did the guy ever know? Did he get it? Did he realize that he was losing it?
I don't know. I have no idea. Quantum mechanics is out of the question.
It's really interesting you say this, when I was a grad student, one of the things that I, wanted to study was that feeling in a student. And it was inspired by this far side cartoon I had seen where, you probably have seen this, where the kid's in the class and he has his hand up and he says, can I go now, Mr. Whatever my name is? My brain is full and I wanted to study I wanted to study that feeling when you just think that's it. I can't hold on anymore.
It's just running out like a glass that's running over. Buffer overflow.
[47:18] Yeah, that's it. Yeah, exactly. Exactly. And I just thought that would be like it because people have that feeling and where does that feeling like how accurate is it?
And what are the consequences of it? And like what can you come back from it?
[47:37] Yeah, I think it, my guess is, I would say is that it depends on the instructor, the person trying to pour the information in.
And the example I'd give is Barton and me with Programming by Stealth is, we started doing video for Programming by Stealth.
Even though we don't record the video, he needs to see when I'm slamming my head on the desk because I have no clue to what he's talking about when he's lost me.
Oh, yeah. And it really, really helps. And he can adjust.
Got that look on your face right now, you know, and so I'll say, okay, I lost you back like 15 minutes ago when you said blah, blah, blah.
And so he'll go back and he'll explain it a different way. But if you don't have somebody who's willing to do that or able because there's 600 people in the lecture hall or something, you know, it can be highly dependent on that, whether that can be fixed, whether you can, you know, open it back up and, you know, scrape out the goop that didn't make any sense and start pouring in it again.
I don't know.
[48:33] Yeah, it's interesting. Yeah, it's interesting. It's fun research. Yeah, there's a whole lot of things to be, investigated here. One of the other things we're doing right now, we're just in the early days of this, so I don't really know how it's going to shake out, but I was fascinated on one of my trips to Japan when one of my colleagues was explaining to me that the kanji often look like the thing they mean. What's the kanji? It's like the Japanese characters. Oh, okay. They come from Chinese. So the kanji for that means tree in Japanese looks like a tree. And so if you Google it, you'll see what I mean. And the kanji for fire looks like a campfire. And And I thought, I wonder if he started off thinking, somehow I took away from this that all the kanji look like the thing that they mean, which is manifestly not true because even Japanese people only learn some kanji because the rest of them are preposterous and it's not like you can recognize them.
[49:47] So, I thought, oh, once I realized that some kanji, only some kanji look like the thing that they mean, then I thought, I wonder if you could get people to think kanji, the same kind of thing as the plane, right?
Sure I could learn kanji, or sure I could understand Japanese just by showing them easier kanji first, and then harder kanji.
And that's what we're playing around with now. You've got more people to torture?
Yeah, more people to torture. Yeah, I mean, I am interested in these. It's, this is called this idea where you step back from yourself and you think about what you know, and you don't know, like we were talking about the beginning of our conversation is called metacognition.
[50:33] And it's one of the secrets of learning new things. Like when you work with Bart, you, have to be able to monitor what you know, and what you don't know. And at least say, I'm lost, or I understand this, but that's where it stops.
Or sometimes you relate something, like you do this a lot, you relate something to something you know, that's another kind of act.
And metacognition is crucial for learning to do something new.
And some people are better at it than others.
So what we're creating here, like I'm interested in these situations in which these illusions pop up and how they arise and how you can maybe repair them or what their consequences are.
There, I was thinking about something I saw on TikTok recently is somebody saying, the problem with with stupidity is that you don't know that you're stupid.
And is there, I guess what you're poking at there a little bit is, is there a way to get people to step back and realize what they don't know?
[51:38] Yeah, well, that would be really, that would be really important.
Yeah, it would be useful.
I mean, sometimes you can, like in a classroom, which is easier to deal with in like reality because like it's a controlled environment.
So in a classroom, I can ask you questions and reveal what you know and what you don't know. I can give you a quiz and you'll take away from that what you know and what you don't know.
Get feedback on an assignment. But in real life, real life, you know, like it's just, you would see people, I'm sure, on TikTok who are making it clear that they don't know what they don't know, and they're just blustering.
And then the question is, how do they learn? How do they come to learn that they're wrong?
[52:24] Yeah, you know, that's a terrible place to leave this conversation.
Yeah, yeah, I know, I know, but that'll make me happy that I have crushed some dreams then.
That's your favorite thing. All right. Well, I am going to leave us on that depressing ending note.
If people want to follow you online, let's see, you've been doing some mastodon tooting there.
[52:49] I have, but not very much. I'm both.
Dr. Lambchop everywhere, except at work where I'm not. All right, well, if you have any complaints about what Dr. Gary said, you're supposed to send them to Allison.
Send them to Allison. Allison at podfeed.com. All right, well, thanks for coming on the show again, Marianne.
This was great.
Thank you. I hope you enjoyed this episode of Chitchat Across the Pondlight.
Did you notice there weren't any ads in the show?
That's because this show is not ad supported. supported by you. If you learned something, or maybe you were just entertained, consider contributing to the Podfeet Podcast. You can do that by going over to podfeet.com and look for the big red button that says Support the Show. When you click that button, you're going to find different ways to contribute. If you'd like to do a one-time donation, you can click the PayPal button. If you want to make a recurring contribution, click the weekly Patreon button.
You're only charged when I publish an episode of the Nocillicast, which, let's face it, every single week, so I don't charge Patreon for Chitchat Across the Pond Lite or Programming by Stealth episodes. Another way to contribute is to record a listener contribution. It's a great way to help the NosillaCastaways learn from you and takes a little bit of the load off of me doing all the work. If you want to contact me for any reason, you can email me at allison at podfeed.com and I really encourage you to follow me on Mastodon at Pufferfish.