tv The Knowledge Illusion CSPAN August 5, 2017 2:30pm-3:23pm EDT
2:30 pm
2:31 pm
steven is a cognitive scientist, he's been a close friend of mine for many years, and he's also an editor of the very prestigious journal cognition. and he's joined in conversation today with professor of management science and economics at mit's sloan school. he also holds joint appointments in the economics department at mit as well as the department of brain and cognitive sciences. before we begin, i ask that you please silence your phones be the you haven't all right. -- if you haven't already. i should introduce myself, i'm the director of the mit press, my name is amy brand. it's a pleasure to welcome you. we just started doing this series of events, and they've been incredibly successful. if you're enjoying this, we do events about every two weeks. the next one coming up is on may 23rd, the book "dream chasers: immigration and the american
2:32 pm
backlash." so tonight's event will last for approximately, you know, 45, 50 minutes before there's the book signing, so we'll start off with steven talking about the book, then some questions for him and we'll moderate a conversation with the audience before we actually do the book signing. i think i covered everything i was supposed to say. oh, yes, and then after the presentation with the signing, books will be available for a 20% discount. >> maybe i'll buy one. [laughter] okay, i don't think i need this mic. >> no, you don't. >> well, thank you so much, amy, and thanks, all of you, for coming out. let me start by telling you a little anecdote from the book. so in the '40s, atomic
2:33 pm
physicists were still trying to perfect the atomic bomb, and eight of them -- eight atomic physicists, these are atomic physicists, right? these are people who know atomic physics better than anybody. they were developing the bomb, and so eight of them coalesced in this room in order to run this experiment. it's an experiment that the famous physicist richard feinman called tickling the dragon's tail. it involved taking two hemispheres and bringing them closer and closer together, and then neutrons would start shuffling back and forth between the hemispheres, and it was really dangerous because you could have a lot of radioactivity. so the main physicist running this experiment, a guy named louie, was keeping these hemispheres of beryllium separated by a flathead
2:34 pm
screwdriver can. so what happened? the screwdriver slipped, the hemispheres came together, radiation filled the room, and louie was dead within nine days. and the other physicists were, they died young, probably from the effects of this radioactivity. so the question and the question posed by the book is how can such smart people be so stupid? so the first claim made by the book is that people are relatively ignorant. and that's a fact in the sense that, you know, 25% of americans don't know that the earth revolves around the sun, and 50% don't know that penicillin kills bacteria and not viruses. railed owe -- talk show hosts make fun of this fact all the
2:35 pm
time. many people can't name the vice president of the united states. but the real point of book is that people think they understand things better than they do. right? so the main form of evidence for this originated by, in the lab of a great psychologist named frank kyle at yale, and what he did was asked people to think about simple, everyday objects like zippers, ball-point pens, toilets. and he asked people how well they understood these things, and people felt they had a sense of understanding, right? on a seven-point scale, they would say four or five. he'd say, okay, how do they work? and what people discovered was, for the most part, theyed had nothing to say. -- they had nothing to say. they just didn't understand how these simple, everyday objects worked. when he again asked them, the ratings were lowered. the act of trying to explaining
2:36 pm
punctured their illusion of understanding such that they lost some of their hubris. and what i've done with some colleagues, todd rogers at harvard and craig fox at ucla and my co-author phil, is to take this paradigm and run it in the context of political policy. so we take political policies like should there be unilateral sanctions on iran, should there be cap and trade policies on carbon emissions, and we ask people how well they understand them. people have a sense of understanding, and then we say, how does it work? we say how does it work, right? explain how this thing works. and people just don't know. and so when we again ask them what their sense of understanding is, it's lower. we puncture their illusion. not only do we puncture their illusion of understanding, but we also puncture their confidence, their confidence and
2:37 pm
their attitude, we reduce polarization in the group, right? simply by asking for an explanation. so this is something that probably is true for only certain kinds of political issues; namely, those that really rest on a consequentialist foundation. those this which it's the causal mechanism by which the policy operates that really counts, right? like for a cap and trade policy, what matters is how it affects people's willingness to put carbon in the air. there are other issues like abortion or assisted suicide which really aren't about the consequences so much as basic values. and i don't think -- and, in fact, we have some data suggesting the same thing would not happen for those kinds of issues. so the next thing we do in the book is try to explain why this
2:38 pm
is the case, why is it that people live in this illusion of understanding or what kyle calls an illusion of explanatory depth. and the answer we offer is aha it's because we confuse -- that we offer is because it's we confuse what we know with what other people know. other people know how ball point pens work, and therefore, we think we do. other people know how cap and trade policies work and, therefore, we think we do. we fail to distinguish the knowledge that's in our head from the knowledge that's in other people's heads. in other words, the claim is we live in a community of knowledge, right? and is we're built -- and so we're built, thought itself is a kind of collaborative process. it's a process which involves a team and does not solely go on inside the skull. so let me quickly describe an experiment that i ran with an upside grad at -- an undergrad at brown which tried to make
2:39 pm
this point directly. we told people about a scientific phenomenon, something we made up. so, for instance, we told them about a system of helium rain which my real scientific friends tell me is not actually possible -- [laughter] but we had made it up, and we said scientists discovered this thing. they haven't explained it yet. they don't understand how it works, but they've discovered it. how well do you understand how it works? [laughter] and not surprisingly, people said i don't understand at all. on a one to seven scale, they chose one, essentially. another group, same thing, scientists have discovered this system of helium rain. they understand how it works. they fully explained it. how well do you understand how it works? and now people say two, right? so it's not like they feel they fully understand helium rain, but there's a little bump in understanding that's a attributable to the fact that other people understand.
2:40 pm
they themselves -- give them no information about it, right? all we say is other people understand and now they understand. is so think about this in the context of, say, to pick a random example, the last election, right? imagine everyone around you thinks they understand why hillary is crooked or, you know, let's be fair, maybe everyone around you thinks they understand why diversity is effective, right? merely the fact that everybody else understands -- if understanding is contagious in the way we're suggesting -- is going to give you a sense of understanding. and if everybody's sense of understanding is a result of everybody else, everybody around them having a sense of understanding, then we can have a lot of confidence and belief based on nothing, right? confidence that's essentially a house of cards. so finally, the last thing we do in the book is we draw out the implications of these ideas for
2:41 pm
a number of things, science, literacy, our understanding of intelligence, decision making, technology, and something, one or two other things that i'm not remembering at the moment. that's what the book's about. >> veried good. thank you, steven. so i just want to say as a reader of the book and as someone who has worked on some related things, that i really found much that was new and led me to think about different problems and some familiar things in a different way. and now, and if you are not in this area, it's great fun to read as well. >> thank you. >> now, maybe we can put you on the couch here as an author of the book. i think it's always interesting especially when someone writes a book that builds on scientific research that goes through the
2:42 pm
editorial process and so on is to think about the deeper motivations for -- [laughter] for it. so i'm going to start with one innocent question, then i have a follow-up question as well. [laughter] but the innocent question is like do you -- is there a reformist impulse in the sense that the knowledge illusion is a bad thing, or are you not -- because it's not so obvious that it is a bad thing. one could say that the knowledge illusion shows that we are a communetarian species, and we rely on others, and if you strip that away, we'll become locked into our own individual severals. so is -- selves. is it a bad thing that there is a knowledge illusion? >> thank god that was the question, because i was sure you were going to ask me about my childhood. [laughter] no. i think you're absolutely right. for the most part when it comes to ball point pens and toilets and zippers, it's not a bad thing at all that there's this illusion. and, in fact, in many other
2:43 pm
domains of life, in our spiritual lives, for instance, there's no point with the fact that we think together, that we think as a team, that we think collaboratively. and, indeed, you know, it's a great way to solve problems. in fact, it sort of relieves us of a kind of burden, right? we don't have to understand everything if we accept this fact about ourselves, our own ignorance, then life is easier, right? there's less pressure on us. in fact, we got an e-mail from this one guy who said that he had suffered his whole life with some kind of mental disorder. he didn't elaborate what it was, and that it was such a relief for him to learn that it was okay to be ignorant. that everybody's ignorant. so in that regard, i agree completely. i just think that there are certain domains in which there
2:44 pm
are ill effects. and, you know, i think populism in politics is something that can cause a lot of damage. i'm worried about the state of the world right now. and i think it has something to do with the fact that we're living with this set of beliefs, this ideology that doesn't really have a firm ground. you know, there are other ill effects too. i think that teams could probably work better together if the individuals in the teams had more respect for the point of view of others. so both on the large scale and the small, you know, i do think that there are prescriptions that could be derived from these insights. but overall i agree with you completely, there's nothing, there's not a problem, there's not an inherent problem with it. >> so i have a related question, and this has to do with the
2:45 pm
general topic of collective knowledge which has been in the popular imagination and in scientific imagination the last, i would say, 10, 15 years. and my question is motivated by is there a particular political preference? and i'll raise it with an example that's not in your book, but a discussion of crowd wisdom and prediction markets. it turns out that prediction markets are partly a new instrument for aggregating information but also much of the interest there, at least knowing the individuals who are pushing this, comes from a kind of libertarian, anti-elitist agenda where you don't want to trust credentials experts, you want everybody in a democratic way to be able to contribute an opinion and influence the outcome. now, in your book it seems that one of the themes is that we
2:46 pm
claim too much knowledge on our own part and that our contribution is smaller than we think. and you mentioned the case of scientific and artistic achievements where people tonight realize how much they stand on the shoulders of giants. so that seems to have a perspective that it's not libertarian it's, rather, communetarian or that we should understand to what extent we're really a small part of the whole. is that a motive or a theme of the welcomed? >> it's certainly a theme -- of the background. >> it's certainly a theme, and it's not in the background, it's in the foreground. it's not a motive, it's a fact. [laughter] so, yes. no, that's definitely an implication of the book, and it's an implication that i would be willing to defend. in the domain of prediction markets, i actually didn't -- i
2:47 pm
mean, i'm certainly aware of prediction markets. we talk about them a little bit in the book. i wasn't aware that they came out of a libertarian perspective. it seems to me that much of value of prediction markets is that they're a means of sussing out the means in a community, the people who are willing to risk the most on their restrictions, presumably, should be the people who know the most and have the most knowledge to bring the bear will have the most accurate predictions. and i've always thought that's the reason that prediction markets are successful. it's actually kind of surprising, because there's other evidence showing that the people who are most confident actually often know the least, not the most. so david dunning has done a lot of work showing this simply, you know, asking people what their views are on various issues and
2:48 pm
then measuring their knowledge about the backgrounds of those issues and finding the people who do the worst on his general knowledge tests are the ones who are most confident about their opinions on the issue. there was this great experiment we mentioned in the book in which a bunch of researchers asked a group to locate the ukraine on a map. and most people were way off. some people got it pretty close. they also asked these people how confident they were that the u.s. should intervene in the war on crimea, right? should get involved in the war on the ukraine. the people who were the most off in locating the ukraine on the map were the ones most confident that the u.s. should intervene in crimea. [laughter] so, i mean, that's an example of the converse, right? how knowledge actually leads to lack of confidence, to sort of lack of i hubris. but in the case of prediction
2:49 pm
markets, i've always assumed it was true expertise that the prediction market was picking out. >> yeah, the -- i'm just keeping track of the time. no, i was thinking that you get the idea in prediction markets is you would not need a ph.d. or credentials or an m.d., actually, to vote on what should be done with a patient and so on. i don't want to take the maybe just one more quick question which goes more to the research. could you -- it's known, for example, that people confuse their own confidence with how much they think some information is generally shared. confidence and consensus are confused. but you have results that it really matters whether you believe someone else knows something, that that is somehow sucked into a sense that you know it as well. but it also has to be accessible, is that right? the knowledge. >> right -- >> it can't be sealed. >> so in this contagious
2:50 pm
understanding experiment, the helium rain example that i described earlier, we had a condition in which the researchers worked for darpa, for the defense intelligence agency. and in one condition, they understood the phenomenon, but it was secret. so people had no access to it, right this and the question was -- right? and the question was would knowing that knowledge was out there but also knowing that you, the judge, had no access to that knowledge, would that also increase your sense of understanding. and the answer is it didn't. right? so knowing that others understand -- or believing that others understand something increases your sense of understanding but only if you can access that knowledge. and, in fact, it doesn't even have to be other people. it can be machines, right? so if you've -- there's evidence that if you've been searching google for answers to questions, then you feel like you're a
2:51 pm
better question answerer. >> so i think we should give the audience a chance now to ask questions. yeah. sure. >> [inaudible] >> you actually do need the mic, because they're taping. >> all right. i'd like to know what you're -- you're kind of, like, promoting your book, so i'm wondering why i should buy your book. in other words, what am i going to benefit from reading your book? because you told me certain things, but i already knew them. what -- i'm being serious. >> well, if you already knew them, then don't buy book, you know? [laughter] i guess you already know what we have to offer. i don't know what to say. >> [inaudible] you hope you will -- >> well, i thought i -- so there are two lessons, two main lessons. there are lots of lessons, but there are two main lessons that i think the book draws.
2:52 pm
one is most people have an inflated sense of their own understanding. no one in this room, i'm sure. but outside this room. and second, that the reason for this is because we should think about thought as a communal enterprise, not something that's going on inside the head. now, i'm glad you already know that, and it's true that it's not a completely novel idea by any means. but it's also antithetical to what a lot of scientists assume, right? so i'm a cognitive scientist, and i can tell you that cognitive scientists, for the most part, assume that thought does go on inside the head, and that's what we study, and that's how we talk about it. >> very good. yeah. if you could just hand this one down.
2:53 pm
>> hi. i'm going to be the annoying person to ask what it means that we understand things, because, all right, so everybody in this room probably agrees that there are, you know, dna is made of four bases, the double helix, that there are 46 chromosomes for humans and so on. probably almost nobody in this room has actually replicated the experiments that caused us to know these things. so what does it mean to say that we know them rather than just that we have this illusion that we know it? >> so it seems like you're reiterating my point, right? that that we feel, we feel we understand these things and, actually, if we took a poll, my guess is fewer people in this room know those things than you think. right? >> [inaudible] >> well, okay. so maybe i'm wrong, but even mit can be surprising. >> [inaudible]
2:54 pm
have it replicated the experiment. >> no, so exactly. so the point of -- one point of communal knowledge is that we depend on work done by others, right? that most of the things we do as lay people and as scientists depend entirely on knowledge that sits in other people's heads. and that seems to be the point you're making, and that's exactly the point of the book, right? so most of the methodologies that we use are methodologies that have been, you know, developed and demonstrated effective elsewhere and, right, we don't replicate them. if i use someone else's theorum, rarely do i go out and reprove the theorum. i depend on what other people know. so, yeah, i couldn't agree more. as far as defining knowledge, that's a separate issue that maybe we can talk about later. >> more questions.
2:55 pm
>> hi. >> hi. [laughter] >> so i have, i was thinking about what you were saying, and i was wondering do you have any thoughts as to the fact that different groups of people may have different sort of assumed knowledge from each other? so i'm thinking about people in, for instance, in academia and how you might get different fields, and people may be sort of siloed, and they may even, if they're approaching a similar problem, may approach it in a different way. so i'm actually thinking of this book i read recently called inventology where the author talked about some really interesting inventers. and among other things, she talks about how people who, some of the people who come up with really good solutions to things may come from a totally different background and maybe a different knowledge base from, you know, people who sort of
2:56 pm
traditionally, you know, from people in fields that traditional hi solve problem -- traditionally solve the problem. so i guess i'm just thinking how, if you have any thoughts as to how we might sort of leverage this, you know, ability that we have to think collaboratively but also to avoid some of the pitfalls that that can, you know, produce. >> so i think what you're pointing out so eloquently is that there is a division of cognitive labor that we all are experts in only one or two very narrow areas, and that to accomplish anything we need expertise that's distributed across the range of possible fields. you know, i -- look, i think society is structured to a large degree in accordance with what i'm talking about, many accordance with the fact -- in
2:57 pm
accordance with the fact that there's a community of knowledge that exists because we each have our own narrow area of expertise. in fact, to accomplish most things of little substance we have to divide up cognitive lay bork right? to build a -- labor, right? to build a ball point head you need someone who knows about fluid dynamics and a metallurgist and so what you're pointing out is society has taken that into account. and the reason we're able to build iphones and send people to the moon and have wonderful bookstores is precisely because we take advantage of that distribution of expertise. so in a sense, we're already doing it for the most part. you also asked what lessons there are about how we can do this better. and, you know, the little bit i would say about that is we could have a little less hubris about our expertise in areas beyond
2:58 pm
our own narrow fields, right? we should better appreciate exactly what you're saying, that other people do all kinds of useful things and have all kinds of useful knowledge that we don't have direct access to. the diversity of perspectives is a good thing. >> so my question is regarding -- >> microphone. >> i just started reading book, and i appreciate all you're trying to see -- do you hear me? oh, yeah. so i think about this knowledge as a bubble, that it's merchandise book is breaking that bubble, it's an awareness that it's there. and my question is regarding to inclination. let's say i'm a leader and i have a team, and i want them to understand this or be aware of this. i want you to tell me or give me
2:59 pm
a suggestion of how can i tell them this is important to take very big decisions, you know? as you said, there are things that we don't need to know that are important, but there are others that people should know this a team that this is there, and maybe we should read more or, you know -- >> should buy my book. >> yeah. [laughter] you know, i'm not going to give them a test and tell them, hey, you know less thaw -- less than you think you know. how can i converse this to a team in a way that doesn't need to read your book or give them aest? ..
3:01 pm
>> two questions related i would like to know whether you have anything to contribute to the underring why fake news are so successfully spreading, and you need to have trust in what you -- on what you believe or what you do not believe, and apparently there are many people believing these news. on the other hand i would like to know whether you think that the phenomenon you describe is changing, whether -- there are many people -- a general perception there is a porlarization in the country and does this increase confidence in your opinions and increase in ignorance in some ways?
3:02 pm
>> so, the basic idea that we live in a community of knowledge and that the mind is built for collaboration, i take to be basic facts about humanity that have been true since the beginning of time, since we were hunters and gatherers. what is changing us voicely, is our mode of interaction and seems to me that america, especially america, hasn't faced the dynamics that it's facing now. the answer i'm going to give you is kind of self-evident, i think. because of the internet and the internet has changed everything. right? and it's not only that we now live in these much -- these bubbles that have firmer walls because we're cut off from the people who live next door, who
3:03 pm
might have a different political perspective, and instead are reaping out to people in serbia or bosnia who have the same political perspective, and that in turn is made worse by the fact that much of our news is individualized. we're seeing only what we want to see, facebook and google deliver what we want to see, which tends to be stuff we already agree with. so our bubbles are getting firmer and firmer for that rope, but at the same time, the old style leadership is disappearing. so, we know people don't go to church as much, and so they don't hear this common voice that delivers the same perspective on the news, regardless of the members' political persuasion. and there have been a lot of people who have analyzed changes in demographics, suggesting that
3:04 pm
intellectuals are moving more to the city and there's less intellectual leadership outside the city, and as a result people are appealing more to their bubbles in order to get their perspective, both val lated and to just acquire them in the first place. one way to characterize the point of the book is that rather than thinking of people at rational processors of information, we should think about people as channeling their communities. so, what this implies about fake news -- there's good news and there's bad news about fake news. the bad news is that it is the means by which people acquire their beliefs to a large degree. if we just channel our community and our community is telling us all this stuff that's not true, it's going to make a big
3:05 pm
difference to what we believe. the good news is that i'm not sure it matters so much, right, in the sense that whether the news is true -- is real or fake, people are going to believe what their community tells them to believe anyway. so it's not obvious to me that fake news is having a huge effect on the distribution of belief in society. so, a lot to be said about that. >> you seem very oriented towards thinking about how american adults think about things, so i want to ask you about two different populations. nonamericans and nonadults. particularly people from very different cultures, people who think that bushmen in africa would show this illusion. do you think little children
3:06 pm
would show similar illusion? >> great questions, and the data are not in, and i'll follow the data. i have no reason to think that they wouldn't. i mean, i certainly have no reason to think that bushmen in africa live more inside their own individual mans than anybody else, right? i assume they're also collaborators and team players, and that i think is the first of the illusions. so, my prediction would be they would indeed show the illusion. kids, too. it's harder to show with those populations populations and that is important work to be done. >> i would like to ask you about reasoning and the difference between individual and group reasoning. a recent theory of reasoning
3:07 pm
suggests that reason is something that develops evolutionary as a social evidence of civil activity and the point is to argue -- rather than find the truth individualry. one of the predictions of the theory is that group or -- reasoning should be somehow more powerful or more effective in finding the truth or some kind of optimal solution to a problem, but on the other hand, there are other psychology that suggest when people start to persuade each other, there is the backfire effect to convince someone and you end up mobilizing them which ends -- leaves them with a resource, original belief. what would be your call? is group reasoning somehow more powerful than individual reasoning? >> so i haven't yet read the book, although it's on my bedside table.
3:08 pm
clearly i've got to answer, yes, to your fine question. i think it's get to to be more powerful because that is the nature of reasoning, and i think thinking about what guess on inside the head as a conversation with somebody else, is probably the best way to think about that form of deliberation. so, i should say i'm a two-system theorists, and i do believe that we have strong intuitions that we can think of as forms of reasoning. so, we will see patterns and make predictions about the future and explain things by essentially a sophisticated pattern completion process that is intuitedtive in the sense that we see what the outcome is, although we dope see what -- don't see what the process is that delivers the outcome. distinguish that from deliberation or deliberative reasoning.
3:09 pm
love the word "deliberation" because it plays on two different senses of deliberation. on one hand deliberation you can think of as a cognitive process, the process of thought, this essential process that goes on inside the head, but deliberation is also something that happens between people, right? juries deliberate to come to a conclusion. and we do that but i agree that we do that more effectively at a group level. the other assumption of their argument that the mind evolves to argue, i have more trouble with. i think the mind evolved to do all kind of thinks. think story-tell its more important than argumentation. so, there are all kinds of formal systems that we learn
3:10 pm
culturally that we pass and discuss and develop culturally, but i'm not sure that thinking about reasoning as a process of argumentation is in the end going to lead us in the best direction. >> i just wanted to ask the question because i think it follows up on this. it occurred to me there's some similarity to the example which is show me how a zipper works and the method which is an old trick of asking a person questions and showing them they don't really understand what they mean when they're using some term. so that's one comment. don't know if you want to comment on that. the other is more a question of rhetoric that a standard device, in persuasion, is to show the other person you can recreate their argument. you can recap pit few late -- recap pit -- recould pit late
3:11 pm
their reasons. does that -- if you want' toe put me in the same camp as socrates, don't mind that. a lot of these idea have a long history. when kyle originally did this work on the illusion of death, he showed that people suffer from the illusions with regard to their understanding about how things work. right? mechanisms. but they don't show us with respect to narrative or script or facts like fake capitals so it's a very specific kind of illusion, or was in a sense. but the one domain in which his students showed it also obtains is in the domain of logical justification. so people do apparently have the
3:12 pm
send that they can justify their beliefs to a greater extent than they can. could -- >> the comments from your own knowledge -- >> i think has a question. >> how do people know what confidence you're asking about when you say do you under helium rain? if my uncle says do you understand helium rain? i would think, yeah, could probably muddle through a conversation, or a world that has helium rain, but if a student asks, do you understand helium rain, won't go there if my broker asks do you understand helium rain, that's a different context. so this an experimental question. when you ask these questions what do you took these people are thinking? >> so, when we run the experiment, we give a detailed -- we give a little
3:13 pm
sheet of strucks on what we mean by understanding. so we provide the scale and we say 1 means you're completely unfamiliar with, and 7 means you're the world expert and you can answer all possible questions and then we -- so we do try to define understanding. i take your point to be that in an experimental context, people can understand what you're saying in lots of ways and the word "understanding" is vague and they could understand it to mean many different things. that's true. i don't think that actually affects our conclusions. right? so in this -- under contagious understanding experiment, what we show is that simply saying, other people understand, causes people to put a higher number down on the scale. so what exactly does that number mean? i don't know exactly. your point is well taken. i can tell you this. it's higher than the number that
3:14 pm
the people put down when the scientists don't understand. so, whatever "understanding" means, when other people understand it, people have a greater sense of it. >> after listening to your talk i'm so compelled to ask, or bring up the topic of the argument of climate change in the political arena. i am -- i know there's climate change, we have to go alternative energy but if you ask me to explain, don't think i could either. why should i be so proud of myself, whether my opponents will say you don't know what you're talking. how would you respond, given what you know? >> well, respond in the same way i did to this gentleman earlier, that we all rely on expertise, and really the question is, which experts are we going
3:15 pm
believe? and in the case of climate change, i m with you, i believe the upwards of 90% of scientists or climatologists who say we should worry about it but if someone says it's a conspiracy and these scientists are only believe in climate change because it's the only way to get published or to get grant money. the truth is, all i can do is say, that's not the culture of science that i know, but it's a problem. right? there is some uncertainty that comes from the fact that we have to trust our experts. look, we have to trust our experts. it's all we can do. right? i can't fix my toilet by myself. i have to trust the expert. and i can't eat potato chip us if i don't trust the expert who
3:16 pm
makes them. i think it's a hard problem, very hard problem, but it's a hell of a lot easier than trying to explain how everything works to every individual. >> i was wondering if you could comment on the people that are most wrong, most often? whether there was any consistency across those people, and maybe the extent to which academics might be part of the group in that they're individuals that are subject matter experts and might believe, therefore their subject matter expertise in one area makes them experts in a large variety of areas. >> so i can't speak to that last issue because i haven't studied it and haven't heard any studies of it. can tell you that the people who really suffer from the only illusion are people who are not reflective. so, someoff you may be familiar
3:17 pm
with this cognitive reflection test, this simple three item test developed by shane frederick at yale, and essentially it asks whether people tell you what is on their minds immediately or whether they verify before telling you what is on their minds. so, this is not -- from the test that but it's a nice example that give us you the flavor of this. if if ask you, how many animals of each kind did moses put on the arc, some people say two, and other people say, moses didn't put any animals on the arc, it was noah that did. the difference is that generating the response is nonreflective. two came to mind and two comes to everybody's mind, right? so people utter it. other people verify before uttering and realize that's not
3:18 pm
right. it wasn't moses at all. so, people who suffer from the knowledge illusion tend to be the nonreflective type. in in fact if you test people who are reflective you don't see any evidence for it. presumably because they explain before the spanned. you first ask how well do you understand how ball point pens work, people who are reflective think and try to explain how ball point pens work before they even put down their response the first time. so, are scientists more reflective? on average, yes. but you'd be surprised how many nonreflective scientists there are. so shane ran this cognitive reflection test on -- i hate to a say it -- m.i.t. students and harvard students and others. m.i.t. were the best, but it's still the case that well over half turned out not be
3:19 pm
reflective. >> do it on some in this room. >> we -- >> that's the best i can do. >> all right. >> in the financial market, as we keep listening to the experts, at some level all of them are -- there's nobody standing on the side and -- [inaudible] -- only people like the big shark movie and people, the only people who stood out and saved capital were five guys but they were their irrational thinkers. could the irrational thinking -- so for scientists or mathematical or food people -- they did not listen in their entire life. so, rationality are being skeptical, be a part of learning? >> absolutely. it should be part of it.
3:20 pm
i'd never argue that we shouldn't develop critical reasoning skills. i bought into that part of the judgment and decisionmaking program a long time ago, and i've tried to learn statistics and i try not to commit various cognitive fallacies that i know people commit. so, one more is following the herd, and we shouldn't necessarily do it. and, yeah, we can reap great advantage by nothing doing is as your example demonstrates. but don't mistake not following the herd with always being right. right? so, if the herd is wrong, then you're brilliant. if you don't follow the herd. but often the herd is right. and i wouldn't necessarily call somebody rational just because
3:21 pm
they don't follow the. he i would call someone irrational if there's no basis for their police officers or their beliefs -- their beliefs or if their police police offics don't correspondent with reality. clearly following the herd in financial markets can earn you a lot of money at least for a while. >> we've run out of time. thank you very much to our speaker. [applause] >> we have a short book-signing
3:22 pm
55 Views
IN COLLECTIONS
CSPAN2 Television Archive Television Archive News Search ServiceUploaded by TV Archive on