tv Weapons of Math Destruction CSPAN October 16, 2016 8:15am-9:31am EDT
8:15 am
have thriving examples in the states of education programs and services that are working for students, their families and taxpayers. there are 61 school choice programs in 30 states in the district of columbia. there are 26 voucher programs. 26 voucher programs, 21 tax credit scholarship programs, nine individual tax credit and deduction programs, and there are five esa or educational savings account programs, and together these programs are helping more than a million school children and families. not to mention the millions more students attending public district, charter, home, and online schools all of their parents choice. dc didn't build any of those programs. citizens in the states did. and these programs in improving student achievement in intrusion competition for students all out
8:16 am
of fraction of what we're told we should be spending. more than 30 years after the creation of the u.s. department of education, students taxpayers and the country are not better off. but we can be. after decades of waving the constitutional barrier to a federal role in education, under the guides of partnering with state government, it is time to dissolve that partnership. and abolish u.s. department of education once and for all. >> you can watch this and other programs onlines at booktv.org. [inaudible]
8:17 am
>> hello everyone. thank you so much for coming out tonight my name is davis. part of the policy honored to welcome you here hills out for a new book weapons of mass destruction. really excited to get into this conversation, but i do have a few housekeeping notes. i want to let you know before we get startedded, if you will attack a moment of -- to silence your cell phone and no interruptions or otherwise. and also i want to mention that -- format of this event and you'll have the opportunity to ask questions. and we have c-span recording the event tonight i was really appreciated if you raise your hand if you have a question and i can bring this to you so we can have all of the questions recorded and everyone can hear
8:18 am
and get in in on the conversation thank you very much for that. we'll be doing a book signing after the event is over and right here with where the white table cloth is there she'll be there to sign a book you can't pay for it here but you can pay for it at the front in the bookstore so either head out to kick out and come back to the copy -- or get it signed here before you go out. i also want to make mention that this is a publish event great and developing over the past year and a half, we're located here as well as if you at the calendar you can see where they are throughout the city. so you know, i first will learn what it was when i solve a rue
8:19 am
rubik's cube but we live in an age of the all gore rhode island increasingly the decision that affect our lives, where we go to school get a car loan and pay for our health insurance are dismaid by humans but by mathematical moldses everybody does courting the same rule and bias is eliminated but kathy o'neal reveals in urgent and necessary book, the opposite is true. the model used today are unregulated whefn they're young. most trouble aring they reinfluence the discrimination. so bank of america and land reviews o'neal is io deem person to write this book. an academic who turned wall street and scientist, who has been involved in occupy wall street and recently started auditing company. he was one of the strongest book was speaking out for limiting ways to allow algorithm and against notion that because it is implemented by unemotional
8:20 am
gene cannot have justice. review said it is an unusually reasonable book adopting governing aspects of our life. so m.i.t. department and the college switch over to private sector working for the hedge fund, and metrics of a software company -- he left in 2011 and started working as a native science in the new york startup and reviewing in 2013 and lost lead program and data journalism in 2014. he's a weekly guest on slate money podcast and may know her for her blog -- o'neal was with the conversation by doing pullback computer science and university of maryland so join me in welcoming kathy joe.
8:21 am
[applause] >> so yeah computer science professor so algorithm leak out here for y'all. [laughter] so maybe we can actually jump right in and get some like definition and ideas out there and also talk about one of the examples in your book because we were chatting in the back aside you always were coming in and saying one of the propghts i've done in my lab is to take data from twitter and facebook to use that to predict your personality trait. how many of you have done a i.n.t. personality test? businesses want you to take those and you don't need to because we have algorithms to look at what you leave behind and find them out automatically and use ways that you may or may not like. >> i wouldn't like that. [laughter] yeah so we're going to talk about what a weapon of mass destruction is and then we thought might start with an
8:22 am
example. just then we can sort of like draw out the characteristics that make it a weapon of mass destruction and by that i mean like there's some all gore risms that i care about and most i don't care about. ones that i care about i think are potentially really destructive i call them weapons of mass destruction but start with an example to make it real. so the example if you don't mind me starting there -- so there's this guy named kyle beam he was a college student in atlanta and he wanted to get a part-time job at clover's grocery store and he had a friend who worked there and his friend it was leaving so he was like if i talk to my manager. you've got any job. you have to fill out the paperwork online. and so he started to fill out paperwork and 50% of the job applicants in this country required to take a personality test. online --
8:23 am
before he got the interview. so if he failed this personality test which he did then he wouldn't get to the interview process and this was a standard procedure for lots of mnl minimum wage jobs especially and luckily people don't find out that they failed it but they never get called out but his friend told him that he got red lighted that's what it is called and then unusual in a second way which was that his father is a lawyer. and you know most applying to minimum wage doesn't have a father who is a lawyer and father asked him what kind of questions were there on this test because you're very, very qualified to get a job at the grocery bagger at a grocery store straight a's in high school. what's the problem? and he said welled dad the questions were a lot like the ones i got at the hospital when i was being treated for bipolar
8:24 am
a mental health assessment. do you know about it? >> it is actually what we could predict. >> so it looks at are you an introvert and that measures how emotionally stable are you. so if you get anxious or anger easily a high score. >> i do. >> or if you're supermellow and nothing both rs you then you score low on that. and easy to find that out through a test but those signals also show up in all kind of other data we leave behind too. >> oh, my god that's so frightening. >> so what happened was he talked to his father so glad to meet you and glad that you don't have to take this test to get this about someone. but what happens is his father was like that's illegal. you can you can't make them take a health exam including a mental health exam. and that's the american disability act that's trying to
8:25 am
prevent this sort of -- creation of an underclass of people with disabilities and mental health problems. so his father is suing on behalf a lawsuit on anyone who ever took this test and add by the way that kite kyle went ahead to get a job at six other large companies in the atlanta area and got the same personality test and red lighted for all of them and prevented from getting a job in his -- in his area. so -- that's the first example i want to talk about. do you want to add anything? >> see if that takes us towards this definition question because this you were talking about a test. but, in fact, a lot of these kinds of insights about people come from these moldses that are built on algorithm and bunch of data and they work opaquely and the fact that they can hammer people or have bias even though on the surface it looks like they don't have bias that is
8:26 am
from the nature of your weapon of mass destruction concept. and apply. >> go ahead and define weapon of mass disruption with three properties first is that it is widespread and impactful. so it matters like algorithms the there like i built in any basement because i'm a data scientist i build them all of the time. nobody cares about ones that don't listen. they start caring they should start caring when it aivetses a lot of people in important ways because it was widely used. people who built this all gore rhode island located in bos son and sold it to people, companies to use instead of their hr people so widespread and impactful. second is that it is secret. so kite kyle did not understand how he was being scored and people who took this personality test is is not understanding that they were being scored. they didn't know this was a hoop they have to jump through and finally it is destructive.
8:27 am
so it destroyed kyle's chances of getting a job. but it is actually descriskt in a largeer sense characteristic of weapon of mass destruction is and terribling gore risms to create and reenforce a feedback loop which is destructive to the society as a whole. here it is destructive larger feedback loop is like systemically refusing to influence people with certain disorders. >> i think you -- one of the examples of like both -- like most in the book is when you open with which was a local one. and it highlight it is the really secretive black box nature of the algorithms including if by the way, the computer scientist really crappy ways of validating them a and that was their use in the dc school system for getting teachers fire sod talk a little about nap >> second example so michelle was school is chancellor for some time and she instituted this policy where by
8:28 am
some people would get fired if they had bad teacher assessments and some people would get bonuses they got really good teacher assessments now assessments are complicated but short version is that most of the ways that a teacher are assessed is like not very -- they don't have a lot of spread so most people get acceptable or very good. and those people who really want to -- discriminate between bad and good are frustrated with that. they want more spread in their scoring system. they want like more people who are terrible and then eminent, goods, fair, better and very good. so they're frustrated by the fact that have few of the ways that we could -- we now aseis teachers have any kind of spread which thng of information. so they instituted this -- this assessment this new kind of assessment called growth score or the value added score of a teacher web and i won't go into it too technically but the very
8:29 am
broadway of just thinking about this is the idea is that the teacher -- the teacher is on the hook for the difference between what their students should have gotten versus what they actually got. so there's an underline model that estimates what each student in a class should get so if you're all in my class like i'm teaching you fifth grades you're all fifth graders right now. each of you got at the end of fourth grade you each got a score for your standardized test right,let say you got a 75 out of 100. then you should be and expected to got a 75 for a fifth grade test. now, let's say you actually got an 80 then i would be given credit for five points that you got more than you were expected. okay, does that make sense now as to not declare how expected score is actually determined per student. and it is really complicated and it's a source of uncertainty.
8:30 am
and it's also another uncertainty what you actually got at the end of fifth grade or what you'll get at the end of fifth grade because afterall like tests vary every year, and kids get different scores depending on morning or afternoon or whether they have breakfast before the test so that kind of thing. so there's leak two uncertainty and the teacher again, remind you teacher is on the book with a difference between two things. okay. now, actually if you think about this -- this original expected score that each of you got, the difference between the expected score and the actual score is called error term, specific, also sometimes called noise. so the teachers are being assessed based on these error terms which is -- if you're not a don't be confused we in statistics call this stuff essentially pretty meaningless and, in fact, this way, the
8:31 am
scores actually came out with almost meaningless. ..pnt so anyway going back to washington, d.c., i interviewed this woman who got fired because her assessments were too low. again it wasn't entirely due to the valued at its corporate it was 50% of the but the other 50% with these things that don't have any spread.
8:32 am
most of the information her score was this terrible value added score. okay, one more thing to tell the about this, her score is that a lot of her, she was a fifth grade teacher. a lot of of fourth graders have gotten good scores at the end of fourth grade. a lot of the kids come into a class had gotten actual fourth grade scores but they couldn't read and they couldn't write. and so she was suspicious of there really very good scores. and, in fact, she is every reason to believe that the teachers of those kids cheated on the test at the end of the year in order to get better value added model scores for themselves. because then their kids got better than expected so they can good scores. so they set her up to get worse than expected. does that make sense of? she has good reason to believe this. all sorts of unusual erasures.
8:33 am
i don't know if you remove that the scandal that industry but unusual erasures which never lick a systematically investigated. they got hushed basically. in any case she got fired. let's go back over the three characteristics of the weapons of mass destruction to its widespread. the scores are being used in more than half the states mostly in urban school districts. it's secret. sarah didn't understand her scores and she was suspicious of it but when she appealed that they said you know, no, it's there because it's mathematical. and finally destructive. not only did it destroy sarah's life because she got fired. like, i should say she got rehired like the following week and sort of affluent suburb where they don't use the scoring system. that's going to the final think
8:34 am
which was destructive. i think you'll be surprised if you know about michelle rhee and why she was brought income which is there's a nationwide kind of are on teachers going on which is fine but that teachers, get rid of them and then we will fix education. it's been like a bunch of presidents who want to be the president that fixed education. we can talk about whether that even make sense but in any case the idea was to get rid of the bad teachers. in a larger destructive feedback loop engendered by the valued added model is it hasn't gotten rid of bad teachers. it's been getting rid of good teachers because good teachers have been quitting, retiring early and most of all they been moving to affluent suburbs that don't have this scoring system. right now we have a nationwide shortage of teachers. i would argue this regime of this i'll take value added model
8:35 am
has been very, very bad. >> at i think something that is especially in cities about this, we talk about algorithms and they can sound tacky and four, things we don't interact with, but another the algorithms that we're talking about that we both work on have tons of some letters, sometimes are identical under date the algorithms like the netflix that recommend shows for you or amazon that recommend books to buy. we totally know how to handle those. amazon or most of the time it superb or and not insightful. you bought a stephen king book and how about these other stephen king books? you're right but not helpful right? sometimes it's totally wrong and you just go like where did this come from? sometimes you get these beautiful glimmers of insight. so bad bought this survival guide and recommend i buy this 18-inch knife that folds out at the handle. i never would've thought of that. that was a great insight. you don't just by having amazon
8:36 am
recommends. you don't watch every show netflix tells you to watch but the algorithms you are talking about, people don't treat him as one area in a human decision-making process. they treat them like this oracle of truth even when they're poorly validated, even when the people who use them to understand. that makes them more destructive because you override all of our great human intuition with this little bit of math. >> grade-point. i would argue this worse than what you just said. you know how, let me give you two examples. you know how the facebook drinking stories has been messing up lately, weird fake stories? guess what. we noticed that ever give feedback to facebook site that's not a real story. right? that something that doesn't happen to the teacher value added model. there was no ground truth for those teachers. there was no secondary assessment of those teachers to compare their scores to.
8:37 am
they were told this is your score. the guy who got six out of 100 was shamed. an excuse to get 96 out of 100 he was like what, that means i went from being a terrible teacher to a great teacher? there's no feedback mechanism. teachers can't say actually i'm a really good teacher so please update your model. that would be the equivalent of like saying netflix, i do want to see that movie because it's awful. please don't show that to me. >> they need a rock -- button. please don't recommend that. >> teachers do not have access to that way. so parent or administration knew they were good, they could go in and override. >> exactly. the thing i want to agree with is that people really did just trust the scores because they were mathematical. the way i came about, across this algorithm is my friend who
8:38 am
is a principal in brooklyn under high school in new york city, and she started complaining to me about her teachers getting scores. i just said can you tell me how they're being escorted? what is the algorithm? she said i asked a contract at the apartment of education and they told me, you wouldn't understand. it's math. and i was like, that's not good enough. as a mathematician let me just say asthmatics as both to clarify, not confuse. that's not a thing to do with math. that's recognizing math, right? it's not okay. i asked her to keep pushing and she got through three different layers, each later sang to her you wouldn't understand it, it's math. three times. she finally got this white paper which was unreadable to me. i cannot understand what this white paper sing. what i did was i filed for a
8:39 am
freedom of information act request to get the source code for the algorithm. it was denied. and by the way, i should say that the reason i thought of that is because the "new york post" had filed a freedom of information act and request and gotten, successfully gotten all the teachers at scores and teachers names and scores and they published it as an act of public shaming for the teachers. really awful. i was like if they can get the scores than i should be able to get the way the scores are made. makes sense. i was denied. that i contacted some at the institute, the data institute that build the score three friend of a friend come and explain to me i would never get that source code because they had a contract with the city of new york was too good now in new york city would get that source code, whatever that this stuff was made. it was secret from the. which is to say secret from the officials in the department of education, so they can't explain
8:40 am
to the teachers how they are being scored. >> part of this book really, the book as a whole as a manifesto about how math is being weaponize. i thought maybe you could give us an overview of how you came to the place. you were building these for finance for a while, and then you kind of became the enemy of finance. my words, not yours. but join the occupy wall street is probably not something you're hedge funds would've smiled upon, right? >> definitely. >> so quick, many years of your life. >> so i joined occupied in 2000 --.gov. i joined my hedge fund in the early 2007 is, and like right headed straight into the crisis. walked in and then boom, crisis. i was like really the solution very quickly because the people that i thought were experts really didn't seem to know much
8:41 am
about what was going on, didn't really understand the market which is to not say i understood a perfect but they didn't seem to understand that much better than i did. at the heart of it, and by the way, the person who introduced us talked about rubic's cube's. that's why i fell in love with math is rubic's cube. how beautiful are rubic's cube's? this idea, mathematics thinkers pure, clean, beautiful, almost artistic endeavor. and then back to the financial crisis, at the heart of the financial crisis was this mathematical life, the opposite of the beauty of a rubik's true. the aaa rating for mortgage-backed securities, which given he when i think of them as mathematics but they are. they were thomas is that people who were really good at math which ph.d's, or at the back crunching the numbers and promising that these mortgages were not going to go back. they were not doing that the it which is basically creating
8:42 am
alive and selling it for money the trip they'll -- the aaa ratings, the scale of the market, mortgage-backed seekers was very, very large, one of the reasons it was such a big deal. that's one of the things i realized was that it was a weaponization of people's trust in mathematics. people trusted math. they trusted mathematicians, and like math itself wasn't the problem. it was the people were basically corrupt and they were shooting themselves, shielding their corruption from peoples prying eyes by calling them out and sang don't look here, you wouldn't understand this, it's math. and it's right. you have to trust us. that's kind of the big thing that i realized, this isn't the right way to deal with math. then i left finance it became a data site is and then i was
8:43 am
like, doing almost the same thing as there was a finance. instead predicting market i'm predicting people. that's what they decide to do. they use historical data to build algorithms to predict protect people. you are predicting personnel does. that didn't seem so bad, but until i came across sort of the teacher value added model antigen, which i was like that's not good. and then i think the moment that i decided to quit my job and write this book was caused by this one interaction i had in my startupstart up with his venture capitalist came to visit. i was thinking, he was thinking of investing in our company. we all sat and listened to them talk about the future of tailored advertising which is what i was working on. i was working on advertising industries et al., like on expedia. i that those relatively benign. you offer some people like a
8:44 am
hotel room, some people you don't offer the motel rooms. you are giving opportunities to some people and not giving opportunities to others. a very kind of mild way of segregating society. does that make sense? you guys deserve these algorithms, you guys don't deserve these algorithms. i don't think it was particularly evil but then this guy gave us this idea of what is future dream for the world of the internet was. he was like i had this dream that like someday -- it didn't sound like martin luther king. let me take that big. he said i had this idea, like this is what i hope to see in tailored advertising. he said hope to see something that i get offered trips to aruba and jet skis, and to never again have to see another university of phoenix add. because those are not for people like me. and everyone like around the laughter. and i was like, what? want?
8:45 am
i thought that would ever happen to the democratizing force of the internet? like this is the goal of the people constructing the modern internet, which is to segregate people and silos in my class so that we, the technologists, the people who are lucky who are getting scored well, score high, with opportunities. we have like and things would like to play with competent people on the other side of the spectrum can be preyed upon. >> i make money off you from your vacations and i make money off them from exploiting universities that give worthless degrees, right? rick and i make money off of them? if any cercla i can make money off them by exploiting the. >> right. if you think about the way tailored advertising works there's an auction for the eyes of whomever you're talking about. it's a very efficient because it's all source of demographics.
8:46 am
and again, if you talk to someone like me, they are very good at finding and joining yarn adds because unlike reportable deal alpaca. they know my number, right? that's okay because like i love jarred and i'm like i want that yarn. but like for some people, the most profitable thing is to get the person to take out a federal loan which goes straight to a for-profit college, saddles them with the debt and does that give them an actual education. we have seen some for-profit colleges been posted recently, which is great news but it was an enormous, still is a relatively large industry, but that type of -- the time venture capital dr. mcnutt but i had never seen a universe of phoenix add. i did note what it was also. it was in 2012. i looked up and i saw that
8:47 am
apollo group, the parent company of universe of phoenix, was the single biggest google admired that quarter. that's when i realized like i'm not going to see the failure of this kind of algorithm. everyone here saw the failure of the aaa ratings on mortgages. everyone in the world saw that because it was so loud when it exploded. but who is going to see the failure of this kind of algorithm? because we segregate ourselves online to such an extent that with so many passionate when someone falls prey to that false trap, we do not see. in fact, we blame them for it. >> how are we doing on time to? i can keep going. great. where do i jump in? i've got like 10 questions i'm going to squish them all into one. one of the things you talk about in your book is like potentially
8:48 am
we have people are building these algorithms, take something equivalent to a hippocratic oath. certainly does to ethics training for people like us. it makes people accountable in the computer science department to talk about ethics i think. there's training is a good part, but i think the question i came away with is, if there's money to be made by exploiting people, it seems like that's going to happen result with the mortgage-backed securities we saw with enron. like back in 2001 pc folk also enron traders, if you remember back then, shut down the power grid in california just to make money on energy futures. they're talking about the old people who are getting screwed basically because they have to pay these enormous rates for the electricity. it seems like the market is set up where if we can use these algorithms, if we can use anything that is vaguely within
8:49 am
the rules to exploit people to make money, we will do it. and i wonder what your thoughts are on that would change that part of the system. >> so i'm not expecting all corruption to end, right? period, right? but if you want one particular form of corruption to be challenged, and that is the corruption of where people claim that because of an algorithm it is beyond morals. every algorithm has invented values in it. -- embedded. the people who create the algorithms, who own them are the one to get to decide what those morals are. we need to start challenging that. i have a very specific reason to define the algorithms like i do, the weapons of math destruction. i perform triage. this is the service i done for the world as i perform triage on
8:50 am
all those algorithms. i told him where to focus our attention, and for those algorithms, many of them are accidents, accidentally terrible algorithms. some of them are just terrible because people who are greedy will always want to do that. so i do think that for the algorithms that rise to the level of may be good down to the level of something that is potential weapon of math destruction we need a more scrutiny. in the to be laws about the. a lot of the stuff as you mentioned, there will be no free market solution to this. the free market will not solve this problem. it actually makes money. it's profitable to be discriminatory. you can't ask someone please stop being discriminatory because of the great society even though you make more money doing it. that's just not going to happen. that's why we have laws about things like fair lending. we have the fair credit
8:51 am
reporting act. we have equal credit opportunity act. and discrimination laws that were written in the 1970s that essentially regulated things like fico scores come and they're not perfect but there are a hell of a lot better than we have for what's going on in big data. i am basically arguing not for one, the public, stop trusting algorithms. pushback, demand accountability. we give secrecy you know accountability. so especially when you are being assessed after job you should be able to know how that is happening. my calendar the ss exactly? you should be able to ask that question. other also asked the rules and regulations around things that rise to the level of widespread and impactful. >> i think just to close this out. if what you declared in the book which is one i've been thinking a lot about in another domain is if you have an algorithm that seems unfair was implications
8:52 am
work differently, and that's because of the human basically right? if you have an algorithm that's discriminating and it doesn't consider race but it does consider zip code, that's because whoever programmed it didn't think using zip code, didn't consider the fact discriminate based on race. if you have a chat bot that started doing holocaust denials and nazi stuff commits because whoever programmed it didn't think that maybe we need to think about minorities and women because they get harassed online all the time. if you have an industry that's made up of people who are greedy or who are ignorant of the problems of people not in my class, they can be a good other sorts of problems are algorithms create anything this accountability should you grace also gets to us as developers that we do think that these problems, we need to know about it and not just about optimizing for profit or the output. >> that's well said. i want to add one last thing.
8:53 am
there's been recent publicized examples of spectacular failures of algorithms coming from like this twitter bot and the facebook algorithm stuff. that's what wiki to see, like most of the bad algorithms that are ruining people's lives are never seen by the public. they are business-to-business dealings, like the one that was the personal the test for kroger supermarket. if we think things can do things that are known to be under public scrutiny are that bad just imagine what's was going on when there is the public that can scrutinize these things. >> i think you're taking my mic now, right? >> thank you. [applause] >> we will start in the back.
8:54 am
>> thank you very much. this is incredibly interesting. talking about the algorithm as a device for controlling sort of, for establishing or maintaining a control system, and algorithms, this device that none of us would understands presented is valid. it reminds me of other things we have in society that accomplished the same function like perhaps constitutional law or things that no one really understands that they are legitimizing a structure that probably ought to be getting to the bottom of. these other things how usually like a priestly class that's where the only people who understand it. you guys come from this in algorithms. what is this class and what are they thinking about this and are you a renegade from this class or is everybody thinking about this? what's the deal? because i don't know of any of these people, really.
8:55 am
>> thank you. yes, it is a priestly class and much to my shame, a lot of the people who are in this class with me refer it that way. it's like actually pretty flattered to be told you are magical because you are a mathematician, right? not enough of us are worried about her impact on the larger society, but i do want to say when i started this project for years ago, i quit my job to write this book, i was really panicking because i didn't see anyone around me who was worried about this. but now four years later i have an entire community of people, including computer scientists sociologists, anthropologists, technologists, that are all super concerned with the stuff. in particular i've been hanging out with some people who, like me, are interested in developing tools and auditing tools specifically to, like audit the
8:56 am
black box, look in the black box in some sense. i do want to make that mysterious. what i mean by that is, in a social logical experiment we are borrowing from sociology. light, if sociologists want to see whether hiring practice was racist they would send, like a bunch of applications with similar qualifications and have like black names and white names and see if like the white names get more callback for interview. you can do that with an algorithm, too. you can just do similar things with the algorithm. it's crude and it's the first generation of funding tool for algorithms is the type of thing we need to start doing. long story long, we do have work to do, and our priestly class of technologists who work with the data, but we do have a growing group of people who want to think about this.
8:57 am
and by the way, also like a field that should be developed right? like i feel in 20 years there will be conferences about this kind of thing. right now there's nothing. as far as i know there's not even a journal that's interested in polishing things like this, but there will be. and by the way, one last thing i would say that the beginning of this field has already taken root in data journalism. if you look at like propublica, the recent work with auditing, we talk about recidivism models but these models that judges use to send people to jail, you know, viewed as racist as one of the reporting teams at propublica after that of data using foia request and audited and found it to be racist. there's work being done by this in the early stages.
8:58 am
>> thank you. this is wonderful. i'm a physician, a little bit late, but i wanted to get to areas in which it's might have something to do with medicine. there's tremendous confidence in the medical community -- >> could you speak up? i'm sorry. >> i'm a physician. big day comes up in two ways. the first is that the medical community and the business community are convinced that all the data going into electronic medical records is going to give us brand-new insight into taking care of people and health. i wonder, it's not your field but i wonder what you think of this? i'm skeptical. the second thing is, another big push in medicine is to force individual positions to take risks for the patients that they take care of. that is to say, if a patient doesn't require so much care,
8:59 am
you do well. if the patient requires more care, than you anticipated, then you are screwed. we are not insurance companies. i wonder if there are algorithms that will be developed so that physicians and insurance companies, even though everything is supposed to be on the up and up and we don't do preexisting conditions, they will use information that is publicly available of recently publicly available to figure out who is at risk and who is not? >> great question. i'm not an expert in your field, as you said, but i do want to say that i worry as well as you did about all the hype around how much data can be in the medical field. so i would focus as a non-data person, you know, and i would focused even though i am a data person, if i were you, on accountability. and explicable testing
9:00 am
9:01 am
really good it agrees with you in the wake of what you know is true and you start talking in a way you are not sure of yourself. that example -- testing -- there is an example i wanted to throw out there. and algorithm, imagine we had good algorithms for that we could predict future illnesses, that is not necessarily a good thing and that is what people get wrong when they talk about precision medicine or all this stuff. in the hands of your doctor trying to help you stay well it was done well. but in the hands of an insurance company who can charge you more for future illnesses that is a terrible thing. something not addressed by
9:02 am
obamacare, it is not future conditions. imagine in the hands of walmart, not saying it is but a large employer who gets to choose who to hire or not hire based on future insurance, that would be really bad. there is no guarantee even if the algorithm is accurate that it is used for good. >> i am kind of in this space building those algorithms and they terrify me, as we were talking before we published a paper where we are looking at people who are starting treatment in aa, the first alcoholics anonymous meeting, we analyze twitter profiles from before they start going to aa and 85% or 90% accuracy they stay sober, they are working on that. the twitter accounts over the
9:03 am
course of the pregnancy and the day they gave birth predictions about 85% accuracy or postpartum depression great for your doctor this you go to your ob/gyn and she can push a button that monitors you for this and all the cases that is really scary and there is work in this community looking at things like diabetes obesity, heart disease, we are getting very good, the science is really interesting that we can do this. on the other hand the implications, the fact that there is no regulation to these algorithms, no one can even know if you are doing it is terrifying. i spend a lot of time talking to big companies saying you need to be careful about this or bad things will happen. it is a scary space at this point being down there building
9:04 am
and looking at these algorithms. i have a very dystopian view. >> we are all terrified, right? i didn't even know you could do that with twitter, that is awful. >> talking about the medical profession reminds me of hit the, so in relation to vocational testing why isn't that as enhanced as worthy of being protected by this law as medical information that is protected. >> thank you for saying that, that is one of the things i called for in my book. but even better if you had read
9:05 am
the book. >> i thought -- the perfect algorithm in the hands of the audience gathering around. >> i am a reporter with higher education, and the potential benefits in the cases you cited. for instance with the teacher example, it seems it was just the use of the information or wrong analysis but it seemed it helps identify an instance of cheating. i wonder what you are -- it is the way they are used. what you say to businesses in
9:06 am
the second example using personality tests, for them it is a matter -- a matter of saving time at the end of the hiring process. i am sure you identified what the response is to that. >> the only reason it was cheating in the first place was the structure of the model itself had too many incentives. when you have enough incentives in place and high-stakes, it causes cheating. i don't -- it is circular reasoning -- on this regime detecting cheating. i am not an education expert, i don't have a solution for education. i want to point out the lowest
9:07 am
bar of all my examples because it is a bad model. i will tell you my evidence for this besides -- some people didn't even teach. and i told you guys at the new york post, they were just high school math teachers in new york city did something really clever. i couldn't look at this model. he found teachers that maybe taught seventh grade math and eighth grade math, supposed to be an overall score for a teacher that would be consistent, and eighth grade math and things like that.
9:08 am
when he found uniform distribution, practically as likely to get 90 as 4040 all over the map. almost a random number generator. so there is no reason to think this is informational. 24% correlation, to say you couldn't possibly take a given teacher and say this is your fault, you can't do that, not strong enough evidence but a 0% correlation at the district level, useful information but i want to throw in, useful information about test scores, test scores are probably not enough information. not and education expert, i
9:09 am
don't think so. in terms of the hiring practices, the personality test it was illegal and secret, we give it evidence eventually that it is a mental health assessment which we think it is. we have evidence for that. assuming it is true you can't figure out directly but assuming it is true the problem is it is against the law, you can't discriminate when you hire. i don't have a problem with using big data to help you filter a resume but they have to be legal. at the very least. they should be transparent as
9:10 am
well. >> a very challenging picture and frightening picture of these algorithms, the average facebook user has no idea what an algorithm is. those people most perfected by the failure of the housing crisis don't know any algorithm had anything to do with it at all. how do you get this information out? that is important because people have no idea of the impact of algorithms on their life. and -- >> most people when the survey was done didn't realize facebook had an algorithm.
9:11 am
they are not aware of the algorithm. the discussion around the trending news stories, they are not seeing everything. and some populations have tricky things where they figure out how to get their friends to see their posts. if you put something your friends might not see it. the teenage population, it's branding into the post like coke and pushed up into the ranking. that is a long-term public all so make people aware of the algorithm. >> one interesting point i spend a lot of time looking at is you look at things like this pew
9:12 am
study, a great representative study, awareness of the algorithm but also concern about the impact clearly correlated to socioeconomic status and race so people who are poor are less worried about these algorithms and less aware of them. talking about being exploited by these algorithms people who are most likely to be exploited, i won't be exploited that much but people likely to be exploited or less aware, there is nothing bad going to happen. and interesting things for the data. your book is a great way to get it out there. and the massive education, so
9:13 am
powerful, and a literacy we need to start developing to at least know they are there even if we don't know the details of how they work. >> it is not enough to say everyone should be aware and everyone should protect themselves because what jen just said, the reason i wrote this book was i saw this time and time again as a class issue. these are tools of social control, people at the top are controlling people at the bottom you should have been aware -- we need rules about this, it is not good enough. >> close to that.
9:14 am
>> i am a big fan of your blog. i work in healthcare, work in healthcare payment, using this algorithm to determine what you were talking about. it is okay to be using these algorithms for hiring practices. and the solution in terms of algorithms themselves, dealing with what rises. and the data scientists, and inequitable welfare and affect. >> the thing about promise of
9:15 am
data, the promise of big data is we can make things more fair. i am not holding my breath but i think we can do that. part of it is the realization that we are not done, part is developing tools to audit and see if it is currently fair and how do we fix the algorithm, how do we make it fair? the good news, let's compare a company that has discriminatory practices with an algorithm with discriminatory practices for hiring was the problem with the company's if you interview people and ask them how they choose to hire, or maybe they don't realize they are being discriminatory. whereas an algorithm will not
9:16 am
lie. and that is good. but you have to have sufficient trust in the out rhythm to be able to do a good job. >> can you hear me? my partner and i did a fair amount of work earlier this year we experienced exasperation and frustration at the fact that bernie's campaign and other democratic campaigns use data from a company called them and what we are finding was the data was sending us in terms of canvassing to a lot of neighborhoods that don't match up with the demographics so on our end it was frustrating the data didn't seem to be working that hire as didn't see just
9:17 am
because there was data in a certain demographic didn't mean that was the best demographic target but beyond that on a broader scale i am a bit concerned about the fact that it seemed to be contacting a certain type of motor and not a different type of motor going to wealthy neighborhoods and not a lot of less affluent neighborhood so people are reminded to vote and others aren't. have you looked at all the advantages? to look at them. in terms the algorithm, the existence of it is not the best thing for society, don't think there should be a private company that owns the data on voters and links it out to campaigns, what can we do about that? where is the power struggle?
9:18 am
>> i have a chapter on politics the subtitle of my book is how big data threatens democracy. this is the second part. just having exacerbated any quality, a lot of people have confusing and badly scheduled lives they don't have time to be engaged but the threat to democracy comes from the data, targeting stuff in the way it is uneven. you are absolutely right the targeting homosexual people and people who are either people who donate a lot of money so they have bigger voices or people in swing states who are swing
9:19 am
voters, exacerbates feedback loops where they spend money on people who definitely -- spend money on people who may vote. having said that, if somebody is supposed to vote doesn't vote that is corrected and they spend money on that person next time around but somebody who doesn't vote doesn't vote they are continually ignored. a socioeconomic -- in terms of voting. the larger issue is this, what is efficient for campaigns is an efficient for democracy. what is efficient for campaigns is everybody here profiling you perfectly and says the campaign
9:20 am
once you to care. that is not what is good for us as a group. what is good for us as a group is public discussion on various issues but that is not what is happening. if rand paul targeted me he would find things i agree with him on which is breaking up the big banks. if i went to his webpage to see what he things of other stuff, he would follow me with a cookie and say let's show her that. they are controlling the information i have about them. that is not democratic. we want information about candidates to be utterly open and have more information about them and what is happening.
9:21 am
>> understanding who people vote for, we are pretty good at it in the last election we had 97% accuracy but here is the scary part. we remember in the last election something you saw, here are your friends who have voted. and something like 1/2 person more people to vote if they showed that then if they didn't. they want to get -- find other people they think will vote for donald trump and show them the things that get people to vote but don't show the people who will vote for somebody else. suddenly they are having a significant impact on the election itself. that seems relatively
9:22 am
apolitical. you are giving me a skeptical look but calling facebook out specifically, the fact is the companies that control what you see with these algorithms have the option to have real impacts on elections by understanding, who is going to make you go vote and hiding that from people evidence of that happening, is a real possibility with the technology we have today and one of the most conservative things in this political space. >> only reason i look skeptical is the population of people on facebook is already not bipartisan. if they showed everyone it would be fair. it shows everyone would have more impact on the democratic election. >> time for one more question.
9:23 am
i looked for an online advertisement on political websites, on the course of the last year a lot of major advertisings running ads as the american political life increasingly radicalize is. have noble intentions and half being able -- naacp, we don't sell t-shirts to people who -- what is happening is whenever we pull out of the market, in this particular case, online advertisers sell weapons and videos about how hillary clinton
9:24 am
has a video, and ideological segregation, advertisers pulling out of right-wing websites the feedback loop steadily increasing and are getting really biased, google cars go to really big things. this is something that is done the great remarks you know of, hundreds of thousands discovered through algorithms and account managers, advertising rates. i wonder if you can comment on that. >> a larger event in my book. i don't know how to address it,
9:25 am
the growing partisan divide in our country is not due to algorithms. i agree the way we live in on facebook and talk to each other is part of it but not the only part. i don't know how to address the problem. >> thank you so much for coming out. and thank everyone else for coming out, a great talk and great conversation. we will have books available for purchases in the bookstore and finding afterwards. [applause]
9:26 am
9:27 am
our look at this week's most borrowed books at the national public library continues with anderson cooper and his mother's dual memoir the rainbow comes and goes followed by dave ramsey's the total money makeover where he provides tips how to manage money and alleviate debt. and elizabeth gilbert's latest, big magic. the list concludes with john lewis's remembrance of the civil rights movement, march, the first installment of a 3-part graphic novel series. that is a look at the most recently borrowed book from the national public library. tune in for coverage of the southern festival of books this weekend. >> this is an unprecedented level -- all those moments, in 1999 speaking specifically to the idea of the explosion of
9:28 am
hip-hop and the global export of that and what does it mean for those particular images, not necessarily -- these images that are being projected to the world, how are we wrestling with the truth of the artist expression and also recognizing how it is born of a form of oppression and stereotype and flattening of identities how can we reconcile, and respect one abstruse and to know that it is influenced by systems you may not be a part of.
9:29 am
and the necessity of blackness is to the american identity, in order for the united states to operate the way it does, it needs a bottom, system of hierarchies and capitalism is that, the needs and explicate of class and blackness provides that bottom class and consistent, always have that bottom class that blackness is always there and we create -- demonize we have latinos, muslims, and it different points, ethnic groups, we are part of that italians, irish, german all exploited classes of
9:30 am
people, but there escape was the fact that whiteness needed to reproduce itself to form strongholds so that they could be embraced in that fold to assure that blackness is always at the bottom. there is a need to flatten those identities and demonize, to make invisible blackness the humanity of black people to the minstrel shows, postcards that depict black men being lynched, watermelon and exaggerated features, calling us lazy and all these different things are the work that the humanity of black people, need to be seen in terms of housin
115 Views
IN COLLECTIONS
CSPAN2Uploaded by TV Archive on
