Skip to main content

tv   Virginia Eubanks Automating Inequality  CSPAN  April 28, 2018 6:45pm-8:01pm EDT

6:45 pm
hillary clinton. she offers her thoughts on free speech. that happens on book tv, 48 hours of nonfiction books every weekend now how high-tech tools punish the poor. [inaudible] >> my name is -- and i'm happy to be here tonight. we have three or four events per week and sore season is full. there's always a few that are care specially about.
6:46 pm
tonight's conversation is one of those. please they're here to join us. please shut off your phones. if you didn't get the invitation for the event from us and would like to get future invitations there's a sign-up sheet on the counter. next week we have friedman coming, he has a new autobiography. -- with a book about the history of the black national anthem. also a great event. can check that out on thursday. after my brief introduction i will hand things over to our guests.
6:47 pm
then a q&a. please just give me a second to come to with the microphone. c-span is here which this will allow everyone to hear your question. with that, virginia eubanks have been analyzing the reporting inequities on our technology driven culture since her first book. we discussed when it came on 2011. is a fan of her work and i'm thrilled to have her back to discuss her new book. how high-tech tools profile police and punish the poor. she's both a scholar and committed activist and organizer.
6:48 pm
she's associate professor of political science and a founding member of the data bodies project. she is trying tonight by catherine, kathy's work shines a bright light in the under discussed and dark corners of the lives of the poor. kathy is one of the nation's leading poverty researchers and wrote the book $2 a day, the art of living on virtually nothing. she has just joined the faculty coming here from johns hopkins. she works on behalf of the poor both inside and outside the homes of academia.
6:49 pm
throw that she agreed to join us for the discussion. virginia will give us an overview before kathy leads us in discussion. this combines stories of the far-reaching consequences of data-driven discrimination. on the other hand cultural analysis. as long as poverty is seen as individual feelings as a result the bed decisions rather than poorly distributed resources will never write the algorithms to increase self-determination of the poor and give them a shot at a better life. we need to get our souls right round poverty. she shows us the ways in which
6:50 pm
our algorithms are written for the government by private and for-profit companies. if anytime you limit a mistake when filling out an online form this is considered a failure to comply on a we then lost her health insurance or social security. we would have a measure of automated of injustice and not stand for it. the poor are powerless not to stand for. but here they have tireless advocates. please join me in welcoming them. [applause] >> to think doors has been an incredible host. she's been witness to the development of this work and
6:51 pm
kathy is $2 a day really was a stylistic touchstone for me. i thought if i could only write half as good as they did i would be happy. thanks to you for coming out this evening to be part of the conversation. what i hope will happen is him assuming you have not read the book some can i give you a little overview of the three cases i talk about. i will try to bring the voices of the brave and courageous smart people they spent a lot of time with my reporting they took great risks going on the record with their stories and made
6:52 pm
themselves vulnerable. i want to make sure their voices are in the room. then we'll have a conversation which will lead into a question and answer. if you want to buy a book i'd be happy to sign it. >> people ask me how i got into this work. i come from a long background and strangely from my work also welfare rights and economic justice. in the year 2000 and sitting in low income housing in new york with a woman who goes by the name dorothy allen. they worked with the community of women to build a technology center.
6:53 pm
we're just sitting around shooting the breeze about technology. we started talking about her electronics benefits card which is like a depakote in which you receive your public assistance. they said to her, here a lot of people like is better there's less stigma than pulling paper food stamps at your bag and she said yes, and also my caseworker uses it to track my food purchases. i must have had a look on my face that was shock and on. she got quiet very generously said you all should be paying attention to those on public assistance because they are
6:54 pm
coming for you next. that was 18 years ago. think the predictions have largely come through. we do not all experience algorithmic tools in the same way. doesn't have the same impact but we all experience it led to shape the decisions whether it's algorithmic justice or artificial intelligence were having a great conversation about how these tools even just in the next couple of weeks,
6:55 pm
these are things were talking a lot about. one thing that's been frustrating is that the tendency to be philosophical with a very abstract about how we talk about it. the question is why if there's a box of puppies in a pedestrian, which with the automated car decide to hit? that's important but it's very forward-looking and touristic. it's not what's happening right now. because of my experience i understand that poor and working people are already working in that future. so to acknowledge that is
6:56 pm
important to rob ourselves as a community of important insights of how these systems can work particularly when they're developed and tested in communities with low expectations that their rights will be respected. so it's like what i think of the loan rights environment what i want to do is tell you about the different systems that i talk about in the book. i'm not can i go into the technical details but kathy may push me so will do that later. but i wanted to hear about the families who are targets of the tools who have important information about how they work. >> the three stories i tell, the
6:57 pm
first is an attempt in indiana to automate and privatize all eligibility processes for the welfare program. so that was food stamps, cash assistance and medicaid. the governor at the time is a mentor of our current vice president he signed what eventually came $1.4 billion contract the intention of the system was the governor tended to see relationships between front-line caseworkers and the families as invitations to collusion and fraud. part of the design was intended to sever that relationship. and move 1500 caseworkers who
6:58 pm
had been in local county offices into regional call centers. remove many application processes online. imagine if you had a problem with an application when he called he would get a new person and had to explain the case from the beginning every time. from the caseworker they were no longer responsible for families there are responsible for tasks that dropped into their workfl workflow. no one was responsible from beginning to end. just pieces. the result of that was a million benefits denials in the first three years of the project, and almost as a cute phrase most of
6:59 pm
those denials were failure to cooperate and establish eligibility. i want to give you a sense of who got that result in how it impacted their lives. in the fall 2008 megan missed an appointment because she was in the hospital suffering from terminal cancer. the cancer that began in her ovaries spread to her kidneys breasts and liver. her chemotherapy left her week. she struggled to meet the new system requirement. . .
7:00 pm
>> and so it's not surprising thatn that context a system that they call the match.com of homeless services has been appealing, right? so the idea of coordinated entry is to match the most vulnerable
7:01 pm
unhoused people with the most appropriate available housing resource. one of the issues around this system is that it involves the use of this incredibly intensive and invasive survey with a horrible acting -- acronym which is the vulnerability index and provision assistance tool. it's not my first time saying that out loud. and the vispat asks questions like are you currently having unprotected sex, it asks if you're running drugs for somebody, if somebody out there thinks you owe them money, if you have an open warrant, if you've thought about harming yourself or someone else recently. these are actually good questions to develop a picture, a good picture of how vulnerable
7:02 pm
an unhoused person is to really bad outcomes like death, like a mental health breakdown or institutionalization or incarceration. the problem is that los angeles county, even though it's made some real strides in recent years, just doesn't have anything like the kinds of resources it would need to legitimately address the housing crisis. so of the 39,000 people who have taken this survey, about 9,000 of those people have been served with any kind of resource at all. and that's not housing, right? that's not like they got into apartments. that could be a little bit of help we vixx or a little bit of -- with eviction or a deposit. so the 30,000 people who have done this survey, often multiple times, but haven't received any kind of benefit at all often feel like they're being asked to incriminate themselves in exchange for a slightly higher
7:03 pm
lottery ticket that might get them access to resources. so i want to tell you just briefly about gary boatwright who is one of those people. and he goes by the nickname uncle gary. so when i met him, he had been living in a tent on east 6th street on the edge of skid row. he's a funny, straight-talking man with thinning white hair and santa claus blue eyes. he's had a dozen careers; welder, mason, paralegal, door to door salesman, law student and most recently a document processer for a wholesale mortgage lender which actually later got popped for its role in creating the subprime mortgage crisis which made a lot of people in los angeles homeless. so gary has filled out the vispat three times, and he's really lost patience with the process. he doesn't think he scored very
7:04 pm
high on the scale of 0-17. he said he's 64, and other than like a little bit of high blood pressure and a hardinging problem, he's mostly hell ty -- hearing problem, he's mostly healthy. he has a mental health file in orange county, but he only found out when a judge told him, and he doesn't know what's in it. so the problem, as gary sees it, is not his comparative vulnerability, it's simple math. there's not enough housing for the city's 58,000 unhoused people. people like me, who are somewhat higher functioning, are not getting housing, he said. it's another way of kicking the can down the road. in order to house the homeless, you have to have the available units. show me the units, otherwise you're lying. so in november of 2016, gary was arrested, and he was charged with breaking the window of a public bus with a 99 cent store
7:05 pm
plastic broom, something he told me is physically impossible. when he got out a few months ago, he had lost everything. he lost his tent, his paperwork, his relationships with local organizations and the networks of friends who kept him safe on the street. and when and if he takes the vispat again, he will actually score lower because it counts incarceration as being housed. so the model will see him as less vulnerable, and his priority score will slip even lower. so that's los angeles. and finally, i want to spend just a few minutes -- i think we're going to talk mostly about this case together, so i won't spend a lot of time on it. but it's the case that i looked at in allegheny county which is a tool called the allegheny family safety screening tool. sorry, not safety tool, the allegheny family screening tool has gotten quite a lot of attention. in fact, there was a pretty
7:06 pm
lengthy article about it in "the new york times" magazine in january. and so i wanted to make sure that we mention it. so the allegheny family screening tool is a statistical model that's supposed to be able to predict which children might be victims of abuse or neglect in the future in allegheny county, which is the county that is where pittsburgh is located in pennsylvania. now, one of the things that's really interesting about this model is if you look back into the history of where its sort of point of origin was, was in a data warehouse that was developed by the county in 1999. that a data warehouse -- that data ware now holds a billion records which is 800 records for every individual who lives inial gapeny county. but -- in allegheny county. it only holds records on people who have reached out to public programs for support.
7:07 pm
so if you ask for county mental health support or you ask for medicaid or food stamps, if you ask for some kind of county childcare, you will end up in this database, because the data warehouse collects regular data extracts from all of these programs. also it collects records from juvenile and adult probation, 20 public schools, the police, the jails and a number of -- and child protective services as well. it's important to note that professional middle class families are probably asking for the same amount of support in their own lives, but because they're able to pay for it privately, they're not ending up in the data warehouse, right? so if you're asking for childcare support from a nanny or an au pair or a babysitter, you don't go in the data
7:08 pm
warehouse. if you ask for mental health support but you pay for it with private insurance, you don't end up in the data warehouse. and that's the sort of primordial soup out of which this predictive tool arises. it's important for two reasons. one is a false positives problem, right? so i believe that this system confuses parenting while poor with poor parenting. and because it does that, it oversurveils poor families. and because there is more attention on poor and working class families, it identifies more problems, and it becomes a self-reinforcing feedback loop, right? so that's a false positive problem. but there's also, and this is what actually intake workers in the call center of this program reminded me, there's also a false negatives problem which is because it only has information about poor and working class
7:09 pm
families, it's likely that the model is actually wrong. because it's missing a whole universe of variables that might be related to child maltreatment in professional, middle class and economic elite families, right? for example, there's some really good research that suggests that geographic isolation and child maltreatment are related to each other. but that's not going to show up in this data warehouse, because all the people looking for public services in allegheny county live in dense neighborhoods or in the impoverished suburbs that ring pittsburgh. so that's a false negatives problem. that's a problem of not identifying maltreatment or trouble where there might be because of the biases that are inpolice sit in the model. implicit in the model. so i just want you to hear a little bit about one of the families that i talked to in pittsburgh. angel, shepard and patrick reid. i met them at the due cane family support center which is a hub where families access
7:10 pm
programs and sort of create peer support networks with each other. and they didn't really stand out right away because their experience is so utterly average. it's so characteristic of the routine, mundane indignities experienced by the white working class. so they've struggled with low-wage, dangerous work, poor quality public schools and predatory online education, poor health and community violence. despite that, they're really creative and involved parents. so when they're caring, they're helping to care for both angel's daughter harriet and patrick's granddaughter desiree, and the two girls are about the same age. so when they bicker, which they do a lot, patrick and angel make them put on what they call the get-along shirt. and the get-along shirt, so patrick is like a big guy, he's like a brick house of a man. like a buddhist ex-biker.
7:11 pm
so they make the two girls get in one of his really big button-down shirts, and each girl president puts one -- girl puts one arm through one of the arms and the other around the waist, and they're forced to stay in the shirt until they get along. and they have to do it even if they have to go to the bathroom, and he says that always works. so despite they're, like, pretty incredible parenting, angel and patrick have racked up really a lifetime of interactions with children youth and family services in pittsburgh, in allegheny county. patrick was investigated for medical neglect in the early 2000s when he was up able to afford his daughter's antibiotic prescription after an emergency room visit. and when harriet was 5, somebody phoned in a string of reports to the child abuse and neglect hotline. this is, like, anonymous tipster, and they charged that harriet was running around the neighborhood unsupervised, that she was down the block teasing a
7:12 pm
dog, she wasn't being properly clothed, fed or bathed, and she wasn't getting the medication she needed. so for each call, an investigator came out to the house, interviewed harriet and patrick, angel and tabitha and requested access to the family's medical records. and then each time, finding no evidence of maltreatment, they closed the case. but each of these interactions was, of course, entered into the data warehouse which feeds the allegheny family screening tool. so patrick and angel are really aware that their entire pasts and every time they reach out for support goes into a risk score that can make their family vulnerable. you feel like a prisoner, said angel to me. you feel trapped. it's like no matter what you do, it's not good enough for them. my daughter's now 9, and i'm still afraid that they're going to come up one day and see her out by herself, pick her up and
7:13 pm
say you can't have her anymore. so i want to just, like, maybe take a minute to take a breath, because there's a lot to carry. these stories are pretty heavy and pretty deep. and i think, i think it's probably best to sort of move into conversation. because i have lots of things to say, but i feel like it will emerge from us talking. yeah. so thank you for your attention to these stories. i just want to say, again, that the folks who spoke to me really took extraordinary risks going on the record, right? so their families are being investigated by child protective services or, you know, risk losing their access to food or housing. and so it's really important to me that their presences and their experiences are already -- are always in the room before we start conversations like -- the more abstract conversation about what it all means.
7:14 pm
>> so thank you. >> thank you. >> full disclosure, when i was at johns hopkins, i led a signature initiative called -- what was it called? memory escapes me. i think i'm repressing it. but the institute for the american city attempted to link researchers to city officials in ways that would promote unique solutions to city problems. and a lot of the research we funded was sort of big data research. and i have to say the tool is compelling. so just to give you two examples, very simple examples of what predictive analytics, which is sort of the most pernicious of your examples can do for cities, two from baltimore and one from louisville, in baltimore a
7:15 pm
couple of years ago there was a wind gust. and one of the dilapidated houses fell on someone's car, killing the occupant. so the city came to us. we used predictive analytics to predict which houses were likely, using wind patterns and housing records, which houses were likely to collapse. and the city was then able to use those data to deploy very limited demolition dollars in ways that could have potentially saved lives. a second example was that one of the biggest causes of immobility among the elderly in all cities, including the city of baltimore, are trip and falls, slip and falls. and it turns out that many times these happen in the house, but also they happen on sidewalks. and the city of baltimore has a very limited budget to fix sidewalks. and what josh got a grant to do is to identify those blocks in
7:16 pm
which there were the most vulnerable elderly people in the city was able to deploy resources to fix those sidewalks first. i think the example from louisville is kind of the most exciting. kids' asthma inhalers were equipped with a gp schip. so every time -- gps chip. the city was able to clean up toxic waste dumps in those areas, and they were able to show that school attendance improved significantly as results. so these are amazing tools. and, frankly, i had never quite, you know, this book is really compelling, you should definitely read it. those who are watching booktv should go right out and buy this book. it is a great read. it'll keep you up at night. it'll frighten you. [laughter] all of those things we love. but i hadn't quite thought so
7:17 pm
systematically about how, when applied to a group of americans who have always been denigrated and despised, these tools can shroud bias in ways that extend it and maybe even deepen it rather than reduce it while limiting any sense of personal accountability. i mean, the case of ibm is maddening. they simply didn't seem to care that they were completely destroying people's lives. so that's my disclaimer. i believe these tools can be used for good or ill, and these are cases in which they were used for ill. but let's begin with welfare in indiana. so i've spent most of my career studying the welfare system, and
7:18 pm
in $2 a day we document tan of over the last -- tanf over the last 0 years. we know the ratio is falling from about 80% in the late 1980s to only about 20% today. that means there are hundreds of thousands, even millions of families who are desperately needy who are not getting aid. they ought to be legally entitled to. and i've always wondered, the midwest is not the worst actor. it's kind of average. but there's one state that stands out as having one of the lowest tanf to poverty ratios in the nation, and it's the only one in the upper midwest, and that is indiana. so i read this book and i thought, oh, this is it. [laughter] you know? this ibm at all consortium screwed this up. i thank you for solving the mystery. but on page 77 you write: automated eligibility was based on the assumption that it is
7:19 pm
better for ten eligible applicants to be not denied public benefits than for one ineligible person to receive them. tell me about that. >> yeah. so that's not even my quote. a really brilliant medicaid attorney named chris holly said that to me. and i so appreciated his insight because one of the big arguments of the book is that it not only matters that we may be diverting people from resources that they are legally entitled to -- and they're right. like, i mean, imagine if 80% of the country was due a tax refund and only 20% of the country got it. right? like, there'd be flaming pitchforks, right? there would be change, right? but there's 80% or -- yeah,
7:20 pm
going from 80% of people being, receiving the tanf help that they are eligible for to 20%, you know, it's a human rights violation that we continue to underwrite partially because of the sort of crazy stories we tell about poverty in the united states. and so one of the things that's really important about these systems is not just that they often stand as barriers between people and the resources that they have a right to, but they also teach people lessons about how government works, what their value is culturally, what their entitlement is to the shared public resources of our communities. and so i just want to tell, like, one very specific, concrete story. one of the people i spoke to at great length is this incredibly brave woman named hind say kidwell -- lindsay kidwell. and lindsay was on medicaid just two months after the birth of her first child, maddox, and she
7:21 pm
was due to recertify which is something that happens really regularly. but she was due to recertify right after this automation happened. and so, you know, she got them all the paperwork that she thought that they needed. the only thing that was a problem is that her partner had a job where he wasn't paid by a traditional paycheck. he was paid by a bank check. and the only thing that the automated system recognized as proof of income was a pay stub. he didn't have a pay stub, because he had a bank check. so they're like, okay. we'll call the help center, we'll find out what they need because we don't have pay stubs. they call the help center, and the help center says, okay, ask his boss to make out a list of his last six months of paychecks, fax it into the document center, everything will be fine. so they do it -- they have ten days to do it, it's right around christmas. i'm sure it was a nightmare, but they manage to do it.
7:22 pm
they gets one of these failure to cooperate notices. and now they have another ten days to fix it. so they call the help center again and, of course, they're talking to a new person at the help center, and the help center person says we don't have any evidence that you've given us, you know, proof of income. and she's like, yeah, we've already had this conversation, he doesn't have pay stubs, we sent in this list, and this person's like, oh, no, we don't need that. we actually need six months of canceled paychecks. so find six months of canceled paychecks, photocopy them, they'll scan them, everything will be fine. so heroically, two months of after giving birth to her first child, right, they find all of these checks, they photocopy them, they send them in, and they get another failure to cooperate notice. so they call the help center, and the help center says we have no evidence -- [laughter] you've had these conversations, right? we have no evidence that you have given us any kind of proof of income, and she says, you know what?
7:23 pm
at this point she's talked to an advocate, and the advocate is like, man, ask for a fair hearing. fair hearings are, basically, a tool within the welfare system that was established by the national welfare rights organization in the late '60s and early '70s that says you have a right to ask a decision maker outside the welfare system whether or not that decision is fair. so she's like, you know what? i want a fair hearing. and they're like, fine. and three weeks goes by, she gets another phone call from a new person who says, oh, i'm calling you to schedule your fair hearing, but i've got to tell you i'm looking in the computer, and the computer or has no evidence that you've ever provided evidence of your income. this is what the judge is going to see, right? so i'm advising you to give up your fair hearing. like i think you should cancel it right now and not go. >> incredible. incredible story. >> right? so it's not just that the systems act as barriers, it also -- they're actively taking
7:24 pm
away constitutional rights that poor and working people have established. >> yes. >> and then i spoke to lindsay again in 2017 right as i was doing the fact checking for the final fact checking for the book, and i was like, yeah, how are you doing, what's going on? she'd married her partner, they'd gotten off public assistance, they'd done well for a while, but she had recently gotten divorced. you know what? i'm working, i'm a single mom, i'm doing the best i can, and i know i'm probably eligible for public benefits again, but i will never, ever apply again because i cried every day through that process. they made me feel so horrible about myself that i will do anything before i do that again. which means she's saying potentially she would rather her child go without health insurance than to go through this process again. and that, that's a fundamental, like, we should be horrified that we're allowing these automated systems that get talked about as just sort of
7:25 pm
increasing efficiency or increasing administrative oversight or easing the burden of paperwork to make these political decisions for us. and so that's really one of the big arguments of the book, is we're smuggling political decisions into these systems that have little or no democratic conversation about what their assumptions and values are. so that's, i think, one of the big lessons of indiana. >> so it is so interesting, i remember with both bush administrations this emphasis on a thousand points of light and this idea that the private sector could do it better, more efficiently. and here we have this horrific case where the, sorry, the public sector, the business sector and the overconcern with fraud which has been a theme in history over the last 400 years, you know, we're so concerned about shirkers we make poverty as miserable as we possibly can,
7:26 pm
thereby potentially creating poverty traps. a great, great story about indiana -- >> can i talk about that history? you gave me a really -- >> no, not yet, not yet, nope. [laughter] that was a case of diversion. now we're going to talk about a case of classification. and that comes from the homeless in l.a. and i have to say this is a very balanced story. i think you can really see the good intentions of the folks that created the system. and you can see the positive results in some cases of the system. but you write about this system called coordinated entry for short, homelessness is not a systems engineering problem, it is a carpentry problem. so tell me more about that analogy. >> again, all my best lines come from other people. so that line is from gary blase
7:27 pm
who's really an extraordinary are homeless advocate and attorney in los angeles. yeah. so it's interesting because your introduction to sort of this section of the conversation, you were talking a lot about the way that these tools work to sort of mete out really limited resources. >> right. >> and one of the things that was common in all of the cases i looked at in the book is that the administer everybodies and the data -- administrators and the day eta scientists and computer scientists who were developing these tools all talk about them as systems for triage, right? like we have incredibly limited resources, we have a catastrophic problem, and we have to decide who is most deserving of the limited resources we have. and i under, really understand that incredible pressure that particularly front-line homeless
7:28 pm
service workers are under. you have a hundred people you see a day, you have one resource. i don't want to make that decision, that's a horrifying decision. but my concern with these tools is that we are using them as kind of empathy overrides, right? >> yes. >> they allow us to avoid making what are fundamentally inhuman decisions, right? which is, like, who stays on the street and who deserves housing. >> the carpentry solution would be to build more housing. >> would be to build a lot more housing. that's what gary would say, build a lot more housing. >> so we're kind of fooling ourselves in thinking we can solve the problem by prioritizing scarce resources, and that may distract us from actually advocating for the resources we need. >> i will say that in this system -- the architects of this system have used some of the finer -- >> that's right. >> -- smarter data that they've
7:29 pm
been able to collect through this system to argue for building more housing. so there has been some, there have been some real successes there. again, in l.a. and in pittsburgh the intentions are really good, the people are really smart. but my big fear with these systems is that we're using them as ways to outsource these incredibly difficult decisions that have deep structural roots that we need to be addressing directly. so this idea of triage, that we have to do triage, right? triage is only appropriate in a natural disaster, and there's nothing natural about the disaster of the housing crisis in the united states. that is a set of policy decisions that we made about public housing, about affordable housing. there is not enough affordable housing in any county in the united states. and that's because of a set of policy decisions we made. and so there's this brilliant, i'm going to space out on his
7:30 pm
name. it'll come to me. there's a brilliant writer, "coming to terms with chance." oh, what's your name? it'll come to me. and is one of the things he points out about the word "triage" is that it actually comes from the french root which means to cull or pick over marketable produce. and he says not to push the metaphor, but basically we're deciding what's worth saving and who is, who it's okay to discard, right? >> yeah. >> and fundamentally, in other places this question of who gets housing and who stays on the street would be seen as a human rights violation. and that here we're seeing it as a systems engineering problem says something really deep about who we are as a culture and where we are around poverty. >> so let's go to allegheny county. you use some -- there's a section in the book that i really want to encourage readers
7:31 pm
or to focus on. and is you're going to be tempted to skip over it. [laughter] because there are, there are labels in the book like outcome variables, predictive variables and validation data. but the story in allegheny county is not about diversion or a classification, it is about prediction. and i think the reason prediction stands out as most sinister in this story is because of something you call proxies. so there's one section in the book where you refer to a specific decision based on -- prediction based on proxies that only gets you about halfway between the flip of the coin decision and perfect information. it's a prediction of .74, i
7:32 pm
think? >> .76%. >> .76? so tell me the story about proxies and how they can use very imperfect data to make, to really decide between in some cases life and death. >> yeah. so here's a piece of advice that starts to shade into solutions, which i'm sure we'll talk about towards the end of our time. one of the things i want people really to do is be a aware of what people call math washing which is making something overly complex so that we think to ourselves i can't possibly understand that. and i want to say that many of these systems are very complicated, but many of them aren't. and the case inial a gainey county -- in allegheny county, for example, gets written about a lot as if it's the most sophisticated machine learning in the world.
7:33 pm
it is, in fact, just a static statistical model based on a -- [inaudible] and it's not actually super complicated. but sometimes you need to sort of break it down a little bit. and so that's what i try to do and, hence, these slightly scary terms like outcome variables and validation data. so there are sort of three things that you need to understand about predictive models to understand how to ask questions about not only how accurate they are, but what kind of impact they'll have on inequities and injustices. one is about outcome variables. so many of the things we want to measure or predict, many of the things we want to predict we don't actually have enough data to measure. so in allegheny county, the thing they most want to predict is very damaging harm to children. >> yeah. >> particularly fatalities and near fatalities. and that is those, that data is collected on a form called child
7:34 pm
fatality and near fatality reports in the state of pennsylvania. and thankfully, there are only a handful of them a year. three, four, five a year. which is great for children but terrible for data science, right? because it's not enough data points to make predictions. >> yeah. >> so rather than modeling actual harm to children, the designers of the system had to choose stand-in variables which are called outcome variables. and in this case, they chose something besides the actual harm they were wanting to model. so they use what are called proxies which just means stand-ins of the thing you actually want to measure or model. so they used two proxy variables to stand in for child maltreatment, and it's actually very important to understand what they are. one is called child re-referral. and all that means is that a phone call came in to the abuse and neglect hotline, it was screened out by intake workers who didn't think there was
7:35 pm
actually a problem there, and then there's a second call on that family within two years. so that is actually how this model defines maltreatment, right? and then there's a second one which is called child placement. and child placement, all that means is that there was a call on a family, it was screened in, and that child was later taken out of their family and put into foster care. so the thing that's important to understand about this is that means the model is actually modeling decisions that are made by the community, who gets called on, and decisions that are made by the agency and judges, who gets pulled out, investigated and then pulled out of the family. it doesn't actually measure actual maltreatment, it measures, yeah, these issues in the community and issues -- >> so all the bias gets baked into the algorithm. >> and one of the things that's really important about that is that while this system, part of the reason for building it was
7:36 pm
to keep an eye on the discretionary excesses of people working in this call center, the county's own research shows that the great majority of particularly racial disproportionalty in the system actually comes in from the point at which the community calls, right? so it's three and a half times more finish. >> and not just the community, but these mandated reporters -- >> yeah. >> -- who exercise their own bias. >> yep. yeah. >> middle class assumptions about appropriate parenting. >> yep. >> and so on. >> so it's actually identifying, there's actually a definition of bias that is encoded into the system that is really important to understand. so for this system bias is only an individual making a decision based on an unconscious or implicit or explicit discriminatory belief, right? but, in fact, like biases
7:37 pm
encoded systemically into our society, so when you look at something like why mandated or anonymous reporters call on black and interracial families three and a half times more often than white families, that has everything to do with our cultural beliefs and norms around what an appropriate family looks like and, you know, the deep legacy of white supremacy in our country. but that's not a data-amenable problem, right? so, like, that's not where we're putting the time and effort. >> but maybe even, you know, less defensible is the fact that receipt of public benefits in and of itself is regarded as a risk factor. >> yeah. so -- yeah. so this is something that -- >> i was -- >> we've had some arguments with allegheny county about. so one of the things that it's important to understand about these second two cases, about los angeles and allegheny county, is that if i wanted to write a book that was really,
7:38 pm
really terrifying, these are not the cases i would have chosen, right? so there's a very similar system to coordinated entry in the bay area that was create by palin tier which is a scary company that does all the drone stuff. we know nothing about what's in it, we have no idea how it works. it's called -- ooh, i forget what it's called. it's okay. home link. it's called home link. if i wanted to write the big brother book, i would have chosen the pallantier book. florida they're using ec earth mine share -- again, a private company that people know very little about what's on the inside of that model, and it has huge impacts on families. my goal in this book was actually to present people with much more complicated cases than that. so what we tend to talk about, sort of progressive critics of algorithms, the things we tend
7:39 pm
to ask for is transparency. we want to know what's in the model, right? accountability, there's some kind of public ownership and accountability for the model. and then in very, very radical cases, participatory design, right? >> yeah. >> like that there's some kind of input from the people who are going to be using the system or targets of the system in design of the system. and both los angeles and inial a gainey county, they did all three of those things, right? so these are are actually some of the best systems. >> that's right. >> and that's why i think it's important to look at them. that's why it's important to understand these systems in historical context. >> oh, so this is the -- >> yeah, i'm providing a segway. >> okay. well, i actually think we should probably go to the audience. >> yeah, sure. >> but if i could plant a question in your minds, you may want to ask virginia why she makes such a strong analogy between what's going on now and the poorhouse of the 18th and
7:40 pm
19th century. she says that this represents the new digital poorhouse. so you might want to ask her what's constant historically and what's really unique about the moment we find ourselves in. but i think dorothea is going to bring around in just a minute? okay, so go ahead and talk a little bit about that segway. >> so the reason that i chose these really good examples is because i wanted to sort of leave people with the question of what if you do everything right, what if the designers of these systems do everything we can ask them to, what if the model is technically accurate -- >> but you shouldn't give indiana any credit. >> well, no. indiana's a pretty black hat situation. but, like, what if the models are technically accurate and we produce systems that still profile police and punish poor and working class families. because i think that really
7:41 pm
forces us to get beyond thinking about building better tools and instead get at the sort of deep cultural narrative that we have around poverty and how that influences all of the tools we create whether that's, like, the register of the poorhouse in 1800, the like big leather book, or the algorithm that runs the child welfare prediction in allegheny county. >> do you want to talk just a -- are we ready? okay. >> yeah. >> so if you have a question, raise your hand, and a microphone will come to you. >> magically appear. >> i know it's tempting to want to just shout it out, but we need you for, we need you on tape. >> so tell us about the history that you were just -- [laughter] i'm very interested in that myself. >> softball pitch, right? [laughter] thank you so much for that. i love that question. thank you so much for that
7:42 pm
question. yeah, so i used the metaphor in the book about the digital poorhouse. i say we're building a digital poorhouse, and i do that really intentionally because there's this moment in american history in the early 1800s in the wake of a huge economic dislocation, the depression of 1819, where poor and working people were doing some incredible organizing that was very threatening to the economic status quo. and economic elites of the time decided that they needed to figure out what was really wrong, right? and so they commissioned all of these reports about what was known as the pauper problem. and they decided that there was a distinction between poverty which is the state of not having enough resources to meet your needs and pauperism which is fence on public benefits. and so maybe not surprisingly, they decided the problem was not poverty, the problem was
7:43 pm
pauperism. and that what they needed to do to solve the poverty problem in the wake of the 1819 depression was to build physical institutions, one in every county, whose primary purpose was to incarcerate the or poor. and what that meant for people was that you had to give up, if at the time you had the right to marry or vote, you had to give up your right to marry or vote. you were split off from your children. your children were taken from you. and the death rates at some of these institutions were up in the, like, 30% range meaning -- >> annually. >> annually. a third of the people who who came into these institutions every year died. there's a famous poorhouse in massachusetts where every single foundling, every orphan they took in for 30 years died. so these were horrifying
7:44 pm
institutions. and the intention was to make the conditions of receipt of public assistance so horrifying that you would do anything but go into the poorhouse, right? and the reason that this is a really important moment for understanding the tools we're building now is because it's the moment at which we decided that our public service systems in the united states are, should be more about moral diagnosis than putting a universal floor under everyone. right? so the intention of these institutions and the energy and the resources that went into these institutions were largely about deciding whether or not your poverty was your own fault rather than saying, like, you know, you might need some scaffolding for us to be able to release your sort of limitless -- i'm sorry, limitless human potential. and i believe that that's part of the story that is built into sort of the deep social programming of these systems, right? so right now in the united
7:45 pm
states the way we establish success in welfare programs is if people are no longer using them, right? so it should not surprise us, and that doesn't say anything about whether they're doing better than they were before, whether they've become employed, whether they've gotten above the poverty line. the success is just marked at like you're no longer using public assistance, so that's good. so it shouldn't surprise us that building a more efficient tool to do that is going to be really bad for people who are trying to access public assistance, right? i kind of say, well, if you plant sunflower seeds -- [laughter] in a place that you've always grown sunflowers and a sunflower comes up, you shouldn't be surprised. i really believe we're planting these new seeds of these new technologies in this very old soil of these punitive and disciplinary systems. so it's not actually that the system is broken, it's just working faster, and it's scaling
7:46 pm
faster. and that's something that really should challenge us to think about these fundamental understandings we have of what public services should do and what poverty looks like in the united states. yeah. >> when you talk about problems with somebody sending a receipt and collecting checks and stuff and then the recipient saying they don't have it, how does that differentiate between the classic problem of bureaucracy and what you're talking about? >> yeah, that's a really great question. so the question was, like, how is this different from, you know, all of the sort of classic means of diversion classification and other things that go with sort of standard bureaucratic procedures.
7:47 pm
and one of the things that i say in the book is that, right, many times these technologies are replacing human processes that largely were already very machine-like. right? but i do believe that we should be concerned about these systems. and i do think that even though i use this metaphor of the digital poorhouse and i think it's a good way to place these systems in their historical context, it only goes so far. and there are some things about these systems that are quite different from the sort of human-centered processes that came before -- >> can i just offer an example real quick? >> yeah, please. >> for example, you tell stories of caseworkers who really have extensive experience and a deep knowledge of the case doubting their own conclusions because of the score that an algorithm produces. >> yeah. >> so, you know -- >> yeah, that's a great, it's a
7:48 pm
great example. so what this forces us to really grapple with is the role of human discretion in public service programs. and that's actually a huge, difficult question, right? because human discretion in the past has been one of the things that has kept people of color and never-married women from receiving benefits that they're entitled to and deserve. and that's real, and we cannot pretend that did not happen. like, that is an important part of this story. that said, in a system that's set up not necessarily to help you succeed, one of the only things that in my 15 years as a welfare rights advocate and activist, one of the only things that meant that you had a good outcome in these systems is a caseworker who was maybe willing to bend the rules for you a little bit. >> or also, just someone -- you tell the story of how someone gets, there are two cases. one gets a score of 7 and the
7:49 pm
other 17. >> yeah. >> for fairly arbitrary reasons that have nothing to do with the case. and the caseworker has to sort of deny common sense to believe the algorithm. >> yeah. so the case that kathy's talking about is in allegheny county i spent, like, a full day in the intake center with the intake call center workers who are the people who decide, after doing a lot of research, decide whether to screen a case in or screen a case out when a call comes into the hotline or a report comes in from a mandated reporter. and one of the things that was really interesting about the time that i was there, service just as the system was rolling out, so people were still not entirely sure how they felt about it. and i spent time with this, pat gordon, this incredible caseworker with, like, 20 years of experience talking to her about how the system worked. and she said, you know, there are definitely these times where
7:50 pm
it's counterintuitive to me. i try to sort of stay open to this. but i have some real concerns about the accuracy of this system. and so i went then to talk to her manager, and i said, hey, what happens when you have a really smart caseworker like pat who says, like, we should be concerned about this, and then the score comes in really low or vice versa? like, what happens if pat is not concerned about a case and the score comes in at, like, the top of the -- it's like a thermometer that says you must screen this in. the thing that was really interesting is that her supervisor said, oh, if the score and the caseworker's decision contradict each other, then that's a really good learning opportunity for our caseworkers, because that means -- >> [inaudible] >> not for the algorithm. >> no. because that means that we have to back piece the puzzle because they've done something wrong. and the thing that's really interesting is it says on the screen that gives you the score,
7:51 pm
it says this tool should not be used for making child welfare decisions, right? because it's just supposed to supplement the human decision making of the caseworkers. but, in fact, this conversation with the supervisor made it really clear that the tool is actually training the caseworkers rather than the other way around. and i think that's profoundly troubling. i don't think it's accidental that we're seeing these tools at the same time that we're seeing an attack on public workers and that these tools are aimed at those people in public service offices who are the most female, the most working class and often the most -- >> underpaid? >> yeah, underpaid and often the most diverse of anywhere else in the work force. so i have a great, smart friend named joe who's a political scientist, and he says discretion is like energy. it can never be created or destroyed, it can only be moved.
7:52 pm
and one of the things i ask people to really think about is though we talk about these systems as removing bias or removing discretion, they're actually just moving bias or moving discretion. so rather than having the discretion of this 20-year experience, african-american, native of pittsburgh who works in the call center intake office has now been replaced by a team of economists from new zealand and a team of data scientists who have their own sets of biases and assumptions, right? so we're not actually removing that bias or removing that discretion, we're just moving it to another place. and so i think that's an important tool that the book provides, is to say, wait, we're probably not getting rid of that, we're probably moving it somewhere wells. why might we be moving it somewhere else. i think that's an important question to ask. thanks for asking that question. >> so we have time for one more. it's back here.
7:53 pm
>> i'm sorry to squander the last question with sometng that's to not a question. >> are just make it really good. >> i just wanted to point out one of the ways this country's really wonderful in solving the housing crisis and the carpentry solution is we build prisons. and there's a really interesting move afoot now which is that sentencing judges are given the opportunity to have what you call an override, an empathy override with an algorithm now to sentence people according to risk factors that are based upon similar play -- flawed sociological studies. >> yeah. there's some fascinating work about those algorithms. one of the things i wanted to do in this book is there's actually been some great conversation about the way these tools work in law enforcement and in criminal justice and in sentencing and parole. and that work's really important. one of the things that one of the invitations i wanted this
7:54 pm
book to offer is to think about policing that happens outside the law enforcement context, right? so that we should be concerned about police abuses and law enforcement abuses, but we should also be concerned about these other areas in which public services act to police people's lives in these very profound ways. so there's been some incredible work on thal dwrit ms that you're -- algorithms that you're mentioning including the team at propublica that wrote an incredible series of stories called machine bias that look at that, that work specifically. but i want to actually mention a historical point which i think is really important which is how -- often people talk about these systems as being able to identify bias in front-line workers and as actually being ways to create more just systems. and often what i remind them of
7:55 pm
is that we have the exact same conversation in the 1980s around mandatory sentencing guidelines. and that was a really interesting political moment where sort of law and order folks who were, like, just throw everybody in jail and also progressive civil rights folks who said, like, so many judges are racist that it's actually better to have kind of a written-out algorithm for them to follow. but we know what the results of that were. which, the results of that were in part a no less discriminatory but much bigger incarceration state, right? so i think that keeping that history in mind should be part of the conversations that we're having. but fundamentally, right, what i think a lot of this comes down to is finding ways to interrupt the dehumanization of people that i saw in these systems.
7:56 pm
not just the folks who are asking for support and help, but also the folks on the other side of the desk, right? one of the most sort of frightening moments for the welfare system in the united states was 1968 when front-line caseworkers went on strike not to protect their own employment contract, but to open up access to welfare benefits for the people they served. and so i think that we're living in a time where we are placing electronic barriers between people who are actually really close to each other. and there's all this space for affiliation and solidarity there. and that's one of my great fears with the system. so i think we have a lot of rehumannization work to do, we have a lot of cultural work to do around what poverty, what the story of poverty actually is rather than the false story we tell ourselves. and then we have a lot of political work to do to create systems that are fundamentally
7:57 pm
less punitive and less policing and that are more gearedded toward social justice -- geared toward or social justice and see people at full humans rather than data points. >> it seems like one of the tasks now is to get your book in the hands of policymakers and the people writing the algorithms. i just want to thank you both for your work and your time tonight. join me in thanking our guests. [applause] [inaudible conversations] >> you're watching booktv on c-span2 with top nonfiction books and authors every weekend. booktv, television for serious
7:58 pm
readers. >> the two main measures if you were trying to figure out who, as a political scientist, who's the most influential justice, the one thing you might say who writes the most majority opinions in the significant cases. and this is actually a political science measure, the cases that make it to the front page of "the new york times." as long as we have a paper, we're going to use that measure. [laughter] by that measure, scalia was not the most influential justice. he didn't write them, often they were written by others. >> and he was quite senior for a long time. so you would think he would get those opinions assigned to him except the chief justice, either rehnquist or roberts, maybe didn't think he would be able to hold on to five? >> are sorry, you wanted -- >> i'm sorry, no. >> so i think either it was what he was trying to say or how he was trying to say it that didn't
7:59 pm
let him get to five. the other way you might measure influence is by looking at who's the swing justice, who's the one whose vote is the fifth vote required, and i think that ties into why scalia didn't write a lot of these decisions. you're going to have to write something that's going to appeal to them, and as sue said about his lack of compromise, he was not going to compromise what he thought was the right way to decide the case in order to get the fifth vote. he pretty literally says that in obergefell. as i suggested earlier, he was very influential but through his sheer force of his writing and his methodology he kind of moved the middle of the court, got people talking about statutes and the constitution in a way they weren't before. you can't write a brief today trying to construe what a statute means without first going through words of the statute, the, you know, what a
8:00 pm
dictionary might say, and then you can go on through legislative history and purpose and all that. but there's kind of a different way of talking about things, and if you go back to the e lek tick way these things would go where courts would go on for 5 or 10 pages about what a statute meant without ever quoting the statute. testifies very influential, but not in the way we talk about influential justices. >> you can watch this and other programs online at booktv.org.

150 Views

info Stream Only

Uploaded by TV Archive on