tv We Are Data CSPAN July 29, 2017 3:15pm-4:21pm EDT
3:15 pm
makes me accountable to read as much as i can. >> book tv wants to know what you are reading and this year summer reading list, via twitter at book tv or instagram at bookótv. or posted to our facebook page, facebook.com/ tv. >> tv on c-span2, television for serious readers. >>. >> i'm john, i'm a book seller here and i have a story, thank you for being here and please welcome john cheney leopold's book we are data, it's for sale in the back and we continue to push here and john will be happy to sign. >> if we run out of copies there are some, we ran out of copies to contribute and there's this place called the place that he can sign and we can order a book for you. >> so find me if you get a book and you want to get a signature. >> i want to thank c-span for
3:16 pm
being here. >> and one, the useful thing to keep in mind is we offer simultaneously yourself. >> in light of doing that i'll say that a four calendar, the amount of literary books, we also have from tampa some flyers in the pack. >>. >> there is also a downstairs. >> next to a register. >> if you took out your phone, opened up your kit and directly into that, >> you might notice a yellow box appear and in my face and looks. >> we were worried about the batteries dying .
3:17 pm
>> okay. it's about how we use these technologies, apple, samsung, your camera sees my face. he points out early on in his book we are made of algorithms and the making of digital cells but most importantly computers don't see. computers compute. cutting through marketing speech, such a statementmight seem like a reassuring reminder. you are being watched, per se. some task is being performed. just as the day are being measured to areas of contrast, physical likelihood our meeting true or false conditions. the same goes for a banner ad . on one site i haven't disabled them all, for a jacket or what i like, my history or keywords in emails. perhaps this seems more
3:18 pm
pernicious and even invasive but it's a condition we've had to accept in contemporary life and while we prefer to think it it doesn't shake that life, after all you might laugh at some of the tailored ads you receive and the way your real human complexity has fooled the server farms and capital poll position. it's not the traces of our online, cheney notes that we are not claiming that we individually are data. rather we are temporary members of the for different emergent categories. these categories are what he called measurable types. the process is that , the prophecies that are creating our rhythms which there are countless which are proprietary or clandestine, constantly hour at work in almost every function of contemporary life. they quote, transpose on us like gender, race, class and
3:19 pm
citizenship in a quantitative measurable form. we're not overgeneralizing but it characterizes as in other words, other adherents in the way we find it do fail to capture anything other than the stripped-down data categorical similes and it's not their power or even a flaw in their design. they also govern much of our interconnected world and they do so on their motives. to prepare for another serious reference, notable types are extrapolated towards perhaps not a progressive but regressive future that reduces us to a machine ofhistology. within this framework , there is a world non- scholarship and intervention between our regular world and the invisible world of control that is increasingly shaping and governing it.
3:20 pm
he is a professor of american culture and digital studies at the university of michigan, allow me to welcome jon cheney-lippold. >> all right, thank you for coming before anything. i'm happy to be here with the literati and i'm thankful that a lot of my friends, family, people that i want to meet our here as well so thank you and hopefully we will talk about this in the future coming weeks, months, years. >> so the book is called "we are data" and john did a good job of explaining. it's not that we individually are made of data because i look at my hands and not ones and zeros, people and sets of data. rather the idea we are data comes to us from one single piece. and that is you are not you think you are in terms of data and that we are not who we think we are in terms of data but there's a sub thesis
3:21 pm
that i want to suggest area which is the terms of data are not the terms of our reality. >> when we talk about this, we talk about gender, we talk about sexuality and race, even who is a terrorist or not. these things i think that our lives, directly and merging ourselves, asymmetrically dispute resources . but that's the way that we negotiate. the government tried to impress you as a terrorist,is not trying to put you through the legal procedures , and are trying to understand terrorism, or that it's just trying to find a functionalist database category as effectively understand a drone program or some sort of parting mechanism by the case of the us or j sought. >>. >> what we think is real though about who we are is not what the governmentthinks is real , is not cost or what governments think is real or useful for them in terms of data.
3:22 pm
one way to see this example in a much more lighthearted way is through the idea of gender and marketing. i often in interviews in my classrooms asked my students and the people listening in to look and google for the idea, what gender does google think i am? and they go through this, they go through their google and they find out that the idea of maybe google doesn't think who you are is who you think you are. in this case from last week, google thinks i'm a 65-year-old woman. >> it's funny but it's also not necessarily inaccurate because you can see on the top, says that i like beauty and fitness, i have no idea howthat developed but maybe . >> i don't know. >> it's to say that the connection that they're trying to make in terms of my patterns of behavior and serving and the actual identities of the measurable types that i can't talk about
3:23 pm
in the book is there just trying to find out a commercial function of canada category. they're not trying to understand how women exist in a patriarchy or how gender performances lived in a real way. rather it's a conflict of data that your data fits in or do not. so is this wrong or right? instead, i argue that it is neither. it's an entirely new layer, an entirely newexception of gender, age and in this case war against race , class, sexuality and others. google rhythmic gender is not based on actual normal things we think of his gender. it's even regulated as regimes of different performance, of self identification is discourse. rather this is the case not in google from another company or podcast that tries to show how things like college grad attributes and mail attributes are understood. that produces then the idea of knowing one as a male or knowing one as a college graduate. this information is useful not just in the clearness of identify one as
3:24 pm
identification or even in the ability for somebody to maybe jump identities into, they never went to college. rather these are used for extraordinarily powerful metrics. that then determine how companies, how websites, how analytics itself understands the demographics of large populations that are in context. for example, i used to work on a website called truth.com, a progressive political blog during my graduate years in los angeles. it was fun, wonderful but i had a first-hand experience with these because what we would do is we would look at the breakdown of who is foresight and who's not? this could determine what kind of stories we would write, it would determine that we have a bunch of white people, less hispanics. >> you could do one of two things, you can say we're going to try to direct our resources and write more articles on immigration. >> flights of hispanic
3:25 pm
identified people within the us or latinos in general or you could say that they lost category, were not going to bring them but we should reunify our bread and butter which are rich white men. in the case of this website we tried to do both but which is to say i was a 65-year-old woman would be participatory in this creation of audience. i would be a woman who then would be maybe writing stories turning a conscience. >> i'm going to read three passages from the book and give you a sense of the pros but also i realize when doing all these interviews i have like a soft way of talking about this stuff but i realize i do it in the book better. who would've thought that what you write is better than what you say? i'm going to read three things and were going to watch a short video after the second one but i'm going to try to bring you to a sense of how these things at impact the on the sensible list for the careerist or maybe the strangeness of how identity works.
3:26 pm
>> in the book the globalization of everything, culture historians writes that while the imminent giants might proclaim control over their use of the system, as long as you just control the personal information or profiles granted at the pleasure of similar companies, such choices mean little. there is simply no consistency, reciprocity or accountability. >> the one fight in this about identification normalizes this inconsistency at every step. we are data is an attempt to continue on analytical tradition and interrogate how the information becomes the terms of our emergence of objectivity. the move is to displace the pictorial from understanding of ourselves, or a cyborg in such a way that it erases the body organ forms a dualism of the mind or body. rather want to explore how companies, governments and researchers use adult programs to highlight our post human cells with layers of new data and identification. our algorithmic identities are made through data and only data.
3:27 pm
there's a processagreeing from databases , our social ties are data and our bodies are data. and conditions are present in the future on the basis of a dynamic interpretation of that reasoning. what media here is frederick kistler writing his network the technologies actively produce the discourse we are subject that we have subjects to speak or remain through, pointing to what i argue is the kernel of our rhythm, what is new is how these disproportionate are now proposed from a real-time statistic commonality model. not from life perspective experience in the commissary. and here we encounter the political of data. we lack the vocabulary to end a politics around her avapro algorithmic identities. in the field of computer science we express this power. if we think politically about how algorithmic gender and algorithmic class, touches on something unknowable and changes. we can't think about algorithm gender in the same
3:28 pm
way we think about gender. we can't think about race in the same way either but we can appreciate the changes that happen when definitions are reconfigured through an algorithmic logo. >> in this way i'm talking about gender without class, i'm talking about race, the victory of identity but i want to suggest that it's something more than just the big three, the normal ways of identity. here is this idea of the algorithmic citizen. this came up for me when in 2013 the snowden documents were released and in this prism, nsa documents, there was a very fascinating little thing that said in the washington post article that said in the documents, it set anybody in the nsa can surveillance a user as long as they are believed to be 51 percent confident. and for me that was a red light. what's that mean? normally i thought citizenship is either you are
3:29 pm
or you are not. you're waiting for the permutation or you don't. it happens when you try to authenticate yourself towards the state. this becomes problematic when you are trying to understand who best deserves their right to privacy in the digital world. who deserves the right to privacy is a question that normally have to have a body or a user or a folder name with a social security number or some sort of computation. we are just google accounts, we are just math addresses on two computers moving around the internet so what they did try to create a new index for citizenship at the nsa to make their drive legal so they created this algorithmic citizen. a person named james bridal, a uk artist in this web called citizen x, citizen-x.com and it looks at information about what sites you visit and what sites you
3:30 pm
are coming from in order to determine your citizenship. this shows me to be 75 percent american, 25 percent german and in the parlance of the nsa i would be a citizen. changes like everything you do you become a citizen and a foreigner, a citizen and antiforeigner. the information used to evaluate this is not based on metrics, based on a rear weirdly racist and zeno public consumption. if you speak english you are more likely to be a citizen. if you speak another language, you're more likely to be for them. >> if you were in the us are more likely to be citizens. >> outside the us were more likely to be foreign. >> if you would the indication you're more likely to be a citizen and if you are a foreign communications, you're more likely to be a citizen and even if you talk to people who believe you are foreign which might be a family outside the united states or might be somebody who at the moment is seen to be a foreigner, that is a suggested quality to you being a foreigner as well. >> this is a way that the nsa
3:31 pm
determines citizenship but it also lets us understand how the nsa determines who is a terrorist. i'm going to read again. i like this quotation from the epigraph of the first chapter by kristin rutter, the sound of okcupid. these data science people who is interested in critically staffing us but understanding the power of data in producing new forms of identity as well as new forms of research, he writes the ability to take a real phenomenon make them something a microchip can understand is i think themost important discovery anyone can have to use a sentence to tell story to a computer you use our rhythms to , use sentences to sell a story to a person, algorithms to sell the story to a computer. we kill people based on metadata. metadata is data about data, data about where you are from and where you send a text message, where that messages sent. data they identify at time of
3:32 pm
day, the subject of the email and even the device use decided. state goes openly through selling fiber-optic networks, easily from ether and connected together and this data when process is algorithmically spoken for in ways you wouldn't want to speak. in the quotation that begins the chapter former and sat michael hayden alerts us to how metadata can be spoken for ask is, produced by a terrorist. that is was metadata can be compared against the pattern, that signature in the parlance of the us intelligence community and metadata fits within the signature of a terrorist template, one might find oneself at the end of a predator drone strike. his database attack is a signature strike. it's a strike that requires no target identification but rather an identification of quote, men who bear certain signatures or defining characteristics associated with terrorist activity but whose identities are known. we might choose to remark a bit more specific and call
3:33 pm
people out there algorithmic terrorists based on metadata, in the us. the argument of the us drone program in the 2000, the strike would target and intelligence officials identified suspected individual through their voice, name or on the ground reconnaissance. then a drone operator would launch a missile down to where the individual was believed to be but in 2008, following frustration with the constraints imposed by military policy, the us loses wartime drone guidelines. now terrorism is it just to the us claims is a terrorist but also who the us consider a database terrorist. while the us to differentiate between targeted and signature strike one of the consequences of the ship was a spike in the frequency of drone attacks. there were 49 strikes between 2004 in 2008 and 372 in the seven years between 2009 and 2015. this loosening of legal restrictions reenter terrorists into algorithmic terrorism, a pre-identified signature behavior that the us is too militant activity. s2000 u.s. army is what is known as drone is not just
3:34 pm
individual people patterns of data. telephone data that looks as if there's targets of the us wanted to kill, that is terrorist. >> foreseeably this attitude would've identified patient of the week and a half. hundreds of civilians have since died of the probable result. some proposes that the obliterated wedding party may be the true signature strike of the post-9/11 era of american warmaking . the strikes never will remind americans of civil war which would remain in distant lands, a war on terror. the unintentional targeting of the weather party for individual on their cell phones congregate outside city centers using data as if it was a terrorist redefining of permanent uncertainty in the geographic areas where these strikes happen, even those were not in the us to list the potential to be identified as if they were. seriousness of life that is a terrorism in and of itself. >> this operation i think of terrorists as an algorithmic process categorization of
3:35 pm
metadata refrains who we are in terms of data. the internetwork world, our data cells again have a pattern analyzed and we find identities with attention to our own historical particulate. as nico media scholar mark endres right, if the signature strike is interested in biographical profiles or back stories, does not deal in desires or motivation. it's in the sense pondered up by one of the virtues of anthology. the management of anthropocentric narratives in favor of worldly. even without anthropocentric narratives, the data is rhythmically spoken. we strategically fictionalize hand frack wagner writes in his book the philosophy of data quote, the purpose of the world of ideas is not be portrayal of reality but to provide us with an instrument of finding our way or about more easily in this world. importantly those who use our data to create these ideas
3:36 pm
have the power to tell it stories for us quiet we find it not our way but their way. >> is the story of discovery is the data that drives the plug. >> as he described, our species was putting out more of its knowledge out there in ones and zeros ever had at any time in his existence. we are putting human knowledge out there in a form that was acceptable to intelligence. in a world inundated with data, traditional analysis failed to capture the rich worldly detail of an nsa wiretap. who indeed true this data perspective to make sense of vast quantities of data requires a move from targeted . validated by world as a pattern receives a logical next step. of course and outcome places acceptance that the world is increasingly data-driven might miss out on the fact that a terrorist still looks
3:37 pm
and sounds familiar to the us declared to be a terrorist. >> both are most likely you look at in the belief in its neighbors. both are most likely to speak arabic. both are most likely are not white and both most likely practice this. >> these construction of terrorism in the us draws from the muslim american studies scholar . quote baggage. in this baggage he encounters in the world of judgment is a media scholar and the intersection of the racial and sexual uncanny of a terrorist monster. subsequently the monstrous other, one that doesn't make a terrorist subject as a subject that is violent links easily to barack obama's defensive is a drone program. let's kill the people who are trying to kill us. this other monstrosity both defines us enemy ship and expands the condition for who could be considered a terrorist. you're the truism of one man's terrorist is another man's freedom fighter is reinforced by the fact that this identification always goes from favorable to geopolitical needs. in algorithmic terrorists
3:38 pm
passes through the prism of a terrorist monster, one whose priority is justified, it's already a dehumanizing protocol regulating targeted assassinations can be further humanized. a terrorist is only going to be a data signature, not a human being. as an anonymous us official told the wall street journal quote, you don't necessarily have to know the guys name, you don't have to have a tendency dossier you have to know the activities the person has been engaged in. the legal requirement to target a single individual, the ontological status is rerouted. rather than being more adept or accurate accurate, us algorithm is deleted by the object of simple strategic convenience.it's a function with category appropriate to the database logic of the nsa. rephrasing this functionalist term, the question of who is a terrorist is answered in the logical vernacular of engineering.
3:39 pm
decryption software pgp describes quote, the problem is mathematicians, scientists, engineers. they will find a way to turn these problems into engineering problems because you can solve them. the nsa has an incredible capacity to turn things into engineering problems. knowledge about who we are is constructed on who it refers to as the quote, small patterns of data. or what political theorist even colbert wouldcall patterns of correlation . the nsa's algorithmic terrorists doesn't replace the concept of terrorists but adds on to another layer. the family of a terrorist must include its cousins. >> now i also want us to think about something that really impacted me when i was doing my research, this is early on. i'm always taken by just the culture, that is not intended to be used for critical purposes. so in one example, i'm going to show you a recruiting video, the microsoft produced in order to let people know how in the late 90s, but it's
3:40 pm
the use of interesting sent in this kind of mantra of inside us there is a code so we'regoing to watch this 2 and a half minute video and i want to read something about it and we will have time for questions. >> . >> inside us all there is a code. >> a code that represents what we are but who we might become. it is our potential. >> 25 years ago a company started based on the idea that technology was the means to unlocking human potential. >> so microsoft launch a revolution. >> fueled by a data software that ontologically, in the means of algorithms, and broke down barriers. >>. >> people ask what can i do.?
3:41 pm
>> we have inspired them to pursue their dreams. >> workers asked how much can i build? we give them the tools to express their ideas. >> why compete? >> we empower them to work longer than the previous generation.>> what's the next big thing? we enable them to perform. >> anywhere do our job, the more potential we unleash. and the greater the achievements we inspire. >> moving forward, we embrace the power of an open world and the unsuspected
3:42 pm
opportunities. >> take the world's greatest challenge to advance the achievements of the human world. i attention that has been encoded in us all. >> microsoft. >>. >> i love it so much. and it's because it is so technically coding but it's also so, 90s. moby in the background and you have ones and zeros and everything but i use this video to begin the end, to begin the conclusion so we can think about this as a whole. we are made of data but we only really are made when
3:43 pm
that data is made useful. this configuration is best described by the technology list dream of the epigraph. the cyber utopians who believe this stuff, we are a agrarian standing with and actualize his us through technology. or more appropriately we are movie rudimentary cyber is laid bare by microsoft's rhetoric.inside us is a code, that code represents not just us but the potential and despite the code being inside us, we can find our potential must be unlocked by microsoft technology. the recruiting video is so wonderful because it transparently celebrates a world where corporate profit goes hand-in-hand with technology which goes hand-in-hand with liberalization, hand-in-hand with the colonization of our own futures by the metaphor of code. one might ask does this code ever run, does it have fun, go to a soccer match or drink a glass of wine or is it only function to use or work or use microsoft products. >> the code seem to be the kind of code the lady at the
3:44 pm
office to the detriment of friendship, partnership or possibly a child's birthday. yet microsoft overrides superficially grazes the more substantial consequence of inside us is a code. we are made of code, the floor fundamentally changes. this identification of life is at the level of productivity and how the digital world is being organized. it is at the level of knowledge, all that world is being defined it is at the level of celebration how that world is connected. we don't have to use microsoft products to increase our potential. microsoft iconic corporate monopoly works and critiques the world of technology has not just representing us but functionally determining who we might become. representation then placed second fiddle as microsoft can rewrite the codes of race, class, gender, sexuality, rewriting these codes transposes the meeting on the microsoft terms. we become the author of who we are. asymmetry defines this type of emergent power elite. power is data capital
3:45 pm
classifies us. in those costly classifications carry immense weight to determine who possesses the right as a citizen and what race and gender me and who even your friends are. essential argument of this book is that the world is no longer expressed in terms you can understand. his world has become data, algorithmically interpreted and configured so it can never say i am john with full comprehension of what that means. the concrete subjective agent, john can exist. however, john can be controlled. this paradox fits with the same double slogan that defined george orwell political dystopia of 1984. who controls the prices control the past. to control the present you control everything that comes before and everything that happens after. but what happens is you don't just control the present but constructive. novelist 80 is 80 smith surprises us with an answer of her 2010 film about facebook, the social network. he writes when a human being becomes a set of data
3:46 pm
on the website, he or she is reduced. everything shrinks.the individual character, friendships, language, sensibility. in a way it's a transcendent experience. we lose our body, the messy feelings and desires of fear. reminds me that those who turned and discussed from what we consider an overinflated sense of self will be careful what we wish for. our denuded networks just wish they were more owned. on facebook ourselves are not more free, they are more open and they are owed because we are now made of data. there's an example from facebook where researchers have conceptualized the very idea of inappropriate with a feature they use to introduce more decorum into our digital lives. they can evaluate photos, status updates, videos and interaction. what facebook thinks is inappropriate content. >> when you post something on facebook considers unbecoming your assistant will ask this is being posted publicly, are you sure you want your brother or your boss to see this? that facebook will know inappropriate to appropriate the same assumption that
3:47 pm
caucasian, google does a celebrity and the nsa knows the terrorists, they don't. all of these are made up, constructed in ontological inconsistencies. despite this error the capacity for control is powerful. facebook's gentility assistant if it ever gets implemented will likely have more input and framing acceptable behavior than any edited book editor ever had. these outcomes may unlock your potential in the case of facebook but not in the way that microsoft wants you to believe. microsoft just a static dna hardcoded into your system that is waiting to succeed in the marketplace. trust me as anyone who isn't your parents, you don't. one brief look and celebrating capitalism will demonstrate its terrifying incompetence to, what colonize our life with its own logic. what these algorithms unlock is the ability to make your life useful for algorithms office.imagine a world where a company like google
3:48 pm
doesn't date information but it controls how we describe that world. this is the problem we encounter when it's privatized although not in the way that people talk about privatized technology. here, public libraries of funding and things are closing and even academic journals are protecting copyright for the articles that scholars write for free. that's one type of privatization but google has the ability to make a celebrity in inappropriate, google and facebook are on their way to privatization. >> for me, that's my presentation and i'm going to cycle some images from the book on automatic and i invite you to ask any question related to the book. >> thank you for coming. [applause] >> iq. >> how is your book
3:49 pm
structured? >> my book is structured in a sensitive definitions. i started with a chapter on categorization so how it happened, how these ideas get fabricated within identities of a terrorist or a woman or a man. >> then i move on to how they control people, how they produce a relationship of control, using philosophers from theory but also a lot of gender. to think about how even things we might not necessarily think of import over our behaviors and our activities. that i fear as a new type of subject, to connect algorithm identities and that i think about what privacy, if everything ubiquitously prevails, whatis privacy in a world where that's something you can't hide from and what does privacy mean if we want to still use it as a cost ? >> i've never been asked that. >> please.
3:50 pm
>> so i wonder if in the context of a terrorist example you can, i guess i sort of have a two-part question before i get into something coherent. you can maybe talk about the nature of sort of error and uncertainty and how it differentiates human judgment from algorithmic judgment. i asking the terrorists contacts because you said a couple times in some things about how identifying a terrorist and then watching a drone strike might be different than algorithms versus a human does it. >> and i think one of the things that was a differentiator from the way you talk about this patient was the type of mistakes computers are making maybe. >> and one of the things i'm trying to distinguish in my mind is how, what the difference between a human saying i looked at the data
3:51 pm
on this person and decided that some reasonable certainty that they are a terrorist which is launching a missile, raisin algorithm saying i looked at the data on this person and i decided with this probability about some threshold that based on my algorithm that as a terrorist we should watch them. i wonder if that's the impact on that. >> i think there's something very important to say as a human decision is ever valuable, we've seen drone strikes the floor, the signature, these villagers and the book is dedicated to one woman whose back to that. >> in terms of algorithmic terrorism, i would think there's something useful in understanding the data they're using so human would use on the ground communication, i know this person is in this moment because i saw him walk in, i see him through the window. it's getting creepy but that's how it works. so it would be like on the ground intelligence, the interpretations that like a western operator looking at a person, rationalizing and
3:52 pm
identifying them, that the data that the nsa uses working in particular uses is often just based on metadata so there's no proof that anybody is at all connected to a person but there metadata suggests they are like a terrorist, that's why a lot of wedding parties get rated. 25 people outside the city center in a rural area with nobody else around, that looks like a terrorist meeting so because of that, at the data used to kill them so self only are targeting instead of a person so we moved to human evaluation but it's also was way too much faith on this idea that they have abolished for terrorists evacuation so in terms of that there's always going to be error but with the thing that you strike your seeing much more depth precisely because if you want to see a terrorist you are going to see it anywhere and in the case of looking at only data and not on the ground intelligence are going to be more likely to see thatwith drawn attic. >> that a good enough answer? >> . >>. [inaudible]
3:53 pm
>> killing a wine party versus, i don't know, what areas they are typically made, human intelligence officer but i imagine the type of people that are accidentally killed by them are different than wedding parties. >> that's a good point, you're going to have more civilian categories. in all, though the civilians killed in targeted attacks were a lot of people who were affiliated in the family as far as a person who was wanted, and joint attacks it's often people unconnected or maybe have the bad luck of after the taliban and realize those people were following the metadata through their cell phones they would meet every week and put all the sim card in a bag, shake the bag and reallocate your sim card and disrupt the pattern
3:54 pm
beingused so imagine you pick the sim card of the people the us is following and you meet with two people outside the city center, it's like that person is going to be one. other questions, that was very good . >> one thing i've been noticing a lot on facebook which maybe you've seen, i've had a couple events recently so that's the memorialization on facebook of people who have died. i'm wondering what your take is on getting that the justification and like, how do you interpret that deification of memory or people's deaths giving your work? >> it's mostly deeply philosophical. that my book is very like, for example, i think the facebook reason is identifying these events is fixing the problem of what do you do this useful culture is no longer useful, might be deceased. i think the is asian is interesting because the way to memorialize is according to facebook's effects. according to facebook structure so even if you
3:55 pm
might want to commemorate online, you have to use it through a facebook venue and thus the ability to monitor you or manufacture some sort of profile according to if you like that stuff, if you comment on it, increase these other people. it will determine how close people on facebook together. so it's live in this weird, i don't want to call it commercialization but this weird relationship is useful for facebook even though i always think that someone would be a faith base away from classification but because it's on facebook everything is commodified. >> other questions? >> i want to get back to the point about drones and human decision-making. we live 35 years ago in a world of reputation in which people can tell information
3:56 pm
about you as an individual you didn'tcontrol. they interpreted form of you as gender or whatever, something similar to what happens with algorithms now . you have to networks, a human network of reputation and a data and our network which is embodied in silicon.what do you think about that? >> i actually use doctor as a mode to understand what subjectivity is. how do we understand our relationship that might not be immediately reflective because the normal assumption of this is usually theorized the twine a cop says hey you and you recognize yourself and face the power and you become subject because there's a hailing of you as a person. how do we exist in this world where some of you might just see you across the street and engage in think a person doesn't look too good or that person is whatever. what we understand gender and race has a lot of import on it even outside algorithms. the big thing that's different is unlike gender and race in the social world
3:57 pm
is there are ways we can talk back. there's a way that gender is a social construct, and that has to proceed through i think when i referenced this in the book, george stands up yelling we're living in a society, there are rules here. in order to suggest we have a participatory quality, even if it's oppressive and might be just one way that affects us, the converse with something like this, all these proprietary, they are owned by in this case comcast so caucasian here is never going to be in interacting with racial privilege, is going to be a commercial category and want to keep secrets, they want to make it not about social construct or race, it's going to be about a function or category comparable to terrorists. if you use the model the nsa was using it would make it available to be hacked, available to data. is that effective answer?
3:58 pm
i like the question, yes please. >> i'm curious do you think that when it comes to the construction and deployment of state power when it comes to drone strikes in terrorism, how much of the usage of algorithms and identification is to essentially remove the subjective actor, the human subjective actor but also to take away the blame being placed on an individual or an organization or institution. essentially, we are going to engineer aproblem and solution that nobody can take blame for, the state can do what theywant without having any blame . the second question is a more lighthearted one. i'm sure you've seen the movie eagle eye. can you comment on it ? >> no. we didn't see that because it's perfect for what he's talking about. >> i think with state power it's compelling because we see during the obama administration, during bush
3:59 pm
as well how they reconfigured what a militant was in order to facilitate legal death. so if a bond people and killed 18-year-old men, even if they were affiliated with terrorist groups or people under the auspices of terrorist groups, they were seen to be military precisely because of their age and gender. as one way that patronization allows us to avoid the hard questions of who deserves death and who doesn't. in terms of algorithmic, i think there's ways that even drone pilots are forced to distance themselves. >> legal apparatus, i would think
4:00 pm
algorithmically, it is going to confound it. there are existing legal stuff, treats citizens like a terrorist or a foreigner just like a terrorist, a target, there is no way having defined targeted signature, a difference between these as if it is the same thing. they increasingly use more data-driven patterns how they are facilitating it. as you heard from yemen and pakistan, families are upset their parents will be killed and they chose not to be killed and there are no repercussions, no way to get back at them. other questions? yes? >> you mentioned heidegger in
4:01 pm
the final chapter, he famously said democracy is only a half measure against technology. i am curious in your conclusion if it is possible democracy could be a full measure against big uses of technology or less then a half metric? >> in the past 5 or 6 months, i have become -- not hopeless but cynical about the prospect of democracy because i am looking at a lot of examples especially the defense department, the minerva project, they are trying to upend and create revolution in foreign countries not through cia agents or covert action but through using data to profile people and create an idea when the state is vulnerable, at what point could an uprising occur? there are those who predict when civil unrest will happen and
4:02 pm
there is a market for it. the fact that this stuff is so attendant in so -- populations moving around, reminds me of an example in the book about usaid program that is a fake twitter made by a shell company in the cayman islands that would bring cubans using it as a twitter over their connection and have that data saved to a database and people in costa rica who are employed by usaid would categorize them as pro-revolution or anti-revolution that at one moment they would have an update, out on the streets going against the castro regime in the cuban spring. that is the foreign-policy way it works but i know through different things, a lot of people say that was part of the
4:03 pm
brexit move, how a lot of democracy can be curtailed by something like the filter bubble. there is no necessary public fear happening, just fragmented commercial pockets of news that keep you on the site longer versus actual civil discourse. at the moment i am not that fond of the future but hopefully it will change. i am hopeful we will do something about it. other questions? >> the question is what do we do? >> one chapter is about what do we do? i try to think of privacy as a different idea. i take writing from 1880, thomas cooley, university of michigan, and he theorized the idea of the right to be left alone.
4:04 pm
that was taken up in 1890, ten years after, the right to be let alone because i learned this a couple weeks ago, brandeis was having these parties and people would be taking photos from the trees of his parties, he himself was of high society and needing privacy so how do you curb new technology invasion of the private sphere. it was more provocative because he said we should have a legal defense against not just somebody looking at us but bringing us the call of our energies, when somebody raises their fist to strike somebody you call your energy, worry about what will happen in the future and in a civil society we should avoid not just assault but stress before this happens. the idea of being let alone is provocative because i started to
4:05 pm
think about how the algorithmic stuff produces a way of control that is moving your opinions and curtailing potential aberrant behavior among on normal behaviors and what is the right to be let alone from these manipulating forces? through that, the liberal right to privacy, shut my door because even if i go home my computer is being watched. my phone is in my house and if somebody comes to my house, if they are together and verizon might know we are in the same house and assume something based on that. so i followed two scholars who suggest we use instead of protecting yourself from privacy, do obfuscation. you produce enough noise there is not enough signal to suggest that is who you are because most of these patterns are based on i
4:06 pm
know you are male because 92% confidence, the internet marketing association baseline stat is. if you can avoid beating the 92%, 50% believed to be gender, someone believes he will be outside their apparatus. for me i am hopeful for that strategy but it is a cat and mouse game. a lot of questions are how do you engage in freedom or practice freedom where everyone is being watched, i will be for the state and a lot of the answers are this is good for years ago but no longer good. today it might be good but tomorrow might not be. it is like producing a paranoia in the best way. other questions? yes? >> about the book writing process and i am doing the same
4:07 pm
thing. it gets better videotape things. at what point did you realize there was a light at the end of the tunnel? >> for me, when i had the first draft of the second revision, not just the content but the tone and voice i wanted to have because what i was trying to do was parrot a scholar, or parent a kind of tone, just the your self. but to say when i gave myself the acceptance, i had the tools to do it, able to go from the beginning to the end and see lines that existed across the
4:08 pm
chapter. it took several years which is a horrible thing to say. you are deep in the process. it does get better as anyone will tell you. other questions? >> i don't know if you talked about it but what do you think about technologies being offered to make noise? where do you see that going? will it be advanced enough or will they keep fighting against it, you are trying to make a lot of google searches and programs that run noise for you. now google learns to recognize that noise, always going to be cat and mouse. >> one program i forgot to mention, was produced by nyu researchers and headlines from
4:09 pm
the top newspapers, and it every 6 seconds a random where a, based on this cacophony. newborns, when i was writing the text, using it as an example i searched on aol.com, i think to the extent that can be predicted, that stuff if you are trying -- go to the same site the plug in is using and see if the client data is accessing and using the plug in, create the same word bank and parse the searches to see if they are legitimate or not. there are more things you can do but it is cat and mouse. we shouldn't sit on our laurels. we want a quick fix.
4:10 pm
there is a propensity for it. i want it all to be ready and okay, but it is almost like we are put into this world of liquid modernity where it is a changing idea, you can never be sure because everything is moving so quickly and that is becoming more the case, something we are not able to adapt to but we are getting better. >> you create awareness around obvious skating practices. in an academic setting you can talk about it. >> a lot of my students, the classroom, one of my students, agnostic to a lot of these things, hard for them to understand i don't care if they are targeting me so what we are bringing up in these interviews,
4:11 pm
orbitz.com, using things like your browser size, your window size or it gives you a front load more expensive plane ticket or hotel ticket because you seem to be a wealthy person. speaking of obscure occasion how do you avoid that? no one wants to pay $50 more or have differential pricing be this case and they start thinking to what huber is doing the same thing, from one wealthy neighborhood to another you get a higher price and going to a poor neighborhood to a poor neighborhood, there is a suite of class related problems that come up with this but with obfuscation, what do you put into day-to-day life where you would think about that and pay $15 as i walk a couple blocks over. outside the classroom maybe i think when i talked to a lot of people at readings like this or
4:12 pm
interviews, there are rollbacks that happened with the trump administration rolling back fcc guidelines, taking 100% of internet traffic and using it for surveillance or marketing, a lot of it is going to be an illegal perspective. how can you limit the ability for company to have 100% access of communications? google gets trendss from
4:14 pm
2009-2016. what were people searching in ann arbor when there was a historical flu epidemic and it predicted 6 weeks before the center for disease control when a flu epidemic would come because people weren't going to the doctor but searching for their siblings or nyquil or something like that. there is a slide here that shows more or less they created a bucket of 140 terms seems to be good indicators of activity so internationally they figured out to predict through the big data learning environment when things would happen in certain areas and tell the local hospitals or schools to have kids wash your hands or be aware of these things because they might have the ability to shoot that down. there is a lot of good that can happen. larry page said if they were
4:15 pm
allowed to mine healthcare data they could save 100,000 lives which is a lot but also potentially useful to think of how can we understand handwashing patterns that wouldn't have been given through a small sample but through a database sample. maybe algorithms can be determined. >> doesn't that make the case for google home device and embedded skin sensors? >> i knew somebody would say this. it does in the way that if you regulated and i would be more a fan of autonomous voluntary, contributing to a large data set not owned by a large company that took away their do not be evil mantra a couple years ago. google no longer has that. that is a good point. a lot of companies that have access to these data sets have
4:16 pm
real reasons to keep them private and also reasons of asymmetry with their use. thinking of how this data is something multi-range. >> i want to understand some of is when you say in your response to what we should do you talk about obfuscation. like sending out activity so various algorithmic venues don't categorize you. >> i'm still stuck on what we should do. as the world becomes more technologically advanced, like digital personalization is an essential part of life, there will be less and less, i like this green thing and it will always be green, i like this thing that can come in 1 billion forms and the way it comes in
4:17 pm
the form is based on data collected from everything we do in our lives. if you set out obvious skating data, you do not see that somewhere in a more technically advanced world. basically creating disharmony between your true self and your environment, what if down the line the strategy you employ to confuse these systems result in something akin to not having a smart phone. you are on the fringe of society. >> there is a minimum technological stuff you have to engage in authentically to maintain competency and get a job and stuff like that. speech is part of society. on my facebook, why are you
4:18 pm
doing this? you need structures that are problematic to be a member of society. i'm also a fan of a lot of technologies, the privacy problems are extraordinarily rich and problematic but i am keen on what it can do technologically. in a class i teach, a way companies like facebook try to keep you in their world as long as possible by giving you advantages like this or keeping in facebook in order to facilitate a good relationship so algorithms can understand you for you to understand better. i am wary of those because it is not for you, it is for them to keep you insight and there are reasons, i will give you $50 to open a checking account but bank fees are $200. eventually you get screwed over.
4:19 pm
i do think the question stands, how do we understand social inclination or social use at the same time, putting a lot of obvious skating information on it you are ruining the experience for everybody. we can have a cure for cancer but some people are being idiots, and a lot of jaundiced newborns, that is an important health issue but it is not. it is something to talk about. to go back to the initial question of what is new about this, the important part is having these categories. i'm not against algorithmic gender, just the unit directionality, the black box society where we don't have the ability to talk back or make these categories social and about how they affect the public. okay. >> other questions?
4:20 pm
pk will -- thank you all for coming. [applause] >> every year, booktv asks members of congress what they are reading the summer. here is a look at some of the books house speaker paul ryan has on his summer list. he is reading washington:a life by best-selling biographer ron chernow. it is an in-depth look at the first president of the united states. also on this list is strong fathers, strong daughters by pediatrician he and doctor meg meeker who highlights the importance of the father/daughter relationship. >> booktv wants to know what you are reading, send your summer reading list via twitter at booktv or instagram at book/tv or posted to our facebook page, facebook.com/booktv. ok
87 Views
IN COLLECTIONS
CSPAN2 Television Archive Television Archive News Search ServiceUploaded by TV Archive on