Skip to main content

tv   Key Capitol Hill Hearings  CSPAN  September 7, 2015 10:00am-11:01am EDT

10:00 am
live from austin with the texas book festival. at the end of the month we will at theo book festivals end of the weekend. it is the wisconsin book festival from madison. coast, theeast boston book festival. in november we will be in portland, oregon, for word stock. followed by the national book awards in new york city. in november we are live in -- for the 18th year in a row at the map -- miami book fair international. that is a few of the book festivals this fall on c-span2's book tv. >> now, a distraught -- discussion on what data is being collected about you online and how it is used, it was the focus of a forum in seattle with technology and policy experts with homeland security, the justice department, amazon, and zillow. this is about 30 minutes.
10:01 am
[applause] >> it is my pleasure to welcome you to tonight's event. it is how technology impact civil liberties. a topic that i tackle in my own cdt.at nyu law school and in a recent book entitled "intellectual privacy," "our most basic privacy is the freedom of thought." to ponder, right, talk about new ideas, including ideas that are unpopular or controversial. it is at the core of our civil liberties. without it there is little substance to other cherished liberties. freedom of speech, but also the right to vote, assemble, protest, freedom of speech, and government decision-making influence. technology has an obvious influence -- impact on the freedom of thought. when we once relied on a small circle of friends and a click
10:02 am
uses to try out ideas, we are now hard rest to imagine intellectual activities of any kind that do not depend on the internet, search engines, large databases, the list goes on. in light of these and other new technologies we clearly need about howcourse technology impacts our daily lives in civil liberties. i would like to thank townhall, for this expansive slate of civil issues. a model i would hope to see in more cities across america. is a perfect partner for townhall in this endeavor. cdt, in my completely unbiased opinion as a board member, is the american leader nonprofit translating democratic values for the digital age. you will get a chance to hear president, nuala o'connor. what most impresses me is her
10:03 am
ability to find her way through the issues in this combative landscape in washington, d.c. this is genuine sieve and discourse that must include citizens, the government and business sectors. before i introduce your moderator for tonight and turn the program over to our incredibly distinguished group briefly want to address one issue that i believe is among the most important as we consider the impact of technology on our core civil liberties, the issue of voter privacy. the now commonplace for political campaigns to assemble and maintain extraordinarily detailed dossiers on every american voter. -- these contain hundreds, even thousands of political data points. not surprisingly, some of this information is derived from .oter registration records but it also brings data from commercial databases.
10:04 am
they use cookies and other commercial tracking methods to profile their supporters. the lastnow that in presidential election both campaigns hired marketing firms to help them match cookie pools? a database of active and targetable cookies with political dossiers to identify andmost sway double voters send them targeted ads anywhere they might show up on the web? or that the obama technical team developed a targeted sharing outreach team that matched political data with facebook profile so that obama supporters could persuade their friends to vote for obama. the jury is still out on whether these techniques are effective, but many americans might be shocked to learn how candidates target them without their knowledge or consent. in my own work i have raised two fundamental questions about these practices. first, does this relentless profiling and micro-targeting of
10:05 am
american voters invade their personal privacy? second, how do these new practices affect democracy? many have observed how privacy, understood as a legally protected zone or preserve in which individuals can think their own thoughts, have their own secrets, live their own lives and reveal only what they want to the outside world is vital to the working of the democratic process. so, what happens to democracy when campaigns routinely invade this zone by subjecting voters to a form of surveillance in which their preferences are tracked? this is just one of many profound questions we need to address when thinking about the indications of new technologies. other things to consider, ranging from discriminatory practices and the surreptitious collection of data and use of data analytics to so-called predictive policing with avowed
10:06 am
towards police departments identifying and stopping suspects based less on specific information on an individual conduct time and place and the war on algorithms that reveal an --inent or future will future criminal conduct. i believe it to the panel to take this discussion forward. it is my pleasure to introduce tonight's moderator, jenny durkan. and the rest of the panel, who will be seated as i complete introductions. jenny durkan, washington native and graduate of gw law school, served as the western attorney of the district of washington through 2014. in this role she was the chief federal law enforcement officer for western washington, responsible for criminal prosecutions and coordination of various federal investigative agencies.
10:07 am
she also shared key department of justice committees and workgroups charged with crafting strategies for cybercrime, intellectual property enforcement, and consumer privacy. her leadership for the doj on these fronts include testifying before congress on national cyber issues. to practiceinued law as a partner in the washington, d.c. office of quinn emanuel. please join me in welcoming jenny durkan. [applause] thank you very much. thank you for that great introduction, but mostly for your leadership in this space. you have been a true national leader on many of these issues. and we will not be tracking you all night. [laughter] just most of the night. what a great pleasure it is to be here at town hall. this is one of the best forms we have in seattle to talk about these things.
10:08 am
i could not have been more honored to moderate these forums on what is one of the most critical issues that we face as a society. i think that the intersection of technology in all parts of our lives will raise fundamental questions about the role the andacy plays in our lives what is important to protect. i cannot give a better panel to do this with. to my immediate right is ryan calo. he comes by way of stamford and is probably one of the leading people in the areas of privacy and robotics. just an exceptional phot thinker. when i was u.s. attorney and i wanted to turn to the people i looked to in academia to figure out those complicated questions, i would regularly have him on my speed dial. thank you for being here. [applause] and we are really lucky to have raquel russell, not just on this panel but also a recent relocate
10:09 am
her from the other washington to washington. she really focused on urban issues. when i had the good fortune to travel back and forth between seattle, d, and doj, whenever there was a issue related to this people would say -- there is this woman in the white house who can really help you. she relocated to seattle and is now at zillow. she can really look at the policy imprecations around these issues and we are fortunate to have her. [applause] and the woman who we could spend nualaof time introducing, o'connor has accomplished so much, you would think she has got to be 80. [laughter] she is the president of cdt and is taking it to the next natural iteration to be a thought leader in one of those places where all people can sit at the table to
10:10 am
decide the right policy initiatives to undertake in .hese areas she has got great experience at one of our great companies here. she took her experience from that sector and applied it to homeland security. she has seen the range of issues . cbt is one of those places where i was on a number of inter-agency panels in washington and they were trying to decide how to broker these tough issues in terms of what technology should have government to intercept and the like. no one even talks to each other sometimes on these issues and cdt plays a critical role in letting people broker those things and having this real vision of the importance of technology, so thank you for being here. [applause] hopefully this will be a kind of conversation style, but when we kick it off i want to ask
10:11 am
different questions bouncing back and fourth. we will reserve some time to take questions from the audience. thank you for being here. we know it is warm. this ishe first things, not jeopardy. i don't get to have the music. but first we will do a lightning round. i am going to ask each of the panelist to name what they think is one of the most exciting new tohnologies and then also say whether they think of the biggest threat to privacy is coming from government or from private enterprise. , let's start with the most exciting new technology. prof calo: you will not be surprised if here that i think that robotics is the most exciting new technology. [laughter] i will give you my thinking on that. if you think about our time now on the internet and you think about how the internet came to be so dominant, think of its history. arpanet, a u.s.
10:12 am
military initiative that wound up funding the initial internet and then it was a public private partnership and that very same group of people have moved over to darpa and are investigating robotics heavily. i think we were on this initial curve. >> i'm fascinated by the quick advancement of social media, generally. facebook started one year ago, in the blink of an eye, and now we have snapped chat, periscope, the way that the newer generations are using these social media platforms to .onnect is a fascinating thing it is so hard to keep track of. >> at cdt we are interested in technology and your daily life. i will pick not only technology, but a device. as a single working mother, my
10:13 am
hands are literally and figuratively fold. i wanted to set a timer for what i was cooking. i thought i needed my amazon echo. who has one? thank you. in my completely unbiased opinion, i have a little bit of bias having worked on the product when i was there. it is the kernel of your star trek computer for your health. if anyone is a big star trek fan, as i am, my vision for technology is a pot is -- positive one. we will move towards integration and tolerance, fueled and supported by technology that supports democratic values, but on a more literal level, i love this device. it has a great home manager. it talks to me. at least there is someone at home that will listen to me now. it has never happened before, right? commend to you the privacy
10:14 am
policies that had something to do with those. say that the thing that has changed my life the most recently is the starbucks app. it is critical for survival. to skip it. i said i would talk about the government, but let's first talk , theat is, in your mind biggest single thing -- you can do more later -- what has the biggest potential to erode our privacy without us knowing it? >> anything? the creeping intrusion or presumption that all of your data should be easily accessible by your government. federal, this sense -- i have worked on this issue of the digital self, that one has boundaries between you and the other, you and your institutions, religious, educational, governmental. and i believe that we have gone
10:15 am
too far in the balance of power between the self and the state. theregest concern -- yes, are real concerns about governments, companies, we can talk all about them, but this erosion of the sense of a clear boundary around my digital self in relationship to my government is very concerning to me. if you look at the history of list making and governments knowing who you are and what you are thinking -- i love that reference to richards'work, i need to have a safe and sacred space -- a place where i can express myself without assuming that that will end up in the hands of the government. >> raquel, what do you think? it is the culture of what is ok to be shared and not ok to be shared.
10:16 am
it is constantly shifting and shifting at a high rate at this point. to me i think that as that baseline shifts, will policy be able to keep up with the cultural shift? prof calo: mine is much more pedestrian. i think that facial recognition changes everything. i don't know if you guys noticed, i took a flash photo of you, i apologize that and i am going to tweeted later and i'm going to say that it went well at town hall. if you think about the fact that facebook has technology that can recognize your face, even if it thebscured, suddenly government can now scour the entire internet and private databases and just find you. find you where you are. even if you yourself did not upload anything. even if you took pains to of secure your face. i think that facial recognition is a big, big deal. i know it is something that cdt has worked on quite a lot.
10:17 am
>> good segue. one thing that is really important about the surrender of yourself to government and what is on balance, i was at a panel once with brad smith and i said -- look, brad, you want us to just have to buy it from you. so, what is the bigger threat to personal privacy? that government is collecting? or the information that every device that you collect that gets aggregated is the larger threat right now? arguments the classic are two of them. one, the government has a monopoly on violence. no matter what facebook does to you, turns out they cannot kill you or throw you in jail. but i wonder if that is at the end of the day really not just disingenuous buffet it is a full
10:18 am
enough answer. corporations can make decisions that affect your lives in short and wide ways. .t is conceivable the second argument that you always hear is that from corporations you understand their incentives, they just want to make money, where is the this mercurial, mixed incentive where you are not totally sure how they are going to do things, perhaps differently in a new regime. personally, i don't know which is the greater threat. >> i will selfishly say, as someone who was a government employee for over 10 years and with the recent data breach for many government employees, i would have to say the government . it is really scary to think that they have so much information on me personally, given the job that i had any information i had to provide to get those jobs. for all of those government employees whose information was stolen, that's pretty scary, a pretty scary thing.
10:19 am
i'm going to show my age here, with three major companies i can remember right now, the answer without a doubt to me is the federal government. but let me tell you why. i think you are right, we have all talked about it for a long time. i was doing deeper thinking knowing that you would ask this earlier today and the flippant answer is because the government can put you in jail because of the monopoly on violence and deprivation of rights, liberty, access to services and etc. your recourse in voting against government activity, privacy, is to vote with your feet and change countries, which seems to me a very draconian response. your answer is voting against the company's practice by not doing business with that company. but there is another answer as well that i had not thought of before, and this is not to let
10:20 am
companies off the hook. some are doing good work, others are not, but there is a government backstop. when there is an outlier or misbehavior by a company, there is the ftc, the state's attorney general. receiving end of 12 attorney general investigations, 21 class actions, and an ftc investigation for data merger that never happened. i certainly know that in a lot of the rest of the world there is not only company and market ,orces and customer distrust but then there is government backdrops and legal actions that can be taken to protest the company behavior. but again, i really do go to the deprivation of rights and liberties. you are absolutely right, the opportunity for hidden decision-making and opportunities cost and lost by a not knowing about the data presented to them is very real and those are just some of
10:21 am
the issues we are working on at cdt. prof calo: by way of support, those things, those companies can make promises to you about privacy. they will not do x, y, and see. and if they do that and said they would not, the federal trade commission will enforce it. is not, and z however getting your information to the government, which some companies say -- if they don't do that, right? for phil that promise, the government is not going to police against it. give to the doj, it's fine. the ftc is not going to enforce a promise by a company against itself, essentially. just to build on that violent agreement all night long, and we should mix it up and probably will, but the opportunity for an individual to again, to correct that wrong or
10:22 am
complain about that, it is very hard to make that case to the government, as you point out, when they have got the data and it is not known and that transparent. i was also a victim of the opm breach, as many of us were. the compelling issue is the one of government collection of data and the impact to your relationship to the government and the company. it is the bleeding and blurring of lines, the potential lack of clarity or boundaries between private sector data sets that were collected mostly with your knowledge. that would be a big theme in your direct to company advocacy. it relevant tos your relationship to the company , and the blurring of the lines that erodes trust in the companies asd the well. that issue of government data collection erodes the whole ecosystem of the government and
10:23 am
trust in my ability to transact with companies and speak my mind. i'm going to moderate and jump in a little bit. when i was chief federal law enforcement officer the government used its powers to collect information in the criminal setting. i really believe that the generation coming up is different, as racquel says, from my generation in a number of ways and for a number of reasons . i think they are being toditioned from every angle not care about privacy and think that privacy doesn't matter. i think that that is due mostly to the monetization of the lack of privacy. that's where i think that maybe it is not the actual collection of privacy, though we will talk about that later as i think there are a lot of dangers, but it is the constant mantra of -- privacy doesn't matter. we are going to talk about this in a little bit, but everyone
10:24 am
sitting in this panel knows that people do not read privacy policies before they click on i agree. people do not know how their information is being used, it's not transparent. it is from the days of the 19th when there were contracts of and they changed it to not have the small print. i think that they will change that as well. government needs to be circumscribed. we always have to fight that battle. but we always have to keep an eye on for private industry is doing. what can they do and what should they do? we have talked a lot about interconnectedness. you drive up to a toll and you have your automatic pass. you drive across the bridge and it automatically sends something to your bank. everyone is connected. it is more difficult to do things today and not leave a digital trace.
10:25 am
do consumers really understand what information is being collected about them? no.o'connor: i don't want to say that the consumer is not knowledgeable, but it is so hard to top with the level of technological advancement, but i am more hopeful, despite what i have seen in government and private sector, i think the most hopeful voice on this panel for a number of reasons. i think there are people in the government and private sector both trying to get this right. good companies where simply sparking the dialogue about customer expectation and trust, they are doing the right thing and trying to get this right. i read with your point about privacy policies completely, having written more than than anyone in this room. iron the mark twain, the fault her, the author privacy policies no one has ever read, but this is not to say that the fair practice of information principles and tenants espouse in this country are dead, but that they need to be reworked
10:26 am
and reimagined for the digital age, for the internet of everything and everywhere. i will go back to my amazon echo example, because i am so proud of it having worked on it. [laughter] can you tell? the only choices are alexa and amazon, but nevertheless i want to call it something inappropriate for this audience. i want it to have a name, like my personal assistant. but that device has privacy protection not only built into the hardware and software of the programming, but literally in the device. it lights up when it's listening. it is actively collecting data. there are other ways to signify those nonverbal cues. look at the airport signs that say baggage. i was pointing this out to my five-year-old. he cannot quite read yet, but he can read the sign that says airplane or luggage or whatever.
10:27 am
the theme that applies to the can no longer, we rely on 14 page privacy policies stuck in the bottom of website or those papers that you get from your bank at the end of the year. i have written those two, by the way, early in my career. no one reads them. having this technology embedded in the device -- these are exciting opportunities. we need to stop and talk about that age thing. i don't agree with the conventional wisdom that young people today do not care about privacy. ms. russell: i'm going to jump on that and say that i completely agree with your last statement. i think that people and consumers do know about the data that they are giving up. how can you not when you are on gmail? you get targeted advertising based on an e-mail that you sent four hours ago. you know that it is being consumed. maybe we don't know the extent as the average consumer, but i think that we know that for
10:28 am
younger generations, including myself, we weigh the pros and cons and the ease that technology brings to my life outweighs the fact that you are collecting data on me. that is a fact for myself and for many of my friends and peers. the ease that those technologies bring completely outweighs the fact -- turbo tax is a great example. i am not 100% convinced nor have i ever done any kind of due diligence to make sure that all of that information given to turbotax is going to be safe, but i would rather do that and fill it out by paper and pen, so i do. ms. durkan: ryan, what you think? prof calo: i get beat up a lot on whether privacy disclosure notices work. i wrote a paper a few years ago that was against skepticism and say no to everything. ms. durkan: that is the short version. [laughter] prof calo: i argue that we should be doing more of what
10:29 am
nuala said. where it is built into the product, like it is in so. there is a lot of opportunity for that. i also think that there is a lot of opportunity to have a privacy disclosure for the most -- the more sophisticated entity that is very full. so, for the cdt, the ftc, the journalist, and then have more visceral notice. i think we should be tracking a two track notice. i will say that at the end of the day, and i hope we talk about this more, it is really about asymmetry. fact that the company knows an unbelievable amount about you and you don't know hardly anything about them. is thereot only information asymmetry, but there is asymmetry about the asymmetry. they know that they know more about you and you cannot even
10:30 am
begin to scope how much they know about you. that asymmetry matters and i want to talk more about that. sort of hopeful about the potential of smart of -- smart design to convey information. ms. durkan: how much do we leave to market and how much we leave to the laws? that is always a balance tension, but at what point should consumers have some absolute right to know, for example, what has been collected, aggregated, to be able to look at it and edit it? what do you think, ryan? prof calo: i think that i agree with commissioner brill at the federal trade commission. consumers should have much more access to the information in the hands of data brokers. i commend you to her speeches about the reclaim your name initiative. she is using a mixture in her
10:31 am
position as a regulator with a megaphone but also sort of that if wethe idea do not get this ability to access your data, at some point the federal trade commission will come along and do something about it. a lot of competition on privacy happens against the backdrop of the threat of regulation. it is very difficult to say precisely the effect, it is the actual act of regulation and the possibility of that. and the people who have regulatory authority who in turn have soft power because of that regulatory authority. that did not at all answer your question, but i'm glad i said it. [laughter] ms. durkan: we are here to talk, not answer questions. ms. o'connor: i do -- ms. russell: i do think that there needs to be a checks and balance
10:32 am
system, but i don't think we can rely entirely on government. not move as fast as technology. if we rely on government to create policies to protect our -- privacy, especially in the technology space, i don't .ee how that works and jives the technology is constantly changing. i just don't see how policies will be able to keep up with that to the full degree. i don't think that we can rely wholly on policy and government in order to protect our policies. maybe this is the idealist in me , but i am a believer that education is key in this. consumers on the privacy of the data being collected on them. and what we would maybe consider the privacy infringements. they might choose to be like -- that's ok. that to some degree the market will play a role in that. there is going to be a checks
10:33 am
and balances there. i don't think that all consumers as concernedbe with the conversation and issues we are talking about today, but i do think that education to the consumer will be really important. just to understand you correctly, education can keep up with technology, but the law cannot? ms. russell: i'm not sure that anything can keep up with technology. i'm only six months into working in the technology sector, but saying the things move at lightning speed compared to the tech -- compared to the government is an understatement. i don't know how government alone and houses alone can up with the pace of technology. i think that education of the consumer by the companies deploying these technologies, corporate social responsibility in the digital world means the responsible use of data, the responsible use of
10:34 am
information as part of the transaction with the customer. i saw this first in the dark ages before any of you were born, apparently, in the early, before the first.com bust at doubleclick. ms. russell: it was single click. [laughter] ms. o'connor: exactly. dial-up modems when i started. senior leadership had gotten themselves into hot water over potential merger of off-line and online data spaces. again, it didn't matter because the off-line data was on punch cards and there was no way to match them. i know that you all get what a cookie is and what tracking looks like and that that is fairly benign because you are trying to serve people. about like tennis shoes when they were on the sports illustrated site, which is not that different from a contextual ad in a magazine, but it freaks me out. might opinion is that there is fear and loathing of the technology and if you want to be the promoter and promulgate or of a new technology, think not
10:35 am
only about how to use it responsibly but also educate the consumer so they can make informed choices about what they are doing. this is not to argue only for self-regulation. i could make those arguments all day long. but there is absolutely a need for the ftc in general. i have come full circle on this, on federal privacy laws in this country, because the market is not just this country, the lark -- the market is global. be thoughtful about where we are in the ecosystem of privacy and we are not even on the playing field. right? we are consider the great unwashed in the rest of the world on how we treat individual customers with data. i was just telling this story recently when last i gave the speech at the international commissioners conference 15 years ago.
10:36 am
u.s. privacyot of laws. the children's online protection act,the video protection and if you go through each one it takes you a longtime, especially speaking slowly for a multi-national audience, it did not go over well for europeans who have just one law that is easier to explain. if you want to be on the playing field with international consumers, when confronted with this ridiculous byzantine attorneys general, there is just one place to go in the country to complain? you see the elegance and that. i'm not here to say that the europeans have it right, because i think that there are lots and lots of flaws, but americans have this phrase -- haiti is data.- habeas my data, myself.
10:37 am
i choose to engage with this company and they have some right. i used to use this before i worked to their at amazon, you have to give them the address so that they can deliver the books. that's part of the deal. it's profoundly impossible to do that unless i tell you where i live so that you can deliver the book. some data has to be transacted. but it is still my address. still part of where i live, my data. i like that construct far better than property. something i sell or barter. structurery akoni and around the european rights, which gets to left -- less enforcement, but this is mine and part of me. i think you are right. if you look at the areas where we have tried to legislate around technology, by the time that we get it enacted, it is obsolete. are there not fundamental
10:38 am
principles? my first entree into the privacy ialm was 19 years ago when ordered a pizza and they asked for my phone number and i said you don't need to call me, just give me the pizza. they refused to give me the pizza unless i gave them my number. i was complaining to one of my friends at the state attorney general and i said great, you are in charge of my new consumer privacy task force. we had meetings over the course of a year. the tech industry was versioning. there were two parts of it. one was consumer privacy. it was us tricked opt in. you cannot collect information unless they agree to it. collect the information that they need for that transaction and only use it for that transaction. then you have to be done with it. the second part was the first identity theft legislation in the country. we dropped the bill, which is
10:39 am
what they call when they introduce it in olympia. that day 85 new business lobbyists registered. it was amazing and they killed the consumer privacy part of it. quite happily i can say that we got the first identity theft bill passed and it became a national model, but everyone understood that there was a way to monetize this data. to take from it and aggregate it in that people do not understand. are there fundamental principles where people should not be able to say -- i give you my address but i only want you to use that to send me the book? you cannot use it for anything else. what do you think, ryan? prof calo: i have had this argument all the time. the law cannot keep up with technology. by the time the federal aviation administration approved a drone for testing delivery by amazon,
10:40 am
right? i the time they got the application, read it and said yes, amazon had moved on to a new technology. they used it to apply again. it was this sisyphus in task. sian task. and but when you look at what the federal trade commission enforces, that mattered in 82 or in 2015 and it is what we call a standard instead of a rule. i guess i don't totally understand why the government could not be more responsive. i think that one issue that i will immediately concede is that the government by and large is not having adequate technical expertise. something that i think you get, speaking of violent agreement all around, the federal trade commission for example is trying to address that by hiring.
10:41 am
good technologists. they just hired someone from my technology in the shop there. that is a big problem. i am sure that you have encountered that. right? but i don't know that the law cannot intrinsically keep up in some deep sense. -- will the way fourth amendment keep up? you know? ms. russell: there is an intricacy to keeping those privacy policies and those are the things that i'm not sure the government can keep up with. keeping a general standard where the same thing should be fair and just, but what does that mean? the devil is in the details. that is where the concern is. that is not to say that there should not be policies and government should not have a role to play, but i don't think we can rely wholly on the government given the fact that the government works so slowly.
10:42 am
prof calo: fair enough. ms. durkan: you wanted to jump in on this? ms. o'connor: i despair of this glacial congress. you are right, we are thinking about this a lot. a clean sheet of paper of what a privacy law looks like in this country would be very short of high-level principles. once you create a particular standard around technology or security standard, you create a chilling effect and a target for hackers to shoot at, right? suddenly you have chilled innovation. but i think that the high-level standards, the ftc is the right place the point. fairness, fair dealing, with some more granularity around the best practices. companies are largely already doing this in springdale. it would be a short bill. we have a all the challenges we hadu experience --
10:43 am
all the challenges that you experienced when dealing with constituencies that clearly care about data. prof calo: very quickly, right now as it happens the federal communications commission is trying to decide how it is going to regulate privacy. there is a long history of having these technologies specific lumbering kinds of rules that are exceedingly slow. they need to decide that as they drag -- i don't want to get too technical, but there is now an opportunity that they have to write the privacy rules, essentially, for internet service providers. of these options will be this technically specific thing that becomes outdated and the other will be this specific approach of looking to broader standards. sometimes government makes the wrong choice. ms. durkan: i was struck by something that ira said during
10:44 am
the introduction about the freedom of thought and technology. where does the person who is -- whether you are the teenager curious about sexually transmitted diseases or a person struggling with alcohol or drug abuse, where do they go without being tracked or leaving a record? maybe it goes back to that intersection between government and private companies, but trying -- try to research something anonymously right now? ms. o'connor: go to mozilla, firefox. [laughter] do durkan: seriously, what we do for those who go online now, even with mozilla there will be tracking of searches. it was veryand transparent of facebook and google to say -- do you want to see all your searches? on the other hand people have been using it a lot like a library. i want to find out about tax, think about ask.
10:45 am
in a search they never know that someone was maintaining that forever. do about thee ability to have that anonymity and provide to make sure that there is that ability for freedom of thought? they have no thoughts on freedom of thought. [laughter] this is not a perfect answer, is barely an answer at all, but i do think that the level of expectation is changing. i expect to be tracked. i expect someone to track that data. but what i also expect is that they will not use it against me. that they will not use that power. i expect them to monetize it -- let's say that there is an app that comes up in something i googled, the next time it comes up in my laptop and i opened up chrome, what i use, but whatever you use, i expect that. i just expect it not to be used
10:46 am
against me and be abused. i do think that that level of expectation is changing and i think that to some degree it is generational but i do not think it is completely generational. not getriends that will on gmail for the example i used earlier. i have a friend who only -- other grocery store they give you those cards where you can save some money and they swipe at each time, she forbids to get one because they are tracking what she is buying. she has made that choice and i have made a different choice, right? i don't know how you had that kind of anonymity in the information age. i don't know how you do that. does that not hurt us as a society? i can tell you that i had one case where someone came to me and they were worried about government misconduct, they wanted to report it to remain a
10:47 am
whistleblower and they with -- literally could not find a route to do it. does that hurt us as a society? ms. o'connor: there are tools, many tools, and i have spoken to folks in the privacy enhancing and security spaces and they despair of the fact that they are not widely accepted. them -- those interfaces are clunky and hard to use. so, they have not taken off and it does reflect somewhat their own lack of good ui, but also that there is not as widespread desire. these are percentages on either extreme. i don't care about anybody knowing or not wanting anyone to know, the vast majority of consumers are somewhere in the middle, but i thought that raque l's answer was beautiful. i don't want it to be used against me. that is a beautiful and incredibly important sentiment, but it is not just the data,
10:48 am
decision. our digital decision project is looking at the unintended disparate impacts in data collection in a variety of mill and sections. engineers and designers should be stress testing new devices, new technologies for the unintended or hidden impacts that may simply not be something that they were thinking about when they were designing, may not be a value proposition that they were considering, but having those values of equality, democracy, of equal impact embedded in the design stage as the ways to get this right. i -- i get to quote the great jeff bezos -- we are still at day one of the internet. many people say that it is too late, there is still -- there is too much data out there. no, we are in the early days of
10:49 am
getting this right, but the decisions that we make now will last for decades, for centuries. we need to do the hard work around fairness, equality, and embedding these values in those interests. that'skan: i think that great. earlier we were talking about the very concrete ways. so much information has been aggregated. when they log on they can tell how much you bought, what kind of car, and the reverse is the reverse marketing. trying to get you to buy stuff, give you a cheaper mortgage. built into those algorithms could basically be digital redlining. what kind of steps do you think that we can take to prevent against that? speaking of a million dollar question, that is so hard. it is not intentional in the way that we think of intentional. i cannot imagine that folks are out to redline on purpose.
10:50 am
the way that we once did. ms. durkan: it's happening. [laughter] prof calo: let me say at a slightly different way. can you imagine that the folks at google or facebook would be thinking that they would want to deny opportunities to people based on their color or race? i don't think that they are intentionally doing that. [inaudible] it maybe some people do intentionally, yes. it is something that i think about and write about all the time. but i guess what i am trying to say is that so much of the mischief is going to be unintentional. it is interesting, you guys have a different intuition. i would like to sell it that way, but i don't think it can.
10:51 am
go back to the housing crisis and the predatory lending that was happening. i don't know how much data was happening around community of color in this country -- communities of color in this country. but i can say that if they had that, they would use it, and they would not do it unintentionally. prof calo: fair enough. that's further than i would go. ms. russell: but i don't know that they were using that technology, or they were doing it intentionally, but if they had technology to do it intentionally anyways, why wouldn't they? back,alo: i take it honestly. from my perspective it's like inpanies do crappy things order to make money. for example, why is it that any major business that if you call, you don't talk to a person anymore, you talk to an automation. they know you don't like that. it saves them money. why do banks have a class action with people making
10:52 am
purchases on debit cards where links were holding back purchases to trigger as many possible overdraft fees as they could, putting them in the correct order. why does someone have 999, do you know what i mean? companies are always trying to extract as much as they can, to use that economic terminology. when they are armed with an immense amount of information about you, they are going to leverage it. it's precisely what you said. people doing bad things will now be doing bad things with lots of data. it's badys said that enough that these things are going to fall in what is known as constitutional law as a disparate impact. even if you did not design the test to keep people of color up -- out of the fire department, the fact that it does is enough
10:53 am
to look at it and say that this is not what we should tolerate. maybe that is right, people are doing it on purpose. excuse me. ms. durkan: that question was phrased differently, he was not prepared for it. i will take part of that response ability. prof calo: your fault. [laughter] ms. durkan: always the moderators fault. let's take it back a little bit more. we already see a huge divot -- use digital divide in our country between those who have ready access to technology and those that don't. what are some of those things that we can address and learn about the importance of privacy but at the same time make sure that there is an equal playing field there? what it -- what you think about that, nuala? ms. o'connor: as we look at access online in other parts of the world as a critical juncture , people without access to computers, but the reality is
10:54 am
that that is happening in our own country. you see a lot of advocacy that is frankly condescending to the rest of the world. reality is that there are kids in this country that do not have internet access at home, that kind of thing. privacy seems nice to have, for the actualdon't have tool to get to the information that they need. that is the threshold question. we talk about net neutrality. we talk about other principles of open internet, but as a mama i am also concerned about the fairness at the earliest possible awareness. reinforcing by increasing that debate -- divide over time. digital literacy has started as early stages. i would like to see not only real thought about computer science and coding and access to literacy is a good
10:55 am
consumer and discerning consumer. i love the issues around coding and consumer science as science in the curriculum in this country. again, there is still a hierarchy of sciences that are considered adequate to be included in secondary education in this country. that knowing and understanding the code and being the certain for ourselves not liking the like of a particular algorithm and knowing what is behind the curtain, those are skills that kids 30 years from now are going to need. ms. russell: i heard something very interesting from one of the founders of co.org, most of us took biology in school. i dissected a frog and knew all along that i was never in going to be a doctor, but i still had to do that. --s these days need to lose
10:56 am
need to learn computer science. it is too much a part of our day to day. when itssue of equality comes to internet access, there is a lot of inequality in this country and i think we all know it. it is hard to get by in this country without being connected somehow to the internet. off of us who are better are extremely connected to the internet. i don't know what i would do if i left my phone at home. i would literally be lost, because i am new to seattle. i would actually be lost. but the fact that there are children in homes they cannot afford internet access, their homes are not hardwired. they literally cannot get this access in their homes even if they could afford it, that should not be in a country where the information age is being -- is king. period. prof calo: the other thing that
10:57 am
we can say with confidence is that technologies are vetted. vetted against the mainstream population. and one of the things that my lab does, at the direction of one of my co- directors, is that we have actually created panels of inple that are nonmainstream order to do analysis of the technologies that we look at. for instance, we will look at gender. we will look at disability. one recent panel that we did was on people who have had contact with the prison system. so, what we see is that the definition of technology and utility actually changes sometimes kind of dramatically. just one quick example of that, augmented reality. the idea of a layer of reality over what you see -- we think of
10:58 am
it as -- i thought of it as being augmented. so, more reality. i never thought of it being a problem if you were to shut off augmented reality when you were in the bathroom or a boardroom, right? but when we talked to people with disabilities they were using augmented realities as a substitute for a sense they did not have. it was deeply problematic to turn it off in a boardroom and a bathroom. an insight that we would not have realized that we not done that. that's another layer, can we really that our technology that way? this gets to the pipeline industry -- pipeline issues not just in the early stages but also in the tech industry. we were talking about diversity on panel, you see these devices created by these non-diverse segments not actually working and serving customers. moment.not a profound
10:59 am
women were largely buying goods and putting women on the team who could speak better than the customer. wetechnology we think invented everything. we might be better at creating these issues, hopefully solving them, but a particular device -- oculus -- it did not work literally. it is a visual viewer. prof calo: according to dana boyd, it made women nauseated. ms. o'connor: the viewfinder was not set at the with for a aman's eyes -- width for woman's eyes, reflecting the fact that there were probably no women on the design team. they have done this study over and over again, women on board, blah blah blah, but on the design team that is one example of diversity in the pipeline for creating a device that.
11:00 am
-- device. we will wind up with better devices for hardware and software in the future. ms. durkan: we are getting close to q&a time. two more lightning rounds and then questions from the audience. lightning round question, the europeans, as nuala has mentioned, the eu has a different approach to privacy, they are much more privacy centric. i don't know if people know that there was a recent they can write to google and say, -- someone searches, i want to be out of the search. there has been a follow-up that says, that does not just apply to european, it applies to anywhere they have a search. can't showhow up -- up in the united states, africa, anything. realistic thing? how would you call it?