Skip to main content

tv   Politics Public Policy Today  CSPAN  November 21, 2014 11:00am-1:01pm EST

11:00 am
but when i look around, is anybody really ready to replace her? i mean, it's a hard job. and i give her a lot of credit for what she's been able to do. but i think it's time that the leaders, you know, start looking at who's going to fill my spot? you know, we're all replaceable. that there might be some bumps in the road but i do always believe that it's time for younger people to take our spots with fresh ideas and new ways of doing things. i see nothing wrong with that. that's progression. that's a normal progression. >> that entire interview with carolyn mccarthy and republican from wisconsin also retiring can be seen at 8:00 monday on c-span. earlier this month the privacy and civil liberties board held a day-long forum on technology. technology experts and open government advocates
11:01 am
participated. this first panel looks at the concept of privacy. the privacy in civil liberties oversight board is an agency in the executive branch recommended by the 9/11 commission, and it's appointed by the president. >> good morning. welcome to the privacy and civil liberties oversight board's public meeting on defining privacy. it's 8:30 a.m. on november 12th, 2014. and we're meeting in the west end ballroom in the washington mariotte georgetown hotel in washington, d.c. this hearing was announced in the federal register on october 21st, 2014. as chairman, i will be the presiding officer. all five board members are present and there's a quorum. rachael brand, james dempsey and patricia wald. elizabeth collins cook. i will now call the hearing to order. all in favor please say aye. proceed. so what is privacy?
11:02 am
the right to be left alone? a desire for independence of personal activity? the right to make decisions regarding one's private matters? space for intellectual development? anonymity or obscurity? freedom from public attention? freedom from being disturbed by others? freedom of intrusion into one's solitude? freedom from publicity which places you in a false light? freedom from appropriation of your name or likeness? control of how one's personal information is collected and used? freedom from surveillance. these are just a few definitions that have been given to privacy in the past. i expect during the course of today's discussions that we'll hear others. the meeting today and the comments we receive will inform the board's approach to privacy issues within its statutory mandate. there will be four panels today.
11:03 am
the first will focus on defining privacy interests. the second will consider privacy interests and counterterrorism context and the impact of technology. next we will hear from government privacy officials regarding privacy interests that have been identified and addressed. and the final panel will see how lessons learned from the private sector can be applied in the counterterrorism context. each panel will be moderated by a different board member. after the board member poses questions, others will have opportunity to pose questions. afterwarsd, members of the audience are invited to submit written questions. peter wen has cards and people can get a card from him and submit the questions time permitting for the moderator to pose to the panelists. i want to thank the panelists who have agreed to appear here today on this panel and others. i also want to note that we have a strict time keeper, joe kelly, sitting in front and so panelists are encouraged to keep their remarks brief so we can have a more extensive
11:04 am
discussion. we'll take a lunch break between 12:00 p.m. and 1:15. today's recording is being recorded. a transcript will be prepared and put on our website in a week or so. written comments from the members of the public are also welcomed and may be submitted through regulations.gov through the end of the year. i want to thank the board's staff -- sharon, shannon, simone, lynne, renee, peter wen, joe kelly -- for their efforts in making today's events possible. so i will now turn to the first panel moderated by judge wald. >> thank you. panel one will attempt to explore -- i think it would be too ambitious to say define -- privacy and the many separate individual and societal interests that the notion of privacy encompasses. the novelist jonathan franzen
11:05 am
remarked, privacy is like the cheshire cat of values, not much substance there but a very winning smile. legally the concept is a mess. that's a quote. that may be unduly pessimistic. most commentators do agree there are aspects of privacy that go way back to the most ancient civilizations and that our own founding fathers enshrine several of them in the bill of rights. but the concept of privacy has been receptacle for conglomerate of interests or values that individuals and society care about. but which to varying degrees they are willing to balance with competing values such as national security. thus the law of privacy consists mainly of a series of situations in which courts, legislatures or government officials have decided to recognize a privacy interest or not to and to protect or not to that interest against a competing value. so, our panelists today will identify the varied individual and societal interests that travel under the rubric of
11:06 am
privacy and discuss how far and under what conditions our laws do or should legitimately -- legitimate claims that are based upon those particular interests. now, our format will be for each panelist to talk initially for seven minutes, and the gentleman in the front row will turn give you a yellow card two minutes before and a green card will mean it's time to wind up quickly. then at the end of their initial speeches, i will question them as the moderator for about 20 minutes. that will be followed by another 20 minutes of questions by my fellow board members. after that. i hope there will be some time left for the written questions which members of the audience are invited to send to the people who will circulate and collect them and i will
11:07 am
question -- discuss some of those questions with the people on the panel. you already, i think, have bios of your illustrious panelists, but i'll identify them very briefly before they speak. so we'll get right on. lizo goten liberty and national security program director. that's enough to identify you. >> thanks very much, judge wald. and i apologize in advance. i have a cold. my voice kind of comes an goes, but thank you to all the board members for inviting me to participate in today's discussion. if there's one thing i've learned from my own involvement in privacy issues over the last few years, privacy is different things to different people. david gave a very comprehensive list of some of the things that privacy is. i'm not sure what i would add to that except to say that i think for those who are outside the ideological mainstream in this country, privacy vis-a-vis government can be critical to
11:08 am
effect wait other rights such as the freedom to religion, speech, and collective association. so collectively as a society, we value all of those aspects of privacy. even if some of us value only some of them or none of them. so, what does that mean for our analysis? i think it's interesting for us to think about different definitions of privacy and it's helpful insofar as it shows the range of definitions that are out there. but i'm not at all convinced that congress or the courts or this board should be in the business of attempting a granular definition of privacy or its importance. look at the freedom of religion by way of comparison. courts don't probe what religion is or why it's important. that's not because the definition of rely john is obvious by any means. it's at least in part because the opposite, because religion
11:09 am
is different the things to different people. so what the court does is it adopts a concept of religion that's broad enough to encompass the many different roles that religion plays in people's lives, and then the court protects it except in the rare circumstance where there's an overriding governmental interest. and congress has followed the same approach. when it comes to information privacy, the best working concept of privacy, the concept that best encompasses all of the important interests that privacy serves is control of information. this avoids the what and the why of privacy and focuses instead of how. how privacy is realized as a practical matter. and also has the additional advantage of matching up quite well with the text of the fourth amendment. if a person controls her papers,
11:10 am
she is secure in them. if a person does not control her papers, she is not secure in them. what are some of the ramifications of this concept of privacy? well, first, controlling one's information means controlling not only what one shares but with whom and under what circumstances. i may share certain information with my mother or with a close childhood friend, but that doesn't mean that i have chosen to share that information with the entire world, including the nsa. sure, there's a chance my mother might rat me out. there's a chance that my childhood friend has a tax problem i didn't know about and could be pressured by the government into becoming an informant. but to equate this outside risk that my confidence is maybe misplaced with a willing disclosure to everyone in the world is a legal fiction of the worst kind. that's really what the third party doctrine is in my view.
11:11 am
second, you don't, in fact, relinquish all control over information about your public activities by virtue of walking out your front door. there is such a thing, functionally speaking, as privacy in public. and this is something that's well understood in the foia context, freedom of information act context. it allows the government to with hold information if releasing it would unduly compromise social privacy. the supreme court held in 1989 that a rap sheet would be covered by this exemption despite the fact that all of the information in a rap sheet is available by virtue of a diligent door-to-door combing of court records. so why was the rap sheet still private? because, the court held, while the information in it was publicly available, it was practically obscure.
11:12 am
this is such a commonsense concept, and deserves a home in fourth amendment jurisprudence. the sum total of a person's movements in public over extended periods of time may be publicly available information, but using normal powers of human observation, it is practically obscure. so when the government uses drones or stingrays or gps technology to pierce that obscurity, it has compromised the control that he person would otherwise exercise privacy of -- exercise over this information, and that's a privacy violation. third, privacy violations happen at the point information was collected. we've heard intelligence officials recently telling us that we don't have to worry about the nsa's bulk collection of telephone records because nobody looks at the records unless they have reason to suspect some kind of terrorist link. that is the government telling you what aspects of privacy you should value.
11:13 am
many people won't care if the government collects but doesn't look. other people won't care if the government looks but doesn't prosecute. but the point at which the government collects the information is the point at which you've lost control. and for plenty of people, that loss of control itself produces harm. it produces a feeling of vulnerability. it causes people to change their behavior. in 2014, there was a poll after the snowden disclosures showing that 47% of the respondents had changed their online behavior of after those disclosures. there was another survey of 520 american writers showing that one out of six authors after the snowden disclosures refrained from writing about certain topics because they feared surveillance. after news stories broke about the nypds infiltration of muslim student associations, attendance in those associations dropped. in some ways these are some of the worst harms that come from
11:14 am
privacy violations because they're society wide. they impact the way we function as a society. they impovrerish our social discourse.ey cse people to cens themselves and not put ideas out there. one last ramification of this concept of privacy -- if i have time? i can't believe i have time. is young people. so, i hear it said quite often that young people don't care about privacy. and it's certainly true that many young people go on facebook and share incredibly personal information with 622 friends. but they don't share that information with 623 friends. what they share and the number of people they share it with may very well have changed. it certainly appears so. but they still control the sharing, or at least they think they do. and my impression, based on a totally unscientific survey of all the young people in my life, is that they still value that control. so -- the red card. i knew it was coming. all right. i'll stop there.
11:15 am
>> thank you. professor daniel soloff is the john marshall harlem research professor at the g.w. george washington law center. >> good morning. i would like to make five brief points this morning. the first point is that privacy is much more than hiding bad secrets. one of the common arguments about -- that people often make about privacy is that people shouldn't worry if they have nothing to hide. and i hear this argument all the time. this argument and many other arguments about privacy are based on a conception of privacy. a conception of privacy that's very narrow, that sees privacy as hiding bad or discreditable things. well, privacy is much more than that.
11:16 am
privacy isn't just one thing. it's many different things. privacy involves keeping people's data secure. it involves the responsible use of data. it involves making sure that when data is kept, it's kept accurately. it's making sure that people who keep the data are responsible stewards of that data. that people have rights in that data. and some participation in the way the data is used. all these things have nothing to do with "nothing to hide." they have nothing to do with secrets and everything to do with how their information is kept, collected, stored, et cetera. i think that if you see privacy broadly, we can move away and abandon these very narrow views of privacy. the second point i would like to make is that privacy is a societal interest, not just an individual one. when balancing privacy and security, privacy is often seen as an individual right and then security is often seen as a
11:17 am
social right. and when they're balanced, society generally wins out over the individual. i think this actually skews the balance to the society side, the security side. in fact, privacy isn't just an individual interest. it doesn't just affect the individual. it's a societal interest. we protect privacy because we want to protect society. we want to shape the kind of society we want to live in. privacy doesn't just protect the individual for the individual's sake. it protects the individual for the society's sake. because we want a free society where people are free to think and speak without worrying about negative consequences from that. third point i would like to make is that the collection of personal data through surveillance and other means of government information gathering can cause significant problems. data collection and surveillance
11:18 am
aren't inherently bad. but, just as industrial activity causes pollution, government surveillance and data gathering can cause problems. and these problems must be mitigated. they must be addressed when they clash with important interests. some of the problems include, one, that this activity can chill people's expression. it can chill people's exploration of ideas. it can chill people in many different ways. either they might not say something or they might say something slightly differently or they might act differently or do things differently. and we don't want that chilling when it comes to legal activity. the other thing -- the other problem is that surveillance gives a lot of power to the watchers. there's a lot of things that can be done with a vast repository of data beyond a particular aim that it might have been collected for.
11:19 am
data has a way of often being used in other manners, in other ways. i think that another issue, too, is the level of accountability and oversight that goes into this. because it's about the structure of our government and the relation of government of the people that we're talking about here. what kind of accountability will the government have when it gathers all this information? what limits will there be on the information gathered and used? how long will the information be kept? in a free society, people are free to act as they want to act as long as it's within the bounds of the law without having to justify themselves. they don't have to go and explain their actions to a bureaucrat sitting in a room full of television monitors
11:20 am
about what they're doing. they don't have to go and explain themselves when a computer's lights are blinking red because of something that they said and it could be misinterpreted. people don't have to worry about that. they can act freely without having to worry about how suspicious their actions might look. that is a key component to freedom. the fourth point i would like to make is that we can't adequately balance privacy and security without a reasonable amount of transparency. there's an overarching principle that this nation was founded upon. it is that we the people are the boss. the government is our agent. we can't evaluate what government officials are doing if we don't know what's going on. this doesn't mean there should be absolute transparency, but it does mean that we need to know something, enough to be able to evaluate government surveillance, because ultimately the choice about the proper level of surveillance isn't the
11:21 am
nsa's to make, it's not the president's to make. it's the people's choice. we can't forget that. it's the people's choice, and the people must be given sufficient information to make that choice. my last point is that the government must get buy-in from the people for its surveillance measures. without buy-in, people are going to start to take self-help measures, which is something we see happening now. we see that companies are providing people with ways to encrypt their data, to protect it from snooping government entities. this is the market speaking. this is something that people want. this is something being sold to people that they're going to buy. it's something in demand. why? why are people demanding this? because they have lost trust. because the laws regulating government surveillance are week and do not provide adequate oversight or accountability. this is why strong privacy protections aren't necessarily bad for security.
11:22 am
in fact, they ensure that the people are comfortable, that there is adequate oversight and accountability for that surveillance, and that they're comfortable and know that they have the information that they need to continually evaluate what's going on, and if they can evaluate what's going on and buy into what's going on, things will be a lot better when it comes to balancing privacy and security. thank you. >> paul is the founder of the red ranch consulting program and a senior adviser to the chertoff group, and he was formerly depp si assistant secretary for policy at the department of homeland security. >> thank you, judge. thank you, mr. chairman, members of the board. i appreciate the opportunity to speak with you today. it's really entirely appropriate for the board to begin a
11:23 am
conversation of privacy in this technological age. it's essential. and the reason for that is one that puts me in some disagreement with my fellow panelists. i think our conceptions of privacy, founded as they were back in the 1970s with the phipps are sma somewhat outdated and antiques that don't survive the technological challenges we face. the 1973 thunderbird was a marvelous car, but we would not think of holding it out today as the state of automotive engineering, and nor would i think we should address the phipps as the state of privacy thinking. we need, in fact, a tesla for privacy today. what would that look like? well, there are many ways to answer that question. and i think to anxious it you have to begin by thinking about what sort of value privacy is. and here again i think i find myself in some disagreement with other members on the panel and perhaps with members of the board. i do not think that privacy is an ent logical value. i don't think it's a kin to
11:24 am
religion. it's not an inherent right or the product of some natural law. in my judgment, privacy an instrumental value, one that acts in the service of other societal values. it's a utilitarian value that derivals it worth only so far in my judgment as it fosters other positive social gains. privacy for its own sake is just an assertion of autonomy from society. it is a valuable insofar as it advances other objectives. now, let me kind of put some salt on that. the problem is that buried in the word privacy are many different social values that rl too many to catalog, though the chairman did a good job of trying to start. for example, we often see, and the discussion here, privacy is enhancing our freedom from government observation. that's probably the use that's most salient to what the board does. but it also enables democracy. that's why we keep the ballot
11:25 am
private. it fosters personal morality. that's why we keep the confessional private. privacy is also about restraining government misbehavior, which is why we see privacy values in the fourth amendment and other procedural limitations on government action. another way in which privacy is obviously relevant to this. it's also, as dan said, sometimes about transparency in the sense we have privacy rules so then i know what you know about me. it can be about control, about control of my own image, and it's sometimes also about simply shame, since one ground of privacy is enabling me to keep from the world things that i'm not proud i did, of which there are far too many, i fear. what's important to note is that in all these instances the value that we're protecting that underlies privacy is different from the privacy itself. and that in turn suggests to me that the way to think about privacy is to think about what operational activities bould protect the underlying value
11:26 am
most. it means we need to go to a micro level to understand in general the nuance that arises from the particular interest that is at the core of the privacy that we're talking about. for example, we protect the confidentiality of attorney/client communications. why? because we think we need to foster candor in the discussion between a client and an attorney. that's something that we feel so strongly about that the instances in which we permit that privacy to be violated are few and far between and they come only with the highest level of judicial scrutiny. the fourth amendment itself reflects a similar utilitarian value of the security of our persons, places, and things against intrusion. once again, we impose a high baric bar, probable cause requirement, and a strong, independent, outside adjudicator, a judge issuing a washt. those aren't the only mechanisms
11:27 am
by which we can protect privacy. we have a series of administrative processes that are often adequate to protect and restrain government observation. they're embedded in many of the internal reviews that are very common in the i.c., in the intelligence community, that you spend your time reviewing. think ear common in virtually every institution of government that we have, at least at the federal level that i'm familiar with, where we think that administrative review, internal oversights, inspectors general, intelligence committee oversights are adequate alternate administrative mechanisms. so what does that mean for some of the things that you think about? let me look at the two programs that you've written about and just kind of express something there. the 215 program is up with that directly impacts issues of government abuse or potential abuse, because of the
11:28 am
pervasiveness of the collection that underwent, that was there. it strikes me that that sort of pervasive collection is one that would require a strong independent review mechanism because of the comprehensiveness of its activity. by contrast, the 702 program, which seems from what i've read from the outside, from your reports, more narrowly focused, is one in which less error correction mechanisms are necessary, less likelihood of inadvertent abuse is there, so those -- if you press on what is being protected, you get a sense of a better way to protect it. let me say one brief word more about transparency. i completely agree with others on the board that transparency is essential to control conduct and misconduct. but the critical question is what type of transparency? and for me, again, this requires us to ask what transparency is for. it's the ground of oversight and audit.
11:29 am
transparency without that ground is just jovoyeurism. but absolute transparency can't be squared with the need for secrecy and operational programs. i think sometimes some calls for transparency, not by members of the panel or on the board, are really just coated efforts to discontinue surveillance programs all together. the truth is that if we believe in absolute transparency, we've got a long way to the view that democracies can't have sequels, a view which i think is untenable in the modern world. with my last 30 seconds, let me offer one last thought about the role of the board and the multivaried nature of privacy. because i think that privacy is many things and has many applications in many different cop texts, i also think that the most appropriate ground for making judgments about privacy is not in board or judiciaries but in the most representative bodies that we have available to us, in this instance, congress. i realize that's perhaps leaning rather heavily on a body that is
11:30 am
not held in the highest regard at this time, but nonetheless that is the mechanism in a democracy for accumulating diverse preferences, weighing them in the balance and reaching a judgment for a broader societal interest. >> okay. >> thank you. >> okay. all right. >> my apologies. >> evelyn is professor of computer science and public affairs at princeton and the founder of princeton center for information technology. i think he'll give us a somewhat different lens through which to view privacy. >> thanks for the opportunity to testify. today i'd like to offer a perspective as a computer scientist on changing data practices and how they've affected how we think about privacy. we can think of today's data practices in terms of the three-stage pipeline. first, collect data, second, merge data item, and, third, analyze the data to infer facts about people. the first stage is collection.
11:31 am
in our daily lives we disclose information directly to people and organizations. but even when we're not disclosing information explicitly, more and more of what we do online and off is record it. online services often attach unique identifiers to these recordings which are used to link them up again later. the second stage of the pipeline merges the data. if two data files can be determined to correspond to the same person, for example, because they both contain the same unique identifier, then those files can be merged. and merging can create an avalanche effect, because merged files convey more precise information act identity and behavior and that precision in turn allows further merging. one file might contained detailed information act behavior, and another might contain precise identity information. merging those files links behavior and identity together. the third stage of the pipeline uses big data methods such as
11:32 am
predictive analytics to infer facts about people. one famous example is when the retailer target used purchases of the products such as skin lotion to infer pregnancy. today's machine learning methods often enable sensitive information to be inferred from seemingly less sensitive data. inferences also can have an avalanche effect because each inference becomes another data point to be yutzed in making further inferences. predictive analytics are most effective in inferring status bhen when many positive and negative examples are available. target used examples of pregnant and nonpregnant women to build a predictive model. by contrast, a predictive model that tried to identify terrorists from everyday behavioral data would expect much less success because there are few examples of known terrorists in the u.s. population. with that technical background, let me discuss a few implications for privacy. first, the consequences of collecting a data item can be very difficult to predict.
11:33 am
even if an item on its face doesn't seem to convey identifying information and even if the contents seem harmless in isolation, the collection could have substantial downstream effects. we have to account for the mosaic effect in which isolated seemingly unremarkable data items combined to paint a dividend-specific picture. indeed, one of the main lessons of recent technical scholarship on privacy is the power of the mosaic effect. to understand what follows from collecting an item, we have to think about how that item can be merged with other available data and how the merged data can in turn be used to infer information about people. we have to take sbrookt the avalanche effects that can occur both in merging and inference. for example, the information that the holder of a certain loyalty card account number purchased skin lotion on a certain date might turn out to be the key fact that unlocks an inference that a particular identifiable woman is pregnant. similarly, phone call metadata, when collected and analyzed in large volume, has been shown to
11:34 am
enable predictions about social status, affiliation, employment, health, and personality. the second implication is the data handling systems have gotten much more complicated, especially in the merging and analysis phases, that is the phases after collection. the sheer complexity of these systems makes it very difficult to understand, to predict, and to control how they behave. even the people who build and run these systems often fail to understand fully how they work in practice, and this leads to unpleasant surprises such as compliance failures or data breaches. complexity frustrates oversight, it frustrates compliance, and it makes failure more likely. despite all best intentions, organizationings will often find themselves out of compliance with their own policies and their own obligations. complex systems will often fail to perform as desired. complex rules also make compliance more difficult. it's sometimes argued we should abandon controls on collection and focus only on regulating use. limits on use do offer more
11:35 am
flexibility and precision in theory and sometimes in practice. but collection limits have important advantages too. for example, it's easier to comply with the rule that limits collection than one that allows collection and then puts elaborate limits on usage afterward. and collection limits make oversight and enforcement easier. limiting collection can also nudge agencies to develop innovative approaches that meet their analytic needs while collecting less information. the third implication is the synergy between commercial and government data practices. as an example, commercials but unique identifiers into most website accesses. an eavesdropper clefkting traffic can use these identifiers to link a user's activity across time and different online sites, and an eavesdrop kerr connect those activities to identifying information. even if the yulser switches devices and locations, identifiers can reconstruct 60% to 75% of what a user does
11:36 am
online and can usually link that data to a user's identity. my final point is that technology offer miss options beyond the most obvious technological approach of collecting all the data, aggregating it in a single large data center, and analyzing it later. here i think paul's analogy to the 1973 thunderbird is a good one. we would no longer accept the safety technologies that were available on that vehicle. nowadays we expect air bags, we expect anti-lock brakes, we expect crumple zones, we expect the latest technology to be used to make the technology safer and to reduce risk. and we should ask for the same when it comes to privacy. we should ask agencies to use advanced technologies to limit how much information they collect, to use cryptographer fi to limit undesirable flows of information. there's a large emerging literature on this. determining whether collection of particular data is truly necessary, whether data retention is truly needed and what can be inferred from a
11:37 am
particular analysis, these are deeply technical questions. in the same way that the board asks probing legal and policy questions of the agencies you oversee, i hope you'll build a capacity to ask equally probing technical questions. legal and policy oversight are most effective when they're combined with sophisticated and accurate technical analysis. and many independent technical experts and groups are able and willing to help you build this capacity. thank you for your time, and i look forward to your questions. >> thank you. okay. for the next 20 minutes or so, i'm going to pose some questions to the members of the panel, and i'll pose them to a particular member but then if one of the other members have something cogent, as i'm sure everything you say is cogent, then feel free to contribute. let me start with you. our constitution enshrines certain aspects of privacy in the fourth amendment -- security
11:38 am
of one's home and papers from unreasonable search and seizure -- excuse me -- and protection from general warrant. but are there other aspects of privacy that the advocacy community believes deserve legal recognition and judicial oversight, or can they all be encompassed within the bounds of the constitutional guarantees? >> sure. >> and if so, you know, what are the ones you think ought to be specifically recognized, protected, you know -- >> sure. okay. so to start with i suppose the obvious, the fourth amendment applies only to the government. it's a restriction on the government. it's not a restriction on private parties. and i think there's absolutely a place for regulation of private entities and how they control, acquire and control people's information because the market doesn't always do a great job of many things, although it does a
11:39 am
great job of many things, but we certainly know that people are not 100% satisfied with the privacy protections that have been provided in the private sector and that obviously falls outside of the fourth amendment but but is deserving of regulation. >> i got you there. >> sure. >> before you go on, there was another question but it leads directly -- follows directly from this. we hear an awful lot about the commercial acquisition of so much personal information, what they do with it, and in fact, the argument is sometimes made, don't worry so much about the government, some of the private, google, some of the communications, the internet have great masses of data. do you think that there's any significant difference in the risks to privacy that are displayed by the holding of so much personal information by the government as opposed to private entities? or is it, like, two big kahunas? >> i do think there is there's a
11:40 am
difference. i think that difference may be getting smaller, but i think there is and remains a difference, which is that private companies do not have the same coercive power over the individual that the government has, and private companies and private entities don't have the same motivations for -- to persecute people based on ideology or religion. these are things that we have seen in the history of this country, unfortunately. we have seen people targeted because -- for surveillance because they were political enemies of the reigning administration. so what i would say is that private entities have neither the ability nor the motive to throw people in jail on pretext because they are politically opposed to the current administration. that said, i think companies -- the line between big companies in this country and govern nance is getting thinner and thinner. and, you know, certainly
11:41 am
companies might have some political axes to grind with respect to the workforce and, you know, they certainly have access to people's information. i am not in the least bit unconcerned with the private accumulation of information. but i remain more concerned with privacy vis-a-vis the government. >> okay. let me try the professor. now, you wrote in -- something in an article called "conceptualizing privacy" that there are -- you went into a little in your prior remarks, there are 16 kinds of activities that represent privacy risks. privacy itself has six all aspects. they're all defined too broadly and they're all define too narrowly. and so you concluded, i think, if i read it correctly, that we should concentrate on specific types of disruptions to those interests and what should be done about that. can you apply that kind of
11:42 am
framework to the kinds of collection protection, i'm sorry, that we need in national security data and surveillance programs in collection processing, identification, secondary yuls, all of the other things that you talked about in your article? >> yes. actually, in what i wrote, i talked about privacy not being just one thing and having a common denominator but being a pool of common characteristics and actually applying to what i laid out was a taxonomy of privacy about various types of problems. and i wanted to focus on the problems or arias where certain activities cause disruption. they have caused problems. and we want to mitigate those problems, and what are those problems, because that's where we want to step in and say, hey, we should regulate this, we shall do something about this, we should address these problems. it doesn't mean that the activities that caused them are bad. but it does mean that they do
11:43 am
cause the problems we need to address. some of these problems that relate to government data gathering include one is aggregation that you can take a lot of different pieces of data, each one being particularly innocuous, not really saying a whole lot about somebody, but when you combine them together you can learn new facts about somebody. this is what data mining is all about and data analytics. the whole becomes greater than the parts. it starts to' create a mosaic, a portrait of somebody. this then lead to the revelation of information that someone might not have expected or wanted when they gave out little pieces of information here and there. and i think this causes a problem. it disrupts people's privacy expect expectations. it can lead to knowledge of information that people don't want exposed or that society might not want exposed. and so i think we need to address that problem and oftentimes conceptions of
11:44 am
privacy will ignore aggregation because they'll say, well, the information all were different facts that were gathered from public information, there's no privacy. but i don't think that's true. i think we really want to look at what the problems are. if we hook at the problem, there's a problem here. another aspect is a problem i call exclusion, which is the fact that people lack an ability in a lot of cases to have any say in how that information might be used independence them. any right to correct that information or to make sure it's accurate. and i think that's a key component of a lot of privacy laws is a right for people to make sure that proper decisions are being made about them based on their information. i can't go through all 16. i can hit some others. one is identification, the fact that this involves linking a body of data, what i call digital dossier to a particular
11:45 am
individual by. by identifying them, you are connecting them to data that then can be used to make decisions about their lives. some decisions could be good, but some decisions in fact could be harmful to an individual. security is another issue that i see as related and part of my taxonomy of privacy, and that's keeping data secure. and when data isn't kept secure, it creates risks and erabities to people that could expose them to a lot of harm if, in fact, the data is leaked improperly. and that happens all the time, and we're all at risk when all this data is gathered together in a big repository. there are a lot of other thing, but i'll stop here because in the interest of time, but these are just some of the ways that the taxonomy addresses this problem. i think it's important to think of -- the overarching point is
11:46 am
don't start with some platonic concept of privacy and see, you know, what fots in it and what doesn't. i think it's better to look at things from the bottom up and say where are the problems here, what are the problems and harms that are caused by these activities and how do we address those harms? >> could i just -- >> yes. go ahead. >> i would agree with everything that dan said, but i would also say also look at what the benefits are. you know, the president's report on big data looked at the increase in the volume, velocity, and variety of data and championed the idea that large-scale aggregation creates serendipitous new knowledge that is of value to society as well. so it brings with it harm, but it also brings with it benefit, and that is why you know, i see it as a kind of cost/benefit utilitarian analysis. >> okay. yes. go ahead. >> could i just say one thing
11:47 am
quickly about it's a utilitarian value, not a human right. it is a human right. it's listed in the iccpr and other treaties and protocols that the united states has signed and have the force of customary international law. so whatever one's perm feelings about that i don't think this board has the latitude to decide that it's -- that all these treaties we've signed declaring it as a human right are void. >> of course defining what's included in that is one of the problems. >> may i make one small point? >> go ahead. >> and that is i think i totally agree about the benefits of big data and the use of these things, but i think often the balance is wrongly cast between, okay, let's take the benefits and let's weigh it against the harms, because protecting privacy doesn't mean getting rid of big data or not engaging in surveillance or not doing a search. fourth amendment allows searches and allows surveillance, for example. it just requires certain
11:48 am
oversight. so we need to look at what we're balancing is not all the benefits of big data against privacy. we need to look at to what extent do oversight, accountability, and these protections on it, to what extent do they diminish some of those benefits? and that difference, that diminishment is what gets put on the scale against privacy, not all of big data's benefits. i think if we weigh that appropriately, then i think we've got a better balance. >> very quickly. >> very briefly. we don't get to weigh these things de novo when it comes to the fourth amendment. the balance has been struck. the government can't say we want to do searches in people's houses. we have a really good reason, we don't have a warrant, but we have a really good reason, let's to this balance anu. that balance was struck by the drafters of the fourth amendment. in a vast majority of cases, there are some narrow delineated exceptions, but you need a warrant based on probable cause to do those criminal searches. this is not starting from scratch. >> all right. your approach -- you talked a
11:49 am
li little bit about this. your approach for balancing privacy and national security has i think been termed whether you call it instrumental or consequential, but in one of your articles you talked about eliminating the right of somebody to complain or to go to court, et cetera, to intervene on the basis of when they are suffering a tangible harm like a warrant or getting called before the grand jury as opposed to professor solis' views of privacy as a kind of foundational value, recognizable in its own right. yet you also recognize in some of your other works the significance of some aspects of privacy to a democratic society. now, all of you have talked about it isn't just an individual right, it's a right that an open society needs starting with even the necessity
11:50 am
for people developing their personalities in an atmosphere in which they feel free to experiment a little bit, to have relationships, feeling they're constantly being judged by the government or society. i'm wondering how you reconcile your recognition of the aspects of privacy that are necessary to a democratic, open society with this notion that we really shouldn't start intervening until somebody's suffering some tangible harm? >> well, thank you for the question. i think that the -- i don't see them as irreconcilable because i see the question about the adversity of consequence and the error correction mechanisms as critical to the first part of your question. the inherency of the value. sometimes i use a thought
11:51 am
experiment. what if in some hypothetical world, which i assure you does not exist, the government never abused anybody. never actually misused the data that was collected. never went after -- had no lists of enemies, no persecution, and never made a mistake. now, granted, that's an impossible -- an impossible standard, but if that were the instance, then in the long run the values that underlie it, the democratic values, would be supported and people would no longer fear the collection because the lack of adverse consequence or be by hypothesis of 100%, would have gone away. so to my mind, the way to support the values we see in the underlying democratic sphere is to build the error correction mechanisms, the audits, the oversights in a way that
11:52 am
areassures society as much as possible that we're driving down the errors of -- and, frankly, both types. the false-positives and the false-negatives but this board is principally concerned with the false-positives, driving down the errors as much as we humanly can. we don't eliminate government programs because of the possibility of error because every government program, every human endeavor has the possibility of error. we arm police officers, even though we know they will sometimes misuse their weapons. we don't eliminate that. we try and drive down the error rate as much as possible so that we engender people's confidence in the police. and we see, sadly these days, exactly what happens when people's confidence in the police is not maintained. that our error correction mechanisms are deemed by society inadequate. and i think we're sort of seeing some of the same thing in response to the snowden disclosures as well. but that suggests to me that the
11:53 am
wrong way to support underlying values goes back to how to fix the error correction mechanisms. >> let me pursue one thing you brought up earlier, and that is this -- which has come up in some of our past reports and is bound to come up in the future ones, i think. and that is, at what stages, if you would go in a little bit more into the point at which you think an independent review of decisions outside of the government's internal auditing and processes are necessary to ensure that you have this kind of trust by the people, that the government is not taking risks to its privacy, and? terms of what you have suggested, that history has got some lessons for us, you know, on the trust us aspect.
11:54 am
>> well, i certainly don't dispute that we've had failures in the past. anybody who would dispute that hasn't read history. i would say that there is no one size fits all answer, that it really depends on the harms involved and the nature of what you anticipate the failure mode would be. i'll give you two examples. on the one side, we have the current tsa inspection programs at the airport. probably a fairly significant error rate of false-positives, pulling people aside for secondary inspection. on the other hand, a comparative modest intrusion, and i say that knowing many people think it's a very large intrusion, but nonetheless comparatively modest compared to the coercive nature of being put in jail, for example. so in that instance, we seem reasonably happy with a principally administrative methodology that doesn't require any outside check because
11:55 am
individual liberty is not at issue, long-term confinement is not at issue, the degree of harm is small. by contrast you will infer that i certainly think that independent review is essential whenever people's liberty is at stake, when significant aspects of livelihood are at stake. i think one of the strangest things i see in privacy debates today is we seem to get all wrapped up about things like the tsa screening and we don't look at how government databases are used to deny employment to people. you can't get a job in the transportation industry with a record, even if the record is itself rife with error because of the transportation rules. so that i think we have backwards a little. that's an instance where an independent review of some sort, for someone who is denied employment in the nuclear industry or the transportation industry. so putting that in this context,
11:56 am
i certainly think that any time there is an adverse consequence to an individual, we get to the point where there is a -- is there room for a judicial intervention, an independent intervention. that's why i sort of like what the president has done in adding the reasonable articulate suspicion of the querying of the 215 database, because that is the point at which some individual becomes, you know, out of the mass, pulled out for individual scrutiny and that's when he begins to suffer the adverse -- inaccurate adverse consequences. so i sort of like that as a transition point. >> good. mr. felten, you talked about tendency of institutions building ever larger databases and then gate them. and i think you've said either today or in some of your
11:57 am
writings that there are inherent risks to privacy interests when the databases get larger and larger, and especially when they agra gate them. so, i guess my question to you is -- we are all taking notes on this. what are the principles that you recommend as a computer expert for protecting privacy amid the increasing use of technology in this field? all on the way from collection ing a aggravation, took adequate concern for the security, what would they look to? >> sure. well, i think the first principle would be to -- to try to look beyond the most brute
11:58 am
force technical force which is collect all the data that might turn out to be useful and retain it all in a single large database for as long as you can. the more data you have, the more you collect, the greater the adverse consequences could be, the greater the risk and the more of a target it is either for abuse or for breach. the first principle is to try to fit the practices to the -- to the specific need. to think in terms of about what kind of analysis it is that one -- that you know you need to do and figure out which data you can collect and how you can structure a system in a way that can do that analysis while collecting less data, holding the data more separately and preprocessing or minimizing the data first.
11:59 am
and there's a growing array of technical methods that can do this. and, unfortunately, this becomes a technical problem. so the key principle here is just simply -- to insist that that technical work be done, to try to architect a system to collect and hold a minimum of data. >> who should do that? the government or private industry or -- >> in my view, if -- if, say, a -- if a government agency wants to argue that they have a need to collect and use certain data, there should be some onus on them to justify the technical practices they're using to justify the amount of data collected, the way they're using it and so on. that those who would argue for collection and use of data should be prepared to discuss these issues and offer a
12:00 pm
technical justification. when it comes to private parties, that's a more complicated discussion. i think that the best practice in industry ought to be to do that as well. although obviously the legal and market mechanisms that drive that relationship are very different. >> the last question i'm going to throw out and you can all take a whack at it if you want to, but several of you, i think you especially, ms. goitein, have talked about the element of chrome of information as being so essential. but then some other people who have written in the field have said, well, that certainly can't be an absolute value. there's got to be value. we wouldn't be able to have any kind of national security programs if, indeed, everybody said, well, i'm keeping control over that piece of information
12:01 pm
because i don't want anybody to have it, et cetera. so, how do you -- what kind of principles would you apply? because i assume you recognize that some balance, and even in the -- as paul pointed out, that even in the fourth amendment there's an unreasonable clause which gives you a kind of balancing fulcrum to talk about it? how would you handle that? and everybody can take a whack at it and all the panelists can take a whack at it. >> i think in the vast majority of circumstances, the way it should be anyway, the drafters of the fourth amendment did that balancing for us and gave us the -- what the government had to do to override the privacy right and that is to show probable cause of criminal activity. there are some very narrow expectations the supreme court have recognized. some of which are controversial, some of which are not. and, goen, so we're not starting
12:02 pm
from scratch, we have to follow the supreme court case law here and say, i believe this particular search was reasonable even though there wasn't a warrant. if it doesn't follow within the delineated warrant. within those exceptions there is room for balancing. what i would say about that is, first of all, the courts do their balancing when they do a review, but congress has a role as well. and when congress does the balancing on behalf of the people, i would agree with what i believe dan said, which is that this is a choice for the public to make. this needs to be a public choice and an informed choice. not a choice made in secret by a small number of officials, but by the public because this is a democracy. so we need to have the information about what this security threat is, how that threat could be mitigated by the collection of this information and what exactly is going to be the effect on either side. the other quick point i would
12:03 pm
make is in balancing tests, national security is too often a trump card. the words are uttered and we're done. julien sanchez from the cato institute made an excellent point is when you look at how courts weigh national security against the individual interesting question, they tend to weigh national security writ large over that particular person's interests and that information. that's not the right comparison. you either need to weigh that person's particular interests in that particular information against the incremental threat to national security in that case, or you need to address national security at large -- i mean, weigh national security at large against the values that privacy serves in our society. when you think of it that way, national security really shouldn't be a trump card. you know, these -- we talk about these values as being competition. i think the evidence for the most part shows that targeted surveillance is more effective than dragnet surveillance. but when they are in conflict, there needs to be a fair and public balancing.
12:04 pm
>> thank you. i'll let everyone have a shot at this. we can go down the line. we'll start at that end with you. >> thanks. i think you're thinking about these issues of control, it's important to recognize the ways in which people try to reashese control even if they don't have it legally. i'm referring to specifically self-help measures people use to try to limit the flows of information, to try to objectify skate information and behavior to deliberately do certain things to protect a certain kind of image that they worry is looking at their data. and these things have a substantial cost. if you're going to do a utilitarian balancing, like paul was talking about, you need to take into account the ways in which resources are spent, and
12:05 pm
sometimes are really wasted in a kind of arms race between self-help and strategic behavior on the one hand and attempts to overcome that on the other side. and those costs can often be substantial. just ask any teenager about their online use and what you'll hear in privacy, what you'll hear is an elaborate story about technical countermeasures and strategic behavior. >> paul? >> i think your point is generally well taken. which is to say that fundamentally the notion of control is at odds with government collection of information, whether it's for the purpose of imposing a tax under the irs or law enforcement or national security. that doesn't mean that it's not an important value. it is one that many would advance. and i see no reason to discount that at all. but in some ways, if you advance that as the touchstone of what
12:06 pm
you mean by privacy, you're setting privacy in option to government inaction and a host of areas where people might reasonably want to control -- you know, i'm sitting here as a republican on the panel thinking of all the friends i have who are second amendment people who think that the government should not collect any information about their gun ownership. and, you know, that's a perfectly reasonable position for them to have. it's not one that the government -- that we currently accept in society. and the last point i would make which is just in response to eliza because she's mentioned it twice, but when i was last in government, the percentage of searches that were conducted without warrants was actually quite high on the order of 50%. now, i don't know if that's changed much, because it's been a while since i've been a prosecutor, but many, if not most of our typical interactions
12:07 pm
with law enforcement are adjudicate indicated on an ex post reasonable standard rather than post. i don't have the data but i seem to recall it's not always a pre, as opposed to a post activity judicial review. >> dan, you have the last word. >> sure. a few really quick points. first of all, even if you can't always give people a total control, there are certain partial things you can give people for control. the other thing is that it's not just people being in control, it's that the uses and gathering of the information is under control. and that's another important thing about that. that's appropriate oversight and accountability and controls on that gathering, too. on the fourth amendment, i think that it would be wrong just to track existing supreme court interpretations of the fourth amendment, which i think are a lot of times flawed in a lot of cases.
12:08 pm
i think there are a lot of exceptions to the warrant requirement, a lot of instances where the fourth amendment doesn't even get applied at all because the court has this platonic conception of privacy that is incredibly narrow. that's how we get the third-party doctrine and how we get a lot of bodies of fourth amendment law that oftentimes take the fourth amendment away from any kind of approach. you know, the fourth amendment is, i think, a utilitarian balance. they say the right to search and against unreasonable searches and seizures. it doesn't say privacy. it says right to be secure against unreasonable searches and seizures. and i think that means that any time the government is engaging in searches and surveillance and gathering information, that it is unreasonable if it's creating problems that are not adequately dealt with, the right amount of oversight and accountability. that's really what the fourth
12:09 pm
amendment is trying to impose there. either justification to gather information, such as a warrant and probable cause or appropriate oversight to make sure an independent judicial body, you know, looks at the -- what the government wants to do and evaluates it. i think it's very important that we conduct the balance between privacy and security appropriately. i'm not a privacy absolutist. i think there should be a balance. but i think it's very important that when we balance, we balance it correctly and not incorrectly and that we don't skew the balance. too much to the security side, overweighing the security interests because it's not the entire security interests on the scale. it's the marginal difference between a security interest without certain kinds of oversight and accountability and the security interest with oversight and accountability. the church sensibility committee and i think all branches have a role to play in this. congress in the 1970s had a
12:10 pm
church committee which did an extensive review of intelligence agencies, produced a very illuminatve public report about that. congress hasn't done anything since. i think it should. i think judiciary has a role to play. i think this body has a role to play. i think the people ultimately are the key to all this. they have a role to play. >> thank you. we're now going to have 20 minutes of questioning from my fellow board members. and i think i'll start with the chair and then i'll -- >> thank you. liza raised a question about the id proper standard for privacy and referenced the cats decision on expectation of privacy and how -- and in some ways people rely on practical obscurity because the government is too complex or burdensome to gather information. in some ways in the computer age, we're beyond that, which is that court file that was gathering dust now is easily
12:11 pm
accessible and public. the question is, how should we look at privacy issues when public databases are so readily available? there's also a reference to the fact that the line between government and commercial databases isn't always great and the government can access commercial databases? how do we look at privacy when the information is out there, is publicly available, but yet, as ed pointed out, you combine it into a mosaic and it can create a very detailed profile? and should the government be collecting that information? so, what standard should we apply in this context? what sort of what's the cats' 2014 version as far as how the government ought to recognize privacy issues? i'm happy to go down the line. >> well, whoever wants to take it. >> go ahead. >> i might note that timewise we're going to have about five minutes. so, if you could keep your
12:12 pm
answer -- your comments relatively brief, we can make sure everybody gets their full component of time. >> i'll be super brief. i think right now what's been known as the mosaic theory that we see in the concurrences to the jones case and the supreme court are starting to look at this very, very question. i can't really answer it in a few seconds, but i think it's to look at when we combine various pieces of data, what are the implications of that? and when does the combining of that data reveal new information that can create certain problems and harms to people. and that's where we want to step in. >> i would make two quick points. the first is, of course, that practical obscurity is itself a sort of post-industrial concept. if you were in the medieval village back in the 1200s, there was no practical object curety. it was limited to who you knew and they knew everything about it, pretty much.
12:13 pm
the data aggregation system in advancing practical obscurity, it comes to be one we value more, one that i agree with. i think dan is exactly right. the mosaic is real. to deny that is to deny the reality of the science that ed knows. and so it strikes me that at the most likely points of intervention are either at the collection, aggegation gated data. i tend to think you can't do it at collection because the databases are there. it's so big. it's impossible to stop unless you're going to stop google from collecting, we're going to have big data collection. it's got to be when the government chooses to aggregate or chooses to act upon ting a r
12:14 pm
gags. >> i agree with what's been said about the mosaic theory, and the other way to look at it is just the information that's being gathered by the government is, in fact, information that is using normal powers of human observation would be in a person's control and would not be something the government would have access to. the one thing i would say is that i don't agree that the point of collection is a moot point because the mere fact that google has all of this information, facebook has all of this information, does not mean the government has all of this information. there are burgeoning new technologies that their use has not been decided. such as uavs and how the government will be allowed to deploy uav. so, there's still plenty of room to regulate at the collection phase and for all the reasons we discussed earlier about chilling effect and about what privacy means to different people. i think that is the point at which the privacy interests arise. >> i'll be very brief as well. along with what the other panelists have said, i would also point out that much of the information that is in corporate
12:15 pm
databases is information that was in corporate databases is information that was observed and not disclosed. and there is not always consent or often consent is very thin from the person who the data is about. and so i don't think you can always infer that there was an awareness. you can't infer that a user was aware that it was collected or that they were aware it might go to the government and be used for government purposes. >> and, therefore, should the government not collect the information under those circumstances? >> well, i hesitate to make a legal opinion here, not being a lawyer. as a policy matter. but i should say as a policy matter, i get very nervous when it appears there is a legal friction that something has
12:16 pm
happened when it's clearly not happening. so a fiction of consent or a fiction that the mosaic effect does not exist are troubling. my time has expired. >> rachel band? >> thank you. thank you all for being here, first of all. going back to this notion of control that judge wald was asking about, you went to fourth amendment concept. i'm interested in whether the notion of control that's embodied in the phipps, which is more of an individual participation concept, can apply in the national security surveillance context? so, as one of you -- one of you noted that individuals might not -- maybe no individual would say, yes, i consent. i'm to be surveilled by the fbi or anybody else, and if that were the standard, then, you couldn't have surveillance programs.
12:17 pm
so, how do you -- and the fipz is on top -- would impose on governmental agencies. can that notion at all survive in the national security concept? what's your opinion on that? >> i think it can apply, but i'm just sort of pausing at some of it because i'm thinking about some premises of the question. it's not the case that you couldn't have surveillance programs if people did not consent to the disclosure of their information. the government can obtain your information with a warrant based on probable cause. >> no, no, my point is we're beyond the fourth amendment now. we're layering on top of the fourth amendment of 9 fipz kind of individual participation. the reason i ask is, for example, when the nsa published their report on targeted data collection or 12333, they said they were applying the phipps, but then they turned around and said the individual participation concept doesn't apply so we're not applying that part of it. what i'm wondering is if the phipps is just not the right
12:18 pm
framework to apply, or does this individual participation thing just not apply and should we look for some other framework or standard to use? that's what i'm getting at. >> >> if you don't mind, i'd like to think about that question and maybe i can put it in writing along with my testimony. >> okay. >> i have a thought on it. i think that the phipps model had some flaws to it. a lot of times people don't read the privacy policies in most cases of companies, and i'm not sure if just providing a notice is effective. so we do need to think about what works in this context. i think that the key is in certain cases we might want individuals to play a greater role. i think the tsa, if you're on the no fly list, i think you should have a right to be heard. there should be rights of redress there and to be challenged you're being on the
12:19 pm
list. so i think there the fips makes sense. i think some of the fips make a lot of sense some of the fips like security make sense. others might not. but i think the larger component of all this is that there's adequate control and accountability which is also part of the fips so that everything within the fips, such as individualized notice much everything that's collected is not really feasible. there is a greater transparency right in the fips, too. not that individuals get notified of every collection about them, but that there is a public accountability and generalized exposure about what's going on. >> i thought that the acknowledgment in the nsa report that some of the fips principles couldn't be fully implemented in the context of a national security surveillance program was an absolutely accurate acknowledgment of reality. you can't provide error correction notice in all circumstances. i certainly agree that -- i mean, i was talking more about
12:20 pm
the secondary screening on the no-fly list where we do have morrow bust rights. but the challenge for you is going to be trying to figure out what the underlying values are and how to get at those so in this context, i think the underlying value is prevention of governmental abuse. that's what animates everybody in this sphere. and government surveillance modifying behavior. and the types of accountability and transparency that you have to help build are ones that match of the need -- the operational needs of the national security system while providing protections against it. we tried that with the intelligence committees and the post-church commission modifications, something that we might call kind of delegated transparency, where we all trust the congress to do it right. it seems as though we're less willing to do that now. personally, i'm not so certain
12:21 pm
that that's a good impulse, but seems like that. so maybe it's this board. maybe it's a judicial panel with a cleared advocate in front of it. there are lots of mechanisms short of the complete transparency and accountability and individual participation in part of fips that could be imagined that would achieve the objective of controlling against governmental abuse and misuse while not completely frustrating the operational necessities that i think we -- most of us see as remaining regnant. so, i think a lot of it would be things that are more in ed's bali wick, which are thinking about the use case scenarios are in advance and building in enhanced privacy protections on a technological level. and then can you as much of your cake as you want and still get to eat some of it.
12:22 pm
>> it seems there should be a greater obligation to further the goals of that. so, for example, if you can't offer the right to control or correct errors in the data, you could imagine -- you could imagine asking for greater effort to ensure the correctness of the data as it is or extra safeguards expost regarding the possibility of error. >> thank you. >> did you collect any thoughts that you -- okay, very briefly. >> yes. and i think -- i think i would agree with ed. i mean, part of what i was struggling with is how -- how much are we giving up on this sort of collection, which i'm not willing to do talking about post collection, that's why i wanted to go back to that issue
12:23 pm
of surveillance and control of information. i still want to go back and look more. this is honestly something i haven't thought about enough and i want to go back and look through the fips, which we used to use it all the time on the hill to craft our privacy amendments, but i want to about back and look more thoroughly. but it sounds to me like the best approach is -- i want to look more closely. >> thank you. we'll be glad to see any later submissions to any of the panelists. before we go on, beth cook's question, i want to remind the audience, if you have any questions, write them down and they'll bring them up to me and then i will -- okay. they're coming. that's good to know. okay. ms. cook? >> so, thank you all for what i have found to be a very, very interesting panel. and i hope it's -- bodes well for the rest of the day. in fact, a lot of -- i think
12:24 pm
panel free will be dealing with exactly how do you deal with fips. i was also struck by the numerous mentions of the mosaic theory. obviously other implications, one under transparency which is to the extent we are trance transparent in seemingly discreet ways. our adversaries are also looking to aggregate information. and i think there is an argument that mosaic theory is a critical -- you need to understand the mosaic theory to understand collection, to understand exactly how the national security apparatus works. they have to be able to aggregate information. you can agree or disagree. but i think i was struck by the different implications of the mosaic theory. so i wanted to start with you professor felten, and i was really interested in your notion of moving away from the brute
12:25 pm
force collection mechanism. and i think the section 215 program is one that the government has made the argument essentially that they need the brute force collection, they need to have the retention, in order to identify previously unknown links and information. have you given thoughts to whether or not there are technological options available to limit collection for a program like section 215? if you haven't, then more generally, if you could be more specific about collection options. >> yes. well, with respect to section 215, the data, of course, is collected initially by the phone companies. right? then there's the question of whether the information needs to be transferred in -- in bulk to -- to the intelligence community in order for them to be able to do their analysis. and i think it's pretty clear that as a technical matter, the kinds of linking, looking for,
12:26 pm
say, multihop links that -- that the intelligence agencies want to do, can be done technically while the information is still held by third parties, such as the phone companies. this requires a modest amount of technical coordination between the companies -- the entities holding the data and the entities that are doing the analysis. so there are opportunities to match, to look for whether there are paths of two hops or three hops from point "a" to point "b," et cetera. and then to reach in and extract just the data items of those individuals or phone numbers that are highlighted by that kind of analysis. that's the kind of thing that can be done. there's further work that is more technical, that goes to questions of whether you can use, say, advanced cryptography to allow that same analysis
12:27 pm
while not disclosing to the phone company information about which numbers are being searched or linked. those sorts of methods are, i'd say, developing. and there's been some interest in the technical problem of how to do this in the independent research community in light of what we've learned about -- publicly about the section 215 program. and one of the lessons is that is that methods are often available or developable when you have a specific technical problem like this. >> i think our biggest challenge is taking the concepts that we're talking about today and developing practical, feasible recommendations that can actually be implemented. so, the more concrete and the more specific that we can be in terms of recommendation, the more likely they are to be implemented. briefly, both to the professors in the middle, i would ask -- you both talk a little about risk mitigation and assuming there are going to be harms, how do you mitigate those harms.
12:28 pm
past the collection stage, what have you found to be the most effective mechanic niches for mitigating risk? is it retention periods? is it access control? is it audit trails? so, what can the government do concretely to start mitigating risks? >> well, i think it's not really just one thing that i could sort of point to. like, that is it. it's all of those things are very valuable to do. everything from mechanisms to ensure that this information is accurate. when information's grabbed from one context to another, you know, what's accurate enough for the purposes of amazon.com to recommend books for you is not the same level of accuracy we might not wpt -- we might want from the government. so, amazon makes a mistake and recommends the wrong book to you. big deal. it doesn't need 100% accuracy for that. but the level of accuracy differs as we differ in context. we need to have mechanisms to ensure information might be taken from one context and put
12:29 pm
into the other, that it's appropriately accurate for that particular context. we need an analysis of how long we keep the data, audit trails to make sure it's not being improperle accessed, audit trails to make sure it is not improperly accessed. appropriate accountability to make sure it is kept adequately secure. and also how it is being used. controls on its use so it can't be used for any purpose ten years from now. so we need all these different things. oversight from a lot of different bodies, i think. so it is actually a complex thing with many, many parts. >> there are certainly many moving parts. but from my perspective, both from outsides and when i was inside, since the threat we're talking about is governmental -- uses is the primary one, the principle factors that i would focus on that seems to be effective, ones that focus on individual government actors, training in the first instance so they know the rules, a culture of compliance that is
12:30 pm
pre -- pre-error mechanisms, then obviously a lot of audit compliance work from outside inspectors general and/or congress. and then, you know, and then finally, and this is perhaps where we fall down the most, the willingness to impose at least administrative sanctions on people who vary from the accepted rules. at least in a willful context and even perhaps in a negligible context. you know, nothing attracts the attention of a government employee so much as the prospect of losing his job or being, you know, suspended for a term of months. so that would be where i would focus. >> i just have one comment. >> sure. >> if we look at the failures of compliance that have been acknowledged, we see some of them that are individual employees doing things they
12:31 pm
shouldn't. but we've also seen some that are failures of the technical systems to -- to behave consistently with the -- with the internal policies. and this is a case where oversight can operate without needing to get deeply into the nuts and bolts of the technology. just the question of what processes are in place to make sure that your technology does what your general council says it should do. and that's -- i think there's an opportunity to push on oversight in that area. >> thank you. i think we'll move to jim dempsey now. >> thank you. and thank you to all of the witnesses. i think it's very important as we wrap up this panel to highlight what i at least heard as an awful lot of commonality. because i think it's important for the board
12:32 pm
i think all of you agree that it is an umbrella term that covers many different values, many different interests. and i also heard agreement the mosaic theory, even if hasn't been accepted by the courts, is -- is real. it's real both from the privacy perspective and it's real from the governmental perspective. >> let the record reflect you're nodding.
12:33 pm
>> and thirdly, i think i heard unanimity that the -- what the law refers to at least as the third-party doctrine. the doctrine that by giving information to one person, you lose all interests, all privacy interests in that information. that disclosure to one surrenders your right to -- with respect to disclosure for any other purpose. again, in my right, there was agreement that that concept, that disclosure to one is disclosure to all, is not a valid, again constitutionally aside, for modern day reality, that doctrine just doesn't fit with the way we view information and the way we view privacy. and dan is nodding. disclosure to one is not a paul, would you agree that the disclosure to one is not a surrender of all interest in the
12:34 pm
information? >> i would say that the way that people interact today, it would be inappropriate to apply universal -- concept to universal disclosure from explicit concept to disclosure to a single person, yes. i'm not sure that i would agree with what's implicit in your question, which is that that -- that it necessarily follows that that is a matter of either constitutional significance or one of legal cognizable significance that should animate this board. i want to think about that. but i would certainly accept the premise that human experiences, that if i tell dan a secret, i'm not expecting him to tell everybody. >> in fact, there's -- taking the instrumental approach, there's an instrumental value that the disclosure of your medical records to the doctor is specifically premised on the
12:35 pm
notion that thereby you have not surrendered your privacy rights. in fact, we want people to accurately disclose information to their doctors. therefore we promise them this their disclosure to the doctor is not disclosure to all. >> that's true. that's a wonderful example because we accept statements made to a doctor as an exception from the hearsay rule precisely because we think that when you talk to a doctor in an emergency situation, you're motivated to actually be telling the truth. i was shot, doc, so the doctor can in some circumstances be compelled to -- so, it -- those realities work both ways. >> can be compelled but not -- >> cannot be collected -- not collected under -- >> yes. >> also, several witnesses mentioned the fips. i think it's first of all important to say we're talking about the fair information practice principles, which actually there's no definitive version of them, but there is a
12:36 pm
version that was adopted by the department of homeland security in 2008, which is as good as any, i think. an it seemed to me also that there was agreement that the fipps framework provide the framework, the question, they are nowhere perfectly implemented. they're nowhere fully implemented, but they are relevant as a framework for asking about how you deal with information. and then you decide, do you adjust it? does it work? if it doesn't work, you compensate for it with more emphasis on other issues. is that again a fair -- paul, you're making a somewhat skeptical face. you at least would say that it is a framework for asking the question? >> it's a framework for a starting point for asking the questions.
12:37 pm
but i think that many of those questions don't withstand the technological transitions we're going through. and so i accept it as a leaping off point, but i think i'm probably more willing than some of the other -- other members of the board to discard some of them as inoperable under current circumstances. >> and what would you replace them with? >> well, as ed said, emphasis on the remaining aspects and then, to my mind, i think kind of more granular analysis of the underlying interests at stake and thinking about what the mechanisms are that the privacy interests that we're talking about is that we have to protect. because, you know, fipps is kind of one size fits all and i just don't think it kind of covers the range of the privacy interests that the chairman outlined so ably. >> so ably. >> earlier in the day. >> okay, thank you.
12:38 pm
>> okay. we have a couple of questions from the audience. i'm not sure we'll get through all of them so i'll direct them. i'll just be arbitrary and direct them to a particular panel member, and then if you can keep it as brief as you possibly can. first one, actually the writer wanted it directed toward you, ms. goitein. when the government draws data from private databases, which point of regulation is more required? the private entity's collection or the government's collection from the private entity? >> when you say -- >> it's a yes or no >> i don't think i can answer that question. depends on what you mean by more regulation. obviously when you disclose certain information to your telephone company, you are in a contract with that company. and that contract regulates your dealings with the company. i think part of -- one of the
12:39 pm
problems with the metadata reading of contract or seconds 215 of the patriot act that would enable any person to know what they are consenting to and know that their information would go to the nsa. >> so the answer is both? >> both, just different types of regulation. there's the contractual regulation, there is some degree -- i mean, the storied communication act is regulation. when you get certain kinds of information from the telephone company. and then from the government there's the fourth amendment and all manners of law. so lots of regulation everywhere. >> okay. for you, dan, i think it was -- they said private companies have no incentive to co-herself or imprison people, perhaps that's why the risks of injury might be greater than from the government than from private companies. but the writer asks, does that take into account the homeland
12:40 pm
security and prisons industry? nsa couldn't do what it does without 484 contractors providing i.t. technical support. are there risks inherent in the increasing commercialization of national security interests? >> well, yes, i definitely think problems can come from anywhere. i don't think they are inherent things that can be said about various things about where problems could be caused. i think we wanted to look at, you know, when does collection and the amassing of data by the private sector cause problems? when does that access by the government cause problems? increasingly we see cooperation of an industry or private sector that has grown up to perform government functions and help gather data, help analyze data and then share data with the government. i think all these things create various problems that we need to address. and so i think we keep our eye
12:41 pm
on the problems and stop looking elsewhere and just look at the problems and we address those problems wherever they may happen. i think that's the best approach. >> okay. here are two -- i think this must be you, ed felten. could the panelists discuss what they think their tesla -- i had to ask what that was -- of today, what technologies of data flow analysis could or should be built in? i know you've covered a great deal of this before, so if you could just give us kind of one or two-sentence summary on it, that would be fine. >> in a sense, the question is asking me to sum up sort of a whole area of knowledge in a few seconds. >> yes, i understand. >> which i won't try to do. i would simply say that as with cars, as with the tesla, you know, some sort of high-end car, you should think in terms of
12:42 pm
which technologies are available and reasonably practical to use to minimize, control or limit the risk of certain information practice. and then ask that those be there. that you should ask that an entity that wants to collect and use the information be willing to justify the choices they've made and be willing to justify why they did not use some accepted technical privacy preserving technical method if it seems to be available. >> okay. the last one is -- paul, i don't think this is your natural balancy wick, but what about the operation of privacy in quasi ferlg organizations like the postal service or p gchltd and ef i can remember back to my old jerusalem judicial background, that's something benefits corporation -- >> pension benefits. >> pension benefits. how are they impacted by the
12:43 pm
fourth amendment? are there issues and concern for privacy in those organizations? >> i suppose the honest answer would be i'm not sure. but my understanding is that the fourth amendment applies to those institutions insofar as they are exercising governmental authority and acting as agents for the government. so i assume that postal service employees can't open your mail willy-nilly just because they're pseudo private actors. i may be wrong about that, but since they don't open my mail -- jim's nodding no. i'm right. so that's good. i think the implication of the question, which is the most interesting part of it, so i'll transition it to something i do want to talk about, is that it
12:44 pm
emphasizes the point with which liza made, which is the line between commercial collection and government collection is increasingly blurring some. and, you know, and the idea that, you know, regulation of the government but no regulation of google's collection is -- kind of sits in dissidence and there are these places that are halfway in between. for me, you know, that suggests one set of answers because i'm unwilling to think about wholesale government regulation at extreme level of government -- of corporate practice -- corporate business practices. i think there's some there, but it certainly emphasizes the confluence between them. >> okay. well, that ends my part of the panel. unless the chair has some parting words? >> thank you very much. >> thank you very much. you've been extremely forthcoming.
12:45 pm
>> thanks to the panel and audience questions. we'll take a 10, 15-minute break and resume with the technology panel. we'll return to the privacy and civil liberties oversight board in just a moment. ending its first panel here on privacy in government counterterrorism programs. we'll be showing the second portion of the meeting coming up. first, though, a reminder that president obama will be in las vegas today speaking about immigration at del sol high school where he appeared last year to talk about his vision of immigration reform. we'll have president's remarks on companion network c-span scheduled to start just before 4 p.m. eastern. we'll also be taking your calls. this morning house speaker john boehner responded to the president's announcement last night on immigration. here are his remarks.
12:46 pm
>> morning, everyone. our nation's immigration system is broken, and i think we need to work together to fix it. but fixing it starts with a commitment to working through the democratic process and enforcing the laws that the president is sworn faithfully to execute. all year long i've warned the president that by taking unilateral action on matters such as his health care law or by threatening action repeatedly on immigration, he was making it impossible to build the trust necessary to work together. as i warned the president, you can't ask the elected representatives of the people to trust you to enforce the law if you're constantly demonstrating that you can't be trusted to enforce the law. the president never listened. by his action, he's refused to listen to the american people. the president has taken actions
12:47 pm
that he himself had said are those of a king or an emperor, not an american president. and he's doing this at a time when americans want nothing more than both parties to focus on solving the biggest problems in our country, starting with our still-struggling economy. and the action by the president yesterday will only encourage more people to come here illegally. and putting their lives at risk. we saw the humanitarian crisis at our border last summer. how horrific it was. well, next summer, it could be worse. and this action also punishes those who have obeyed the law. by this action the president has chosen to deliberately sabotage any chance of enacting bipartisan reforms that he claims to seek. and as i told the president yesterday, he's damaging the presidency itself.
12:48 pm
president obama has turned a deaf ear to the people that he was elected and we were elected to serve. but we will not do that. in the day as head, the people's house will rise to this challenge. we will not stand idle as the president undermines the rule of law in our country and places lives at risk. we'll listen to the american people, work with our members and work to protect the constitution of the united states. >> reporter: speaker, the president says -- >> do you want to know what the rules are? >> reporter: speaker, the president says you could have prevented this, mr. speaker, by showing that the house was going to take action on comprehensive immigration reform. did you miss the boat? >> the president made 38 unilateral changes to the affordable care act. the president repeatedly suggested that he was going to unilaterally change immigration law. and he created an environment
12:49 pm
where the members would not trust him. and trying to find a way to work together was virtually impossible. and i warned the president over and over that his actions were making it impossible for me to do what he wanted me to do. >> reporter: mr. speaker, can you tell us how you plan to respond? how the house plans to respond? when the house will respond. and whether or not you agree with chairman rogers, given how these are funded, the power of the purse is not a constitutional avenue to fight this? >> we're working with our members and looking at the options that are available to us. but i will say to you, the house will, in fact, act. >> reporter: speaker, you started a piecemeal process last year or earlier in this year. can you renovate that, reinvag rate that or are you going to start from scratch? >> i said at the beginning of my remarks. we have a broken immigration
12:50 pm
system. and the american people expect us to work together to fix it. we ought to do it in the democratic process. moving bills through the people's house, through the senate and to the president's desk. thank you. happy thanksgiving. tonight an encore presentati presentation. mr. osnos, we will have his interview tonight at 8:00 eastern on c-span. on capitol hill, the house oversight committee is asking former obama administration
12:51 pm
helper jonathan gruber to help. the chair of the committee sent a letter to mr. gruber yesterday, two weeks avid yos surfa after videos surfaced. the chair has also invited the director of the centers for medicare and medicaid services who found that the obamaed aminute had inflated the healthcare enrollment numbers by counting dental plans separately helping list the number of enrollees past the 7 million number. the hill says gruber has been fired by north carolina state auditors. read more at the hill. the privacy counscil, better oversight for private company and the federal government.
12:52 pm
>> good afternoon. welcome to our meeting on defining privacy. we will continue with our afternoon session with government panelists moderated by a member of ethco. >> welcome to those who were here or not here. what we noticed this morning, make sure -- alex it will be relevant for you. make sure the microphone is actually in the direction you are talking so that even if you pull it in front of you but then turn to talk to us, make sure the microphone is picking up. they were having problems this morning. we have all been gently reminded as well. this panel is about the privacy interests addressed by government privacy officials. obviously, in the
12:53 pm
counterterrorism context, defining and articulating individual privacy interests while balancing the needs of national security is extremely challenging. as we discussed a bit this morning, widely accepted privacy frameworks like the frar information practice principals may very well be intention with the necessity to protect information regarding the operation of a particular counterterrorism program. by the same token, some counterterrorism programs could be better served with greater transparency about what information is being collected about the statutory authorities or the authorities pursuant to which programs are being operated and about what protections the government utilizes to minimize the negative impacts on individual's priva privacy. the panel we have today for this forum is i think uniquely
12:54 pm
situated to discuss these privacy issues that arise in the context of federal count e counterterrori counterterrorism. they have been pioneers, many of them, in the practice of working pro actively within the agencies to ensure privacy and civil liberties concerns are taken into consideration from the beginning of programs. if that were not enough of their duties, they are learning to live with us and work with us. joining me today are three individuals -- unfortunately, dls was not able to make anyone available for this as it turned out. we have three folks. they will have ten minutes, given that they have a little bit of extra time. but we will follow the same framework. i will then ask a series of questions for a period of time and then invite my fellow
12:55 pm
panelists and public to submit questions as well. leading us off is alex joel, who is the civil liberties protection officer for the office of the director of national intelligence. do you fit that on one card? >> yes, i do. >> it's amazing. in that capacity, he leads the odni's civil liberty and privacy office. he report es to the director of national intelligence. prior to joining the government -- this is relevant based on our other panels -- alex served as the privacy technology and ecommerce te attorney for marriott where he helped establish their global privacy compliance program, including their first privacy officer position. alex, did you want to kick us off? >> thank you. i want to thank the board for -- >> there's a stoplight function going on. green, good to go. yellow, start wrapping up.
12:56 pm
red stop in the front row. >> okay. i want to thank the board for inviting us here to address the public in this very important hearing. as you said, the board does work very closely with us. we feel that the board's role in providing both transparency and oversight as well as advice to the intelligence community has been extremely valuable and is a critical part of the -- of how the intelligence community protect s privacy. i want to thank the board for their careful exercise to exercise their functions, which i think have been critically important. this topic is, of course, one that consumes all of us, not specifically how to define privacy but how to apply protections required to protect privacy in the context of our activities and in particular in
12:57 pm
counterterrorism activity. i would like to get to what i think of as the heart of the matter from an intelligence community perspective. which is that we operate by necessity within a sphere of secrecy. we have to be able to maintain secrets to be effective. the more publically transparent and intelligence service is the more it informs eadversaries ho they are collecting information and the betterer ab eable they avoid detection. a fully transparent intelligence service is by definition ineffective. the key for us then is how within the sphere of necessary secrecy do you make sure that the intelligence agencies are acting appropriately and in a way that protects people's privacies consistent with the values of the nation?
12:58 pm
in the past what we have done, as you know, is focused on ensuring that we are providing full transparency to our oversight entities. our oversight system is something that i would like to characterize as a system of many layers with many players. we have not only within each agency offices of general counsel and inspector general as well as newly created privacy and civil liberty offices, but we also have outside of the agency we have entities like the department of justice, which is responsible on a government-wide basis for exercising some of the authorities in oversight controls. we have, of course, newly created entities like the privacy and civil liberties board, perhaps not that new anymore. which again is designed to make sure that there is a secure place for information to be disclosed and discussed so that the oversight institutions are satisfied that the activities
12:59 pm
being conduct ready proper ones. and then, of course, we have congress and the judiciary, both of which exercise robust oversight. i would mention that, for example, the congressional oversight committees which were established particularly after the church committee hearings in the 1970s to provide this level of oversight over intelligence activities has been in effect -- has been very effective in my view in providing careful oversight of what we do. that's the -- that's sort of the oversight part of the equation. i think what we have now more fully realizes the need to enhance transparency. so if you think of it -- i was just thinking about this before i started talking, which is always dangerous. but if you think of it as operate iing within a sphere of secrecy, to make sure the rules and oversight structure within the sphere are robust enough to
1:00 pm
make sure that privacy is being protected and then there's the other way of approaching this which we're focusing on doing which is reducing that sphere. providing greater transparency into what goes on inside intelligence agencies so the public at large can get reassurance and can provide input and feedback into our we conduct these activities. i think if i could just continue to -- along this theme there are two aspects in particular of what goes on to regulate our activities that i think is of interest. one is the rules that we follow. the other is the oversight framework and mechanism designed to make sure we are following the rules. i think on the former, what are the rules that we follow, we can and sure provide greater transparency. but a lot of the rules are being debated and discussed. you can think of some of the reform mechanisms as attempts to modify the rules. you have activity going on

67 Views

info Stream Only

Uploaded by TV Archive on