Skip to main content

tv   Key Capitol Hill Hearings  CSPAN  November 12, 2014 9:00pm-11:01pm EST

9:00 pm
because through a wiretap on the prime minister thatch that's what happens when you don't have encryption and security by default. to think that the terrorists aren't going to do the same thing, i think is naive. >> alford, you talked about the expectation of privacy and if i heard you correctly, tell me if i'm wrong, is that you're suggesting that we talk about not what people expect their privacy to be because i can put up a sign saying i'm conducting video surveillance and destroy that. but their expectation of what privacy should be? >> i'm not saying that. that's a separate wonderful, powerful argument. what i'm saying is that technology is making us realize that we do expect privacy in scenarios that didn't exist 10 or 15 years ago. so i think technology can expand your notion of privacy, but i also think the fourth amendment doesn't protect me and you, it protects us as a society and sets a base for relationship between a government and its citizens that also needs to be
9:01 pm
protected. >> fourth amendment, mike, you talked about the balance between government requests and your customer's privacy, do you think the government should have a warrant every time it accesses your customer's records? particularly if they're american customers. >> yeah. certainly in the law enforcement context we've advocated for a reform of that that would in effect require a warrant for access to any content regardless of the age to precise location information, other sensitive data. you know, i'm not sure we gould so far to say that a warrant is required for every single case for every single data type. we certainly need to update the rules so that there is appropriate judicial review of surveillance programs and specific requests that we get for data. >> in terms of third-party doctrine, would you then essentially not have it be an absolute exception to the fourth
9:02 pm
amendment but essentially where would grow with it to provide some protection but not necessarily a full warrant protection? >> yeah. the laws that we deal with in the law enforcement context provide a sliding scale in effect. provide some reasonable oversight in protection, something below warrant and probable cause. and we've taken the position that's appropriate for some types of subscriber data, et cetera. >> thanks. heidi, you talked about -- i want to put this in the context of how much information should be collected, you talked about enforceable rules for collection, but you also said that collection is going to be faster, cheaper, we're going to be all more connected and that tax will increase and that even compliance with rules may be more difficult, professor fellton talked about potential abuse of information and increase possibilities of breech. how would you strike the balance
9:03 pm
between collection rules and essentially user rules? >> that's a very good question, very difficult one. i don't know on the technology side of the house, i don't know if we really know where the balance is. we take a look at the tax, we look at the system, we look at the capabilities, we look at the mere fact that all of these attacks, exploits are becoming so advanced that i used to give you one concrete example. i used to need to be physically around your things that you touched to be able to lift your fingerprint then have access to your phone and then use that fingerprint to mount an attack. with the resolution of the cameras that we have these days, sometimes -- very high resolution camera, i just need to have your picture that was taken somewhere in china to be
9:04 pm
able to zoom in, zoom in, zoom in and lift your fingerprint and mount an attack. how do you reflect things like this should we build systems that when ever there's a fingerprint that smudges it and we don't expose it -- there are things like that encompass all those cases that should be buildable what i'm trying to come across are coming up with the rules that define those capabilities or things that should or shouldn't be done is a very complex problem. >> thank you. >> so, thank you, guys, for another excellent panel. my first question and this goes back to what i had said on a previous panel which is i view our job to be translating these ideas, these concepts, these concerns into practical recommendations. so, starting with you, what have you found effective as a privacy officer to ensure your very large work force, your
9:05 pm
complicated work force dealing with emerging issues takes privacy seriously, your rules are enforced, and that from beginningprivacy is a part of your culture? this is free advice to the new privacy officer over at nsa. >> well, becky, as i eluded to in my opening remarks, you know, one, there's no silver bullet. you need to take a number of approaches. and we've taken a number of approaches to drive awareness and sensitivity around privacy throughout our work force, through a number of steps on mandatory training that's required for all employees that cover a range of ethical and compliance issues. deeper role-based training that's specific to software engineers, that's specific to sales and marketing people,
9:06 pm
that's specific to different roles that people play in a company that impact customer privacy. we have -- as i mentioned, not just sort of told people what the rules are and then crossed our fingers and hope they abide by them. we have put in check points in the way that we have developed our internal systems, the way you've developed a software and get it out the door that has to go through certain check points and reviews to ensure that privacy issues aren't missed or overlooked. there's a number of things that we've done along those lines to make sure that people are aware and have the tools available to them to do privacy right. but then there's also different checks along the way to ensure that mistakes don't get made. nothing is perfect, of course, but we try to do a multi-facetted approach or multi-layered approach to make sure we catch those things. >> let me follow up on this and it's a somewhat specific example but hypothetical.
9:07 pm
have you found training to be more effective or effective enough in the absence of pairing with mechanisms and processes. that was a horrible question. so i'm just going to start over again and say, 702 that program has certain legal requirements. in the private sector, would you train to those legal requirements or would you also have, for example, when an analyst is sitting there attempting to target or select or whatever they're going to do, also have at each stage of the screen or the process or however they're doing it rules reflected in the computer system that they're attempting to use? >> we do both. to the extent that you can use technology to enforce policy, that's always super effective because you get past -- or you reduce the potential for human error. but that's not always possible.
9:08 pm
you can't completely prevent mistakes, oversight, or intentional bad acts. so you need to do more than that. you have to have -- you have to build the awareness so that the inadvertent stuff is reduced. you have to build in the technology tools to prevent that from happening. then you need some level of checks to make sure that everything went right. if you have somebody trying to circumvent a policy for whatever reason, that there's some way to catch that before it creates a negative impact. >> and so i think i have time for one other quick question, in the section 215 program, one of the features was, in fact, not all of the call detail records went to the government. in fact, names are not provided originally to the government, subscriber information, simply numbers to numbers, would that
9:09 pm
be an example of d identification and anonmization. that was my only question. >> i have a couple of very sort of brief questions, which i think you can answer very quickly. that way i'll get them all in. okay. i'll begin with -- you talked about how it would be good for us and we already do have technologists on board -- based upon your knowledge here, does the government have technologists who worry at all about privacy? i know they have technologists, obviously, but is this as a result of your observations and studying the field something that they consult with the technologists about, hey, we
9:10 pm
need this kind of information for national security, but we would like to get it or as much as we can? what's the balance off? does any of that kind of thing go on inside the government with technologists? >> right. so having worked a lot with the government, i know that they consult technologists greatly with security, with privacy, with compliance issues and how do we engineer software that takes all of that into consideration. i think if we look at the past five years or so or six years or so that you'll see that the nsa was really, really focussed on compliance. i think the results of the reports and the oversight have shown that they've done a really good job with that. when there's been an issue, they've dealt with it. i think someone mentioned the new cpo at nsa. what we'll see different now is not only are we complying with law going to be something that's factored into all of the software that's developed and all of the tools and techniques
9:11 pm
and procedures but also now, well, just because it complies with law, should we really be doing it? and what's the extra step we're going to take to really consider privacy at the on set? >> so you sound reasonably satisfied with the fact that they're taking it seriously and doing the best they can? >> i absolutely do. i wish -- i actually feel very comforted by the fact that the government has a ton of oversight and a ton of laws to comply with. and i personally am much more worried about the large collection and amount of collection that's taking place in industry that people don't really understand. >> all right. so i can get on to my next -- mr. bedoya, you talked about how important it was to limit collection to what was necessary or purposeful, et cetera. but in light of so many of the experts on both panels talked about almost like an inevitable momentum of collection, collection, collection, where
9:12 pm
would you look -- what part of the government or where would you look for the mechanism to try and limit the collection or get that kind of impediment or balance done? >> certainly. so i think folks have been saying that it's inevitable that industry is going to collect all this data. i don't think folks have been saying it's inevitable that government will collect it. taking that as a given, i think the question is about reconstructing the fire wall between government and industry with respect to data collection. and so i would be surprised if anyone on the panel thinks or previous panels think that it's inevitable the government will collect all this data. one other quick point on your previous question, i believe that the congressional committees that conduct oversight on fisa and on foreign intelligence certainly senate judiciary committee lacks
9:13 pm
technologists. >> we talked a little bit about that in our first report on fisa reform there. okay. mr. hinz, you talked earlier, you said one of your principles was there shouldn't be any bulk data collections. now, terminology is varied all over the place, it would help me if i knew what you meant by bulk collection. at a gathering i was at, they talked about the great importance of public health data, especially for when epidemics come along or that sort of stuff, so wouldn't some of that come under your ban against all bulk data collection? >> i was talking specifically about government surveillance programs. >> okay. i just wanted to clarify that. and what do you mean by -- give us an example of what you call bulk data. this has been a debate whether
9:14 pm
this program or that program falls under bulk data. >> certainly. i had in mind the 215 program in particular where government goes -- >> it's not targeted. >> yes, it's not targeted. correct. >> okay. i think that's all i have right now. >> we may be able to go back to board members for additional questions. i would like to continue with this panel up until the top of the hour. we have one question from the audience which i will read and we welcome others if others want to post questions, in 2005, the national academy of sciences studied whether patterned-base data mining can anticipate who is likely to be a future terrorist. it concluded that this wasn't feasible. the question is pattern-based
9:15 pm
data mining in the terrorism context, is it feasible today and will it be feasible ten years from now? would anybody like to address that? heidi? >> i don't know specifically about terrorism, mindful of what ed mentioned as that we have limited data. but there is a program that has been running in las vegas in lapd, we may not necessarily still be able to identify specific criminals, but our predictive modelling systems have been at work. they're able to make a reasonably good prediction about where the criminal activities are more likely. it is not precisely the question that you're asking, but i can assure that it is just becoming better. i can assure that any service provider that has the amount of data that we are generating and
9:16 pm
it's becoming more and more and more generated is just a honing and fine tuning and polishing their models. whether it's going to be applicable to anti-terrorism methods, i don't know. i think all of these models are heavily data driven. so one would need a lot of data. but to the point that these models, these predictive modelling are able to predict things that may relate indirectly to terrorism or criminal activities, the systems are suggesting that we are going that way. >> other thoughts on that question? there's a system in chicago that the chicago police department has deployed which both has been touted and criticized, but it does somewhat the neighborhood or block level predictive deprediction bs as to criminal
9:17 pm
activity as well as i understand individual level identifying people who may be either victims of crimes or perpetrators of crimes. again, both touted and highly criticized. any thoughts or comments? >> one quick one creating a feedback loop. you see every crime that occurs on corner acts and draw an overdrawn example that you thought it was real dangerous. that's the main one from my perspective. >> so this is certainly not necessarily my area of expertise, however, predictive is different from being able to reconstruct after the fact. and so can we use these things to then -- when something has happened go back and find whether we missed certain people that are still involved? yes, i do believe that's the
9:18 pm
case. in terms of predictive, i think we have a ways to go. by the same token, i get a crime report from all the crime in my area, i can predict where there's going to be on a weekly basis crime in my neighborhood. so, we're getting there. >> but i mean, at some level that's just comstat all over again. the systems that have been available to police for decades. >> sure. >> one question and i'll go down the row again and i'll pose a question and i think we can just go down the row with additional board members if they have additional questions. i had said in talking to each of the panelists i didn't want this to be a panel about going dark and the implications of encryption, but several of you have eluded to encryption and
9:19 pm
its significance here. and i would ask any of you who would to comment on the following, which is there is a growing trend towards more and more devices, cheaper and cheaper wearables and the internet of things. more and more data collection occurring. there's also, it seems, a trend towards more encryption by default, whether it's at the device level or as mike was referring to in terms of the encryption of data flowing between data centers. so it seems to me like we have two things going on at once, which is not unushlg. somebody referred to the modern era, the era of the internet of
9:20 pm
things and big data and ubiquitous data flows as the golden age of surveillance. it seems to me that both trends will always be there. more and more information available both to the private sector and possibly to the government and increasing pervasiveness or at least increasing die fusion, if not diffusion of encryption. comments on that as a premise first of all, the premise of my question, am i right? and secondly, where does that lead the government and would you agree with my assumption that there will still be huge amounts of information available both to the private sector for its purposes as well as to the government? i guess let's go right down the
9:21 pm
row. professor? >> so, i believe that there will still be a lot of data that's available to the government. when i say that i really support encryption by default, i also really think that our country really -- we were the code hackers. and it was really critical in world war ii. and i think that instead of just kind of taking the lazy approach and saying, oh, leave us a back door that we should just get better at cracking the code because they're getting smarter and we need to get smarter, too. so i leave it to the lawyers the le galty of when you can actually apply that or break into a system is, but being satisfied with just having a back door means that we're not advancing our state of the craft and your trade craft here in this country and we'll be left behind as a result.
9:22 pm
>> i'll actually pass. >> my thoughts on this two-trend seem to be occurring simultaneously. >> yeah. we're certainly seeing an expanded use of encryption. encryption between customers and the service provider, encryption between data centers, encryption on devices, et cetera, that's being driven by customer demand. customers are concerned about the security of their data and they're not just concerned about the security of their data vis-a-vis hackers and bad guys, they're increasingly concerned about their data the government. that's driving customer demand for these security features and companies will continue to invest in that. does that mean that there will be no data available? i don't think so. the nature of many cloud services requires service
9:23 pm
provider access to it. you can't run an effective e-mail system without being able to filter the content for spam and malware. and so there will be a point in the communication chain where data is available. if it's available to a service provider, it's available to a government through lawful demands. >> yeah. any thoughts on this and then i'll yield? >> first off, i want to agree with dr. anton's points, we should just get better. we cannot ask industry, oh, don't encrypt, don't do anything. i would love to follow that when chinese and russians also followed that as well. so that's just not going to work. i'm very respectful of the problems that the law enforcement agency has with the current state of affairs. we just have to get better. and it works at the end it's going to work better for us as a
9:24 pm
nation. so that's number one. i fully agree. some of the things that -- so, going dark, i don't know if it's going dark. i know that we are currently in a state that we are really able to think certain way about the system design, about the system security, about maintaining privacy. that world has changed. the world and the industry has changed rapidly. the rest of us are catching up. so i think it pays dividend if we figure out, take some time, figure out what are the rules sf this new world where we don't necessarily need to rely on encryption. i'm a big fan of encryption. it's one of the tools that security professionals and others have. but there are others. the fact that some data is encrypted is not on its own necessarily the end of the world. i mean, how many times i know michael mentioned that we are overusing this motion of
9:25 pm
metadata, it is meaningful when you see some encrypted data is being accessed a little bit more than the other. one could discern and learn things about it. once we start learning how to deal with this system, then we could maintain encryption and maintain stronger encryption and also deal with the cases where we don't have access to clear. i think our law enforcement and our government and i think our legal system, i think us as a society are in the process of learning how to deal with this new world where things that we knew in the past no longer apply. lastly, the new generation have figured it out. i think they're doing a lot better. they're figuring out that you cannot expect everything is going to be fully protected for you. they're figuring out ways to live in the world where they're posting a lot of of things on the facebook that -- us probably
9:26 pm
won't do. they're trying to learn how to deal with a system that you may not have the capabilities of asserting your privacy the way that our generation did, but still have an expectations about their rights. >> does a particular board member have a question? yes. >> several of you have mentioned -- >> i'm sorry. >> several of you have referred to oversight in one way or another. and i just want to ask a question about that. in my view, oversight is especially important in the intelligence context because of the necessary level of secrecy. but at the same time, when you start to layer on paperwork, there's a point of diminishing returns. you don't deter misconduct. do any of you have thoughts on principles for what's effective oversight?
9:27 pm
as opposed to just another box-checking exercise? >> certainly have a few thoughts for the legislature. there's been a lot of soul se e searching around the executive. one of them is the technologist issue and the other is clearances. i can say with moderate to high confidence that most united states senators lack a staffer with tssi clearance. i hope i'm wrong, i don't think i am. and the fact is that all of the key briefings for these senators are conducted at that level and as a staffer, you don't send your boss into a meeting about soybeans -- you don't need a staffer for that. you don't send him into a meeting on an issue that seems very easy without a staffer. lot of these folks are going in unstaffed. thankfully folks on judiciary
9:28 pm
and intel have dedicated folks they can rely on, but outside of those committees you're often flying -- i don't want to say blind you don't have the resources you need to conduct that oversight. >> followup questions for professor anton. >> go ahead. >> on the identification. one is you commented earlier that phone numbers without names associated with them would be the identified information. >> it's actually not identified. i stand corrected on that. >> obviously the availability for the directory makes that -- >> absolutely. >> you commented earlier that by analogy having a lock on your door was a pretty good protection against burglars but obviously not a perfect protection. the question is in a context of a massive data base, burglars
9:29 pm
may not have the incentive or wherewithal to break into everyone's home in a community, massive data base with a brute force attack, you might be able to get a very valuable return on it. so does that suggest that d identification needs to be stronger or may not be sufficient -- as you pointed out in the netflix example and professors have written articles about the ability to d identify. is it a useful tool in some instances and not others? >> well, i think it's better than nothing. we have to work harder at it to get access to it, right, to really be able to understand it. but that's going to help us with the high school kid who is just trying to tinker around, right? but i think this is another example where encryption is really, really important and very strong encryption. and so i think it's a blend of
9:30 pm
both. >> thank you. >> just on the issue of d identification and anonmization. i had understood it as a concept that could apply in varying degrees. at a period of time it has been delinked and now they have to go to court to reassociate it with the identifying information. so i don't think i was asking you to say that if it had been permanently d identified or amon miezed. this question is if r mr. bedoya. to the extent that we're looking at evolving standards or evolving notions of expectations of privacy, how do you quantify it? is it because 51% of folks in a washington post poll said i care about this but i'm still using facebook? do you look at conduct, do you look at the fact that people inside the beltway really care? people in ivy leagues really care or i struggle with how --
9:31 pm
what is a good way to identify emerging notions of expectation of privacy? >> i'm not going to pretend to know the right answer to that question, it's a really, really hard question. i certainly think that looking at conduct is extremely valuable and there's been a lot of discussion about the third-party doctrine. the fact is it doesn't remotely represent what the american people think about privacy. you know, if your social network had public or only me that was the only option, people would say this is ridiculous. and i do think it sounds strange to say it but we do have something to learn from the best practices of these social networks in that they very much see the world as a series of segments and they respect the fact that sometimes you want to share something with segment a but not segment b. i would say that's certainly valuable. i don't have a good test about identifying a reasonable
9:32 pm
expectation of privacy. i will repeat myself. i think we need to see that as a standard that can expand and contract. >> okay. >> if i could quickly add, after the snowden leaks there's an anonymous search engine duck duck go. the amount of people who started doing searches increased by over 100%. there's one way to watch people's actions and conduct. >> one quick add on to that. it's not a binary thing. they care about privacy but use facebook. you have to look at how they're using facebook. whether they're using the privacy controls, how they're engaging in those services. if you look deeper you see some pretty sophisticated choices people are making in ways to protect their privacy that's not apparent on the fact that you're using a social network, you must not care about privacy. >> i have a question. between the two panels, the
9:33 pm
first panel and the second, i heard i hope correctly that there is some difference of opinion on a couple of things or maybe slight. you, i think, ms. anton, you suggested to answer to a prior question of mine, you thought the government was trying to build privacy into the technological aspects in some of the programs. on the other hand, earlier you said that in threat modelling, very little privacy considerations were going into this. other people said that it wasn't inevitable that the government would keep collecting more and more information, but i think i got that impression that maybe it seemed to be going that way from mr. fenton on the earlier panel. so my question is basically very briefly if there were one area of priority, if you were running the government's overall privacy
9:34 pm
protection that you would suggest they concentrate on and could perhaps improve privacy protection without endangering national security, what would it be? you can do it very quickly. >> i think that we really need to work more on privacy standards and not -- privacy standards globally and also that don't -- that aren't rigged in some way to help some government or sector of industry. i think that's the number one challenge right now. >> other people? >> yeah. i would say it's ending programs that involve the bulk collection of american's data. >> i couldn't hear the end. >> endings programs that involve the bulk collection of american's data. >> okay. >> do you have any in mind except 215? >> i didn't have the tssi clearance so i don't know. >> all right. >> okay mr. chairman? >> wait a minute, there was somebody --
9:35 pm
>> i'm sorry. yes, please. >> one last thing. i don't know if this is the elephant in the room. if one thing i would put as an item of priority is our systems in the technology is very much built as one-way. i would introduce the notion of revocation. if something goes bad right now if i'm releasing all of this information, there is no way for a user for a citizen to go ahead and push a button somewhere and say revoke all the rights i gave to x, y, z service providers and i want to go ahead and clear everything. defining what that revocation means what are the ramifications of that and how to crystallize it as a requirement for the industry would go a long way for things that we could build. >> that would go primarily to industry. that wouldn't affect the government. if i gave the goth some information under some program that would benefit for me and later on turned out it was being used in a different way, would your revocation principle apply
9:36 pm
there? >> if i have the right to revoke whatever government collected about me and i knew things that our government in the position of government and i was able to revoke that, perhaps that would be helpful. >> thank you. >> so this concludes our second panel concludes our morning session. we will reconvene at 1:15 with a panel of government privacy officers. defense secretary chuck hagel and joint chiefs of staff chair general martin dempsey will testify about u.s. strategy and military action in combatting the militant group isis. live coverage from the senate armed services committee starts at 10:00 a.m. eastern here on
9:37 pm
c-span3. here are just a few of the comments we've recently received from our viewers. >> just calling to tell you how much i enjoy q&a at 5:00 a.m. on sunday in the west coast, everything stops at our house. it's the most enjoyable hour on television. >> yesterday was very informative, gave good opinions, i enjoyed listening to him and the comments that was done today. me, myself as living over in the middle east, he was very accurate and he was on point, not -- he was not using his own personal innuendos. i greatly enjoyed it. i hope you have more guests like that. but he was right on target this morning. >> i'm calling to say that i think, like many people, c-span is wonderful, but as to
9:38 pm
criticisms, i almost have none. and i'm a very partisan kind of person. the reason i almost have none is i think you do a tremendous job of showing just about every side of everything and the way people look at things in d.c. and elsewhere. i take my hat off to you. thank you very much. >> and continue to let us know what you think about the programs you watching. call us at 202-626-3400, e-mail us at comments @c-span.org. join the c-span conversation, like us on facebook, follow us on twitter. now more from the privacy and civil liberties oversight board. we'll hear from officials from the justice department, the nsa and the office of the director of national intelligence.
9:39 pm
good afternoon. privacy civil libertiesover sight boards meeting on defining privacy we'll continue with your afternoon session with government panelists moderated by a member, beth cook. >> so welcome back to folks who were here earlier or welcome to those who were not here. just one quick piece of housekeeping. what we noticed this morning is make sure -- alex will be particularly relevant for you, make sure the mike crcrophone i actually the direction you're talking. even if you pull it in front of you but turn to talk to us, make sure the microphone is picking up. we've all been gently reminded as well. all right, so this panel is about the privacy interests, identified and addressed by government privacy officials. obviously in the
9:40 pm
counterterrorism context, defining and expressly articulating individual privacy interests while balancing the needs of national security is extremely challenging task. as we discussed a bit this morning, widely accepted privacy frameworks like the fair information practice principles or traditional privacy impact assessments may very well be intentioned with the necessity to protect information regarding the operation of a particular counterterrorism program. by the same token, some counterterrorism programs could be better served with greater transparency about what information is being collected about the statutory authorities or the authorities pursuant to which programs are being operated and about what protections the government utilizes to minimize the negative impacts on individual's privacy. so the panel that we have assembled today for this forrum is uniquely situated to discuss
9:41 pm
these privacy issues that arise in the context of federal counterterrorism programs. these officials not only assess the privacy impacts of a full spectrum of counterterrorism programs, they have also been pioneers in the practice of working proactively within the agencies to assure privacy are taking into consideration from the beginning of programs. if that were not enough of their duties, they also are learning to live with us and work with us. joining me today are three individuals, unfortunately dhs was not able to make anyone available for this as it turned out. so we have three folks. they will have ten minutes given that they have little bit of extra time with fewer folks but we'll follow the same basic framework. i'll then ask a series of questions for a period of time and invite my fellow panelists and the public to submit
9:42 pm
questions as well. so, leading us off is alex joel who is the civil liberties protection office for the office of the director of national intelligence. do you fit that on one card? >> yes, i do. >> it's amazing. so in that capacity, he leads the odni civil liberties and privacy office and reports directly to the director of national intelligence. prior to joining the government -- i think this is also relevant based on our other panels, alex servinged as the privacy technology and ecommerce attorney for marriott international where he helped establish and implement their privacy program including the creation of marriott's first privacy office position. so, alex, did you want to kick us off? >> yes, thank you. i want to thank the board for -- >> oh, i'm sorry. there's a stoplight function going on here. green, good to go, yellow start wrapping up, red stop in the
9:43 pm
front row. >> okay. >> i want to thank the board for inviting us here to address the public in this very important hearing. and as you said, the board does work very closely with us. we feel that the board -- the board's role in providing both transparency and oversight, as well as advice to the intelligence community has been extremely valuable and is a critical part of the -- how the intelligence community protects the privacy and civil lib terties. i want to thank the board for holding this hearing and the board's very diligent and careful efforts. it's been critically important. this topic is, of course, one that consumes all of us, not specifically how to define privacy but how to apply the protections required to protect privacy in the context of our activities and in particular the context of counterterrorism
9:44 pm
activities. i would like to just get to the -- what i think of as the heart of the matter from an intelligence community perspecti perspective. which is is that we operate by necessity within a sphere of secrecy. we have to be able to maintain secrets in order to be effective. the more publicly transparent an intelligence service is the more it informs the adversaries of how the agencies are collecting information and the better able those adversaries are to avoid detection. so, as i said in the past a fully transparent intelligence service is by definition an ineffective one. so the key for us then is how within this sphere of necessary secrecy do you make sure that the intelligence agencies are acting appropriately, lawfully, and in a way that protects people's civil liberties consistent with the values of the nation. in the past, what we have
9:45 pm
done -- as you know -- is focussed on ensuring that we are providing full transparency to our oversight entities. there are our overnight system is something that we would like to characterize as something of many layers within many players. we have general council and inspector general as well as privacy and civil liberties offices. we also have outside of the agency entities like the department of justice which is responsible on a government-wide basis for exercising some of these authorities and oversight controls. we have, of course, newly created entities like the privacy and civil liberties oversight board, perhaps not that new anymore. which, again, is destined to make sure there's a secure place for information to be disclosed and discussed so that the oversight institutions are satisfied that the activities being conducted are proper ones.
9:46 pm
and then, of course, we have congress and the judiciary, both of which exercise robust oversight. i would mention that, for example, the congressional oversight committees this ch were established particularly after the church committee hearings in the 1970s to provide this granular level of oversight over activities have been in effect -- has been very effective in my view in providing careful oversight of what we do. so, that's the -- that's sort of the oversight part of the equation. i think what we have now more fully realized is the need to enhance transparency. so, if you think of it -- i mean, i was just thinking about this before i started talking which is always dangerous, but if you think of it as operating within a sphere of secrecy, one way is to make sure that the mechanisms, the structure within that sphere are robust enough to make sure that privacy interests
9:47 pm
and civil liberty interests are being adequately protected. then there's the other way of approaching this which we are also focussing on doing is reducing that sphere. providing greater transparency of what goes on inside the intelligence agencies so the public at large can get reassurance and provide input and feedbacks into how we conduct these activities. if i could just continue along this theme, there are two aspects in particular of what goes on to regulate our activities that i think is of interest. one is the rules that we follow. and the other is the oversight framework and mechanisms designed to make sure we're following those rules. so on the former, what are the rules that we follow, we can and should provide greater transparency, but a lot of those rules are now currently being debated and discussed. you look at the reforms to mod fie those rules. you have the activity going on in congress.
9:48 pm
for example, the usa freedom act and similar legislative initiatives. as part of that also the proposal to create an advocate of some kind, an adversarial mechanism for the foreign intelligence surveillance court. here again in my view san attempt to influence or effect what are the rules that the intelligence agencies are expected to follow? and then a different part of that question is what oversight mechanisms what assurances do we have that the agencies are, in fact, following those rule. you're part of that. i already mentioned the congressional committees and the layers within the executive branch itself at the intelligence community and the department of justice level. so, i think -- i hope that the public discussion has been shifting a bit from whether or not we're following the rules. i think what i perceived in the public discussion is a greater acceptance that we are, in fact, trying our best to follow the rules. we're not perfect and we make
9:49 pm
mistakes, but we're trying to follow those rules as best as we can. and now the discussion has been shifting to, well, what should those rule bs? what are the rules? and what should those rules be? i think we can and must provide greater transparency into both sides of that equation. we're working on that. i would also say that another thing that i know the board has been pursuing which is the recommendation that the board made in the 702 report regarding ethicacy. to what extent are the counterterrorism programs and measures effective? and to what extent do they provide fall is a key part, in my view, of the transparency equation as well. we have to figure out ways to identify the specific value associated with particular programs and activities and then be more transparent about that so that the american people can render a judgment as well as everyone else on the need or
9:50 pm
desirability for a particular kind of program. it is very difficult to do all of this stuff and still maintain secrets. the intelligence community is not built for transparency. i've said this before, it's built for exactly the opposite, of course. we train, pvd policies and systems and reminders to our work force for the importance of maintaining secrets, the sources and methods that the community uses to carry out its activities. and this is vital. we have to do that and we're reminded of that need all the time. but at the same time we have to find ways to enhance transparency. it's going to involve some changes, some culture training, a look at policies and processes within the intelligence community and i know that you may to want ask questions about that. so i look forward to that discussion. so thank you again. i appreciate it. >> sir, turning now to erica brown lee, she is the chief
9:51 pm
privacy and civil liberties officer of the department of justice. in that capacity, she is the principle advisor to the attorney general on privacy and civil liberties matters effecting the department's missions and operations. and as part of the office of deputy attorney general, miss brown lee oversees the prooif sis and civil liberties programs and initiatives implemented by department components, and component privacy and civil liberties officials. she also heads the office of privacy and civil liberties which reviews and evaluates department programs and initiatives and provides department wide legal advice and guidance to ensure compliance with privacy laws and policies including the privacy act. thank you for coming. >> thank you. and thank you to the board for inviting me here to talk about what is a very important topic. you asked about private sector experience and other government experience, i also come from the federal trade commission which,
9:52 pm
in particular, the division of privacy and identity protection which of course the federal trade commission is a very different orientation toward the commercial side of privacy that nonetheless an important perspective and an interesting one to bring to this position. but counterterrorism is a significant part of the department's mission. since my colleagues on the day of today will be talking for more of an intelligence lens, i thought i would orient my remarks more towards the department's efforts to fight terrorism from within the criminal law enforcement context. the department has an elaborate ak techture that protects privacy in our counterterrorism work. and since i only have a few minutes, i'll focus on the lead agencies which is the fbi and focus a little bit more on the efforts with their counterterrorism activities. but stepping back for a minute
9:53 pm
of course as we know after 9/11, it was recognized that in order to address the current threat environment, fbi's functions needed to be expanded, but it was not intended that the expansions would come at a cost of civil liberties. and so, in 2008, the department issued the attorney general guidances for domestic fbi operations, egg dom, and later that year the investigation operations manual. and combine those two documents provide significant guidance for fbi activities. but what i wanted to talk about, and i know i don't have enough time to get too far into the weeds is just to explain how privacies sort of embedded throughout the stages of the investigation from the initial phase of throughout the process.
9:54 pm
so for example, one of the key tenants of both documents is the least intrusive method, so in other words, in any activity that the fbi engages, that's the baseline, but of course within the counterterrorism context, its got to be calibrated against the threat to national security in which case more intrusive methods would be used. but in terms of a little bit detail from an operational context when an fbi conducts an assessment for example, which necessarily, well not necessarily, but often times is proactive, that would involve, doesn't require a factual predication, but it does require a clearly defined object iive a the least intrusive method in that context would be starting with publicly available information to voluntarily provided information in that
9:55 pm
perspective. and then moving up from there, with regard to predicated investigations, which of course by, which implies their title. there requires a factual predication to open that investigation, but that has to have supervisory approval. and both types of investigative activities, whether it's assessments or predicated investigations require or are subject to oversight on the intelligence side, but also on the law enforcement side for counterterrorism. the department's national security division has oversight authority for those kinds of activities. now beth mentioned and asked us to sort of talk about or think about how it applies if you're looking for the ak ok ro anymor
9:56 pm
it's not there -- acronym, but they are embedded throughout the principles. if you think about about a transparency objective. all item discusses with the 700 pages of it for a little light reading. if for anyone interested, it's on the web with certain rediamondbackss, but also we have privacy impact assessments that are available and one that i wanted to mention in particular regards the the e guardian system. that is a specific system or incident reporting system that is designed as a platform to share terrorism-related information across law enforcement, you know, federal, state, local, triable, territorial, jurisdictions. e guardian, i don't have time to go into much detail about it. it has an entire architecture of
9:57 pm
privacy protections governoring how information comes into e guardian, how it's shared across those entities, how it's stored and how it's retained. individual participation, obviously that's more of a challenge in the law enforcement context. it's not realistic to be able to obtain consent for in order to pursue investigations, but nonetheless, the privacy act provides some measure of review in the sense that if access or amendment to records is denied, there is judicial review of an agencies jurisdiction and subject to court order. records may be orderrd or access granted. on the min mization side, i mentioned the least intrusive means already with the dialogue. there's also a per scriptive
9:58 pm
measure in the diog with ed collected that if the evidence collected through an assessment or through a predicated investigation has no foreseeable future evidentiary or intelligence value, in terms of the disposition of that piece of evidence. otherwise, you know, information is retained according to the schedule set by narra. the national archives records administration. and improved through, which department of justice would seek approval for. with regard to use, i think that's also a challenge. on the criminal side, of course, realful disclosures of protected information are not something that any agency can exempt
9:59 pm
themselves from and to the extent that information is released that's not subject to a routine use or other permitted disclosure, and of course, you know, routine uses are subject to a compatibility standard, that tracks the language. if the information is disclosed or even shared, in violation of that, that's potentially a wrongful disclosure, subject to not only civil damages, but criminal penalties. and then as terms of accountability, i mentioned oversight from the national security division. but also the fbi has the national security review, sorry, national security law branch which conducts national security reviews, and that's a significant review process in that they go out to all of the field offices and review the investigative activities i mentioned, the assessments, the
10:00 pm
predicated investigations, and look to see whether in fact supervisor approval was obtained, whether in fact there was a clearly defined objective for the assessment, and it's written up into a report. that report actually comes through fbi channels, of course, but then, also comes for review by the chief privacy and civil liberties officer, and i look at those obviously through a privacy and civil liberty's lens. so as alex was mentioning, there are lots of layers that are applicable, and i know i don't have much time remaining, but in my, in conclusion, i guess i would just like to leave you with a couple of takeaways. one is that fips, quite to the contrary of certain statement is not dead, it's just embedded. andled a also say that the
10:01 pm
processes can always be improved, certainly i work with the components, there are over 40 components in doj, each has a senior component official for privacy and i host regular meetings. in fact, we're having a privacy forum next week that will cover privacy-related activities, focussing on law enforcement, but other, other components as well. activities, common privacy issues across components. it is internal though, so none of you are actually invited. unless you happen to get a job by monday. at the doj. but that's, that's also something that is a way to improve and i would also say that, you know, while privacy impact assessments are very important and a critical part of
10:02 pm
a program because they're sort of this tangible proof that we actually evaluate privacy that we mitigate the risks that we take into account security and accountability, they really only form a part of the a techture for the department of justice's privacy program. and i welcome your comments. >> thank you erica for that nice education about the fbi's operations, fbi in particular. so becky richards is the national security agency's civil liberties and privacy officer. in this, i think relatively new role, i think it's fair to say, she provided expert advice to the director of nsa on all issues pertaining to privacy and civil liberties protections. and she conducts oversight of nsa's civil liberties and privacy related activities. she also develops measures which i hope she will talk about to further strengthen nas's privacy
10:03 pm
protections. prior to joining the national security agency. she worked as a senior director for privacy compliance at the department of homeland security. >> thank you. and thank you for hosting us. i am very honored to have been selected to be the first nsa's civil liberties and privacy officer. this is an exciting time to be a member of the civil liberties for privacy community. our community is growing and evolving and will help inform the debate as the nation reshapes its expectations for and limitations on the intelligence community activities. changes in the nature of the threat to our national security alongside rapid advances of technology, as was discussed earlier, make my job both interesting and challenging. technology provides us with both opportunities and challenges, but ultimately, we must guide and shape its use to ensure the fundamental rights we hold dear as a nation are maintained. today, i'd like to take a little time to describe nsa's civil liberties and privacy programs in the past, present, and a few
10:04 pm
thoughts on the future. part of nsa's mission is to obtain foreign intelligence worth knowing derived from foreign communications in sports to validated and levied upon us by the executive branch. one such priority is counterterrorism, but there are other threats to the nation such as the spread of nuclear chemical or biological weapons or cyber attacks. nsa also works directly with and supports our troops and allies before performing intelligence. for military operations abroad. as we consider nsa's privacy programs over the past 62 years, it's important to think about how the threat, tech any logical, and landscape in which nsa conducts its mission has changed. first, the threat has changed. nsa previously operated in a cold war era when it was directed at nation states, structured military units, and foreign intelligence services. while threats remain from nation
10:05 pm
states, they now also come from not state actors which requires nsa to look at more, smaller, and decentralized targets to protect the nation. the technology has changed. nsa again previously operated in an environment where the communications between targets were frequently conducted over isolated government owned and operated communication channels and equipment. now, foreign target communications are interpercented with ordinary commercial and personal communications. additionally, the sheer volume and ability to analyze and manipulate big data which occurred as a result of a significant advances in information technology can expose information of a personal nature that may not have been previously discoverable and may not be of any interest. third, how society think abouts civil liberties and privacy has changed. we have come a long and positive way in thinking about what ought to be private. personally identifiable information was not a mainstream issue 25 years ago. for example, social security
10:06 pm
cards, kansas numbers were routinely put on student id cards and no thought of hippa. sop with that, i'd like to give historical perspective. nsa civil liberties and privacy protections have been driven primarily by the fourth amendment analysis which is reflected in nsa's authorities. executive 127 triple 3. this analysis framed nsa's protection program by asking where and how the data was collected, i.e., usually overseas, and the status the individual or entity being targeted. i.e., is it a u.s. person or not. as it considers new types of collection answering these types of questions. it is built a strong compliance program based on these with compliance activities embedded in our technologies and systems. as i've learned more about nsa and its compliance regime, its became clear while this is certainly one way to address
10:07 pm
privacy concerns, it is different from how concerns are addressed outside of nsa. over the last 15 years, congress has pass at variety of laws to protect privacy and other parts of government and the commercial sector. these policies and laws focus more on the nature and content of the data, not where it was -- i'm sorry the nature and content of the data and how it was used. i believe we have an opportunity to bring together nsa's current civil liberties and privacy analysis with a broader approach to privacy and civil liberties. this is also supports the president's ppd 28 mandate to recognize that signals intelligence activities must take into account that all persons should be treated with dignity and respect regardless of their nationality and that all persons have legitimate privacy interests in handling their personal information. to address a broader set of civil liberties, i am testing a privacy assessment process.
10:08 pm
the expands nsa's views to include considerations of frameworks that the private sector and nonintelligence elements of the government use to assess privacy. for example, for the first time in its history, nsa is using the principals as a framer for considering privacy works. how will it be used? as sit, we have designed an initial standardized template and will ensure we're building a helpful process to identify and mitigate civil liberties and privacy risks.
10:09 pm
a critical part of the assessment process is to make sure we're not merely checking off boxes. but fundamental lly weighing th risks. in essence, we're asking should nsa conduct a given activity, given its civil liberties and privacy risks? as part of the assessment process, nsa is documenting both standard protections such as miniization and controlling who has access as well as tools designed to protect civil liberties and privacy. much like performed in the private sector and other parts of the government, we're using the fips as the basis for analyzing what protections are in place. as we look to the future, i want to, i'd like to spend a little bit of time talking about blending the art of science and blending the art and science of privacy. historically, privacy tends to be a bit of an art form. several of us stand around and think about how we're going to do the analysis. this can be difficult when we're beginning to think about big data and the complexity that was
10:10 pm
being discussed this morning. nsa is fundamentally a tech knowledgely centric organization. we have and will continue to contribute to advancing the construction and research to protect civil liberties and privacy. today it has made notable stride to promote privacy such as unique inconscription capabilities, digital rights management, and trust worthy community. it is also being developed on coding privacy policies such that technology supports all specific uses. but civil liberties and privacy protections need to blend the art and science of privacy if we're going to harness the potential of technology and inkrorpt our core values as a nation into this era of big data. despite significant progress, basic privacy prince pams founded in basis have largely proven elusive. if we can better understand what constitutes personal information and how such information is used, we believe it'll be public
10:11 pm
to develop, to determine whether question develop more practice kl approaches to evaluating the inherent risk of privacy to the individual. we are building five building blocks. and to introduce the concept of some very difficult math into what is otherwise a very nice liberal arts discussion of privacy. the first one is to categoryize personal information. we would like to determine if it's possible to identify and categoryize different types of personal information. and what that risk is to privacy. now we've heard different discussions today, but we want to push folks to think about certain type of data more risky to privacy, health data than other information, say your address, and can we think about those risks? if we can do that, then next we would like to determine if it's possible to identify and categoryize different types of use. if we take both of these
10:12 pm
together, it's possible to develop a categorization of both personal information and uses of the personal information, it should be possible to develop a scientific process to assess risk. this process could evaluate the risk of the use of individual types of personal information for different uses as well asing a investigated uses of personal information. now with these three building blocks being more of the scientific aspect, i would now suggest we would move to an art form that looks at how we build that to identify what needs to have additional privacy impact analysis conducted so that we're looking at that across the board. with all four of these together, then we would look to see if we could build a responsible use for a mark that holds data collectors and users accountable for any harm it causes. building a technical means based on methodologies to support the identification of civil liberties and privacy risks can help us better protect civil liberties in a fluid world of big data. success is dependent upon input
10:13 pm
of disciplines ranging from technologists, social scientists, experts, attorneys, and computer scientists to name a few. we would welcome to opportunity to discuss this in detail at a later date. with that, i thank you for the opportunity and i'm happy to answer what i'm sure are a couple of questions. >> thank you all for your opening remarks, becky, i wanted to stick with you for just a second. when we go and meet be y'all and when we talk with y'all, there are someone from the general county's office, someone from your office, what are you doing that is different than the general council's office and a compliance shop? >> that's a great question. the office at nsa is the focal point for questions surrounding civil liberties and privacy. its been brought toe a senior leadership position at nsa in order to focus on those efforts. so, generally speaking, our general council will answer the
10:14 pm
legal question, is this legally permissible? and they will often then work with compliance for what are the rules. but we haven't had a person asking some of these more difficult questions of should we be be doing this? frequently the oversight folks were playing that role, i don't want you to take away the idea that those questions weren't asked been but it's important to have that role inside a building where you are working with the operators and the technologists and can spend a great deal of time understanding what we're trying to do and bring to their, those questions. >> erica, similar question for you, at fbi for example has its own privacy officer, have its own general council, own compliance shop, what is your relationship and what is your ability to provide recommendations or to actually impose requirements on the fbi? >> so also a very interesting question.
10:15 pm
my role and position is department wide. so of course i have oversight over the compliance for doj as a whole. each component as i mentioned has this senior official for privacy, but in addition has general council's office that has significant footprints and privacy. so as fbi, they have their privacy and civil liberties unit that's headed by a chief. i work quite significantly with that, that person in that office to specifically address compliance issues, to specifically address privacy initiatives that i feel are important for the bureau to consider. ultimately, it is somewhat of a reporting structure, in other words if there is a recommendation or a particular policy or statutory obligation,
10:16 pm
fbi has the responsibility to comply. and so, but part what have my job is to advocate and make sure that that is occurring on a regular basis and that there, and you know looking for ways that i can improve the process, looking for ways for example, i talked about privacy impact assessments, some of that is if you look at the government act, its written fairly broadly. i take, you know, a particularly broad view of what i think should, should have assessments as part of compliance there, and so that's what i work in particular with fbi on. >> so alex, related, but different question for you, how do you ensure that you have access, do you ensure that you have accesses to what various agencies are doing or do you find yourself periodically reading about new programs, alleged new programs on the front page of new york times?
10:17 pm
>> i'm surprised by that question. information sharing is perfect. everywhere in government. >> i'm also seeking free advice, because obviously one of our biggest challenges is going to be knowing what the agencies are doing. can't conduct oversight of something you don't know is happening. >> right. i think that is, it's a major challenge for all of us. as you said, it's something that you're focussed on, i know it's a challenge for everybody, it's a matter of how -- personal understanding, the information flows within your own agency and trying to put in place markers for where it's important for you to be consulted. the main way that i have just practically done it, since i've been doing this for about a decade now and when i first started, you know, it was just me and we built a small staff over time. has been to form trusted relationships inside the intelligence community. and to make sure that the people that i'm working with and that are in positions of influence
10:18 pm
and authority to make decisions on programs and activities understand the importance of consulting with civil liberties and privacy professional. my own personal experience working within the intelligence community has been that i've been, when i first joined, i was very pleasantly surprised that people were so focussed on compliance, on protecting privacy and civil liberties, doing the right thing, following the right directives, and even when they might feel legally permitted to do something, they still gave voice to their own doubts as to whether they should be doing it. and so i did not have, i did not personally experience an uphill battle in trying to persuade intelligence officers, hey, it's important for you to pay attention to civil liberties and privacy, in fact it was sort of the opposite where many people felt they were doing that and it was their job to focus on that. for example you mentioned office of general council. i was before coming to this job, and we certainly felt when i was there that was part of our job.
10:19 pm
we needed to look out for privacy and civil liberties, not just what the law allowed, what was the underlying intent and what should we be doing in that light? i certainly didn't want to talk away that sense of responsibility from anybody inside the intelligence community. my approach had always been, it's all of our jobs, it's part of our oath to support and defend the constitution. and there are some offices that are particularly focussed on that. office of inspector general, intelligence oversight offices. there are now we're creating i do think we add value because i think we, it is our full-time job to focus on civil liberties and privacy. we bring focus and we bring an external perspective and we have specific expertise and training and experience that we can bring to bear, then we become a voice as erica said, an internal advocate for civil liberty eens privacy. i don't have -- i think agencies
10:20 pm
will fient e find different ways of doing. . we're a small organization, i am able to, and we have mechanisms for understanding what's going on across the intelligence community. so when a particular program or activity bubbles up to the point of a decision, either it comes automatically through my office or somebody will understand that i need to see it and route it to me. >> so followup, particularly to you, alex, and eric ya, both of you have fairly small staffs, considering the breath of your responsibilities. and we taurked a lot this -- talked a lot this morning about the increasing tech any logical complexity of what you are assessing. do you have the tech ani techn resources? i think that is in terms of assessing on the front end whether systems should go live or to the exthaent there are restrictions, for example if they put a restriction in place
10:21 pm
on a particular program ensuring that those restrictions are actually functioning? >> so i think that's a good point. so, but as i mentioned earlier, oversight is sort of variety of roles in the department that have oversight with particularly with regard to counterterrorism that my office is fairly small in the sense that given the large footprint of the department of justice, but, they work incredibly hard and diligently with all of the components to ensure compliance and we rely quite a bit on internal component work that is done to produce the, and, you know, produce information about what the privacy compliance is, and then also with regard to auditing and making sure that the privacy activities are
10:22 pm
actually effective. but, i would also say that some of the oversight just to sort of again stress that some of the oversight isn't just through my office, it's national security division and fbi also has their branch, and sop we work very collaboratively, and like alex, i have found that within the department, there are a lot of people who care very deeply about these issues. it's not specificallien y in a privacy role as a title, but they have oversight and i think meaningful insight as to what, you know, how the activities and should consider and be consistent with privacy initiatives, but, you know, it is something that i take into account, and that's part of the reason why we have this internal, you know, conferences and what not that i'm trying to do to build upon that. >> and alex, what do you do to make sure the old adage is
10:23 pm
trust, but verify. what do you do to make sure you actually understand the programs and the systems? >> right. so it's a variety of things. one is i, although i am not personal lay technologist, i have been -- personally a technologist, i've been dealing with issues associated with technology for much of professional career. so i was a privacy commerce and i.t. lawyer there, and before that, i was at a law firm in downtown d.c. focussed on technology, large scale technology transactions. that doesn't make me a specialist in technology, but it does enable me to ask the right questions and make sure that if information is explained to me appropriately, i don't have the staff resources to engage a full-time technologist, i think that would be helpful. i do think that you have to be a little bit careful with that because what you really want in essence is a technology generalist, there are so many different aspects to technology
10:24 pm
as you know, that's just a word that almost lacks meaning these days because we use it so frequently, but what nsa does for one particular type of activity will differ significantly from what fbi does, will differ from what all agencies do in terms of data base management. you have data base issues, surveillance technologies, understanding communications technologies, understanding all kinds of different aspects to that issue. so, and then of course the engineers it and technologist as we know speak different language from lawyers, and so sometimes it's hard to, for everyone to speak to each other. so what i have been doing is making sure that the information is clearly presented this that i see the documentation. that i personally understand it, that i trust the people who are providing me that information. are giving me a complete picture, and then i also leverage, we leverage technical experts in the particular field that we have access to or through the agency. so if something comes up, that
10:25 pm
we don't quite understand, we can reach out to somebody to have them help us understand it. i think with a larger staff i would try to have more full-time technical expertise. >> becky, you had mentioned, you have a couple of pilots, experiments going and you mentioned also new technologies that may or may not be available. how are you working with the private sector to leverage what great thinking is doing on. has consideration been given to that? >> i'll start with the procurement. i started with theory on procurement. nsa is a technology company been it has a huge research portion of it and also has a huge technology division. it's two different parts. i have a technical director on staff who's here, david marcos,
10:26 pm
he and i have been working at how do we look at what's out in the world. what is out there right now. and they're conducting that right now so we could get a sense of both from a policy and technical perspective to what's going on as opposed to things we may know from knowing different people or carnegie to make sure we had a broad breath. they're doing that, we're working on that right now. and we're working with our research level. property kurmt process is not how things happen, and i think that's each agency has its own culture and its own aspects. a lot of what i've been doing is taking the learning and sort of shifting it to make sure that building the program within nsa works for how nsa works, and so
10:27 pm
that means that our privacy program but it's based on sort of how the organization functions and where those key decisions make decisions are being made. and so we're working through that, but turns out procurement really isn't quite the right place. we're looking through in terms both the technology and research director and others to make sure we understand what those touch points on. and that's why we're beta testing the processes. >> i have one more question, alex for you, you explicitly opponented to congressional oversight as one of the things that the american people should be aware of that this is happening, it's robust, it's real. a previous panelist pointed out that there is potentially one significant staff. what has your perception been. yes, item going to ask you to
10:28 pm
state on congress. whether consideration should be given to broadening the range of individuals. i think there is some comfort level with i think someone called it delegated oversight within the congress, but when some significant majority of decision-makers and a representative democracy don't have cleared staff, how is the oversight nonetheless sufficiently robust? >> so the intelligence oversight committees have very substantial cleared staff. and they of course have secure compartment information. they have to review all the classified information and we, we have many, many meetings, briefings, and reports with our oversight committees. i guess my first, my first response as a matter of principle, yes, congress should have the degree of staff cleared at needs in order to assist to perform its oversight functions.
10:29 pm
i think the intelligence community assumption had been that by clearing the staff of the oversight committees that that was, that function was being fulfilled. i think some staff members are also cleared for some of the other committees, i don't have all of that information in front of me, i believe judiciary has clear staffers, whether or not that's enough staff to be cleared. i don't know. i think congress, it would be for my personal perspective it would be helpful if congress figure ld out for itself which committees are performing which function and which staff members need to be cleared in order to oversee our activities, then we can assess it. i would certainly support a desire to make sure that there are enough cleared staff to perform oversight, absolutely. >> so transitioning to the member questions and while this is happening, just reminder, there are folks with cards if you have questions that you'd like to submit from the public and to keep everyone on their toes, this time i'm going to start with pat.
10:30 pm
>> okay. you may be sorry about that choice. i might not be, they might be. this is somewhat of a loaded question, but it's one that's sort of in the back up of so much, the work we have done and we'll continue to do. i lawed all of becky's attempts and your attempts to inject erica attempts to inject privacy into all of the various phases -- thank you, of intelligence, but drawing upon what some of the people in the first panel said this morning, let me just pose sort of a question that for instance, several of the panel members thought collection was a prime mare focus of -- primary focus of trying to enhance privacy interest by eliminating collections and leaving apart any debate about whether or not
10:31 pm
collection lights, can be an injury to privacy. i guess here, and that's a collection. also, when you get another expert talked about the risks to privacy fromming a investiga ia. we found out like for instance in the 702 report we did, when you got to retention of data, the analyst might look at it and say, well, i don't see any foreign intelligence purpose to this piece of data, if it came from an innocent person who's not the target, but it's conceivable there might be one down the line or some other person, i don't know about the agency. so therefore, you know, i've got to bend to make sure that the security. so it seems to me, one of the basic problems here will be, what's the tipping point?
10:32 pm
in other words, assuming good faith on both sides, there really is a national security interest when you have to make a choice between privacy and national security, but the real question is, how much and at what point? in other words, is any national -- when we were doing 215, we were told many times we need a big hay stack in order to find the needle. and the bigger hay stack, the more likely we are to find the need needle. the policy judgment has to made at some point. at this point, yes, we're going to lose some national security things, but privacy is more important. i guess i want to know what your thoughts are about how that decision, which is a basic policy decision, but it seems to come up in every program that we look at. how it's made or how it should be made. even at the most general level. you can all take --
10:33 pm
>> okay. so i'll start. i'll offer some general observations. so i think on the collection and use and retention point, i would say that it's very important to look up each phase of that. and that's in fact how the intelligence community structures its determinations in many ways. it's collection and there's retention, and there's dissemination. >> and of course when you aggregate data, you create risk. if your concern is to protect privacy, the better way to do it, and you're worried about what the government's going to do with your data. it's always better for the government not to have the data. that's the best protection. if the government doesn't have the data, there is no risk to privacy from the government because they don't actually have it. so that's, that's why i think it's appropriate of course to focus on collection. once, once a determination is made that the government really needs this data in order to carry out an important function, then your shifting to retention,
10:34 pm
and so there -- >> let me interrupt you. sorry to do this. old habit of mine. >> yes, your honor. >> when you say really needs, that's where rubber hits the road because, sure it's going to be useful and so where the sideline between something which generally will be useful for you, but will be more of a privacy risk and the thing of this is really necessary because, and we all know it's going to be drawn differently in different case situations, that's what it always comes down to. i'm wondering if you have any thoughts about how that -- which is a policy. >> so i would, this is where, and i know -- >> okay. >> before we -- you used the term tipping point which i think is a very helpful term and sometimes people think of this as a balance or as a scale. the way that i think of the balance metaphor as it might apply here is not that you're saying well, that tips it over
10:35 pm
here, so therefore we're going to do it. therefor we're not going to do it. to some extent of course that happens. the twha i think of it is that if you're going to do something new, a new or different collection program. you ask the following questions, a, is it lawful? of course it has to be lawful. is it justified? what is the purpose you know going to a fips analysis. what is the purpose for it? is this collection focussed on a, on a valid purpose that we feel should be pursued, it is important or it pursued, whatever the phrasing should be. and is your activity tailors to that purpose. are you doing something, are there less intrusive ways of doing it. are you, is this the appropriate way to go about doing it in terms of obtaining this information. and then what are the risks so that, without going to the other side of the scale, and i think, how do you guard against those risks? how do you mitigate the risks? this is the way i've always
10:36 pm
thought of it, it actually fits into some fips kind of models. if you look at that overall picture, you can then, it then helps inform you, either the art or science side, i don't know, becky can tell us which one that is. it helps inform the decision about whether this is the right thing to do. and i think you have to look at that totality. if you're going to do one program while it's lawful, but now you can't figure out, there are major risks, but you can't figure out thousand adequately mitigate those risks, that'll tell you one thing about the overall risk of doing that activity. >> alex, if we could, we're trying to keep -- >> i'm sorry. >> something specific you wanted to say. >> the only thing i would say, we've been asking some different questions to try and tease out some of this conversation as we go through different programs. and the questions we've been circling around, which are a little bit different than, you know, is this lawful sl more is what the type of data, how
10:37 pm
intrusive is the data? how broad is the collection? in other words, am i obtaining a lot of people who are sort of an accidently collection or not part of the target. and what are the stated uses or future uses. we've been using those three questions. to get an overall risk is we to want start the government from doing bad things to good people. and so, you sort of looking through those different lenses, it helps us do that analysis. >> so thank you. david -- >> i'm sorry. >> just because you wanted -- >> i know. i was just going to sort of follow up on the comments in that. that i think forcing mechanism of trying to do, of having ongoing vetting and ongoing evaluation by the right people is where to go because you're looking for sort of, you know, the meaningful relationships and developing those as opposed to, you know, retaining the isolated pieces.
10:38 pm
seen i would just say that trying to force that mechanism of vetting is really important. >> one of the reasons for having the forum today is to get a understanding of what privacy interests are being protected by your offices and our agency. and alex and erica both have been in either the private sector or ftc at a private sector focus. how would you compare the privacy? what are the similarities and what are the differences? >> i actually think there are a lot of similarities. but there of course important differences as well. on the similarity side, i think, and i think privacy officers and people in all kinds of organizations be they private sector or other government agencies share a similar challenge or problem set which is your organization wants to do something either for a business purpose or for an authorized statutory purpose. in order to do that you need information.
10:39 pm
and for businesses typically information about customers or potential customers, and then you want to do something with that information to carry out your lawful activity. so, it's, it's a given that your organization will be obtaining and using personal information in many cases. and so the privacy officer's challenge is making sure that that activity is conducted in a way that maintains your key trust relationships. there are different ways of framing it, but i think that's generally speaking what happens. and so for a business perspective, what you want to make sure you're doing is delivering value to your customer and that you're not using that information for inappropriate means or ways that are going to essentially get your customer upset and have your customer take his business elsewhere. i think for the, and so a lot of those things are similar, i think that the key to sanction for a business is of course that, it has the ability to disclose a lot about what it's doing in terms of obtaining that
10:40 pm
information, and the value that it's providing, is also something that gets immediately, should be immediately apparent to the customer. to the extent that the value is further down the chain and the customer doesn't see it that much, but is aware it's being collected. that impacts the trust the customer has with the business. from an intelligence community perspective, it's hard to demonstrate the value. what are we doing with the information? and so as a result when people are worried about information being obtained by the intelligence community, they don't, the value to them seems inchoate, but the risks seem real. my freedom could be impacted if the government misuses this information. we can ensure people, we are making sure that the information will not be misused, but i think we need to do o a better job of that. but i think the other side of that equation is we have to show what we're better show what we're doing with the information. and of course for intelligence agencies, some of the most tightly held secrets are the successful use of intelligence. because we don't want
10:41 pm
adversaries to know that that method was successful. >> so just to quickly answer your questions, i was also in the private sector at a law firm and practicing privacy, but here's where they're similar. whether it's clients or even form of government perspective, people tend to be reactive to privacy. and one of the things that i find the biggest challenge is to be proactive. and it means sometimes taking on popular positions, whether it's with clients or internally within my organization. and, but sort of having principle reasons for doing that and if not forcing putting, you know, very strong arguments to do what you think is the right thing sing where it's similar and where it's hard. but interesting. >> becky, you talked about categorizing information as being sensitive. in our prior morning discussion, there was talk about the mosaic theory where there's individual bits of information that are
10:42 pm
innocuous in their face, but in combination they present a profile or someone's activities, thoughts and so forth. how do you, how do you -- do you lose something if you focus on what seems to be sensitive information and not take into account that the potential combinations of information? >> so actually the goal is to take into all those combinati combinations. the idea and where we've been looking at is it's very difficult, we want to push folks, and i will say this is an uncomfortable place to be as a privacy person. this is where i'm like it it'll depend, but if we look at where big data is today. there is a lot of data, and it's very voe lums now. if we can start to define which is what we heard in the second panel. and this is where i think we're going to try to push nsa is if we can start to define and put mathematics behind it so that if for example you have, vaguely anonymous or slightly data over here and over here, and the
10:43 pm
computer start to put them together, we would want the system to then pop something to say hey, look at this before you decide to go forward. so the idea is technology is supporting the privacy analysis but looking at whether or not the map underneath it can work. so you're going to have to make hard choices. do i think health data is more risky than my address? then you have the violent, you have the violence against women or something along those lines, but at some level, if we deal with only the edge cases, we're not going to move forward. i think that the value we will be losing some of the value, both from a privacy perspective as well as from a technical perspective because we're sort of in this art form of looking at each individual case, which i recognize at nsa, i'm not going to look at every single little thing. we want a system to be able to identify the things that need additional analysis and additional judgment. what i don't to want have happen is where the system is doing
10:44 pm
things that we will find unacceptable because we didn't build something in to help with that. >> thank you. >> thank you, rachel. >> thank you all for -- let's see here. thank you all for being here. for those of you who have been here all day, you'll know that this is a little bit of a hobby horse of mine. i want to ask about the fips and why you are reporting to apply them although you can't apply them. i gather and miss richards, this is directed to you initially. and i commend you for publishing the paper on targeted collection under 12333 and you said that you were applying the fips, and i gather you were talking about the 2008 dhs it ration of the fips. but then you said that the, the individual participation fip can't really apply to your activities. and the transparency one can apply in a very limited way. i guess, i'm wondering whether
10:45 pm
it doesn't make sense to come up with a new set of principles that applies to surveillance of the government, if you look at dhs fips, the transparency cannot apply because it's talking about providing notice to the individual, regarding collection, that's not obviously going to take place. individual participation can't apply at all. some are very, very important, specification is important, minimagsization, some are important, but yet, this doesn't at all address things like, fresh holds, evidentiary for collection which are required obviously by law, but if you're talking about prince. s that are supposed to sit on top of the legal requirements, you should talk about thresholds, and there are other principles that don't come into play here. i'd be interested in knowing why you decided to apply the fips and if you've given some thought to come up with new principles. i don't mean to criticize this for dhs's purposes, because dha
10:46 pm
has voluntary interaction with the government where this makes a lot of sense. but you're in a different position than that obviously. >> so i think it's -- i guess what i would say, it's a beginning place. and i've stated that a couple of different times because i wanted to start with something. and so, from my perspective, i guess i want to take the parts that work well which would be basically the bottom six of the dha ones -- dhs ones and look how we can work those through. what i would say, sometimes there's analysis that needs to be done at an enterprise level. so it's useful for me walking into the agency, which may become, be readily apparent, but it was use to feel go through the process and say hey, here is one framework that we think about for privacy, and as an enterprise, we don't do the first two, one of the questions that led know ask and some of the conversations i've had with academics and advocates is we
10:47 pm
don't do transparency in the individual participation sense, is there some proxy? is there some additional thing that we should be doing given that? and i think that gets to your question of well are there other things? and that's where we're working through the questions. and so i think it was very beneficial to start with that as the beginning one. and then use the question, use the remaining six prince. s as the basis for some of the questions. part of the problem thoughly tell you with the fips is they don't give you a judgment. this is good enough or that's bad enough. and that's the place where we are trying to then look at, look at the data, what the the risks to the data. we spend time talking about what the exact risk to the program and civil liberties. we're still working through those, and having a lot of really fun and intellectually stimulating conversations about what are the right questions and how do we do that for an intelligence agency? nsa, i would say for us it was a
10:48 pm
beginning place. i don't think that it's necessarily the ending place. but it was a place to start. i didn't want to throw everything out and start with, i don't know, you know, you have to start somewhere. >> okay. did the other panels to want say anything about that? alex. >> i would just add that even though the first two do not directly apply, certainly not as written by dhs, they provide useful measures for us to determine to what extent does this raise privacy issues and in what areas? i think that's very helpful to use as a guide in the way that becky has been using nsa. i like the idea of developing a statement of principles that would apply to the intelligence community. so i'll take that back. >> i think i don't have time for another question, but i would just suggest if you're going to engage in that exercise that you look at the threshold's question and you also look at oversight because these, you know, they talked about accountability and auditing, creating a paper trail is not the same thing and
10:49 pm
obviously as i said in the previous panel, it's extremely important in this context, just food for thought. >> i think it's important that you don't have a check box. part of the, part of the problem i think with the fips also is it leads itself to a check box box. yep, privacy statement. okay, am i doing everything, okay, i can do that. as opposed to should i be doing that? that's where having a individual at the agency whose focal point is this benefits the agency because it can quickly deinvolve to i checked it off, i'm good. you have no privacy. but i'm good. >> and i would just say that oversight perspective has to also be changing because i think as technology allows us to, you know, collect more data in different ways and different data points that the oversight of it is is -- has less meaning if you're not also adapting on that side z fast as we are adapting to the technological
10:50 pm
changes. >> did you have questions? >> i would have some, thank you to the members of the panel. i would have questions that i want to ask, but there were a lot of audience questions and was there one or two that stood out that stood out particularl? because technically we only have five more minutes on the panel. i'm happy to have one or two audience questions. >> i think you should know you have won the jackpot thus far on audience questions. this one goes to you and draws on remark from previous panel. why can't the ic inform the american people about how many phone records were collected pursuant to 215 and make disclosure regarding the u.s. person collection under 702 and eo12333, understanding you're not trarg eting specifically u.s. persons but the u.s. person
10:51 pm
incidental collection. >> so that's a good question. i don't want to duck it. i will say, i'm going to in a certain way. but, no, i don't want to -- i guess i'm not getting into the specifics of like 215 or 702 et cetera. what i will say is there are two challenges. i understand the interest and i understand the importance. one is technical capability. can you in fact count it? and for some things, some activities, you should be able to count. but for other ones, they inherently involves challenges. one of the peak law recommendations in the 702 report was in fact the account of some of the 702 collection that involves u.s. persons. there are inherent challenges in doing that. from a national security perspective, what i'll say is what i have heard internally as we have pursued these kinds of questions is that providing that kind of information can in fact put at risk some kinds of collection because especially if
10:52 pm
you track it over time. an adversary can put the information together in terms of a volume of collection in one particular area and draw conclusions about what specifically is being obtained. what are the specific channels that are being watched and therefore change behavior. so our job from a transparency perspective is to continue to discuss that internally and see well, are there ways of mitigating that what can we in fact disclose because of the strong interest. >> so eric, i'll direct this next one to you because you mentioned that part of the civil liberties and privacy collections are from wrongdoing. so the question is in the case of privacy violation sufficient remedial measures are critical. what, if anything, do you think needs to be done either statutorily or to strengthen existing remedial schemes? >> so, yeah.
10:53 pm
i do think that the remedies for privacy act violations or for, you know, privacy violations are -- you know, as i said in my remarks, everything could be examined and looked at for approval. i was focussing my remarks on the fbi so of course they have their own investigative unit. so if there is any particular activity for -- that an agent engages in for example, that is, you know, collecting information in violation or specifically because the first amendment purposes, that's subject to review and disciplinary action. with regard to individuals, i agree. we talked about how the fips doesn't really have as meaningful of really a guide for law enforcement either.
10:54 pm
i think it is not something that i can do but certainly it is attempted before to remedy the proif privacy act or two to amend it. the administration is looking to expand judicial redress for nonu.s. persons and dhs as a policy of doing so administratively but think statutorily it is a hurdle and it is something i would be willing to have a conversation to further that. >> just to keep this even across the board. becky, this one is for you. i think implicit in this question is a very interesting premise. do you anticipate the wide swats of data will no longer be collected now that you are asking questions about whether they are really needed and the civil liberties down sides? so i would say the premise is that it is your job to shut it down. which i think is a widely shared
10:55 pm
premise and basic question is, do you think you're effective? >> this is also with the premise that the collection we're doing currently is that starting with the premise that we're collecting too much information today. and i think what i would say is that what we're working on is sort after premise so if nsa is filled with a lot of people who do math for a living, we're in the process of third grade math which folks need show to their work. so they need to show why they're doing what they're doing so we can then have those conversations. i don't want to presuppose we will do more or less or either way of those. but i do think that what we haven't done well is explain what we're doing. and if you sort of consider the nsa has a long history of saying absolutely nothing to anyone and in the last year and half, we've had to create a voice for ourselves to explain what it is
10:56 pm
we do, and recognize that most people, there are a lost ph.ds in math at nsa who don't necessarily take well to speaking in public, it's a work in progress. and so, my hope here is not that -- not to be judged by how much we turn on or turn off but by demonstrating what the value is to the country in terms of what we're doing and demonstrating civil liberties in privacy. >> so thank you all for your remarks and your active back and forth on the questions. >> and we'll be taking a 10 or 15-minute break. we will resume with the private sector view on these issues. thanks. >> thanks. >> thank you. >> on the next washington journal, congressman tim ryan on what the democrats hope to
10:57 pm
accomplish in the next congress and current lame duck session. then dennis hastert on the republican's congressional agenda. and a look at the president's call for new internet provisions for new net neutrality. we will talk to bloomberg bma. washington journal is on c-span. >> with live coverage of the u.s. house on senate and on c-span 2 we show you the most relevant hearings. on the week ends, c-span is home to american history tv, including six unique series, civil war's 150th anniversary visiting battle fields and key events. american artifacts, touring museums and historic sites to discover what artifacts reveal about america's past. history's bookshelf with boast known history writers. presidency looking at policies and legacies of our commanders
10:58 pm
in chief. lectures in history with top college professors delving into america's past. and real america, featuring archival government and educational films from the 1930s through the 70s. c-span 3 created by the cable tv industry and funded by your local cable or saddle loit pr provider. watch us on hd, like us on facebook and follow us on twitter. more from the privacy and civil liberties oversight board. enhancing oversight in the private sector. this is 1 hour 25 minutes. >> thank you. for everyone's endurance today, this is will final panel an important panel on what the private sector learned about privacy and how that might
10:59 pm
relate to considerations with the national security issues. and rich brand moderating -- >> okay -- >> all right, thank you, david. thank you to the panelists for being here. we are exploring what interests are underlying privacy. the second panel had do with technology. third panel was government panel. and this last panel is supposed to be focused on solutions and particularly those solutions that folks in the private sector might be able to suggest. with a we will do here logistically, each panel has seven minutes of remarks. sam kaplan in the front row here with yellow and red card so when he hold up the yellow card you know you have two minutes left. please pay attention and red card means your time is up. that point as mod ritor, i will ask about 20 minutes of questions and each of my fellow
11:00 pm
board members will have five minutes of questions. and then we will open it up to questions from the audience and as with the previous panels, when i start to ask questions, some of our staff members will stand up in the back and prim will stand up and hold up cards and you can get yourself a card. right down a question and then the staff will pass it up here. so we will just go down the row and start with professor kate. i'm not going go into length on the biographies because i think they are all available to you. professor kate is a professor at indiana school of law. he's been on a number of previous board and commissions on privacy and so professor kate, we will start with you. >> thank you very much. >> this is the time i think to say i'm color blind. so i will have no idea what cards are held up. perhaps you will wave them in a definitive way and i will pay attention. so let me just -- first of all, i'm sorry not to be here this morning but the last panel

47 Views

info Stream Only

Uploaded by TV Archive on