tv Key Capitol Hill Hearings CSPAN November 13, 2014 5:00am-7:01am EST
5:00 am
be deployed in a way that protects privacy. that is something that founders of the company instilled from day one and that is why my job exists a civil liberties engineer. one thng i learned, and this is different from the hill certainly, when you walk into a room and say to engineers, i'm worried about this thing you're building. it creates a privacy problem. the response is oh, okay, how do i fix it. which is not often what you get when you ray these things other places. so it is our job as an engineering team to come up with suggestions for how to fix it. i'm a lawyer. as you may have guessed. so i do not necessarily possess a lot of technical skill. so the main role for us is to translate between the lawyers and engineers. so what i want to focus on today a little bit is some of the technology at a high level and then i had actually suggestions for moving forward that i think are actually fairly low hanging
5:01 am
fruit. so just briefly to provide context, as i said, data management and data analytics. we're not dealing with the collection of data. this gets more to professor kate's point about the use of data. and we have two sort of high level categories of technology that deal with manage willing or protecting privacy with the oou use of data. there is access control and oversight mechanisms. i want to start by pointing out and this is something to keep m mind just as technology expanded, power of surveillance and the amount of data collec d collected, it is also significantly expanding the leflt of privacy protection that isvilleable at the agencies. if you imagine 50 years ago if there was an fbi file, this is probably pieces of paper in a red well, sitting on a desk somewhere or maybe locked in a desk drawer. hopefully locked. or maybe in a dusty basement archive or something like that. and there is probably limited
5:02 am
tracking of where the log book was. and anyone accessing the file can see whatever is in the red well. you can just rifle through it and you can see everything even if it isn't directly relevant to what you need. it o would be nonexistent's you couldn't see who added information to the file. who deleted informing from the file. and deletion is hopefully burn bag or shredder. probably just crumb pling it up in the trash. or a black magic marker redacting a few points of information. today we do a lot more management data and oversight. and management at a grander level. that's what axis control point. which you can now build axis controls to manage data very prenicely on data point by data point basis. can you do it in a more nuanced way. you don't have to choose between access or not access. can you make the access controls dynamic and so there is a lot of options and sort of the way the many options you have to
5:03 am
configure the access controls give you a near infinite variety options in how to manage data. who can see the data and what they can do with the data. the other point is oversight pecknisms. and this is really you think a lot about audit logging and also using technological electronic work flows to control exactly how data flows around an organization and who can see data and what kind of analysis they can do with it. or hard wiring an approval chain for use of data and things like that. and these can be very detailed. so the -- or the hard wired approval process and things like that. that can be very complex and involve multiple actors. and then the auditing of how data used it self can be incredibly granular and incredibly detailed. and i want to get to other point. just these two capabilities are
5:04 am
a significant improvement of what existed before and can get us a long way. and there are things that exist today. now i'm obligated to say that poll intier does this best but this is not exclusive to poll intier and they can be deployed and can be used in a lot of different context. so what is the problem today? why aren't these capabilities being used more? a couple things. one, issue and technical awareness. lawyers don't know technology and engineers don't know law. and you need people who know both of these things to be able to make the decisions as how to use these technologies. how to incorporate them into programs. lack of resources. you need people who can actually manage the data. you talked about this in earlier panel. alex joel has a very small staff. erica has a very small stafkcaa.
5:05 am
they need resources and infrastructure do this. resource is hard. how do you use an audit log. how do you use it effectively. how do you access controls especially when you are dealing with massive amounts of data. the last one is death by anecdote. the debate, cost benefit analysis, tends to be the national security soerkt saying one time we caught this bad guy using this information and this community saying one time this unjust thing happened to a person because of this program. there needs to actually be a much more -- you can't just make this argument on anecdote. you have to look at date why and can find that more specifically how these programs are working, how effective they are. so solutions suggest some of the solutions in listing problems. education. i think and poll intier sponsors scholarships.
5:06 am
to make sure lawyers can learn technology and engineers can learn law. it should be a requirement to have an ethics program. they will build technology that will hit the streets and is going to months or years before the law catches up. so shuouldn't engineers catch o what they are building is affecting privacy and they should think about these things. infrastructure. an important value for us as society then we need to invest in infrastructure to support it. cop photo goidance we actually need go beyond just systems should have use limitation. we need to tell people, how are you going to do that? i can dig into that more when people have questions. but really specific guidance rather than just the, you need to have notice and consent, you should think about use limitation and things like that. and last, everything in the world can be datafide these days. including how effective they are. becan do analysis and start
5:07 am
analyzing data and figure out, is this effective, is this not effective. is this having negative effects. is this creating bias in the analysis. thanks very much. >> thank you. our next panelist is chris, venture partner at palton group. >> i spend most of my time teaching at the naval academy. i, like the other panelists, are grateful you established this venue for what i think is important dialogue. i would like to make four quick point, then get to question and answer. first and foremost, i absolutely agree with the premise which that the framers of the constitution did not intend for security and privacy to be in mort al combat and we try to figure out how do we achieve both. it may very well be we cannot trade one for the other. i think that's right. but we have to work harder to achieve both. i think technology and practice from the private sector can be
5:08 am
helpful there. two, i agree that government is different. not simply in the powers, toops it might bring to bear on a citizenry or others. and therefore, therefore should be con trained. but the government alone has the requirement to essentially meet standard of the first, fourth and tenth amendments within the constitution and from my nsa experience, the most significant of those which essentially says, unless you have the dhothe authority to do something, you should not. the back door search is from 215 or nsa interpretation, both were specifically permitted underneath, under court approved procedures and specifically where interpretations of the law that went through three branches of government. i think that's right and proper. that doesn't necessarily justify them. it may be bad policy at the end of the day but rule of law has to pertain to how the government gets things done. point three, i would say that largely agree with what john had
5:09 am
to say. i holy agree with what john had to say.wholy agree with what jod to say. that aspects and law are at odds with each other, because they are perceived as independent bias on any particular solution. i would add a third which is that what typically plays out in any one of these systems is that you are trying to effect technology, law and operational practice of those that make sense of the technology and surprising result is that because they do not change at the same rate, they essentially change at very different rates. keeping them reconciled or synchronized from moment to moment is really hard. therefore mechanisms or oermg things are not likely to satisfy the need what you need are threads or systemic solutions that you pull through and you take both art and science process to essentially try to figure out how to make some solution here. i will wholly agree with john that education is absolutely essential.
5:10 am
at nsa when we found ourselves at compliance incidents, which no one intentionally made a mistake, we had to sit down and figure out, how do you find a horizontal joint between all who were trying to achieve something slightly different but ultimately invested in the same problem. last point i would mike is i do believe there is a role for big data. sometimes called mass collection. there is a role for big data. but the principles should be the same as surgical data. which is necessity and proportion ality. the government should be able to justify on what basis this is necessary. such that it could then argue not for on encroachment upon civil liberty or privacy but how do we work harder to achieve sustainment and it should only achieve that in proportion to that need. therefore, i think that all those comments aside, i would say that the private sector probably has a lot of experience
5:11 am
in this regard that the government can take advantage of. my own sense is that government collects far less information than is perceived by the public and certainly far less information than the private sector does. i don't excuse the government for that. they should be held for account but they can bring technologies in that might well scale quite well for the government's purposes. because it would have to scale them down as oppose to scale them up. i'm open to any questions you might have. >> thank you. >> just a reminder to the audience, that there are stoppers in the back with cards and if you would like to direct a written question to the panelist, hold up your hand, find one of them, and write down your question. and for the benefit of the audience and the cameras, for the panelist when you're answering a question, if you wouldn't mind moving the mic back and forth. i'm sorry, we don't have as many mics as we probably should. i would like to start with asking about oversight.
5:12 am
and i would like mr. grant to direct this question to you first. both in your oral statement and in the written statement that you submitted to us, you talked about a wide range of mechanisms. paper trails and electronic work flows and things like that. frankly, when i read the written statement it seems like an overwhelming array of different ways to engage an oversight. i think for a couple of reasons you need to choose your oversight mechanism. one is that the agency will have limited resources to dedicate and secondly as i mention a previous panel there may come a point where there are diminishing returns joefon oversight. you need to leave the agency to do their job and not have mechanisms all day long. so have you given some thought to what constitutes an effective oversight mechanism? how do you rank different mechanisms in terms of their effectiveness?
5:13 am
>> yeah. so i think we should actually think about oversight as a big data problem. and apply the same thinking to it that we would try to analyze intelligence and try to analyze huge amounts of transactional data for marketing. it's a similar issue. have you a huge amount of data. there are massive amounts of audit logs for example in an organization like the nsa. and that's a lot of information. but you can use technology and analystic tools to mick sense of that information. and drive the insights that you're looking for. so at the part of the issue is, a, you need to do it. you need someone so we see this all the time and i know other organizations see this as well, which is everybody checks the box for audit logs. we've got audit logs and we will go through an enormous number of hoops to make sure it is logging exactly the information that it is supposed to.
5:14 am
we get fewer requests to actually look at the audit logs once the auditing mechanisms are logged on. there aren't many laws i can tell that tell anyone they have to look at audit logs. it is the seinfeld joke about renting a car. everyone can take the reservation. but to hold the reservation, to use the information. so i think, to me, that's how make oversight more effective. you use these techniques. and that's another thing. oversight people and the information security people and things like that, they should be as good as your analysts and you need to have good people who are also doing analysis and connecting oversight. so to get to your last question, which is the most effective, i think it is using that auditing data. using the big data that you've got and having a team of people that can proactively comb through it. not only are you looking for people doing something wrong but you can also ask questions such as, is the data retention policy
5:15 am
make sense? can you look at data and say, it turns out we is data set for five years. no one uses it older than three years so let's change the policy to change with the use of data. >> okay. i would especially like to get your thoughts from your time in government. what did you view as an effective oversight mechanism. >> first and for most, if there is an authority granted or burden that's imposed and they come hand in glove, that's not a one-time thing. and there cannot be a repurposing somewhere later or have gotten past that threshold. events might be collection, processing of data, analysis of data, dissemination of that data and burden imposed at every step according to the authorities that were granted for the acquiring of that data, acquisition of that data in the first place. what he with found to achieve that, data is is aggravated,
5:16 am
synthesized, we take the iconic analytic effort, doesn't simply use data from one source, they use date wa from many sources. if there is different expectations to keep it straight in your head as to what you're going do about that. so the focus has to be, how do you find the attributes for particular data element at the moment that it comes into being. >> could you pull the mic a little closer? >> at the moment, you collect a piece of data, how do you bind attributes to tho that data, wht is the authority under which that data was collected, what are the burdens, constraints that come along with that. what are the prescriptions if any that come with that and that should be bound through that data through its life. throughity life of collection, process, analysis and dissemination. now at some point there is a second order use of that data where someone reads a broad
5:17 am
swath of material synthesizes that in their head and constructs a document across an air gap. that gets hard. but at least in that primary use, if you have a systemic view from start to finish, you make the auditor's job or compliance oversight much, much easier. and you therefore in your system in your technology, essentially impose a constraint or check every time something exercises privilege again that data. whether it is at collection, analysis, processing or in dissemination, that makes the auditor's job much easier and frankly has a nice deterrant effect because they know at every moment they are held to account. but at my experience in government it is not so much the deterrent in government as the very rule ladened environment. typical counterterrorism analyst at nsa would often deal with hundreds of constraints on the data sets that are available to them. because various orders of the court, interpretations of the
5:18 am
court sharing arrangements with various others nations would all come along with their independent assessments of how the data can or should be used. so bottom line is the technology can help us by essentially doing an atomic bind. meaning it issor beganic to the data itself of what is it prominence. that should never be lost through the history of that system. >> thank you. i would like to turn to fips and mr. goiger, i was happy that you recognized those and professor kate as well. so i would like to direct this question at first to the two of you. so mr. geiger, i notice that in the written statement that you sent us to you talked about the fips but you didn't really talk about the individual participation fip. and i guess when i talk about the fips, i'm referring it primarily to the dhs version. you said in your oral statement just now that the fips are not a smorgasbord, they are a framework. you can't just pick and choose between them. if you have to employ the fip, how can that work in a
5:19 am
surveillance context in. >> that's the tough toastest >> that's the tough toaste to a this this context. one way to do it, which is not viable or good policy, is bring suit for violations of law. but my, i think more reasoned answer, is if the fip is lacking in the national security context, then the rest of the framework has to work overtime to compensate and that includes data minimization, which is why i emphasize data collection and transparency. as well as the rest of the framework. i absolutely recognize challenges in participation but this is one area again where government is different from private sector and i think that difference should express itself in particular in the data minimization principle. >> professor kate, do you have thoughts on that? i would ask also, there's a lot of -- a lot written and said in
5:20 am
public recently about how perhaps the consent and individual notice fip really doesn't work well in the private sector because nobody really understands what they are consenting to. they have to consent to get service and it is a meaningless exercise. do you have thoughts on that and whether the individual can work in this process? >> thank you. i have thoughts. especially with one of the people who have written some of that. i think the challenge of the fip says that they often lead us in the wrong direction. and i think this is a real challenge. i'm not in any way trying to make it sound easier or make it sound like there is a simple answer here. but for example, if we think of fips and classic 1980 fips, we are talking about consent, use limitation, to the purpose specified and then we add things like data minimization and
5:21 am
individual participation and frankly almost all of these seem claeled in a modern data environment. private sector or public sector. in other words, how does that really work? you know, there are 60 people in the room. they all have cell phones, recording devoices, video, audio. i don't have a statement from any of them. i don't know about my individual participation rights. i suspect they would look down on me wanting to interview each of them about it. the issue is an important one, which is how to protect privacy but shifting the burden to the individual which is what fips have the large effect of doing is a very difficult way to approach that. i think it is an important way to approach it in the public soerkt environment. but it also may lead to completely wrong results. in other words, one of the surprising things to me, and i can't believe i'm saying this in a place that's recorded, but that the -- about section 215 is that nsa collected all this data and did so little with it.
5:22 am
it was astonishing. so you would like to say, when people talk about atomically binding limits and what you can do with the data with the data, something knew we might do with the data that might have a major effect on national security, we would have a process for some sort of risk analysis. what's the benefit. what's the risk p. what are the processes in place to protect it. now let's do that thing. and data has real value. it does in the national security environment and private environment. i think we need to think about approaches here that aren't boinding everyone to some mythical transaction that took place that which in the fips world we say the individual agreed to this even though i can't think of a case in which the individual actually agreed to it or it was meaningful consent. and in the national security world we just overlook thap. well we think it was important without again doing a clear and well-documented type of risk
5:23 am
assessment. using clearly are a tech lated benefits and harms.te a tech la benefits and harms.ie a tech lad benefits and harms.ce a tech lated benefits and harms.ue a t lated benefits and harms. a tec lated benefits and harms.a techd benefits and harms. tech lated benefits and harms.tech lated benefits and harms.ech lated benefits and harms.ch lated benefits and harms.h lated benefits and harms. lated benefits and harms.lated benefits and harms. >> so it does sometimes lead programs in the wrong direction. it is a useful framework for evaluating privacy protection but the application of the fips, what you are actually doing with the program, you may pass muster under your privacy impact assessment but the way the program is impacted on the grounds may not be privacy protected. so i don't think that fips are a silver bullet. but the principles themselves i think are very useful for the evaluation of the program. second, it's been a long standing controversy about notice and consent being inadequate. but that is why i said at the outset that the fips is a framework. each principle is dependent on the other. this came up clearly in the health context. people don't know what they are consenting to when they receive a notice from their doctor. they don't know what the privacy notice says or means or what
5:24 am
hippa does. which is why there has to be a lot of additional privacy protections in place actually meaningfully protect that individual's privacy. then lastly, fips are not the only framework. i this i it is useful. indespencible framework. but there are other frame works that can be applied and should be applied to data collection at large. >> though this is the subject of the first pab el and not this panel, but i want to ask anyway. i apologize if i'm springing this on you. i want you to say what is privacy. you i assume you spent time thinking about how to protect privacy and civil liberties. what does that mean? what interest were you trying to protect? >> i would sigh i don't think that has changed over time.
5:25 am
the fundamental question, always comes back to, two things. one, with respect to the perspective of the individual is there a reasonable expectation of privacy for fill in the blank what that information might be. that's the stuff of great legal debate but operators think about that as well. particularly operators inside the government because they are con train strained by the tenth amendment to think, what else is there do. but the second way to think about the issue of privacy, is then what might you learn if you take these discreet data sets and combine them in a way that might then give you some insight into things that were not self evident from any one of the discreet data sets. and you have to think about aggravation, synthesis, down stream. again you might have thresholds that you have to think your way through and you have to go beyond that particular point in time. i would tell that you at the
5:26 am
national security agency, ethos is as important as compliance rules, fips mechanism and things of that sort. science will lead you astray. science alone cannot lep you. so essentially navigate the clael. the question of how do you achieve both security and privacy in a world where they are massively converged in a place called the internet. >> professor kate, do you have a thought on the nature of privacy. >> running out of time before you got to me on this. this is an area where i think public versus private sector is an important distinction. i think it has to be kept clearly in mind. in the private sector i think of privacy mainly in terms of if harms or impacts on individuals or groups of individuals. so whether that is it the way we think about it in the fair credit reporting act like higher price for creditor denying someone a benefit or whether it
5:27 am
is some other way in which we think about an individual being manipulated or higher price. in the public soerkector i thin that is also true. i think there is something more in the public secretary popper which is privacy i think from the very beginning of the constitutional debate was seen as something about the balance of power between the individuals an their government. between the citizenry and the government. and that there is something quite strike aeng this i completely agree with harley about the more the government knows about individuals, the greater the risk that that information will be used in way that that alters that balance of power. that makes the government more powerful and makes the individual less powerful and you know, a widely served but ironic twist as we've got under the 21st isn'try we are less transparency to the citizen about the government and more
5:28 am
transparency about the citizen to the government. and that is a clearer alteration in that relationship. that power relationship or that oversight relationship. and so in that sense, that's why again, focussing on collection or use, it may be a not so significant matter. but i at the end of the day, it is use that matters p. it is knowing how can the government use this information in a way that might effect me as opposed to o is the information out there, which seems to be always the answer is yes now. >> mr. grant? >> i don't have necessarily an answer. but i think i have sort after framework for thinking about tp prsitp prsiytp prstp prsittp prs prsitpthink about it in social . younger people are viewing privacy. if you ask, most engineers appear to be about 14.
5:29 am
and if we had discussion internally and should we look at linkedin, face book, and look at it as part of the ways to defirdetect phishing and things of this.boo it as part of the ways to detect phishing and things of this. they vigorously -- and they say, you tweeted that. which means people are going read that. it is a tool for communication to the world. and they still felt yeah, it is publicly available. anybody can google it. but they still have an object to government collecting it or government reading it. or their employer reading it. things like that. i don't know what that means in terms of coming up with a final definition of privacy but suggests that people -- there is a different view of it. and that even public information, there is still privacy inherent in public information somehow. like i said, i think talking
5:30 am
through sort of attitudes towards social media and understanding that could help us figure out what is the newer conception of privacy in this technological age. >> do you have something to say, mr. geiger? >> sure. i said most of it in my opening remark. i view it as an individual's ability to control information about herself, but then also the control that the entity holding information can exercise over individuals. and i think that it is very important not to just look at privacy harms or privacy interest o are the extent that privacy can control over an individual or their decisions in the context of today's technology. i think it is important to look out the next couple of decades and see what is coming down the pike and there are very pervasive, very privacy intrusive technology that we
5:31 am
will see in hoirms or in ourselves in our lifetimes and certainly our children's life times. the laws haven't kept pace without a change in the law. again i reiterate that internal pro tetection on use and access while important be is not sufficient. because they can change. they have changed. when we talk about protecting privacy, i think we should look as i said just to what we are protecting several generations down the loiioinnoineoininn. . >> professor kate, they've been talking about that throughout. and for focussing on how the private sector might have solutions that the government might learn from, private companies are obviously doing something to control use of information they collect. they have to. they have a privacy policy that says what they will do with your information. they have to comply with it. there are organisms that they can use for forcing limitations that are effective that the
5:32 am
government might leash from? mr. grant, do have you a view on that in. >> so we see this a lot. our customers don't hold data. and honestly, actually, they use the same basic pecknisms i described in my testimony and often the same basic weaknesses. and do they have the infrastructure to manage access control? a lot of them do not. and it costs money and takes time. are they conducting oversight of data? probably so more than some people, and again because of limited resources but they are probably still not doing it at the level that you would hope. and one thung i notice is that a lot of them, there is, even in europe, where you have more commercial privacy law and more commercial privacy compliance
5:33 am
requirement, a lot of times it's best guess. for example, one we have been running into recently now, is looking into cybersecurity and information security data ex filtration risk in the private sector. and these giant companies are trying to deal with privacy laws that are all over the map. they are asking questions like if a german employee sends an e-mail to a u.s. employee, what privacy rules apply to the c conteco content of that e-mail. in germany, you have to tell people, i'm going to monitor your e-mail. in the united states, they can basically do what they want. there are terms in what the privacy is try doing but i think you are facing a lot of similar problems that related to scale, related to lack of understanding of what the rules should be as the government.
5:34 am
>> so there is probably a lot of great technology out there that can be use b used but any techn can fuel into the wrong hands without the right process. the following process might be to consider, that first and foremost before you acquire any capability, within if the government, within the private sector, you think of the po portionality situation and is this necessary and have i done this only to the degree it is necessary. and what we are trying to achieve is not simply the balance of privacy but traps transparency and you don't often believe they achieve the balance of the first two. that derives the possibility in the government. the need to essentially acquire explicit that comes with
5:35 am
constraint. constraints are bound to that and some measure of accountability for those constraints. and the process elements that then are essentially implemented to pull that off, i think should have the aspect of continuous compliant. not discreet compliance but continuous compliance. you think about it all the time. first, middle and last. a stretched analogy as port of the problem of cybersecurity. and you think of that as a bolt on some will such tomb we operate systems continuously with that foremost in miend as primary attribute that will break our heart. the next is an external component. internal component, you have to hold people accountable internally of system. you can wined up with mismatched expectations or the system might in fact go rogue. and then three, there has to be at various phase point required importing that ch is important
5:36 am
because that is synthesis and retrospective that says how do wing we wing a reget aggregate our experience. do we need to invest time and energy in the process itself and absent that, you find that you're the frog in the beeker and it is just getting a degree hotter moment by moment all of a sudden you're the boiled frog. and you didn't realize you step back and take a hard look and you got off course a little bit. time to go. >> thank you. i think my time is up. we will go to mr. dempsey and go on down the row. >> thank you. thank you members of the panel for giving us your time today. >> in a way, building off of something that chris said or at left what i heard, you are saying that we need the technology controls. we need to build the technology in a way that implements the
5:37 am
controls but at the same time you need the policies that surround it. you need the legal rules et cetera. i think john, my first question was to you, you talked a lot about the potential of the technology in terms of tagging information and audit controls and permission controls but just to state the obvious, that's an substitute for legal rules and policies. >> absolutely not. we try to say, even when we talk about privacy and capabilities, if you think you're buying a switch that you can flick that protects privacy, it is not going to happen. it not possible. you have to be able -- you have to respond dynamically to changing situations. you have to be able to make human driven nuance decisions. about data and how it is oeused
5:38 am
used appropriately. that is just not something that machines can't do. and you can't find a terrorist button. you nodeeed a human at the top the analysis clanalysis chain. so don't worry about it, we've got privacy covered. so what the goal should be for technologist says what kinds of tools do policy makers need and the oversight boards and civil liberties protection officers, what do they need and what makes their job easier or possible especially when you are dealing with data at scale. and you know, so easy example is there's a lot of work, a lot of research going into improving access control inner face. when you're dealing withtera bites of information on the cybersecurity space, how can you create technological short cuts to allow a human to make the designificants about how to manage that data.
5:39 am
and that is how you do it. you think about how do you support the policy. not how do you replace the policy. >> let me go to fred kate. fred, totally accepting your point about the limitations of the fips and totally accepting your point about the importance of focussing on risk and focussing on use, you're not saying, that collection is irrelevant, that obviously the fourth amendment is in some way a collection limitation, and that, you know, in the commercial context that company that had the flash light app that was at collecting data from the -- nobody even got to the harms analysis debt collection was inappropriate in and of itself. >> right. >> you are absolutely right. and i agree completely. in other words, i'm not
5:40 am
suggesting collection's irrelevant. we make collection to the end of the story so once you cross the -- you know, like a spill way at a dam. once you're over the collection limit then anything else goes. >> the ironic thing is that at nsa, as chris english said, their view is they never thought of it that way, that they thought that you have your collection authorization which is critical, your retention, your use, your dissemination. your retention limit. that each one of those -- >> if i can just respond to that, i think there's something of a mismatch here and i'm not in any way doubting either of what nsa is doing or what chris is saying. but one of the astonishing things for example, when i read the section 215 report that came out from the nsa, civil liberties office, well written report, it was full of all of
5:41 am
the limits on what they were doing and the incredible, what can only be described as bureaucracy and what struck the american people is how is the authorization obtained in the first place. we add law that said relevant to specific investigation. the 99 out of 100 people thought it might be focus owned specific individuals. apparently the 1 out of 100 that didn't was a fisa judge and had other members along with him, members of congress. so i think one of the critical issues when thinking about going forward, is this were private sector, there would have been immediate -- and you know that policy that says we will collect information for limited purposes, that means we will collect everything. then there is customer reaction p. what can we create that will
5:42 am
mimic that in the clsfide environment. maybe that's the -- maybe that's you literally having outside of the agency but focused on privacy and civil liberties that says we understand the challenge but we think you have got the wrong end of the stick. but i think it has been overly focused on the fourth amendment that creates this problem. as you well know, the fis just dismissed it by saying third party doctrine, no problem at all, let's go ahead. someone should have said, wait, you are talking about collecting data on everybody. and that would have focused the discussion under way that all of the technological controls and all of the bureaucratic controls now well documented in the agency somehow never did. >> i don't want to -- that's very helpful. i don't want to further rehash 215 and history of 215. and anyhow, i have a red card. so i guess that's the end. thank you.
5:43 am
>> so let me just follow up quickly on that point, maybe what we need do is supplement the fips with the omg standard. which is you know, private practice, i could have a client and i say, everything you propose to do is perfectly legal but are you nuts? how do we imbed that in stepping back and saying okay lawyers have technically signed off. everyone technically signed off. but this is a crazy thing to be doing. >> one positive step is adding someone like richards aep an office to support her within the agency. i think that's one way. so you have people not just thinking about the law but people who say understand legal clearance is taken care of but i still have the oh, my god response. are you allowed to refer to god at a hearing -- >> prfree speech. can you say what you want. >> nervous about that. so the club, there are rules not necessarily identical but
5:44 am
outside of the agency, there is where i would say, though this may reflect my naivete, we would have secret law. so if one thing is interpreted to mean the opposite that someone would feel the need to signal that as opposed to going out of their by and say no it doesn't mean what we think it means. it means only what you think it means so we would build in avenues for transparency about the law. so that at least we all knew what the rules were going into it. and i think that's a huge problem when the law itself is effectively classified because of the way this which the interpretive process works. >> i'm sure jorge posada, can just jump under on that.hrge po just jump under on that.nrge po just jump under on that.ge posa just jump under on that.e posad just jump under on that.osada, jump under on that.sada, can ju jump under on that.ada, can jus jump under on that.da, can just jump under on that.a, can just p under on that., can just jump under on that. engineers and technologies thinks of things, as does it work or not work.
20 Views
IN COLLECTIONS
CSPAN3Uploaded by TV Archive on
![](http://athena.archive.org/0.gif?kind=track_js&track_js_case=control&cache_bust=554497062)