tv Key Capitol Hill Hearings CSPAN November 14, 2014 7:30am-9:31am EST
7:30 am
translate the fifth? is it the right translation? does it work in the government context? i was struck by the mosaic area because there are other implications, transparency is to the extent that we are transparent in discreet ways. our adversaries are looking to aggregate information about sources and methods. the other is the national security establishment would argue the mosaics here is a critical -- you need to understand election, and what the apparatus is, that they have to abrogate information. you can agree or disagree but i was struck by the different implications of the mosaic. i wanted to start with you, professor. i was really interested in your notion of moving away from the brute force collection mechanism
7:31 am
and i think the section 215 program is one that the government has made the argument that they need bruce force collection, they need the retention to identify previously unknown links in information. have we given thoughts to whether there are technological options available to limit correction for a program like section 215, if you haven't been more generally, you can be more specific about correction options. >> with respect to section 215 the data is collected initially by the phone company. the question is whether the information needs to be transferred in bulk to the intelligence community in order for them to do their analysis and it is pretty clear that as a technical matter, the kinds of linking, looking for multi hot
7:32 am
links that the intelligence agencies want to do, can be done technically while the information is held to a third party such as the phone companies. this requires a modest amount of technical coordination between the entities holding the data and the entities that are doing the analysis. there are opportunities to match, to look for whether there are patches of two hobbit three hops from point a to point b and reach in and extract the data of those individuals for phone numbers that are highlighted by that analysis. that is the kind of thing that can be done. there is further work that is more technical that goes to questions of whether you can use advanced cryptography to allow the same analysis and not disclosing to the phone company
7:33 am
information -- there has been interest in the technical problem how to do this in the independent research community in light of what we have learned about section 215 program. one of the lessons is methods are available and develop kidsable when you have a specific technical problem like this. >> the biggest challenge is taking the concept we are talking about in developing practical feasible recommendations -- briefly to the professor's -- in the middle -- you both talk about risk mitigation and assuming there are going to be arms, how you mitigate those farmers. how is the collection stage? what have you found to be the
7:34 am
most effective mechanisms for mitigating risk? is it retention periods? is it access controls? is it audit trails, what can the government do to start mitigating risk? >> it is not really just one thing that i can point to as that is it. it is all those things are very valuable to do. everything from mechanisms to assure the information is accurate. when information is grab from one context to another what is accurate enough for the purposes of amazon.com to recommend books for you is not the same, we may not want from the government. amazon makes a mistake and recommends the wrong book to you big deal. it doesn't need 100% accuracy but the level of accuracy differs as we differ in context so we need mechanisms to make sure information might be taken
7:35 am
from one context and put into another. is appropriately accurate for that context. we need an analysis how long to keep the data and it is not being improperly accessed. and adequately secure and also being used. controls on its use so it can't be used ten years from now. we need all these different things and oversight from a lot of different bodies so it is up complex thing with many parts. >> there are many moving parts but from my perspective from outside and inside since the threat we are talking about is government -- the primary, the principal factors i would focus on that scene to have been effective that focus on individual government actors you can train on the first instance of being overruled, inculcating
7:36 am
a culture of compliance that is pre error mechanisms and finally, this is perhaps where we fall down the most, the willingness to impose, at least administrative sanctions on people who very from accepted rules at least in will full context and even in negligent context. nothing attracts the attention of a government employee so much as prospect of losing his job or being suspended for a term of months so that is where i would focus. >> if we look at the failures of compliance that have been acknowledged, we see some individual employees doing
7:37 am
things we shouldn't but also seen some that are failures of the technical systems to behave consistently with the internal policies. this is a case where oversight can operate without needing to get into the nuts and bolts of the technology. the question is what processes are in place to make sure your technology does with your general counsel says it should do. there's an opportunity to push on oversight in that area. >> to james dempsey now. >> thank you to all the witnesses now. it is important as we wrap up this panel to highlight what i at least heard, an awful lot of commonality because it is important for the board and the public debate moving forward not to end up with the proposition
7:38 am
that this is confusing or disparate or many different views. i heard a lot of commonality among the witnesses, starting with the point, i think you all agree whether you start from the premise that privacy is a human right or whether you start from the premise that it is an instrumental right, all of you agree that it is an umbrella term that covers many different values, many different interests. i also heard agreement that the mosaics theory even if it hasn't been accepted by courts is real. it is real both from the privacy perspective and it is real from the governmental perspective. >> lead the record reflect that.
7:39 am
>> i heard unanimity that the law refuse to the third party doctrine, the doctrine that by giving information to one person you lose all interest, all privacy interest, disclosure to one surrenders your right with respect to disclosure for any other purpose. there was agreement that the disclosure to one is disclosure to all is not a valid constitutionally aside for modern-day reality that doctrine just doesn't fit with the way we view information and privacy. would you agree that disclosure to one is not a surrender of all
7:40 am
interests in the information? >> i would say that the way people interact today, it would be inappropriate to apply consent to universal disclosure from explicit consent to disclosure to a single person, yes. i am not sure that i would agree with what is implicit in your question which is fast it necessarily followsthat it necessarily follows that it is constitutional significance -- i would accept the premise that human experience -- i don't expect him to tell everybody. >> taking the instrumental approach, there is an instrumental value that disclosure of medical records to the doctor is specifically premised on the notion that you have not surrendered your
7:41 am
privacy rights and in fact we want people to disclose information to their doctors and we promise them that their medical disclosure to the doctor is not disclosure to all. >> that is a wonderful example because we accept statements made to dr. s. an exception from the hearsay rule because when you talk to a doctor in an emergency situation you are motivated to be telling the truth. i was shocked, the doctor can in some circumstances be compelled, the realities work both ways. >> can be compelled to -- >> cannot go on. >> also there are several witnesses mentioned, it is important to say we are talking about the fair information practices principles which actually there is no definitive version of them but there is a
7:42 am
version that was adopted by the department of homeland security in 2008 which is as good as any. it seems to me there was agreement the framework provides the framework, the questions, they are nowhere perfectly implemented, nowhere full be implemented but they are relevant as a framework for asking about how you deal with information and you decide do you adjust it, does it work? if it doesn't you compensate with more emphasis on other issues. is that a fair play? making the skeptical face but you at least would say that it is the framework for asking the questions? >> it is the framework for starting point.
7:43 am
but i think many of those questions don't withstand the technological transitions we are going through. so i accept it as a leading off point but i am more willing than other members of the board to discard some of them as inoperable under current circumstances. >> what would you replace them with? >> emphasis on the remaining aspects and to my mind kind of more granular and analysis and underlying interested, thinking about what the mechanisms are, a privacy interest we are talking about is we have to protect because it is one size fits all and i don't think it covers the range of privacy interests.
7:44 am
>> we have a couple questions from the audience and not sure we will get through this. direct them to a particular panel member. if you can see that as brief as you can. the first one, toward you, when the government draws data from private databases such as telephone metadata, at which point of collection is more regulation required? private entity collection or government collection from private entities? yes or no. >> i am not going to ensure is that question. obviously when you disclose information from telephone company are in contact with that company and that regulates the
7:45 am
company. one of the problems in is the contract, or the section 215 of the patriot act, what they were consenting to end their information would then go to the nsa. it is different types of regulation, contractual regulation, story communication, government regulation. when you get certain types of information from the telephone company and the government there is fourth amendment so lots of situations everywhere. >> private companies have no incentive to coerce or imprisoned people, the risks of injury might be greater from the government and private companies but the writer asks does that
7:46 am
take into account the homeland security and prison industries? they can't do that without 484 contractors providing technical support. are there risks in the increasing commercialization of national security interests? >> problems can come from anywhere but there is inherent things that can be said, various things about how problems are caused and when does the amassing of data by the private sector cause problems? when does access by the government create problems and increasingly we see a cooperation or industry in the private sector that has gone up to outperform government functions and gather data and analyze data, all of these things create various problems we need to address.
7:47 am
keep an eye on the problems and stop looking elsewhere and look at the problems and address those problems, that is the best approach. >> there are two -- panelists discuss what they think they're tesla asks about today should provide. with technology or data analysis could or should be built in, you know great deal of this before. >> in a sense the question is asking to sum up an area of knowledge in a few seconds which i won't try to do. i simply say as with congress as with a tesla, a high end car you should think in terms of which
7:48 am
technologies are available and reasonably practical to use to minimize or control or limit the risk of information practice and ask that those beat their. and you should ask and entity that wants to collect a use the information be willing to justify the joyces a mate and be willing to justify why they did not use some accepted technical privacy masse said. >> the last one, i don't think -- what is the application of privacy in organizations like the postal service, you can remember my old judicial background that guaranteed interests. >> pension benefits. how are they impacted by the
7:49 am
fourth amendment? are there e issues and concerns for privacy in those organizations? >> the honest answer would be i am not sure. my understanding is the fourth amendment applies to those institutions as they exercise governmental authority and acting as agents for the government so i assume postal service employees can't open your mail willy-nilly because just because they are pseudo private actors. i may be wrong about that but since they don't open my mail kim is nodding no, i am right. i think that the implication of the question which is the most interesting part of it until i transition to something i want to talk about is it emphasizes
7:50 am
the point lies the made which is the line between commercial collection and government election is increasingly blurring and the ideas that regulation of the government and no regulation of google's collection sits dissonance, and places that happen between. for me, that suggests one set of answers because i am willing to think about wholesale government regulation, extreme level of corporate business practice, but it emphasizes the difference between -- >> that ends my part of the panel. >> thank you very much. thanks for the panel and the audience question.
7:51 am
7:52 am
[inaudible conversations] >> james dempsey will moderate this panel. >> thank you and good morning to members of the audience particularly good morning to our second panel. the title of our panel is privacy interests in the counterterrorism context and the impact of technology. i have no opening statement of my own, so i can go straight to the opening statements by witnesses. i will introduce each of them in turn. we will go right down the row which happens also to be alphabetical order. i remind the witnesses that we would ask them to keep their
7:53 am
opening remarks to seven minutes. there is a timekeeper which you might not have seen but in the front row here, rene will be holding up a yellow card for your two minute warning and a red card for time up. after a round of questioning by board members, the possibility of questions submitted by members of the audience, staff members throughout the audience have little index cards and so if during the course of the panel a question acres to you raise your hand and someone will bring you a little free3 x 5. our first speaker is the professor and chair of the school of interactive computing, annie anton, at georgia tech
7:54 am
university. she has a ph.d. in computer science and is one of the country's leading experts on issues at the intersection of technology and policy. please. >> good morning and thank you for the opportunity. let's try that again. good morning and thanks for the opportunity to testify. we are in an ever-changing world where terrorists and criminals are getting smarter and more sophisticated. their offensive techniques are surpassing our ability to protect our nation. providing strong technical protection for privacy and civil liberties is a counterterrorism weapon. i focus primarily on three technology considerations. first, strong encryption is and essentials technology for fighting terrorism. second, identification, while
7:55 am
not perfect, may be a reasonable approach given the thorough risk analysis. third, improved privacy threat modeling is critical for counterterrorism. our national cyber infrastructure must be resilient to attack from foreign powers, terrorists and criminals. requiring government back doors and commercial products, stockpiling exploits and weakening softer security standards are all processes that her nation's cybersecurity posture and make easier for attackers to infiltrate these symptoms -- systems for nefarious purposes. the latest appleby and google phones fill in encryption by a fault. both companies are configuring this encryption such that they cannot decrypt the information for anyone including law enforcement. these nations and -- measures have been criticized by the director of the fbi and the attorney general. as a technologist i can assert that applying security best
7:56 am
practices such as encryption by default--requiring companies to provide back doors for law enforcement for national security, individual privacy and the overall security. moreover the security benefits are questionable at best because sophisticated terrorists and criminals will use international products for more secure, less convenient alternative is. technology and policies callers are debating a the merits of identification and automation techniques. the issue is critical because privacy rules only apply to identifiable data. technologies scholars emphasize there is no way to mathematically prove this data set. that it cannot be identified.
7:57 am
in contrast privacy scholars -- policies dollars believe this provides practical protections to most of the people most of the time. considers that the lock on your door at home are pretty good but not good enough to keep but determined intruder at bay. that is the idea behind this. there are some cases where it is critical to protect a person's identity. for example for victims of domestic abuse we need to insure their location is protected and cannot be identified by the user. however in many settings if we apply effective but not perfect identification procedures overall privacy protection may be increased and maybe more useful. in some cases it should not be the enemy of the good. the key club might consider how to determine practice when
7:58 am
agencies exist on technically strict identification versus not perfect identification may interrupt the bulk of the risk. finally, a threat modeling is critical for counterterrorism and we must improve its to achieve two goals. we must develop privacy oriented threat models. most threat modeling techniques which developed entirely in a secure context with little privacy consideration. the latter is crucial given the rise of analytics and other things. as a nation we do not want insiders leaking state secrets to foreign journalists to become a common way to influence public policy decisions and debates. insiders with access to information must be considered as potential threats simply because of the extreme damage a leak could do either in direct costs by providing useful information to enemies or indirect costs with respect to
7:59 am
public versions of trust. a good threat model makes risk analysis usable for any organization. in closing as a technologist i believe we should encourage strong encryption by default, and use identified technologies now rather than wait for theoretically perfect solutions and expand threat modeling to include privacy and security as well. in addition, the importance of having technologists in the room, i can help but note the review group did not have a technologist who which appreciates all that you are doing but when there isn't a technologist, a heavy technologist on the panel, i would like to see more technologies that involve decisionsmalcolm. i would like to thank for civil liberties and privacy board for its commitment to finding ways for the government to protect
8:00 am
privacy and also needing our critical security needs as a nation as well. >> thank you for your testimony but we have a technologist in the second row and outside as well so we do values the roles of technologists and have two. >> i look forward to meeting them. >> our second witness, the executive director of the center on privacy, technology and the law at georgetown university law school. and was previously chief counsel of the senate judiciary subcommittee on privacy, technology and the law. >> good morning and thank you for the opportunity to speak with you today. we have a problem in privacy, it is up problem for government and
8:01 am
8:02 am
under the new model you collect as much information as possible at your protect privacy through after the fact post collection use restrictions. i'm here to encourage you to resist this new model. at march 4. the first is a collection still matter. the collection impact the persons or right to privacy regardless of what happens to the data after the fact. second, this was discussed on the first panel but there's a misconception that this is useful for commercial privacy. i talk about the fact that tips remain a critical benchmark which to measure the privacy impact of counterterrorism policy. i will add given the previous discussion literally since their inception in 1973 the committee that wrote the report dedicated a section talking about how not all of the tips can appoint an intelligence context but clearly some of the most because the risk is too high.
8:03 am
third am testifying about, we need to remember privacy is not about taking, pardon me it's about taking and not about sharing. forth and finally i think americans to expect a degree of privacy in public. given my limited time i want to focus my oral testimony on the first point, collection. after the snowden disclosures on the telephone records program last summer, the first line of argument was we may collect a lot of information but we look at a tiny part. the problem is this is that people think about privacy. in a police officer knocks on your door and says give me a list of every person you spoken with in the last week and said don't worry, we will probably never going to look at this stuff. would that reassure you? i think most people would say no. this highlights the fact the forcible collection of sensitive data in dave's what this board
8:04 am
has called the core concept of information privacy. it's not just the concept. in my mind the single biggest reason to resist the privacy model that primarily relies on post collection use restrictions is the disparate impact that model might have on vulnerable communities. in the use restriction model, collect everything and protect privacy by banning harmful use of data after it's been collected. the problem is there's basically what i will call a moral flag in which we treat data. we as a society are often very slow to realize that a particular use of data is harmful, especially when it involves data of racial, ethnic minorities, lgbt and people of life local power. the two most prominent examples
8:05 am
of this moral lag involved the department of defense or formerly the department or. during world war ii japanese-americans volunteered detailed information about themselves and their family. they volunteered under a statutory promise from the federal government that that data would remain confidential. this was a use restriction. what happened? in 1942 congress weight the provisions and the department for his detailed sensitive data to monitor and relocate japanese-americans to internment games. after world war ii a similar story unfolded for gay and lesbian servicemembers. they were prohibited from serving openly. even after "don't ask, don't tell" the military use that confidentially collected data to out and dishonorable discharge lgbt servicemembers. today with the benefit of hindsight we recognize that these events are discrimination. but at the time the picture was less clear for a lot of people. that took a long time to change.
8:06 am
the census only acknowledged the full extent in 2007. congress only repealed the ban on openly serving the lesbian servicemembers in 2011, three years ago. so let me be clear. my point is not to cast aspersions on the department of defense. rather my point is that all of us as a society are consistently slow to recognize what's a harmful use of data when it comes to global kennedys. it takes us decades to figure that out. far too often face discrimination was yesterday's national security measure. what this means for our data and privacy is that we cannot solar rely on use restrictions. what this means is that collection matters and that the simplest and most powerful way to protect privacy is to live in data collection, particularly for government. i urge you to continue to protect the core right. thank you. >> thank you very much. our next witness is mike hintze,
8:07 am
who is chief privacy counsel at microsoft where he's been for 16 and a half years, the epicenter of evolution of technology and privacy. mike? >> thank you. thanks for the opportunity to speak with you today and participate in this important discussion. i come to this discussion from the perspective of advising on in managing privacy and related issues in the private sector. i've done that for nearly two decades, at microsoft we approach the issue of privacy from a core belief that privacy is an essential value, both to us and to our customers. with strong committed to privacy because recognize a customer trust is critical to the adoption of online and cloud services.
8:08 am
our customers from individual consumers to large enterprises will not user profits and services unless they trust them. unless they trust their private data will remain private. we seek to both of trust with our customers by appearing robust set of policies and standards. the sky to we do business and how we design our products and services in way that protects customer privacy. choice about collection and use of data, strong security and ensure the data is protected, accountability to ensure that we're living up to our commitment. these standards are not just a rulebook we created no all employees all of the we built them into the processes we use to operate our business. for example, they're built into the tools that are used in
8:09 am
software development lifecycle and there are checkpoints to prevent a product from service to shipping without a privacy sign up. we have taken what's often referred to as a privacy by design approach to how we operated company and how we develop and when our services. this approach is supported by a mature privacy program that includes dedicated person with privacy expertise who sit in both centralize roles and our embedded throughout the business. the program could incident management response and escalation processes. further we've developed and deployed comprehensive role-based training for engineers, sales and marketing personnel, as well as those in h.r., customer service, other roles the touch and handle personal data. our program includes executive level accountability for privacy coal plants. but that investment in privacy and trust we work to build is undermined if those customers believe that the government can freely access that data. concern about government access
8:10 am
to data collected by the private sector foster a lack of trust in the private sector services and when those concerns are focused on the access to data by the u.s. government, that lack of trust becomes focused on u.s. companies. that's why we been vocal for the need for surveillance reform in the united states. there have been positive steps in this regard in las in the lar but there's more that needs to be done. we've laid out several things the u.s. government should you do not restore the trust that's been damaged by last year's revelations. first, bulk data collection programs should end. we have been clear that we have not received any bulk orders, any orders for bulk data collection but we strongly feel that surveillance should be focus on specific targets rather than bulk collection of data related to ordinary people's activities and communications. the recommendations of this board in section 215 program encouraging in the comments of the president and we argumentation to into the existing program and we urge congress to enact prohibitions
8:11 am
on any such orders in future. second, we should do more to increase transparency. it is a key element to any program for protecting privacy. it facilitates accountability enables public debate around policies and programs. we assume positive developments. in particular the government has declassified for more information about its of its program and the workings of the fisa court. we and other companies from lawsuits last year against u.s. government arguing we have a legal and constitutional right to disclose more detailed information about the demands we have pursued under u.s. national security laws. earlier this you we came to an agreement with the government enabling us to publish some aggregated data about the fires orders international study letters we have received the it was a good step to foster better understanding of the type in point of such orders for service providers receive. we believe there can and should be more detailed reporting permitted. third, we support reforms on how
8:12 am
the fisa court operates. they are a properly balanced against privacy and of individual rights. surveillance activities must be subject to judicial oversight. we need a continued increase in the transparency the fisa court proceedings in the links but effective judicial review requires a two adversarial process where more than one site is heard. we urge congress to act on fisa reform. fourth, government should provide assurances that it will not attempt to hack into data centers and cables. we the constitution requires that the government seek information from american companies within the rule of law and for authors government
8:13 am
access, and we've taken steps to prevent such attempt by increasing strength in our use of encryption across from networks and services. nevertheless, we and others will press for clue government assurance. fifth, although recent government revelations have focused mainly on u.s. government and many of the subsequent debates that focused on the privacy rights of u.s. persons, we must recognize this is a global issue. as we seek to solar products and services to customers around the world, discussions that focused exclusively on the rights of u.s. persons are not enough. many people do view privacy as a fundamental human right and the have a very real concern about whether and how governments can access that data. we appreciate the steps president obama announced in january which acknowledged the need to address her connections about non-u.s. citizens. along those lines in the law enforcement context we've challenged the federal court ward in u.s. courts seeking customer e-mail for contact
8:14 am
details and a data center in ireland. and we've called for governments to come together to create a new international legal framework that allows for new streamlined processes for cross-border data access that can supplement existing rules. none of this should be taken in just that we don't out and appreciate the critical work that our law enforcement and security agencies do everything to keep us all safe. in fact, we work closely with u.s. and other governments to fight cyber crime and other threats. this balance is rarely an easy one. as chief justice roberts recognize in the case of riley versus california, privacy comes at a cost. the court in the decision exclude that privacy is an inherent and enduring value that must be protected. blog is not always a perfect
8:15 am
analogy between protecting privacy and the private sector, law enforcement and national study context, we and the private sector do with questions striking the right balance between privacy and the need. each of these contexts as technology evolves we need to reevaluate that balance. >> mike, could you wrap up? >> i will end there. >> thanks. we'll come back to some of those issues in the questions. our final witness or member of this panel is hadi nahari, chief security architect at nvidia come to a company that designs and builds high-performance computer systems. hadi is a photographer and computer scientist. welcome and please proceed. >> thanks for the opportunity to testify today. i appreciate it.
8:16 am
i'm here as a technologist, not as a lawyer. in silicon valley we say i'm not a lawyer rule of lies. our concern is about building systems that are buildable and creating rules that are enforceable. so i wish to provide some technology background to the panel and to the conversation. from our perspective security is to system to what harmon is to music. providing security as a foundation establishing rules of privacy is our model. we build systems that are enabled and are able to enforce rules, and that is the context of security as we see it. security is one of the intersection between technology and civil liberty. and we deal with issues such as
8:17 am
trust and active adversary in a system. this is how we build and design of systems. our world used to be similar, and sometimes i provide samples of that simple world. you all remember this as a mobile phone. this is from the phones that were actually doing just that, they reforms. some of these devices were statements of class. you remember this, right? this was a phone, this was a mobile phone. i worked in this coming. one of my favorites in the collection, text. this used to sent and received text messages. some of these -- oh, down, this was your personal digital assistant. i have some other -- oh, yeah.
8:18 am
pauthis was one of the darlingsf the valley. this is also a very important device that everyone carried. this is from the time the world was very simple. we build systems that did three basic things. thomas frieden, facebook didn't insist, twitter was still a sound. the cloud was still in the sky. 4g was a parking place. lincoln was a prism. applications for which is sent to college. and skype was a typo. said june 29, 2007, iphone was
8:19 am
introduced. the world changed. the world for us, technologist changed. probably for everybody else in the room, non-technologist and technologists alike also change. we are doing with devices that are not as simple as what we used to carry. it's only seven and a half years ago. i don't believe there's any other event in the history in the short amount of time has ravaged and gone through everything and try to change everything such as our foundation of our society. in the old and pre-2007 world we said things like you cannot enumerate all the attacks. cryptography is an
8:20 am
understatement and state-based explosion coming you cannot define a secure state of a system. it was difficult back then during these devices. it has just become worse. the guarantees that, we didn't know anything about future but a couple things i could guarantee, a couple things i could guarantee right here is that things would only get faster. we're going to build things that are faster. they're going to become smaller, a lot smaller. the going to become cheaper, and these devices are going to become more abundant. some of them we no longer care about, building devices that are usable for a long period of time. it's a lot more economic to build these devices that are basically throwaway devices. that's the concept that we're following. they are becoming more
8:21 am
connected. everything is becoming more connected. you have for things such as iot, internet of things, or as i call them things internet. everything is becoming very talkative. all of these devices are very chatty. they talk a lot. you guys all have phones, smartphones in your pockets. from the time that i started, which is about five minutes right now, until now, each one of those devices without you even touching them has transmitted, sent and received about half a meg data without you even touching them. this abundance of information that this happened, that is without you interacting is having a lot of ramifications of what -- we heard a lot of things about data is only picking the leading, it's not going away. we are generating more data than we can manage without them. a hundred hours of video, a
8:22 am
hundred hours of video is uploaded on youtube, and youtube is not the only recipient of this service. hundreds of hours of videos are uploaded to youtube every single minute. every single minute. we are building systems to manage and apartment allies and define -- compartmentalize, define and create and work with the systems. these data as we've heard are not going away. they are not disappearing. in the new world, maintain security is even harder. as a citizen, i'm very carefully following what is happening by this esteemed board as to what is the revocation of the decisions that we are making and whether that's enforceable.
8:23 am
whether we can build systems, but nothing is a good professional and creating a doable and enforceable security is as unpopular as being an atheist in jerusalem. no one likes you. some hoping that we can come up with a system that is also buildable. and lastly, i close my remarks and i'm looking forward to the questions. that one would think that i could guarantee is the attacks are going to increase. they're going to become simpler and easier to mount. and by one measure the number of attacks in 2013 were 3 trillion, only affecting private information. on average $27.3 per attack,
8:24 am
about $100 billion, the cost of these attacks. this date is 2013. none of the target, home depot, none of that information, none of those attacks are included here. so with that i close right -- my remarks and i look forward to answering questions. thank you. >> we will now go through a round of questioning, and board members as well will be subject to time limits year. i think i have 20 minutes and then each board member will have five minutes, and then still the possibility of questions from members of the audience. i wanted to build my first question off of the point that i think hadi was making at the end of their, which is that there seems to be this inexorable
8:25 am
trend towards more sophisticated devices collecting, generating, sharing, the meeting, autonomously, automatically, disclosing more and more information. and i think i'll go to professor anton first and then maybe come back to hadi with this, but looking at that phenomenon and the seeming an extra ability of it, the seeming inevitability of it. first on the technology designed side and then on the policy side, on the technology designed side, what do you see as any potential out all for limiting the growth, controlling the flow of that information?
8:26 am
you talk to some extent about the the possibility of technology protecting privacy. how does that square with this tremendous ongoing growth of information? >> thank you. so as was mentioned in the earlier panel, systems are getting more and more complex which makes compliance more and more difficult as well. i really hope that we don't limit growth and limit the ingenuity of new technologies that might have really great applications in the future and sal wonderful, really important problems. by the same token there's a lot of work that's been done, especially with work that's
8:27 am
being done it georgia tech impacinfact on how do we designe interest of things, or internet devices such that we're taking privacy and security into consideration given all of the output, all of the possible inputs. and engineers just simply need better tools for how to do that. privacy by design is think about these things early on and thinking about it as and after the fact. in terms of controlling information i think what we want us to secure the flow of information but not limit the flow of information. these are all things that researchers are actively working on in universities and research labs and industry as well. >> i've written myself about the potential for privacy enhancing technology and the value of privacy by design, but at the
8:28 am
same time at some level i just don't see it happening. >> so -- >> let me put it this way. but i see it happening and i take mike hintze's point, privacy by design is a corporate concept, but there are these other hugely dominant trends that almost seem to be overwhelming. >> so within the context of counterterrorism i think there's a lot of policies and laws that are in place. when i mentioned earlier i'd like to see more technologists in the room, it's not just the study after the fact that to be involved in forming the policy. because a lot of times the policy and law are written in such a way that we can't avoid it. and so what i'd like to see this more technologists involved in the discussion up front really
8:29 am
informing the decisions about laws that are going to be passed, about policies were going to adopt. because we could write them in a way that makes it a lot easier to comply with the law. >> do you have an example in my? >> excuse me? >> do you have an example in my? >> i work a lot in the book, for instance. we have a new -- hipaa. i had one ph.d student who was really working actively on how do we predict what the changes will be because when if i make the decision will have very little time to implement the change in system to be able to make sure we're compliant. had we had more technologists involved in the process we'll be able to quickly adapt our systems and we would have a better community of practice about how to salvage those -- established those laws and how to implement a system to make sure only the right before
8:30 am
having access to the right information at the right time. >> and certainly you would agree we need both under, clearer laws as well as more mindful technology designed? >> absolutely. >> it's not that one or the other would solve this problem? >> absolutely we need both. >> i want to go to alvaro. there was one point in your written testimony which he didn't mention of want to talk about now because i think it's very, very important. a lot of our constitutional of privacy is based upon the concept of reasonable expectation of privacy. there's a lot of worry and legitimate concern that with these changes in technology, that our expectations of privacy diminish. you talked about the fact that, in fact, with changes in
8:31 am
technology are expectation of privacy may actually be growing. >> exactly right. the point here is that it cuts both ways. usually the court talked about cats in the side of his able is becoming inured to this idea from the are surrendering. i think people, technology is helping people learn about what they think privacy is. the best example of this i think its location technology and facial recognition technology. previously people of no occasion to develop an opinion on whether or not expected the sum total of the movements to be developed, to be compiled in a profile. by the sudden is becoming radically cheaper to conduct that surveillance. i think in the same ways that you only realize what you had when you start losing it, for the first time a reasonable expectation of privacy in public is crystallizing in people's
8:32 am
minds. or a drive down the street or go to work i expect my colleagues at work to see me. my neighbors to see me. but i really don't expect anyone to know that i'm at all those places at all times matter where i go. and so i do think technology can expand our expectation of privacy. >> mike hintze, over the past 15 or 16 years that you've been at microsoft, do you think it's fair to say that your customers have become less interested and less concerned about privacy, or expect more of microsoft and other companies when it comes to privacy? >> i think they expect more. i've agree that expectations of privacy in some ways have increased. they have certainly changed as technology evolves.
8:33 am
people learn about it. it up. there's data sharing going on that people wouldn't have contemplated or excepted a number of years ago. but that doesn't mean people don't care about privacy anymore. it's very clear to us that our customers care about privacy now more than ever. you see that any amount of resources and attention and focus that we've put on privacy. it really is one of the top legal issues we are dealing with. it's one of the top customer issues we are dealing with. we hear every day from customers who have questions about how their data is being treated, how it's being protected, how it's used. people's expectations of privacy are not fading away. >> just to put sort of a nail in the coffin here, i think the government argues and there's supreme court precedent to support it that a person surrenders his privacy rights
8:34 am
when he discloses information to a third party, such as microsoft in the course of using microsoft products or services. but it seems to me from what you're saying that microsoft does not believe that its customers have surrendered their privacy rights when they had used a microsoft product or service, and thereby microsoft has acquired information to microsoft does not believe that that information is zero privacy interests, doesn't? >> absolutely not. on the contrary. to the extent of the third-party doctrine ever made any sense, it doesn't make sense today. people are putting all the information they use to keep in their homes and lockout and it's online, and cloud services. as recent court decisions have recognized, particularly in rightly, it's even more dated. there's more data in the cloud, more data being created that
8:35 am
reveal the most private and intimate details of people's lives that and cloud services come in the hands of third parties more so than whenever at people's homes. the expectations of privacy, around that data are quite profound. >> that's true, in your view, oath of content so to speak and noncontent or metadata or transactional data? their sensitivity in both categories. >> absolutely. i don't like the term metadata because it encompasses too much. we should talk about what we're talking about. there's a broad range of data that's collected or created or inferred through the use of online services to some of it is fairly benign. we call things, put metadata label on things like the amount of storage you using in your online storage thing or the
8:36 am
average file size. all of the fifth round that data. but as you go up the scale with mcgee content being the end of the most private, the stuff that people have the highest expectation of privacy, but other things about you, you are communicating with our right up there, right up against content in terms of what that can reveal about people's relationships, associations, thoughts, beliefs, et cetera. there is very important privacy implications about that i think as well. >> you mentioned the transborder issues and the fact that people around the world recognize privacy as an interest, in many
8:37 am
cases as a human right. we're doing stand and what are you aware of, or what do you know about -- is there any progress being made multilaterally or bilaterally or in terms of developing standards for transborder surveillance and transborder government access? anything in the works there that we should be made aware of? >> not that i'm aware of specifically. there's certain more discussions happening in recent years and has been in the past around a number of constituents and interested parties on privacy on the globe. the chairman and i were at a conference with the issues were loudly and vigorously discussed and debated, and so that
8:38 am
dialogue is happening. but in terms of actual progress towards making headway in terms of developing an international framework for this stuff, they're certainly a lot more work to be done. >> i would ask you and others as well as members of the audience, if and when did he become aware of things that are making progress, please let us know. obviously, we are remaining interest in that, in the transborder question. hadi nahari, you know, we talked about privacy by design. in your experience, do technologists give adequate consideration to privacy as a design products? and what more could be done to encourage or promote privacy by
8:39 am
design? >> in technology we build things that are reasonably well defined. so i recognize in the previous panel there was a discussion that you don't necessarily need to define privacy to be able to enforce it. on the technology side, if we are able to build a model that represents a need, then we are very good at building it. i think part of the reason that mapping a very human, very societal concept such as privacy into the devices that we build, the services that we build and we use, sometimes it's simpler, sometimes it's not there to answer your question i see a great deal of attention, a great deal of interest in the notion of privacy, privacy by design, secure by design, trustworthy by
8:40 am
design, and especially in the field that we're dealing with our model and security of the device when we release it and goes to the field. it's in a mutually distrusting system. so you don't really know. it's one thing to build a server that resides in someone's data center where you have full control over the actual device and you have to control the flow of information, the software that is there and how it's used. it's another thing to build a device and leave it in the hands of the users, and guessing what they want to do. and then it's one thing to have a notion of privacy as we do and build a system based on the. it's another thing when you take a look at this, should i call this a generation gap as to -- there's this company called snapchat and the promise that whatever picture take is going to disappear. anyone who works in technology knows things like this are not
8:41 am
possible. you could sell the ticket picture of that device. but we call it job security. then when they realized that this is not really possible, they announced it and they are under the of the government for about 20 years to make sure they do things right. i know they're paying a lot of attention to make sure they get things right. but then you take a look at the users. i think the status, status was released last week the week before that the asked college students to more than 50% of college students said yeah, we still will you snapchat. they are aware. they understand. i don't know how to reconcile that. there is a generation, new generation that has, i don't know whether it's more unless but certainly a different expectation and definition of privacy. it is a vagueness of what does
8:42 am
that mean in terms of a system that could be built. once those are in a reasonable state, we are really good at building systems that satisfy those rules, hence my opening remarks as to our model in the industry and the technology. we understand the rules. they're very good at, you know, creating those rules, building systems, devices, services and everything that enforce those laws would have to be buildable and it has to be enforceable. >> the rules have to be clear and if they're not could you don't know what to build to? >> semi-clear will do. we used to live and will be for 2007 that everything had to be really, really well defined. it's no longer exist but we have a new generation of hackers that do not abide by the rules. therefore, we have to create systems that are almost ready to we are seeing in the programming
8:43 am
languages, in the design of systems, seeing it in self-correcting systems. sometimes, somehow, someone accurate will do. >> annie, did you want to respond to that in a minute remain? >> this reminds me of what i was talking about, practical encryption and anonymization. and so i think there are times in certain applications where the risk is fine. there are other instances where it's not fun. him that's where guidance from pclob can be very helpful in to defend the got what of the risk profiles and when is it the weekend pretty good rules and when we have to very tight, accurate 100% certainty kind of rules. >> okay, thank you. at this point other members of the board of will post questions and a five minute rule, and we will go in sort of reverse order down the line starting with
8:44 am
rachel brand. >> and you, jim, and thanks to all of you for being here. that's a good segue because the first question i was wanting to ask was, doctor anton, i was interested in what you're saying, domestic violence you want to be perfect perhaps. can you explain what you mean by that? within example of identification method that might be good enough but perhaps not perfect? are not a technologist as you know. if you ca could help me out that would be great. >> there are certain cases of studies that have been done, for instance, when the netflix put up their data online and then researchers went and looked at the internet movie database to try to seek whether they could reidentify people. they had resources that was readily available information, and this context i don't think
8:45 am
anyone was personally hurt by it, but there might be cases where that kind of identification could be extremely damaging. so the more we talked earlier about aggregation of databases and how the ability to link different kinds of information across different kind of databases could actually be detrimental. it can also help us find the bad guy the. that's the tension, right? so when is it okay and when is it not okay? are there instances, for instance, for netflix or something that's available online it's just not, where you want to score something that's not very important, it may not be really necessary to worry about where you are. but in context of a group that is actively trying to mount a terrorist attack, that's really important.
8:46 am
>> i guess it makes sense in terms of when it's important and when it's not important. but how do you do with? how do you do the perfect domestic violence context? >> i think it's very difficult. i think we have technology is pretty good but not perfect. the idea is do you keep the data unencrypted and then easily accessible, or because it's not very important, or to actually encrypt it and then use reasonable practicable anonymization on top of that? so it just depends. it just depends, i think this is one of those cases where technologists would welcome guidance into helping us figure out what are the risk profiles. technologists don't have access to sometimes what the risks are within a counterterrorism context. >> for mr. bedoya, you said
8:47 am
something along the lines of a national security context, some of the phipps must apply even if they all came. can you elaborate on that? >> sure. the first is a historical point which is that when the report was issued, i wish his reading a couple pages, they said we just set up the standards. clearly all of them can apply to all intelligence records but some of them must apply because the risk is to put on it some protection to to put it more concretely, the more difficult ones are individual for to station entrance into. i think there are ways to address these, at least on an aggregate level that would be really powerful. i think in the 702 context, the board has, and to take a step back. i think it is shocking to one half years after the snowden
8:48 am
disclosures the american public doesn't have even a rough sense of how many of them have had their information collected to take the telephone records program. people think it's everyone but the of news reports saying 30% of calls are actually recorded. so in the 702 context the board has recommended there is measures to identify the scope. in all my time in the senate i never said anything that would lead me to believe that they itd be impossible for the industry to produce an estimate based on statistical sampling of a number of u.s. persons collected and 702 dated the in the transport context there's another thing she did to quantify scope. one of them could be releasing the number of queries done on data. i think there are ways to address these principles. >> does anybody else have a thought that? >> i do in terms of transparency. this is another way in which, for instance, fiscal technologist could be helpful. -- fisc.
8:49 am
heidi says, whispers in my ear, i spoke with jim about the panel to the time it gets to jim it's going to be i spoke to jim about wearing flannel. when you get lawyers talking together from the nsa and fisc about technology, and you don't have a technologist there to ask questions like, or make suggestions about we could actually -- have you thought about including this metric, collecting this data or instrument the software in certain ways? we could improve the ability to have more transparency and more oversight. >> i'm going to try to get a question and for each panelist i would appreciate great responses. annie, you said that encryption is good for counterterrorism. i guess i would like to understand more. i understand mandating a backdoor weakens protections but
8:50 am
it would seem as the terrorists cannot hide communications would seem to be detrimental to counter terrorists. >> it's a better world when everyone can hide their information. so there was a case in greece where there was a phone and someone was able to actually start because of the backdoors and they were able to listen to the conversations, basically to wiretap on the prime minister to that's what happens when you do encryption and security by default. to think that the terrorists aren't going to do the same thing i think is naïve. >> alvaro come to talk about expectations of privacy. if i heard you correctly and tell me if i'm wrong, you're suggesting we should not -- i can put up a sign saying i'm conducting a surveillance and i can destroy the but the expected action of what private should be spent on i cannot sing the.
8:51 am
that's a separate wonderful powerful argument to what i'm saying is that technology is making us realize we do expect privacy in scenarios that didn't exist 10 or 15 years ago. i think technology can expand your notion of privacy but i also think the fourth amendment doesn't just protect me and you. it protects us as a society and as its base relationships between the government and its citizens but also needs to be protected. >> fourth amendment, mike, you talked about the balance in government requests and your customer's privacy to defend the government should have a war and every time it accesses your customer's records? >> certainly in the law enforcement context we've abdicated for reform that would in effect require a warrant for access to any content regardless of the age to precise location information and other sensitive data.
8:52 am
i'm not sure we would go so far to say that a warrant is required in every single case in every single data type. we certainly need to update the rules so that there is appropriate judicial review of surveillance programs and specific requests that we get for data. >> in terms of third-party doctrine, would you then essentially not have the an absolute exception to the fourth amendment but essentially where would you go with it to provide some protection? >> the laws that we deal with in the long for the context provide a sliding scale in effect. provide some reasonable oversight and protection, something below want and probable cause. we've taken the position that's appropriate for some types of subscriber data, et cetera. >> thanks. and hadi, you talked about, i
8:53 am
want to put this in the context of how much information should be collected. you talked about enforceable rules for collection but you also said that collection is going to be faster, cheaper. we're going to be all more connected and that attacks will increase and compliance with rules may be more difficult. professor feldman talked about potential abuse of its mission and also the increased possibilities of breach. how would you strike the balance between collection rules and essentially use rules? >> that's a very good question, very difficult one. i don't know in the technology side of the house, i don't know if we really know where the balance is. when you take a look at attacks, to collect the system take a look at the capabilities, we take a look at the mere fact that all of these attacks, all of these exploits are becoming so advanced that i used to give you one concrete example. i used to need to be physically
8:54 am
around your things that you touched to be able to lift your fingerprint and then have access to your phone and then use that fingerprint to mount an attack and use your biometric. with a resolution of the cameras that we have these days, very high resolution camera, i just need to have your picture that was taken somewhere in china to be able to zoom in, zoom in, zoom in and then lift your fingerprint and mount an attack. how do you reflect things like this, should we build systems that whenever there's a fingerprint smudges it and we don't expose it? that there are things like this that i encompass all of those cases that should be buildable, that what i'm trying to get across is coming up with the rules that define those capabilities or things that should be done and shouldn't be done is a very complex problem. >> thank you. >> thank you, guys, for another
8:55 am
excellent panel. my first question, and this goes back to what i said on the previous panel, which is ideal our job to be translating these ideas, these concepts, these concerns into practical recommendations. starting with you, mr. hintze, what have you found effective as they privacy officer to ensure your very large workforce, or complicate workforce even with emerging issues takes privacy seriously, your rules are enforced, and that from beginning to end private as a part of your culture? the this it free advice to the new privacy officer over at nsa. >> well, thank you. [laughter] as i alluded to in my opening remarks, one, there's no silver bullet. you need to take a number of approaches. we've taken a number of
8:56 am
approaches to drive awareness and sensitivity of our privacy throughout our workforce, through a number of steps on mandatory training that's required for all employees that cover a range of ethical and compliance issues. deeper role-based training that's specific to software engineers, specific to sales and marketing people, that's specific to different roles that people put in the company that impacts customer privacy. we have as i mentioned, not just sort of told people what the rules are and then crossed her fingers and hope they abide by them. we have put in checkpoints in the way that we've developed our internal systems, the way you develop software and get out the door that has to go through certain checkpoints and reviews to ensure that i was the issues are not missed or overlooked. so there's a number of things we've done along those lines to make sure that people are aware
8:57 am
and have the tools available to them to do privacy right. but then there's also different checks along the way to ensure that mistakes don't get made. nothing is perfect of course, but we tried to do a multifaceted approach or a multilayered approach to make sure we catch those things. >> let me follow up on this with a specific example, but hypothetical. have you found training to be more effective, or effective enough in athens of pairing with mechanisms -- does a irbil questions i'm just going to start over again and say, so 702, the program has certain legal requirements. in the private sector would you train to those legal requirements for which also have, for example, when an analyst is sitting there attempting to target or select are whatever the going to do, also have at each stage at the
8:58 am
screen for the process or however they're doing it rules reflected in the computer system that there attempting to use of? >> we do both. to the extent you can use technology to enforce policy, that's always super effective, because you get past -- or to reduce the potential for human error. but that's not always possible. you can't completely prevent mistakes, oversight or intentional bad acts. so you need to do more than that. you have to build the awareness so that the inadvertent stuff is reduced. you have to build in the technology tools to prevent that from happening. they need some level of checks to make sure that everything went right. if it's somebody was intentionally trying to circumvent a policy for whatever reason, that there some way to
8:59 am
catch that before it creates a negative impact. >> so i think i have time for one other quick question. in section 215 program, one of the features was, in fact, it's not all of the call detail records went to the government. in fact, names are not provided originally to the government, subscriber information, numbers to numbers. would that be an example of identification and anonymizati anonymization? that was my only question. >> sure. ..
9:00 am
>> is this something that they consult with the technologists about, hey, we need this kind of information for national security, but we'd like to get it or as much as we can what's the balance off? any of that kind of thing go on inside the government with technologists? >> right. so having worked a lot with the government, i know that they consult technologists greatly with security, with privacy, with compliance issues and how do we engineer software that takes all of that into consideration. i think if we look at the past five years or so, six years or so, that you'll see that the nsa was really, really focused on
9:01 am
compliance. and i think the results of the reports and the oversight have shown that they've done a really good job with that. when there's been an issue, they've dealt with out. i think someone mentioned the new cpo at nsa. i think what we'll see just right now is not only are we complying with law going to be factored into the tools and techniques and procedures but also now should we really be doing it just because it complies with law, and what's the extra step we're going to take to consider privacy at the onset? is. >> so you sound reasonably satisfied with the fact that they're taking it seriously and doing the best they can? >> i absolutely do. i wish -- i actually feel very comforted by the fact that the government has a ton of oversight and a ton of laws to comply with, and i personally am much more worried about the large collection, amount of collection that's taking place in industry that people don't really understand. >> all right. so i can get on to my next --
9:02 am
mr. beguy ya -- begioia, you talked about how important it was to limit collection to what was necessary or purposeful, etc. but in light of so many of the experts on both panels have talked about collection, collection, collection, where would you look for the mechanism to try and limit the collection or get that kind of impediment or balance on focus? >> certainly. so i think folks have been saying that it's inevitable that industry is going to collect all this data. i don't think folks have been saying it's inevitable that government will collect it. and i, for one, don't actually think it's inevitable that industry will collect it, but taking that as a given, i think the question is about reconstructing the firewall between government and industry
9:03 am
with respect to data collection. so i'd be surprised if anyone on the panel thinks or on the previous panel thinks it's unevident bl the government will collect all this data. on your previous question, i believe that the congressional committees that conduct oversight on fisa and on foreign intelligence, certainly the senate jewish dish -- judiciary committee lacks a technologist. >> i think we talked about that on our first report on fisa reform there. okay. and mr. hintz, you talked earlier. you said one of the principles was there shouldn't be any bulk data collections. now, terminology is buried all over the place, so it would help me if i you what you meant by bulk collection there. and let me say one gathering with public health people, they talked about the great importance of public health data, you know, especially when epidemics come along or that
9:04 am
sort of stuff. so wouldn't some of that come under your ban against all bulk data collection? >> yeah. i was talking specifically about government surveillance programs. >> okay. that -- i just wanted to clarify that because -- and what do you mean by, give us an example of what you'd call bulk data? this has been a debate, whether this program or that program falls under bulk data. >> certainly. i had in mind the 215 program in particular where government -- >> it's not targeted. >> yes, it's not targeted. correct. >> okay. i think that's all i have right now. >> we may be able to go back to board mens for additional -- members for additional questions. i would like to continue with this panel up until the top of the hour. we have one question from the audience which i will read, and we welcome others if others want
9:05 am
to pose questions. in 2005 the national academy of sciences studied whether pattern-based data mining can anticipate who is likely to be a future terrorist. it concluded that this wasn't feasible. the question is, is pattern-based data mining in terrorism a context? is it feasible today and will it be feasible ten years from now? would anybody like to address that? hadi? >> i don't know specifically about terrorism. mindful of what ed mentioned as to we have limited data. but there is a program that has been running in los angeles in lapd. we may not necessarily still be able to identify specific criminals, but our productive modeling systems have been at
9:06 am
work. they're able to make a reasonably good prediction about where the criminal activities are likely. it is not precisely the question that you're asking, but i can assure that it is just becoming better. i can assure that any service provider that has the amount of data that, you know, we are generating and it's becoming more and more and more generated is just a honing and fine tuning and polishing their models. whether it's going to be applicable to anti-terrorism methods, i don't know. i think that all of these models are heavily data driven. so one would need a lot of data. but to the point that these models, these predictive modeling are able to predict thicks that may -- things that may relate indirectly to terrorism or criminal activities, the systems are suggesting that we are going
9:07 am
that way. >> other thoughts on that question? there's a system in chicago that the chicago police department has deployed which both has been touted and criticized, but it does somewhat the neighborhood or block level predictive productions as to criminal activity as well as, i understand, individual level identifying people who may be either victims of crimes or perpetrators of crimes. again, both touted and highly criticized. any thoughts or comments? >> one just quick one which is the risk of creating a feedback loop, you know? if you predict there will be crime on corner x, you watch it like a hawk, you see every crime and, therefore, draw an overrepresented sampling of crimes at corner x reinforcing your prior conviction that you
9:08 am
thought corner x was dangerous. so that's from my perspective. >> so this is certainly not industrially my area of expertise, however, predictive is different from being able to reconstruct after the fact. and so can we use these things to then if -- when something has happened -- go back and find whether we missed certain people that are still involved, yes, i do believe that's the case. in terms of predictive, i think we have a ways to go. by the same token, every morning i get a crime rating crime report for all the crime in my area, and i can predict whether there's going to be crime in my neighborhood. so we're getting there. >> i mean, at some level that's just com is -- comstat all over again. the system that has been available to police for decades. one question, i'll go down the row with additional board
9:09 am
members, with board members if they have additional questions. i had said in talking to each of the panelists i didn't want this to be a panel about going dark, and the implications of end description, but several of you have alluded to encryption and its significance here. and i would ask any of you to comment on the following: there is a growing trend towards more and more devices, cheaper and cheaper wearables and the internet of things. more and more data collection occurring. there's also, it seems, a trend towards more encryption by default whether it's at the device level or, as mike hintz
9:10 am
was referring to, the encryption of data flowing into data centers. so it seems to me we have two things going on at once which is not unusual. somebody referred to the modern era, the era of the internet of things and big data and you ubiquitous data flows as the golden age of surveillance. and it seems to me that both trends will always be there. more and more information available both to the private sector and possibly to the government and increasing pervasiveness or at least increasing diffusion if not comprehensively diffusion of encryption.
9:11 am
comments on that as a premise, first of all, the premise of my question. am i right? and then secondly, where does that lead the government, and would you agree with my assumption that there will still be huge amounts of information available both to the private sector for its purposes as well as to the government? i guess let's go right down the row. professor anton? >> so i believe that there will still be a lot of data that's available to the government. when i say that i really support end description by default, i also really think that our country really, we were the code hackers. and it was really critical in world war ii. and i think that instead of just kind of taking a lazy approach and saying, oh, leave us a back door, we should just get better at cracking the code because they're getting smarter, and we
9:12 am
need to get smarter too. so i leave it to the lawyers to decide what the legality of when you can actually apply that or break into a system is. but being satisfied with just having a back door means that we're not advancing our state of the craft and our trade craft in this country, and we're going to be left behind as a result. >> i'll actually pass. >> yeah. mike, two trends seem to be occurring simultaneously. >> i mean, we're certainly seeing an expanded use of encryption, encryption between customers and the service provider and encryption between data centers, encryption on devices, etc. and that's being driven by customer demand. i mean, customers are concerned about the security of their data, and they're not just concerned about the security of their data vis-a-vis hackers and bad guys, they're increasingly
9:13 am
concerned about the security of their data vis-a-vis government. and so to the extent that there is that concern out there, that's driving government demand, that's driving customer demand for these security features, and companies will continue to invest in that. does that mean that there will be no data available? i don't think so. i mean, the nature of many cloud services requires service provider access to it. you can't run an effective e-mail system without being able to filter the content for spam and malware. and so there will be a point in the communications chain where data is available, and that means if it's available to a service provider, it's available to a government through lawful demand. >> hadi, any thoughts on this and then i'll yield. >> first off, i want to agree with dr. anton's point, which is
9:14 am
to get better. we cannot ask industry, oh, don't incriminate, don't do anything -- encrypt, don't do anything. i would love to follow that when chinese and russians also follow that as well, so that's just not going to work. i'm very respectful of the problems that the law enforcement agency has with the current state of affairs. we just have to get better. and at the end, it's going to work better for us as a nation. so that's number one, i fully agree. some things, so going dark, i don't know if it's going dark. i know that we are currently in a state that we are really able to think certain way about the system design, about the system security, about maintain bing privacy. -- maintaining privacy. that world has changed. the world and the industry has changed rapidly. the rest of us are are catching up. so i think it pays dividends if we figure out, take only time, figure out what are the rules of this new world where we don't
9:15 am
necessarily need to rely only on encryption. i'm a big fan of encryption. i think it's one of the tools that security professionals and everyone has. but there are others. the fact that some data is encrypted is not on its own necessarily the end of the world. i mean, how many times i know michael mentioned that, you know, we are overusing this notion of metadata. but if you think about metadata as something about the data, it is meaningful when you see some encrypted data as being accessed a little bit more than the other. one could discern, one could learn things about it. once we start learning how to deal with the system, then we could maintain encryption, then we could maintain stronger encryption. we could also deal with the cases where we don't have access to clear. i think our law enforcement, i think our government, our legal system, i think us as a society are in the process of learning how to deal with this new world where things that we knew in the
9:16 am
past no longer apply. lastly, the new generation figured it out. i think their doing a lot better -- they're doing a lot better. they're figuring out that you cannot expect everything is going to be fully protected for you. they're figuring out ways to live in the world where they're posting a lot of things on facebook that, i mean, they're trying to learn how to deal with the system that, you know, you may not have the capabilities of asserting your privacy the way that our generation did, but still having expectations about their rights. >> does a particular board member have a question? >> i do. >> yes. >> several of you have mentioned -- >> i'm sorry, yeah. >> several of you have referred to oversight in one way or another, and i just want to ask a question about that. in my view, oversight is especially important in the intelligence context because of the necessary level of secrecy. it's important in all areas of
9:17 am
government, but especially here. but at the same time, when you start to layer on box-checking exercises and paperwork, there's a point of diminishing returns, and you sort of have oversight for its own sake that doesn't actually deter misconduct or compliance with the rules. do any of you have thoughts on principles for what's effective oversight as opposed to just another box-checking exercise? >> so i certainly have a few thoughts for the legislature. i think that there's been a lot of soul searching around how the executive needs to change its practices with respect to internal oversight, but i think there's some pretty serious problems at the legislature. one of them is the technologist issue that i mentioned, and another is clearances. i am, i can saw with moderate to high confidence that most united states senators lack a staffer with tssei clearance. i hope i'm wrong. i don't think i am. and the fact is that all of the
9:18 am
key briefings for these senators are conducted at that level. and as a staffer, i know there's a lot of staffer into the room, you don't send your boss into a meeting about soybeans -- sorry. you don't send them into a meeting on an issue that seems very easy without a staffer. and a lot of these folks are going in unstaffed. thankfully, folks on judiciary and intel have dedicated folks for the committee that they can rely on, but outside of those committees you're often flying -- i don't want to say flying blind, but you don't have the resources you need to actually conduct that serious oversight. >> i have two -- [inaudible conversations] >> follow-up questions for professor anton. >> go ahead. >> on the identification. one is you commented earlier about phone numbers associated with the names would be identified information, but would you -- >> [inaudible]
9:19 am
>> okay. obviously, the availability for directories makes that -- >> [inaudible] >> then i guess you also had commented earlier that having a hock on your door -- lock on your door was pretty good protection against burglars but, obviously, not perfect protection. and i guess the question is in the context of a massive database, burglars may not have the incentive or wherewithal to break into everyone's home in a community, but with a massive database, with a brute force attack you might be able to get a very valuable return on it. so does that suggest that deidentification needs to be, essentially, stronger or may not even be sufficient? as you pointed out in the netflix example, two professors have written articles about the ability to identify. is it a useful tool in some instances but not others? and where it's useful, does it have to be a pretty enhanced
9:20 am
form of de-identification? >> well, i think it's better than nothing. we have to work harder at it to get it, to get access to it, right? to really be able to understand it. but that's going to help us with the, you know, the high school kid who's just trying to tinker around, right? but i think this is another example where encryption's really, really important and very strong encryption. so i think it's a blend of both. >> thank you. [inaudible conversations] >> will just on the issue of de-identification and anonmy. i had understood it as a concept that could apply in varying degrees, so at a period of time, it has been debriefed from the identifying identification, and now they have to go to court in order to reassociate it. i don't think i was asking you to say it had been permanently de-identified or anonymized. to the extent that we're looking at evolving standards or
9:21 am
evolving notions of expectations of privacy, how do you quantify it? is it because 51% of folks in a washington post poll said i care about this, but i'm still using facebook? do you look at conduct? to can you look at the fact that people -- do you look at the fact people inside the beltway really care? people at the ivy leagues care? what is a good way to identify emerging notions of expectation of privacy? >> i'm not going to pretend to know the right answer to that question. it's a really, really hard question. i certainly think that looking at conduct is extremely valuable, and there's been a lot of discussion about the third party doctrine. and the fact is it doesn't remotely represent what the american people think about privacy. you know, if your social network only had the settings of public and only me, that was the only option, you know, people would say this is ridiculous. and i do think it sounds strange
9:22 am
to say it, but we do have something to learn from the best practices of these social networks in that they very much see the world as a series of segments, and they respect the fact that sometimes you want to share something with segment a and not segment b. that's certainly valuable. i don't know a good test about, you know, identifying a reasonable expectation of privacy. i'll just repeat myself. i think we need to see that as a standard that can expand and contract. >> okay. >> if i could quickly add, after the snowden leaks there's anonymous search engine called duck duck go, and the number of people who started doing searches on that search engine increased, i think, by over 100%. so there's one way you can watch people's actions. >> just one very, very quick add-on to that. it's not a binary thing. you can't say that, oh, people say they use -- say they care about privacy, but they continue to use facebook. you have to look deeper.
9:23 am
you have to look about how they're using facebook, whether they're using the proves controls, how they're engaging in those services. if you look deeper, you see some pretty sophisticated choices people are making in ways to protect their privacy that's not apparent on the fact that, oh, you're using a social network, you must not cower about that. >> i have -- you must not care about that. >> i have a question. between the first panel and the second, i heard -- i hope correctly -- that there is some difference of opinion on a couple of things or maybe slight. you, i think, ms. anton, you suggested in answering a prior question of mine that you thought the government was, indeed, involved in trying to build privacy into the technological aspects of some of the programs. on the other hand, earlier i think you said that in threat modeling very little privacy considerations were going into that. other people said that it wasn't
9:24 am
inevitable that the government would keep collecting more and more information, but i think i got that impression that maybe it seemed to be going that way from mr. fenton on the earlier panel. so my question is, basically very briefly, if there were one area of priority if you were running the government's overall privacy protection that you would suggest they concentrate on and could perhaps include privacy protection without endangering national security, what would it be? you can do very quickly. [laughter] >> i think that we really need to work more on privacy standards and not -- proves standards globally -- privacy standards globally and also that aren't rigged in some way to help some government or sector of industry. i think that's the number one challenge right now. >> okay. other people? >> yeah. i would say it's ending programs that involve the bulk collection
9:25 am
of americans' data. >> i couldn't hear the -- >> ending programs that involves the bulk collection of americans' data. >> okay. do you have in mind any except 215? [laughter] >> i didn't have tssei clearance, so i don't know. [laughter] >> okay. mr. chairman? >> wait a minute, somebody -- >> oh, i'm sorry. yes, please. >> one last thing and i don't know if this is the elephant in the room, if one thing i would put as an item of priority, it's our systems and technology. they're very much built as one way, so i would introduce the notion of revocation. if something goes bad right now, if i'm releasing all of this information, there is no way for a user, for a citizen to go and push a button somewhere and say revoke all the rights that i gave to x, y, z service providers, and i want to go ahead and clear everything. so the defining what that revocation means, ramifications of that and how to crystallize
9:26 am
it as a requirement for the industry would go a long way -- >> that's, that was primarily the industry. that wouldn't effect government. i mean, if i gave the government some information under some program which i thought was going to benefit for me and later on turned out it was being used in a different way, would you revocation principle apply there? >> if i have the right to revoke whatever government had collected about me and i knew things that our government and the position of government and i was able to revoke that, perhaps that would be helpful. >> thank you. >> so this concludes our second panel, it concludes our morning session. we will reconvene at 1:15 with a panel of government privacy officers. [inaudible conversations] >> this weekend on c-span,
9:27 am
author and president of arabs for israel. >> i had arrived late at night on september 11th morning. i arrived in my home in los angeles. and i woke up at six a.m. l.a. time to seeing the second airplane hitting the twin tower live. and i was traumatized because that was when i knew that this is terrorism. it's not one airplane accident. so i, i ran to the phone, and i called many people in egypt. i wanted them to comfort me, especially after i learned that mohamed atta, the leader of the 19 terrorists, was from cairo. the same city i came from. and i called around eight people, and they all said the same thing even though they --
9:28 am
some of them don't know the other. they told me how dare you say that this was done by arabs or muslims. don't you know this is a jewish conspiracy? the jews did it. and i hung up the phone and wept. i suddenly felt i cannot relate to my culture of origin anymore. and this is a very hard feeling, when you can't relate to how the people you love and were brought up with for many, many years of your life, that they don't see the reality as it should be. >> her entire interview sunday evening at 8 p.m. eastern on c-span's "q&a." on booktv we're featuring new releases. best selling author karen armstrong on religion and
9:29 am
conflict. president george b. bush on his biography of his father. and on american histy tv on c-span3, our all-draw, live coverage of the world war i centennial symposium from norfolk starting at 9:35 a.m. eastern. let us know what you think about the programs you're watching. call us at 202-626-3400, e-mail us a at comments@c-span.org or send us a tweet @c-span @c-span@comments. join the conversation. like us on facebook, follow us on twitter. >> the c-span bus tour travels to u.s. cities to learn about their history and literary life. and this weekend we partnered with charter communications for a visit to madison, wisconsin. >> bob la follette is probably the most important political figure in wisconsin history and
9:30 am
one of the most important in the history of the 20th century in the united states. he was a reforming governor. he defined what progressivism is. he was one of the first to use the term "progressive" to self-identify. he was a united states senator who was recognized by his peers in the 1950s as one of the five greatest senators in american history. he was an opponent of world war i, stood his ground advocating for free speech. above all, bob la follette was about the people. so he spent the later part of the 1890s giving speeches all over wisconsin. if you wanted a speaker for your club or your group, bob la follette would give a speech. he went to county fairs, he went to every kind of event that you could imagine and built a reputation for himself. by 1900 he was ready to run
76 Views
IN COLLECTIONS
CSPAN2 Television Archive Television Archive News Search ServiceUploaded by TV Archive on