Skip to main content

tv   Key Capitol Hill Hearings  CSPAN  November 17, 2014 9:00pm-11:01pm EST

9:00 pm
marketplac marketplaces. so we don't know yet how that's all going to play out. how many people are really going to switch, but people really need that kind of information to make informed decisions based on price knowing whether their same network of doctors will be available on a different plan. >> great point, sara. just last month we put a consumer guide to networks out. you can find on your network. we put a lot of time and effort into this. we worked with literacy experts. it's a very helpful guide and educational tool that consumers can use when they navigate different plan network configurations and provides them with the kind of information they need to make sure they're making the right choice and we'll continue to focus on transparency because consumers have to know what plan is best for them. it's the whole part of value in making sure consumers get the
9:01 pm
care they need. >> i would just add quickly that the national association of insurance commissioners is currently reworking its model act on network adequacy includes addressing the question of provider directories and holding 2 1/2 hours of conference calls every week which i think is unprecedented in my experience. the state regulators who are ultimately responsible for this are very aware that there is an issue here that needs to be addressed. >> commercial alert. the alliance will be holding webinars looking at draft form already has been released new model regulations that the naic is developing in the area of networks. and that is on the 18th, is it? 19th. you'll be getting -- you'll be getting a notification soon.
9:02 pm
yes. go right ahead, ma'am. >> good afternoon. i'm an attorney trying to break into health policy. my question builds upon several of the questions asked u but is specific to consumer confusion in anticipating their costs as it relates to the tax credits that they've been given. so as mr. jost mentioned supreme court did grant cert on this issue. we may understand what that means in terms of impact on the decision and when it will take effect, many other individuals may think, that means i don't get a tax credit. in addition to the fact that the data that they had with regard to their taxes may change. so i was just wondering what the efforts were that have been made to address this specific issue. >> well, i think there are representatives of the media in this room and i hope they help get this message out that the effect is going to be
9:03 pm
prospective rather than retroactive. people should go ahead and enroll. as to what the supreme court will do, i'm hoping that obviously that they conclude that the irs is properly interpreted the statute and that solves the problem. if they don't, there are very serious problems that face not only millions of americans who receive premium tax credits now and who would lose those, but really the non-group market in two thirds of the states. we could because of all the carry-on ramifications of this decision, i think it's not just low income people, it's not even just moderate income people who could lose access to health insurance. it's virtually anybody who purchases health insurance that they don't get through their employer or government program. i think it's very important that people understand that this is very serious business. there is no easy fix.
9:04 pm
the administration, i believe, has properly interpreted the statute, but if the supreme court disagrees, this is going to be a national crisis, which i think congress will have to fix. >> if i could just follow up on that, too, and just ask tim to comment on there's been a lot of talk about states just if the decision were to go in favor of the plaintiffs, that states could just -- that have federally operated exchanges could go ahead and set up their own marketplaces. >> well, how much time do we have? i mean, in the first place, that's not -- it would take legislation. it would take probably legislation. some states may be able to be done by executive act. there's a few states in which that's already been done. states would have to affirmatively embrace setting up their own exchanges for that to happen. secondly, under current regulations a state has to give the federal government six and a
9:05 pm
half months' notice before it sets up a state exchange. and under current guidance, it has to do it by really the first of may and have it approved by june 15th. so, if a state -- if the supreme court delivers its decision by the end of june, it could be 2015. states would have to come up with funding. they would have to appoint a board. they may or may not be able to contract with the federal website but exchange is a lot more than a website. it's a navigator program. it's certifying plans. it's a heavy lift. and it's a heavy lift that would demand a serious political commitment, which i think is not there in quite a number of the states. so, again, i think that if the supreme court decides this case in favor of the plaintiffs, which i hope it won't, it's going to mean massive disruption of insurance markets that is not easily fixed.
9:06 pm
>> thank you. >> thank you. >> thank you. john graham for the national center for policy analysis. dr. sheshemani slide you had 7.3 million the other report said 7.1. the end of first open enrollment is 8.1. where did those 1 million people go that have left the exchanges. second question which is kind of related is the network adequacy, how confident should by that if i sign up for 2015 the network that that nurse, cig that or blue gives me now will be exactly the same all the way through 2015 or will doctors get fired or drop out? thank you. >> well, some of them may die. >> what are you going to do about that, meena?
9:07 pm
>> i think that the reduction is pretty obvious that any insurance company will have attrition over the year. actually a 90% retention rate is pretty good. also, you know, you have people who get a job, they get employer coverage. you have people who lose a job. they get medicaid. there's just a huge turnover over the course of a year. and i was frankly surprised we had 7.1 million in. that's a very good retention rate. in terms of providers, that's a problem now with employer plans too that have networks. that's a problem with any network. and you can't be for sure that your doctor is going to be there. >> so i would just add that the 8.1 million and the 7.1, it's not the same group of people. insurance is constantly -- it's dynamic. people come in and buy.
9:08 pm
people leave. people get married. they get divorced. they have children. they get a job. there are any number of reasons that would lead to a natural churn in the marketplace. it's not that you had 8.1 million and then, 1 million people left and you ended up with 7.1 million. the composition could be different because of all of these different factors. that's the one point that i would add. >> i want to follow up on that question, too, with what we're projecting this year to understand. both tim and meena mentioned the numbers of people that are expected to enroll this year and what the total number will be, 9 to 10 million. can you unpack that a little bit and what that is comprised of. >> sure. since we had experience from this past year, there was some data that could be used to be able to look at of the addressable market, so you have people who are currently enrolled and then you have people who would come in as
9:09 pm
newly enrolled. and for the newly enrolled, we're able to take the experience from this past year in terms of takeup rates of the percentage of people with various characteristics that took up coverage to be able to get a sense of how many people may come in newly enrolled. and from the reenroll population, there are several estimates that are out there, including from the issuer industry and we're able to use that to get a sense of how many of the people currently in the marketplace would stay in the marketplace and basically those two figures together give kind of a bottom up approach to give a sense of what may we be expecting this year using data that was not available when initial cbo estimates were done. the other way to look at it is that you have a market that is growing and there's a rampup that's associated with the growth of any market.
9:10 pm
from experience with chip and with the medicaid expansion, if you moved from a rampup of three years to a rampup of five years, that leads to a different trajectory. and doing each of those approaches kind of leads you to the same place in this nine to 9.9 million range. >> okay. >> joyce freedman. as you all are aware, the open enrollment last year went among the state-run exchanges went better in some states than others in connecticut it went really well, in maryland not so much. so i wondered if the panelists could talk about particular states that they might be looking at and what would be signs that enrollment is going better than before. >> it's got to be better in oregon. >> yes. >> they've switched to the
9:11 pm
federal exchanges as has nevada and maryland, i understand, have picked up the connecticut software or website, so i think there are some states where we will clearly see improvement. but we'll have to see. >> there's a related question on a card that panelist might want to address. and the specifics of the question are, have to do with the reenrollment process, i wonder more broadly what the application of the principle is. that is, is that reenrollment process going to look the same in a state-operated exchange as it will in the federally facile stated ones. for example, are they going to use auto enrollment? tim was describing the sequence of how and when one would have to reenroll in order not to be automatically reenrolled.
9:12 pm
are all the state exchanges going to do the same thing? are they required to do the same thing? how many of the federal rules apply to them? >> state exchanges can do their reenrollment differently. basically in our regulation on this issue states can provide their information on how they would like to do it to -- and get people to do it that way. >> okay. >> and some states, as you said earlier, everybody is going to reenroll. >> correct. nevada and oregon. >> i have a question from the audience on what is considering to the difference we're seeing in premiums this year? why have premiums increased moderately in so many states and gone down in states? what do you think is driving that -- those changes?
9:13 pm
even if you look at employer-based plans want to take it more broadly, we've seen a great moderation in premiums in employer-based plans. make dan would like to address that? >> sure. there are a number of underlying influences when it comes to premiums and i talked a lot about it in my presentation and in areas where there is substantial provider market consolidation and plans are unable to negotiate better rates for their consumers. you tend to see correlation in those areas where there isn't market competition among hospitals. you have higher premiums. and that situation has to change if we're going to deliver a value to consumers. so that's part of the equation. goes right into the underlying cost of care.
9:14 pm
it's similarly with prescription drugs, we look at sivaldi, plans had to submit their filings and rates back in the spring of 2013. they were locked in. sivaldi came on the market very late in 2013 and so when plans set their premiums, they had no idea it would be priced at 1,000 dollars a pill and they didn't have that in their pharmacy budget because there's no transparency there. unlike health plans, they have to submit their rates for review and there's back and forth before they get approval. there is no such thing for prescription drugs. and you don't know the price until it's launched. and, you know, our rates are set well in advance through a very transparent process. so that also has a significant impact and with the pipeline even more. so, you got the underlying cost of care that can contribute, but
9:15 pm
you also have a competitive marketplace. and health plans are competing. they're competing based on value. and in value comes from low price/high quality. and that's why we have choice in this marketplace because competition helps consumers. and so you do see variation by state. you see variation within the 501 market areas across the country because of these different types of factors that go into premium rate setting. >> i guess i would just follow up, too, and maybe ask tim this how important we think that the risk adjustment and risk the three rs, the reeninsurance, risk corridors and the risk adjustment provisions in the law have been in keeping premiums moderate this year? >> oh, i think they've been very important. quickly the reinsurance program provides reinsurance for high-cost cases for any plan in
9:16 pm
the individual market. the risk adjustment program moves resources from plans that end up with low-risk population to those with a high-risk population. and then the reinsurance provides kind of a fly wheel so that if one plan gets their premiums way off one way, then they may either compensate or be compensated by plans that got their premiums off in the other direction. but, i mean, i think dan said this that last year health plans were kind of thrown a dart at a wall. they didn't know exactly what the population was going to be that would show up and who they would end up ensuring. and some big commercials didn't enter the market under those conditions. this year, we're having a number of big commercials entering lots of markets and that's increasing competition. but at the same time, some plans that set their premiums too low last year are raising those. there is a tremendous amount of
9:17 pm
movement in the market and it's, again, very important for consumers to go back and shop to make sure that they know where their plan is at. >> health care lobbyist. i had a question for you, meena on the coverage numbers. your slides suggested roughly 7.5 million people enrolled through exchanges, 8 1/2 million through medicaid and the later slides suggests only 10 million new covered vinls. what happened to the 6 million people? were they previously ensured and headed to the rolls of the unensured? can you fill in the blanks on the coverage. >> part of it is that the 10 million number is not as recent as the other numbers that i provided. but also various aspects of the insurance market when you include employer-sponsored coverage that could be playing a
9:18 pm
role but probably just that there's a difference in timing of the numbers. >> so it would be accurate to say that 16 million people were benefitted by the aca in terms of acquiring coverage? >> well, i mean -- i would venture to say that far more than 16 million benefit because there's so many other consumer protection provisions that affect people who already have insurance, get insurance through their job, et cetera. it's kind of -- i would not -- i do not think just the number of people enrolled in the marketplace, that's not the one and only measure of success for the affordable care act. it is much broader than that. looking at, for instance the drop in uninsured is important there are many other ways that people are obtaining insurance as well. >> i'll just follow up on that, too. in the commonwealth surveys, we asked people with both marketplace plans who newly enrolled and who newly enrolled in medicaid. about 60% with new plans had not
9:19 pm
had health insurance before they enrolled. that gives you a sense of the share of people who were without health insurance before enrolling. >> yeah. that's the other good point is that some of these people had insurance before and now likely have better insurance. >> right. people shifted from individual -- a substantial share probably 20% in some of our data were shifting from employer-based plans. so people do -- i think right now we might see during this open enrollment period people who are also in their open enrollment period for their employers may be looking at whether or not they're paying a lot in their employer-based plans and decide to check out the marketplaces and see what's on offer in the marketplaces. if they could, in fact, be eligible for a tax credit if they're paying too much of their income for employer plans. so you might see some shifting in that from employer-based plans this open enrollment period. >> okay.
9:20 pm
we have about ten minutes left. we're going to try to get to as many eof the questions on the green cards that we have in front of us as we can. i don't think we're going to succeed in getting to all of them. if you have something you positively have to have asked, you might want to appear at one of the microphones. in the meantime, let me just turn to one of these questions. meena, a lot of commentators and tim mentioned it as well have suggested that what happened last year was a fairly large enrollment of people who were motivated to get insurance for one reason or another. the famous low-hanging fruit. so, you might want to describe what, if anything, hhs is doing to reach and convince those middle-hanging fruit people that they need to enroll as year two
9:21 pm
begins. >> sure. well, with our very active outreach program, i mean, i think that there are still many people out there who are eligible for premium tax credits, who may be in plans and may not realize what opportunities are available that we can reach. and so, it comes back to some of the points i made previously that we are working very closely with our partners, with local media, radio, tv, digital media, events and also reaching out across provider groups, churches. any number of community organizations to really get the message out. and, again, i think having had the experience of the first year and having a lot of people who are enrolled, who are happy with their plans helps to get more people who may have been reluctant or hesitant or
9:22 pm
doubtful et cetera to see there are people like me who were able to get quality coverage. i think that that is a very strong message as well. >> i wonder if the message is going to include at any level whether it's with your partners or with some of the work you're doing directly, about the sticks as well as the carrots. we were talking to a number of congress yesterday who pointed out that the penalty that's not the proper term the supreme court called it a tax, whatever the fee is, was going to go not to $95 but to a maximum of 2% of the person's income if they didn't have insurance coverage. i wonder if that word is getting out to people as a way of convincing them that they need to look seriously about getting insurance for the first time. anybody talking about the car
9:23 pm
lo rot. >> one thing i will say on that, congress wrote this so it would be an educational process. first year you get a tap and next year it's a bigger tap and by the third year it's a pretty big tap. people will be getting their w-2s in january and they're going to work on their taxes and they're realize sometime in february, march or april, but not only do they owe 1% of their income for this year, but they're also going to owe $325 per adult, almost 1,000 per family if it's a large family, and/or 2% of their income above the filing limit for next year. however, the exchanges close their doors on february 15th. so, people will not have a way of avoiding that penalty unless they qualify for a special enrollment period. i think that hhs has the authority to declare a special enrollment period for people who
9:24 pm
are going to owe the penalty for next year or who owe it for this year and to leave that open a bit longer. and i don't see how that would hurt anybody. i think it would be a way to bring a whole lot more people into coverage. >> there's something to consider in your deliberations at the department. yes, carl. >> carl, independent consultant. you already answered half of my question. so, the irs is really going to motivate a lot of the low-hanging fruit the way the law is designed. lady in my church looks to me sort of a navigator, you know, because i know a little bit about health policy. she works for h&r block, and she says she hasn't received any guidance yet about this penalty or this thing that she's supposed to deal with. and i wonder how is the irs going to deal with that on two grounds. one is, you already talked about, you know, they might be easy on people, but there's a certain disclosure element of it.
9:25 pm
is there going to be a box i'm going to check to declare, you know, if my coverage is credibility or whatever the word is, and then how is that enforced? because certainly i'll make a lot of mistakes. there will be a lot of people making mistakes about what do you mean by this coverage? how are they going to enforce that? what if i put the wrong information in there? on what basis will i be penalized? >> the forms are already online. the instructions for the forms are already online. 1040 does have a box. you check that box if you've had continuous coverage. if so, you just proceed. if you haven't, you will fill out an form called an 8965. the instructions are there online. and i've put up a couple of blog posts explaining in some detail how it works. i think if -- the people at h&r block are very aware of this. if she hasn't gotten instructions yet, she will because they are going to be major players in helping people
9:26 pm
understand how all of this works. they and jackson hewitt and turbo tax and other people. but, it's not easy. and, the first year it's a lot is going to have to be taken on trust because we don't have employer-reporting in place yet and there's some other issues. >> how about health plans? do they have any responsibility to notify their policymaker -- their policyholders that, yes, the insurance that you get from us qualifies you -- is a qualified health plan and meets the requirement of the statute? >> that comes from the exchange. so that's the exchange responsibility. >> i see. okay. >> sara? >> i have a question on i had mentioned how important the deductibles are in my presentation, that people should really pay attention to what the deductibles are like. will consumers have access to the information that they need
9:27 pm
about the implications of cost sharing provisions of plans available on the exchanges? will you be able to tell what your potential out of pocket costs might be over the course other than say the simple deductibles or co-pays? >> one of the provisions of the affordable care act made requirements that health plans must provide and in that they have to provide certain coverage examples that details out what are kind of the costs you could expect and with the cost sharing in the plan that you're considering what would your out of pocket costs be. >> shoutout to the d.c. exchange. our operations director was pointing out to me this morning, they have a couple of very good examples in their materials of just what you were describing. how much is it going to cost if you have a baby in this period? how much it is going to cost if you're managing type ii
9:28 pm
diabetes, that sort of thing. it is very helpful to the people in our office in trying to decide as they did. we've come to the end of our time. and i apologize to those of you who have written some very good questions though we haven't had a chance to get to. but there are a lot of questions in this area. we will try to keep up with this debate as it goes on and try to schedule programming that might be able to answer more questions. in the meantime, as you're putting on your winter coats to go outside, also take out the blue evaluation form and fill it out, if you will, to give us some feedback. thanks to commonwealth and sara for putting -- helping us put together and being part of a very useful program. i would ask you to join me in thanking the panel for a very enlightening discussion.
9:29 pm
[ applause ]. the ebola outbreak will be the topic of two congressional hearings tomorrow. at 10:00 a.m. oorn, the house foreign affairs sub committee on africa and global health will look at the international response and the on going need for doctors in areas stricken by the disease. that will be live here on c-span 3. and later in the day, the house energy and commerce sub committee on oversight will hear from cdc director dr. thomas freiden about steps the u.s. is taking to prevent the spread of ebola. you can watch that live here on c-span3 starting at 1:00 p.m. eastern. the 2015 c-span student cam video competition is under way.
9:30 pm
open to all middle and high school students to create a documentary on the theme the three branchs and you, showing how a policy, law or action by the executive, legislative or judicial branch of the federal government has affected you or your community. there's 200 cash prizes for students and teachers totaling $100,000. for the list of rules and how to get started go to studentcam.org. the privacy and civil liberties oversight board recently held an all-day seminar examining the intersection of privacy and technology. the board is an agency in the executive branch that was recommended by the 9/11 commission. in this portion of the seminar, the board's members looked at how the concept of privacy is defined, hearing from a panel that included security and technology experts, and a george washington university law professor. this is 1:45. >> public meeting on defining
9:31 pm
privacy. it's 8:30 a.m. on november 12th, 2014. and we're meeting in the west end ballroom in the washington mariotte georgetown hotel in washington, d.c. this hearing was announced in the federal register on october 21st, 2014. as chairman, i will be the presiding officer. all five board members are present and there's a kor rum. rachael brand, james dempsey and patricia wald. i will now call the hearing to order. all in favor please say aye. proceed. >> so what is privacy? the right to be left alone? a desire for independence of personal activity? the right to make decisions regarding one's private matters? space for intellectual development? anonymity or obscurity? freedom from public attention? freedom from being disturbed by
9:32 pm
others? freedom into intrusion in one's solitude? freedom from publicity which places you in a false light? freedom from appropriation of your name or likeness? control of how one's personal information is collected and used? freedom from surveillance. these are just a few definitions that have been given to privacy in the past. i expect during the course of today's discussions that we'll hear others. the meeting today in the comments we receive will inform the board's approach to privacy issues within its statutory mandate. there will be four panels today. the first will focus on defining privacy interests. the second will consider privacy interests and counterterrorism context and the impact of technology. next we will hear from government privacy officials regarding privacy interests that have been identified and addressed. and the final panel will see how lessons learned from the private sector can be applied in the counterterrorism context.
9:33 pm
each panel will be moderated by a different board member. after he poses questions, others will have opportunity to pose questions. members of the audience are invited to submit written questions. peter wen has cards and people can get a card from him and commit the questions time permitting for the moderator to pose to the panelists. i want to thank the panelists who agreed to appear here today. i also want to note that we have a strict time keeper, joe kelly, sitting in front and so panelists are encouraged to keep their remarks brief so we can have a more extensive discussion. we'll take a lunch break between 12:00 p.m. and 1:15. today's recording is being recorded. written comments from the members of the public are also welcomed and may be submitted through regulations.gov through the end of the year.
9:34 pm
i want to thank the board's staff, sharon, shannon, simone, renee, peter wen, joe kelly for their efforts in making today's events possible. so i will now turn to the first panel moderated by judge wald. >> thank you. panel one will attempt to explore, i think it would be too ambitious to define, privacy and the many separate individual and societal interests at the notion of privacy encompasses. the novelist john fran sin remarked, privacy is like the cat of values, not much substance there but a very winning smile. legally the concept is a mess. that's a quote. that may be unduly pessimistic, most commentators do agree there are aspects of privacy that go way back to the most ancient civilizations and that our own founding fathers enshrine several of them in the bill of rights. the concept of privacy has been
9:35 pm
reseptember kal for conglomerate of interests or values that individuals and society care about. but which to varying degrees they are willing to balance with competing values such as national security. thus the law of privacy consists mainly of a series of situations in which courts, legislatures or government officials have decided to recognize a privacy interest or not to and to protect or not to that interest against a competing value. so, our panelists today will identify the varied individual and societal interests that travel under the rubric of privacy and discuss how far and under what conditions our laws do or should legitimately claims that are based upon those particular interests. now, our format will be for each panelist to talk initially for seven minutes and the gentleman in the front row will turn give
9:36 pm
you a yellow card two minutes before and a green card will mean it's time to wind up quickly. then at the end of their initial speeches, i will question them as the moderator for about 20 minutes. that will be followed by another 20 minutes of questions by my fellow board members. after that. i hope there will be some time left for the written questions which members of the audience are invited to send to the people who will circulate and collect them and i will question -- discuss some of those questions with the people on the panel. you already, i think, have bios of your illustrious panelists, but i'll identify them very briefly before they speak. so we'll get right on. lizo goten liberty and national security program director. that's enough to identify you. >> thanks very much, judge wald.
9:37 pm
and i apologize in advance. i have a cold. my voice kind of comes an goes, but thank you to all the board members for inviting me to participate in today's discussion. if there's one thing i've learned from my own involvement in privacy issues over the last few years, privacy is different things to different people. david gave a very comprehensive list of some of the things that privacy is. i'm not sure what i would add to that except to say that i think for those who are outside the ideal logical mainstream in this country, pry sa vis-a-vis government can be critical to effectuate other. so collectively as a society, we value all of those aspects of privacy. even if some of us value only some of them or none of them. so, what does that mean for our analysis? i think it's interesting for us to think about different
9:38 pm
definitions of privacy and it's helpful insofar as it shows the range of definitions that are out there. but i'm not at all convinced that congress or the courts or this board should be in the business of attempting a granular definition of privacy or its importance. look at the freedom of religion by way of comparison. courts don't probe what religion is or why it's important. that's not because the definition of rely john is obvious by any means. it's at least in part because the opposite, because religion is different the things to different people. so what the court does is it adopts a concept of regiligion that's broad enough to encompass the many different roles that religion plays in people's lives and the court protects it except in the rare circumstance where there's an overriding governmental interest and congress has followed the same
9:39 pm
approach. when it comes to information privacy, the best working concept of privacy, the concept that best encompasses all of the important interests that privacy serves is control of information. this avoids the what and the why of privacy and focuses instead of how. how privacy is realized as a practical matter. and also has the additional advantage of matching up quite well with the text of the fourth amendment. if a person controls her papers, she is secure in them. if a person does not control them, she is not secure in them. what are some of the ramifications of this concept of privacy? well, first, controlling one's information means controlling not only what one shares but with whom and under what circumstances? i may share certain information
9:40 pm
with my mother or with a close childhood friend, but that doesn't mean that i have chosen to share that information with the entire world, including the nsa. sure, there's a chance my mother might rat me out. there's a chance that my childhood friend has a tax problem i didn't know about and could be pressured by the government into becoming an informant. but to equate this outside risk that my confidence is maybe misplaced with a willing disclosure to everyone in the world is a legal fiction of the worst kind. that's really what the third party doctrine is in my view. second, you don't, in fact, relinquish all control over information about your public activities by virtue of walking out your front door. there is such a thing, functionally speaking as privacy in public. this is something that's well understood in the foya context freedom of information act
9:41 pm
context. there's a privacy that allows the government to with hold information if releasing it would unduly compromise social privacy. the supreme court held in 1989 that a rap sheet would be covered by this exemption despite the fact that all of the information in a rap sheet is available by virtue of a diligent door to door combing of court records. so why was the rap sheet still private? because the court held while the information in it was publicly available, it was practically obscure. this is such a common sense concept and deserves a home in fourth amendment injury is prudence. the sum total of a person's movements in public over extended periods of time may be publicly available information, but using normal powers of human observation, it is practically obscure. so when the government uses
9:42 pm
drones or sting rays or gps technology to pierce that obscurity, it has compromised the control that a person would otherwise exercise privacy of this information. third, privacy violations happen at the point information was collected. we don't have to worry about the nsa's bulk collection of telephone records because nobody looks at the records unless they have reason to suspect some kind of terrorist link. that is the government telling you what aspects of privacy you should value. many people won't care if the government collects but doesn't look. other people won't care if the government looks but doesn't prosecute. but the point at which the government collects the information is the point at which you've lost control. and for plenty of people, that loss of control itself produces harm. it produces a feeling of vulnerability. it causes people to change their
9:43 pm
behavior. in 2014, there was a poll after this snowden disclosures showing that 47% of the respondents had changed their online behavior of those disclosures. there was another survey of 520 american writers showing that one out of six authors after the snowden disclosures refrained from writing about certain topics because they feared surveillance. after news stories broke about the nypds infill ration of muslim student associations attendance in those associations dropped. in some ways these are some of the worst harms that come from privacy violations because they're society wide. they impact the way we act as a society. they cause people to censer themselves and not put ideas out there. one last ramification of this concept of privacy, if i have time? i can't believe i have time. is young people. so, i hear it said quite often
9:44 pm
that young people don't care about privacy. it's certainly true that many young people go on facebook and share incredibly personal information with 622 friends. but they don't share that information with 623 friends. what they share and a number of people they share it with may very well have changed. it certainly appears so, but they still control the sharing, or at least they think they do. and my impression, based on a totally unscientific survey of all the young people in my life, they still value that control. so -- the red card. i knew it was coming. all right. i'll stop there. >> thank you. >> professor daniels is the john marshall harlem research profer sor at the gw george washington law center. >> good morning. i would like to make five brief points this morning. the first point is that privacy
9:45 pm
is much more than hiding bad secrets. one of the common arkts about -- that people often make about privacy is that people shouldn't worry if they have nothing to hide. i hear this argument all the time. this argument and many other arguments about privacy are based on a conception of privacy. a conception of privacy that's very narrow that sees privacy as hiding bad or discreditable things. well, privacy is much more than that. privacy isn't just one thing. it's many different things. privacy involves keeping people's data secure. it involves the responsible use of data. it involves making sure that when data is kept, it's kept accurately. it's making sure that people who keep the data are responsible ste wards of that data. that people have rights in that data. and some participation in the way the data is used.
9:46 pm
all these things have nothing to do with nothing to hide. they have nothing to do with secrets and everything to do with how their information is kept, collected, stored, et cetera. i think that if you see privacy broadly, we can move away and abandon these very narrow views of privacy. the second point i would like to make is that privacy is a societal interest, not just an individual one. when balancing privacy and security, privacy is often seen as an individual right and then security is often seen as a social right. when they're balanced, society generally wins out over the individual. i think this actually skews to d the balance to the society side. in fact, privacy isn't just an individual interest. it doesn't just affect the individual. it's a societal interest. we protect privacy because we
9:47 pm
want to protect society. we want to shape the kind of society we want to live in. privacy doesn't just protect the individual for the individual's sake. it protects the individual for the society's sake. because we want a free society where people are free to think and speak without worrying about negative consequences from that. third point i would like to make is that the collection of personal data through surveillance and other means of government information gathering can cause significant problems. data collection and surveillance aren't inherently bad, but just as industrial activity causes pollution, government surveillance and data gathering can cause problems. and these problems must be mitigated. they must be addressed when they clash with important interests. some of the problems include, one, that this activity can
9:48 pm
chill people's expression. it can chill people's exploration of ideas. it can chill people in many different ways. either they might not say something or they might say something slightly differently or they might act differently or do things differently. and we don't want that chilling when it comes to legal activity. the other thing -- the other problem is that surveillance gives a lot of power to the watchers. there's a lot of things that can be done with a vast repository of data, beyond a particular aim that it might have been collected for. data has a way of often being used in other manners, in other ways. i think that another issue, too, is the level of accountability and oversight that goes into this. because it's about the structure of our government and the
9:49 pm
relation of government of the people that we're talking about here. what kind of accountability will the government have when it gathers all this information? what limits will there be on the information gathered and used? how long will the information be kept? in a free society, people are free to act as they want to act as long as it's within the bound of the law without having to justify themselves? they don't have to go and explain their actions to a bureaucrat sitting in a room full of television monitors about what they're doing. they don't have to go and explain themselves when a computer's lights are blinking red because of something that they said and it could be misinterpreted. people don't have to worry about that. they can act freely without having to worry about how suspicious their actions might look. that is a key component to freedom.
9:50 pm
the fourth point i would like to make is that we can't adequately balance privacy and security without a reasonable amount of transparency. there's an overarching principle that this nation was founded upon. it's that we, the people, are the boss. the government is our agent. we can't evaluate what government officials are doing if we don't know what's going on. this doesn't mean there should be absolute transparency, but it does mean that we need to know something, enough to be able to evaluate government surveillance. because ultimately the choice about the proper level of surveillance isn't the nsa's to make, it's not the president's to make. it's the people's choice. we can't forget that. it's the people's choice, and the people must be given sufficient information to make that choice. my last point is that the government must get buy-in from the people for its surveillance measures. without buy-in, people are going
9:51 pm
to start to take self-help measures which is something we see happening now. we see that companies are providing people with ways to encrypt their data, to protect it from snooping government's entities. this is the market speaking. this is something that people wa want. why are people demanding this? because the government has lost trust. they don't provide oversight ability. that's why strong protection aren't necessarily bad for security. in fact, they ensure that the people are comfortable, that there is adequate oversight and accountability for that surveillance and that they're comfortable and know that they have the information that they know what's going on, and if
9:52 pm
they can evaluate what's going on, things will be a lot better when it comes to balancing privacy and security. thank you. >> paul rosenfelt is the founder of the group and he was senior secretary for policy security. >> thank you, members of the board, i appreciate the opportunity to speak with you today. it's really entirely appropriate appropriate in this technological age, and the reason for that is one that puts me in some disagreement with my fellow panel liis panelists. i think as acceptable as they were were somehow outdated and they don't survive the technological challenges we
9:53 pm
face. the 1973 thunder bird was a marvelous car, but we wouldn't think of holding it out today as the motor, we need a test for privacy today. what would that look like? there are many ways to answer that question, and i think to answer it, you have to begin by thinking about what sort of value privacy is. and here again, i think i find myself in some disagreement with other members on the panel, and perhaps with members of the board. i do not think that privacy is an entological value. it's not an inherent human right. rather, in my judgment, privacy is an inherently instrumental value, one that acts in the service of other societal values. it's a outilitarian value with positive gains.
9:54 pm
privacy is just an assertion of autonomy from society. it is valuable insofar as it advances other objectives. let me kind of put some salt on that. the problem is that buried in the word privacy are many different social values that we're fostering, too many, really, to catalog, though the chairman did a good job of trying to start. for example, the discussion here, privacy is enhancing freedom from government observation. that's probably the use that's most sail yent lient to what th does. but it fos terz personal morality. that's why we keep the confessional private. privacy is about restraining government misbehavior, which is why we see privacy values in the fourth amendment and other procedural limitations on government action. another way in which privacy is obviously relevant to this sport. and it's also, as dan said,
9:55 pm
sometimes about transparency in the sense that we have privacy rules so i know what you know about me. it can be about control, about control of my own image, and it's sometimes also about simply shame. since one ground of privacy sen ab -- is enabling me to keep proud of what i kept from the world. what's important to note is that in all these instances, the value we're protecting that underlies privacy is different from the privacy itself. that in turn suggests to me that the way to think about privacy is to think about what operational activities would protect the underlying value most. it means we need to go to a micro level to understand in general the nuance that arises from a particular interest that is at the core of the privacy that we're talking about. for example, we protect the confidentiality of attorney-client communications. why? because we think we need to foster candor in the discussion
9:56 pm
between a client and an attorney. that's something that we feel so strongly about that the instances in which we permit that privacy to be violated are few and far between, and they come only with the highest level of judicial scrutiny. the fourth amendment itself reflects a similar u till tear yan value, helps the security of our people, places and things against intrusion. once again, we impose a high bar of probable cause requirement and a strong, independent, outside adjudicator, a judge issuing warning. those aren't the only mechanisms by which we can protect privacy. we have a series of administrative processes that are often adequate to protect and restrain government observation. they're embedded in many of the internal reviews that are very common in the i.c., in the intelligence community that you spend your time reviewing.
9:57 pm
they're common in virtually every institution of government that we have at least at the federal level that i'm familiar with where we think administrative review, internal oversights, inspectors general, intelligence committee oversight are adequate, alternate administrative america -- mechanisms. what does that mean with what you've written about? the 215 program is one that directly impacts issues of government abuse or potential abuse because of the pervasiveness of the collection that underwent that was there. it strikes me that that sort of pervasive collection is one that would require a strong independent review mechanism because of the comprehensiveness of its activity. by contrast, the 702 program would which is less likelihood
9:58 pm
of inadvertent abuse. if you press on what is being protected, you get a sense of a better way to protect it. let me say one brief word more about transparency. the critical question is what type of transparency. this requires us to ask what transparency is for. it's the ground of oversight and audit. transparency without that ground is just voyeurism. but absolute transparency, as dan said, can't be squared with the need for secrecy in operational programs. i sometimes think that some calls for transparency, though i hasten to say not by any panel members of the board just discontinue them altogether. the problem is if we believe in
9:59 pm
absolute transparency, we've gone a long way that democracies can't have secrets, a view that i think is untenable in the modern world. with my last 38 seconds, let me offer one last thing about the nature of the board and the need for privacy. because i think that privacy is many things and has many applications in many different contexts, i also think the most appropriate ground for making judgments about privacy is not in board or judiciarys but in the most representative of bodies we have available to us. in this instance communism. i realize that's perhaps leaning very heavily on a body that's not held in the highest regard at this time, but nonetheless, that is the mechanism in a democracy for accumulating diverse preferences, weighing them in the balance and reaching a judgment for a broader societal interest. thank you.
10:00 pm
>> professor of computer science and public affairs at princeton, the founder for the princeton society of public privacy policies. i think he'll give us a somewhat different lens with which to view privacy. >> thanks for the opportunity to testify. today i'd like to offer a perspective as a computer scientist on changing data practices and how they've affected how we think about privacy. we can think of today's data practices in terms of a three-stage pipeline. first, collect data. second, merge data items. and third, analyze the data to infer facts about people. the first stage is collection. in our daily lives, we disclose information directly to people and organizations. but even when we're not disclosing information explicitly, more and more of what we do on line and off is recorded.
10:01 pm
on line services link up with recordings. the second stage of the pipeline merges the data. if two data files can be determined to correspond to the same person, for example, because they both contain the same unique identifyr, then those files can be merged. and merging can create an avalanche of that, because merged files contain more precise information about identity and behavior, and that precision in turn allows further merging. one file might contain detailed information about behavior and another might contain precise identity information. merging those files links behavior and identity together. the third stage of the pipeline uses data methods such as an t analytics. today's learning methods often enable sensitive information to be lifted from seemingly insensitive data. it can have an avalanche effect
10:02 pm
because each inference becomes another data point to be used in making further inferences. data is effective in status. for instance, target used many examples of pregnant and non-pregnant women to build its predictive model. a predictive model that tried to identify terrorists from everyday behavior data would expect much less success because there are fewer examples of known terrorists in the u.s. population. with that technical background, let me discuss a few implementations for privacy. first, the consequences of collecting a data item can be very difficult to predict, even if an item on its face doesn't seem to convey identifying information, and even if the contents seem harmless in isolation, the collection could have substantial downstream effects. we have to account for the month is -- mosaic effect combine to make a vivid picture.
10:03 pm
the power of privacy is the power of mosaic effect. to understand what follows from collecting an item, we have to think how that item can be merged with other available data and how that in turn can be used to merge other data. for example, the information that the holder of a certain loyalty card account number purchased skin lotion on a certain date might turn out to be the key fact that unlocks an inference that a particular identifiable woman is pregnant. similarly, phone call met adata, when collected and analyzed in large volume have been shown to unlock predictions in personal status and personality. the second implication is the data handling systems have gotten much more complicated, especially in the merging and analysis phases. that is, the phases after complexion. the sheer complexity of these systems make it extremely difficult to predict and control
10:04 pm
how they behave. even the people who build and run these systems often fail to understand fully how they work in practice, and this leetads t unpleasant surprises. it frustrates oversight, it frustrates compliance and it makes failure more likely. despite all best intentions, organizations will often find themselves out of compliance with their own policies and their own obligations. complex systems will often fail to perform as desired. complex rules also make compliance more difficult. it is sometimes argued that we should abandon controls on collection and focus only on regulated use. limits on use do provide more reflection in theory and sometimes in practice. but collection limits have more important advantages, too. for example, it he's easier to comply with what allows complexion than limits collection and puts limits on use afterward. and collection makes a forsight
10:05 pm
easier. they make major approaches while collecting less information. the third implication is the synergy between commercial and government data practices. as an example, commercial entities put unique identifiers into most website accesses. an eaves drop per collecting traffic can use these to link activities to different on-line sites and they can link these to identifying information. our research shows that even if the user shifts locations and devices as many users do, these identifiers can reconstruct 50% to what's on line and can usually link that data to a user's identity. my final point is technology offers many options than the approach of collecting all the data, aggregating it in a single large data center and analyzing it later. here i think paul's analogy to the 1973 thunderbird is a good
10:06 pm
one. we would no longer accept the safety technologies that were available on that vehicle. nowadays we expect airbags, we expect anti-lock brakes, we expect the technology to make the automobile safer and reduce risk. we should ask the same one when it comes to privacy. we should ask agencies to use advanced technologies to limit how much information they collect, to use encryptology. determining whether collection of particular data is truly necessary, whether data retention is truly needed and what can be inferred from a particular naltanalysis, knees de deeply technical questions. i hope you'll build the capacity to ask equally probing technical questions. legal oversight is most effective when they're combined with sophisticated and active
10:07 pm
technical analysis. and many groups are able and willing to help you build this capacity. thank you for kwloyour time and look forward to your questions. >> thank you. for the next 20 minutes or so, i'm going to pose some questions to the members of the panel, and i'll pose them to a particular member, but then if one of the other members has something c cogent, and i'm sure everything is cogent, feel free to include it. let me start with you. we have certain acts of privacy in the fourth amendment, security of one's home, papers from search and seizure and protection from general warrant. but are there other aspects of privacy that the advocacy community believes the third legal recognition and judicial oversight, or can they all be encompassed within the bounds?
10:08 pm
and if so, which ones do you think should be specifically recognized or protected? >> sure. to start with, i suppose, the obvious. the fourth amendment applies only to the government. it's a restriction on the government, it's not a restriction on private parties. i think there is absolutely a place for regulation of private entities and how they acquire and control people's information because the market doesn't always do a great job of many things, although it does a great job of other things, and we know people aren't entirely satisfied with things in the private sector. that falls outside the fourth amendment but is worthy of recognition. >> there's another one that falls directly under this. we hear so much about personal
10:09 pm
acquisition of so much personal information. what they do with it, the argument is sometimes made, don't worry so much about the government, somebody will google your communications, the internet has great masses of data. do you think there is any significant difference in the risks to privacy that are displayed by the holdings of so much personal information by private entities, or is it like two big kahunas? >> i think there is a difference. that is that private companies don't have the same coercive power over the government, and private entities don't have the same motivations for -- to persecute people based on
10:10 pm
idealogy or anything. these are things we have seen in the history of this country, unfortunately. we have seen people targeted because -- for surveillance because they were political enemies of the reining administration. what i would say is private entities have neither the ability nor the motive to throw people in jail on pretext because they are politically opposed to the current administration. that said, i think companies -- the line between big companies in this country and governance is getting thinner and thinner. certainly companies might have some political axes to grind with respect to the work force and they certainly have access to people's information. i am not in the least bit concerned of the private accumulation of information, but i become more concerned with privacy via the government.
10:11 pm
>> let me try -- now, you write something in an article called conceptualizing privacy. you went into it a little bit. there are 16 kinds of activities that represent privacy risks. privacy itself have six aspects. they're all defined too broadly or they're all defined too narrowly. you concluded, i think, if i read it correctly, that we should concentrate on specific types of disruptions to those interests. and what should be done about that. can you apply that kind of framework to the kinds of collection protection, i'm sorry, that we need in national security data and surveillance programs in collection processing, identification, secondary use, all the other things you talked about in your article? >> yes.
10:12 pm
what in what i wrote, i talked about privacy not just being one thing but having a common denominator and being a pool of common characteristics and laying out various types of problems. and i wanted to focus on the problems or areas where certain activities cause disruption. they have caused problems, and we want to mitigate those problems, and what are those problems? that's where we want to step in and say, hey, we should regulate this, we should do something about this, we should address these problems. it doesn't mean the activity that caused them are bad, but it does mean they do cause the problems we need to address. some of these problems that relate to government data gathering include one is ago regar -- aggregation of particular pieces of data. when you combine them together, you can learn new facts about
10:13 pm
somebody. this is what data minding is all about and data analytics. this then leads to the revelation of someone that they might not have expected or wanted when they gave out little pieces of information here and there. and i think this causes a problem. it disrupts people's privacy expectations, it can leads to knowledge of information that people don't want exposed or that society might not want exposed. they say, well, if the information were all different facts that were gather the from public information, it's all privacy. i don't think that's true. i think we want to look at what the problems are, and if we look at a problem, there is a problem here. the other problem is exclusion,
10:14 pm
which is the fact that people lack an ability in a lot of cases to have any say in how that information might be used against them. any write to correct that information or to make sure that it's accurate. and i think that's a key component of a lot of privacy laws is a right for people to make sure. we link individual data to a particular individual. by identifying them, you are connecting them to data that can then be used to make decisions about their lives. some of the decisions could be good but some decisions could, in fact, be harmful to an individual. security is another issue that i see as related and part of my
10:15 pm
t taxonomy of privacy. when data isn't kept secure, it creates risks and vulnerabilities to people that could expose them to a lot of harm if, in fact, the data is leaked improperly. that happens all the time. we're all at risk when all this data is gathered together in a big repository. there are a lot of other things, but i'll stop here in the interest of time. but these are just some of the ways that the taxonomy addresses this problem. i think it's important to think of the overarching point which is don't start with some pla tonic concept of privacy and see what fits in it and what doesn't. i think it's better to look at things from the bottom up and say, where are the problems here? whe
10:16 pm
where. >> the data looked at the increase in volume velocity and variety of data and championed that it created serendipitous new knowledge that is a value to society well. it brings with it harm but it also brings with it benefits, and that is why i see it as kind of a cross benefit u titilitari analysis. >> let me say something about this outilitarian as a right. it's in the treaties that the united states have signed customary to modern law. whatever people's personal feelings about that, i don't think this board has the latitude to decide it's not got
10:17 pm
all these treaties of a human right. >> we're defining things which will make a lot of problems. >> may i make one small point? >> yes. >> i totally agree about the benefits of big data and the use of these things. i think the balance is wrongly cast -- let's take the benefits and let's weigh it against the harms. because protecting privacy doesn't mean getting rid of big data or not engaging in surveillance or not doing a search. the fourth amendment allows searches and allows surveillance, for example, it just requires certain oversight. we need to look at, when we're balancing, not all the benefits of big data against privacy. we need to look at what extent to oversight accountability and these protection on it, to what extent do they diminish some of those benefits. and that difference.
10:18 pm
>> the balance has been struck. the government can't say, we want to do searches of people's houses, we have a really good reason. we don't have a warrant, but we have a really good reason. that balance is struck by the drafters of the fourth amendment. in a vast majority of cases, there is some narrow, delyineatd sections. this is not starting from scratch. >> mr. rosewine, your approach for balancing privacy and national security has, i think, been termed -- would you call it instrumental or consequential? in one of your articles you talked about you thought limiting the right of somebody to complain on the basis of when
10:19 pm
they are suffering a tangible harm like a warrant or being called before the grand jury as opposed to professor to use a privacy as a kind of foundational value, recognizable in its own right. yet you also recognize in some of your other works the significance of some aspects of privacy to a democratic society. now, all of you have talked about it isn't just an individual right, it's a right that an open society needs, starting with even the necessity for people developing their personalities in an atmosphere in which they feel free to experiment a little bit, to have relationships, to talk without feeling they're constantly being judged by the government or socie
10:20 pm
society. i'm wondering how you reconcile the issue of privacy that's necessary to a government assista assisted society without the understanding of tangible harm. >> i don't see them as reconcilable, because i see the question about adversity consequence and the error correction mechanisms is critical to the first part of your question, the inherancy of the value. what if, in some hypothetical world, which i assure you does not exist, the government never abused anybody, never actually misused the data that was collected, never went after -- had no lists of enemies, no persecution and never made a mistake. now, granted, that's an
10:21 pm
impossible thing and the values that underlie, the people would no longer feel the connection or by hypothesis of 100% would have gone away. so to my mind, the way to support the values we see in the underlying democratic sphere is to build the error correction mechanisms, the audits, the oversights in a way that reassures the privacy society that we are driving down the errors, the false positives and false negatives, driving down the errors as much as we humanly can. we don't eliminate government programs because of the possibility of error because
10:22 pm
every government program, every human endeavor has the possibility of error. we arm police officers, even though we know they will sometimes misuse their weapons. we don't eliminate that, we try to drive down the error rate as much as possible so that we engender people's confidence in police. we see, sadly these days, exactly what happens when people's confidence in police is not maintained, that our error correction mechanisms are deemed by society inadequate, and i think we're sort of seeing some of the same thing in response to the snow dden disclosures as we. but that suggests to me that the wrong way to support underlying values goes back to how to fix the underlying krcorrections. >> let me pursue one thing you brought up earlier, and that is which has come up in some of our past reports and is bound to come up in the future ones, i think. and that is, at what stages, if
10:23 pm
you would go in a little bit more into the point at which you think an independent review of decisions outside of the government's internal auditing and processes are necessary to ensure that you have this kind of trust by the people, that the government is not taking risks to its privacy in terms of what you yourself suggested that history has got some lessons for us. you know, the trust aspect. >> i certainly don't dispute we've had failures in the past. anyone who would dispute that hasn't read history. i would say that there is no one size fits all answer, that it really depends on the harms involved and the nature of what you anticipate the failure mode
10:24 pm
would be. we have tsa protection systems at the airport. probably fairly significant error rate of false positives, pulling people aside for secondary inspection. on the other hand, a comparative modest intrusion, and i say that knowing many people think it's a very large intrusion, but comparatively modest compared to the force of nature being put in jail, for example. in that instance, we seem reasonably happy with a principally administrative methodology that doesn't require any outside check because individual liberty is not at issue, long-term confinement is not at issue, the degree of harm is small. by contrast, i certainly think that independent review is essential whenever people's liberty is at stake, when significant aspects of livelihood are at stake. i think one of the strangest things that i see in the privacy
10:25 pm
debates today is we seem to get all wrapped up about things like the tsa screening and we don't look at how government databases are used to deny employment to people. you can't get a job in the transportation industry with a record, even if the record is itself wrought with error because of the transportation rules. that would be backwards for someone who is denied employment in the nuclear industry or the transportation industry. so putting that in this context, i certainly think that any time there is an adverse consequence to an individual, we get to the point where there is a room for a judicial intervention, an independent intervention. that's why i sort of liked what the president has done in adding
10:26 pm
the reasonable articulate suspicion of the querying of the 215 database, because that is the point at which some individual becomes out of the mask pulled out phorfo for indi scrutiny and that's when he begins adverse consequences. i sort of like that as a transition point. >> mr. if he wouldfelden, you t institutions building ever larger databases and then to aggregate them. i think you said either today there are inherent risks when the databases get larger and larger, and especially when they aggregate them. so i guess my basic question to you is, what are the principles that -- we'll all take notes on this. what are the principles that you recommend as a computer expert
10:27 pm
where protecting privacy in the increasing use of technology in this field? all on the way from collection what would they do for it? >> i think the first thing would be to try to look beyond the most brute force which would be to collect data that might be useful and retain it all as long as you can. the more data you have, mothe me you collect, the greater the adverse consequences could be,
10:28 pm
the greater the target it is for abuse or for breach. the first principle is to try to fit the practices to think about the way you can construct things to do that analysis by collecting data, holding data more separately and p preprocessingminimizing the data first. there is a growing array of technology that can do this. unfortunately, this becomes a technical problem. so the key principle here is just simply to insist that that technical work be done to try to architect the system to collect and hold a minimum of data.
10:29 pm
>> who would do that? >> in my view, if, say, a -- if a government agency wants to argue that they have a need to collect and use certain data, there should be some onus on them to justify the technical practices they're using to justify the amount of data collected, the way they're organizing it and so on. they should be prepared to discuss these issues and offer a technical justification. when it comes to private parties, that's a more complicated discussion. i think that the best practice in industry ought to be to do that as well. although obviously the legal and market mechanisms that drive that relationship are very
10:30 pm
different. >> you have talked about the element of control being so essential. but other people who have written in the field have said that certainly can't be an absolute, there has to be balance. we wouldn't be able to have any kind of national security programs if we read everybody so wide. i'm keeping control of that piece of information because i don't want anybody to have it. how do you -- you pointed out even in the fourth amendment there is a reasonable clause
10:31 pm
which gives you a balancing footprint to talk about. how would handle that, and then everybody can take a whack at it. >> the drafters of the fourth amendment did that balancing for us and gave us what the government hands you to override the policy right and that is to show probable cause of criminal activity. there are some very narrow exceptions that the supreme court has recognized, some of which are controversial, some of which are not. you have to get a warrant based on probable cause. within those exceptions, there is room for balancing and that's part of the reasonableness analysis. what i would say about that is, first of all, the courts do their balancing when they do a
10:32 pm
review, but congress has a role as well. and when congress does the balancing on behalf of the people, i would agree with what i believe dan said, which is this is a choice for the public to make. this needs to be a public choice, and it needs to be an informed choice, not a choice that's made in secret by a small number of officials but by the public because this is a democracy. so we need to have the information about what this security threat is, how that threat could be mitigated by the collection of this information and what exactly is going to be the effect on either side. the other quick point i would make is in balancing tests, national security is too often a trump card. the words are uttered and we're done. julia sanchez from the kato institute made an excellent point which is when you look at how courts weigh the national security against the individual question, they tend to weigh national security over that
10:33 pm
person's particular interest and that information. that's not the right comparison. you either need to weigh that person's particular interests in that particular information against the incremental threat to national security in that case, or you need to address national security at large -- i mean weigh national security at large against the values that privacy serves in our society. when you think of it that way, national security really shouldn't be a trump card. if we talk about these values as being in competition, i think the evidence for the most part shows that targeted surveillance is more effective than dragnet van van surveillance. but when they are in conflict, there needs to be a fair and public balancing. >> i think you're thinking about these issues of control. it's important to recognize the ways in which people try to reassert control even if they don't have it legally. and i
10:34 pm
>> imus: thinking of protective measures people use to try to obje obstinate certain behavior or deliberately do certain things to project a certain kind of image that they worry is looking at their data. and these things -- you need to take a look at the way resources are spent and sometimes really wasted in a kind of arms race between self help and strategic behavior on one hand and attempts to overcome that on the other side. those costs can often be substantial. just ask any teenager about their on-line use and what you'll hear in privacy is an elaborate story about technical
10:35 pm
counter-measures and strategic behavior. >> paul? >> it's government collection whether it's a rule imposed by the irs or law enforcement or national security. that doesn't mean that it's not an important value. it is one that when you set it as a touchstone, you're putting it in a government action in a host of areas. i'm sitting here as a republican on the panel thinking, of all the friends i have who are second amendment people who think the government should not collect any information about their gun ownership, and, you
10:36 pm
know, that's a perfectly reasonable position for them to have. it's not one that the government -- that we currently accept in society. the production of protection without warrants were actually high on the order of 50%. i don't know if that's changed much because it's been a while since i was a prosecutor. many, if not most, of our typic typical. i seem to recall it's not always a pre as opposed to a post activity in judicial review. >> dan, you have the last word.
10:37 pm
>> a few really quick points. first of all, even if you can't always give people a total control, there are certain partial things you can give people for control. the other thing is that it's not just people being in control, it's that the uses and gathering of the information is under control. and that's another important thing about that. that's appropriate oversight and accountability and controls on that gathering, too. on the fourth amendment, i think that it would be wrong just to track existing supreme court interpretations of the fourth amendment, which i think are a lot of times flawed in a lot of cases. i think there are a lot of exceptions to the warrant requirement, a lot of instances where the fourth amendment doesn't even get applied at all because the court has this platonic conception and that's where we get a lot of bodies
10:38 pm
that take the fourth amendment away to any kind of approach. i think it is a utilitarian right of balances. they say the right to search and seizures. that means that any time the government sis engaging in searches and surveillance and gathering information that it is unreasonable if it's creating problems that are not adequately dealt with in the right amount of oversight and accountability. that's really what the fourth amendment is trying to impose there. a warrant in probable cause or appropriate oversight to make sure that an independent judicial body looks at what the government wants to do and evaluates it. i think it's very important that we conduct the balance between privacy and security appropriately. i'm not a privacy absolutist.
10:39 pm
i think there should be a balance. but i think it's very important that when we balance, we balance it correctly and not incorrectly and that we don't skew the balance. too much security oversight by weighing the security interests because it's not the entire security interest on the scale. it's the marginal difference between a security interest without certain kinds of oversight and accountability and the security interest with oversight and accountability. the church sensibility committee produced a very innumerative public report about that. congress hasn't done anything since. i think it should. i think the judiciary body has a role to play, i think this body
10:40 pm
has a role to play. the people are the key to all this. they have a role to play. >> thank you. we'll now have 20 minutes of questioning from my fellow board members. i'll start with the chair and then i'll move down. >> thank you. liza raised a question about the question on reverence. it's mostly on the question of privacy and how some of these people rely on practical obscurity because the government is too complex or burdensome to gather information. in some ways in the computer age, we're beyond that, which is that court file that was gathering dust now is easily accessible and public access is so available, and the government can access commercial databases. how do we look at privacy when
10:41 pm
the information is out there, it's publicly available, but yet you combine it into mosaic and it can create a very detailed profile, and should the government be collecting that information? what standard should we apply in this context? what's the cats 2014 version as to how the government should recognize privacy issues, and i'm happy to just go down the line. >> whoever wants to take it. i might note that timewise we're going to have about five minutes per person. if you could keep your comments relatively brief, we can make sure everybody gets the full component of time. >> to be brief, i think right now what's been known as the mosaic theory that we see in the concurrences to the jones case in the supreme court are starting to look at this very
10:42 pm
question. i can't really answer it in a few seconds, but when we combine certain pieces of data, what are the implications of that, and when does the combining of that data reveal new information that can create certain harms of people, and that's where we want to step in. >> i would make two quick points. the first is, of course, that practical obscurity is itself a sort of post-industrial concept. if you were in the medieval village, it wasn't a practical obscurity to who you knew and they knew everything about you, pretty much. the data ago regags syst-- aggr system. we're advancing a value that has come to be something that we value more, one that i agree with. i think dan is exactly right. the mosaic is real. to deny that is to deny the reality of the science that ed knows. and so it strikes me that at the
10:43 pm
use of the aggregated data, you can't do it at the collection because they're so vague it's impossible to stop unless you're going to stop google from collecting. it has to be when the government chooses to aggregate it or perhaps chooses to act upon the ag rr aggregation. >> i agree with what's been said about the mosaic theory, and the other way to look at it is just the information that's being gathered by the government is, in fact, information that is using normal powers of human observation would be in a person's control and would not be something the government would have access to. the one thing i would say that i don't agree that the point of collection is a moot point
10:44 pm
because the mere fact that google has all this information, that facebook has all this information doesn't mean the government has all this information. the use has not been decided such as uavs and how the government will be able to divide uavs. so there is plenty of room to regulate all the discussions we had earlier about chilling effect and what privacy mines to different people, i think that's what it means when the privacy interests arise. >> i'll be very brief as well. along with what the other panelists have said, i would also point out that much of the information that is in corporate databases is information that was observed and not disclosed, and there is not always consent or often consent is very thin from the person who the data is about. and so i don't think you can always infer that there was an
10:45 pm
awarene awareness. you can't infer that a user was aware that it was collected or that they were aware it might go to the government and be used for government purposes. >> and, therefore, should the government not collect the information under those circumstances? >> well, i hesitate to make a legal opinion here, not being a lawyer. as a policy matter. but i should say as a policy matter, i get very nervous when it appears there is a legal friction that something has happened when it clearly has not happened. so a fiction of consent or a fiction that the mosaic effect does not exist are troubling. my time has expired. >> rachel brand?
10:46 pm
>> thank you all for being here, first of all. going back to this notion of control that was talked about, you went to the fourth amendment concept. i'm interested in whether the notion of control that's embodied in the phipps, which is more of an individual participation concept, can apply in the national security surveillance context. i just wonderanted to know, one you said maybe individuals would say, i consent to being investigated by the fbi or anyone else, and if that were the standard, you couldn't have surveillance programs. the fifth is on top, obviously, whatever the fourth's baseline is. the fifth would impose on the government agency's restrictions. can that apply at all in the national security context? what's your view on that? >> i think it can apply, but i'm just sort of pausing at some of it because i'm thinking about
10:47 pm
some premises of the question. it's not the case that you couldn't have surveillance programs if people did not consent to the disclosure of their information. the government can obtain your information with a warrant based on probable cause. >> my point is we're beyond the fourth amendment now. we're layering on top of the fourth amendment the phipps kind of individual participation. the reason i ask is, for example, when the nsa published their report on targeted data collection or 12333, they said they were applying the phipps, but then they turned around and said the individual concept does not apply so we're not applying that part of it. what i'm wondering is if the phipps is just not the right framework to apply, or does this individual participation thing just not apply or should we look at another framework to use? >> if you don't mind, i'd like to think about that question and maybe i can put it in writing along with my testimony. >> okay.
10:48 pm
>> i have a thought on it. i think that the phipps model had some flaws to it. a lot of times people don't read the privacy policies in most cases of companies, and i'm not sure if just providing a notice is effective. so we do need to think about what works in this context. i think that the key is in certain cases we might want individuals to play a greater role. i think the tsa, if you're on the no fly list, i think you should have a right to be heard. there should be rights of redress there and to be challenged your being on that list. i think there some of the phipps make a lot of sense. some of the phipps like security make sense. others might not. but i think the larger component of all this is that there's adequate control and accountability which is also part of the phipps. so that everything within the phipps, such as individualized notice of everything that's collected is not really
10:49 pm
feasible. there is a greater transparency right in the phipps, too. not that individuals get notified of every collection about them, but that there is a public accountability and generalized exposure about what's going on. >> i thought that the acknowledgment in the nsa report that some of the phipps principles couldn't be fully implemented in the context of a national security surveillance program was an absolutely accurate reality. you can't provide an error correction in all notices. i was talking more about the secondary screening on the no fly list where we do have more robust rights. but the challenge for you is going to be trying to figure out what the underlying values are and how to get at those so in this context, i think the underlying value is prevention of governmental abuse.
10:50 pm
that's what animates everybody in this sphere and government surveillance modifying behavior. and the types of accountability and transparency that you have to help build are ones need, op the national security system while providing protection. we tried that with the intelligence committees and the post church commission modification, something we might call delegated transparency where we all trust the congress to do it right. it seems as though we're less willing to do that now. personally, i'm not so certain that that's a good impulse. but things like that. so maybe it's this board. maybe it's a judicial panel with a clear advocate in front of it. lots of mechanism and participation that could be imagined that would achieve the
10:51 pm
objective of controlling against government al abuse in this use while not completely frustrating thes necessities that most of share. so i think a lot of this, a lot of it would be things that are more in which are thinking about, what the use case scenarios that are legitimate are in advance and hence privacy pro teches pro te protections technological. it seems like there should be a way to further the goals. so for example if you don't have the triright to control or corr
10:52 pm
data you can imagine asking for greater effort to ensure the correctness of the data as it is. or extra safe guards ex post regarding the possibility of error. >> thank you. >> did you select -- >> very briefly. >> part of what i with a struggling with is how much are we giving up on this collection which i'm not quite willing do in collection and wanting to go back to that issue of survey le lens and control of information. i still want go back and look at something i haven't thought about enough. so i want to look at, we see it all the time on the hill, so i want to go back and look at privacy. it sound to me like the best approach is -- [ inaudible ] but i want to remark --
10:53 pm
[ inaudible ] >> i'm glad to see latest submissions to any of the panelists. before we go on, beth cook's question, i want to remind the audience, if you have any questions, write them down and bring them up to me and then i will -- okay. they are coming. good to know. >> thank you all for what i thought to be a very, very interesting panel. and i hope it's bodes well for the rest of the day. in fact, a lot of i think panels free will be dealing with exactly how do you translate, is that the right translation, does that really work in the government context. i was also struck by the numerous mentions of the mosaic theory. obviously other implications, one under transparency which is to the extent we are trance parent in seemingly discrete ways. our adversaries are also looking
10:54 pm
to advocate information. and i think there is an argument that mosaic theory is critical. you need to understand collection to understand exactly how the national security apparatus works. do they have to be able to aggregate information. you can agree or disagree. but i think i was struck by the different implications. so i wanted to start with you professor, and i was really interested in your notion of moving way from the brought force mechanism and i think the section 215 program is one that the government has made the argument essentially that they need the brute force collection. they need to have the retention. in order to identify previously unknown links. have you given thought to whether or not there are technological options available
10:55 pm
to limit collection for a program like section 216. if you haven't been more generally, if you could be more specific about collection options. >> sure. yes. well with respect to section 215, the data of course is collected initially by the phone company. right. and then there is a question as to whether the information needs to be transferred in bulk to the translation department and i think it is clear as a technical matter in terms of looking for hotlinks that the intelligence agencies want to do, can be done, technically, while the information is still held by third parties such as the phone companies. this requires a modest amount of technical coordination between the companies, entities holding the data, and the entities that
10:56 pm
are doing the analysis. so there are opportunities to match to look for whether there are paths, hops from point a to point b, et cetera. then to reach in and extract the phone numbers that are highlighted with that analysis. that's the kind of thing that can be done. there is further work that is more technical. that goes to questions of who you can use, say, advanced criptography to announce that same analysis while not disclosing to the phone company information about which numbers are searched or linked. those sorts of methods are, i'd say, developing. and there's been some interest in the technical problem of thousand do this in the independent research community in light of what we have learned about publicly about the section 215 program. and methods are often
10:57 pm
developable about a specific problem like this. >> i think our biggest challenge is taking the concept we are talking about today and developing practical feasible recommendations that can actually be implemented. so the more concrete and the more specific that we can be in terms of recommendations, the more likely they are to be implemented. briefly, both to the professors and in the middle and i would ask, you both talk a little bit about rick's mitigation and assuming that there are going to be harmed, how you mitigate, pass the collection stage, what have you found to be the most effective mechanisms from mitigating risk? is it retention periods? is it access control? is it audit trails? so what can the government do concretely to start mitigating risk? >> i think it's not really just one thing to point to, that that
10:58 pm
is it. it is all of those things are very valuable to do. everything from mek nmechanisms to ensure that everything is accurate. when it is grabbed from one context to the other, for the purposes of amazon.com to recommend books for you is not the same accuracy we might want from the government. so amazon makes a mistake and recommends the wrong book to you, big deal. it doesn't need a hundred percent accuracy for that. but the level of accuracy differs as we differ in context. we need mechanisms that when information is taken from one context and put to the other that is t is appropriately accurate for that particular context. we need analysis of how long we keep data. audit trails to make sure it is not improperly accessed. appropriate accountability to make sure it is kept adequately secure. and also how it is being used. controls on its use so it can't be used for any purpose ten years from now.
10:59 pm
so we need all these different things. oversight from a lot of different bodies, i think. so it is actually a complex thing with many, many parts. >> there are certainly many moving parts. >> there is a culture of compliance that is preerror mechanisms, then obviously a lot of audit compliance work. error mechanisms, then obviously a lot of audit compliance work. outside inspectors general and then congress. and finally, and this is perhaps where we fall down the most, willingness to impose at least
11:00 pm
administrati administrative sanctions on people who vary from the accepted rules. at least in a willful context and even perhaps in a negligible context. nothing attracts the attention after government employee so much as prospect of losing his job or being, you know, suspended for a term of months. so that would be where i would focus. >> if we look at failures of compliance that's been acknowledged, if we see employees that are failures of the technical systems that behave consistently with. internal policies and this is a case where oversight can operate without needing to get deeply into the nuts and bolts of technology and what process are in place and what

67 Views

info Stream Only

Uploaded by TV Archive on