Skip to main content

tv   Key Capitol Hill Hearings  CSPAN  March 17, 2014 10:00am-12:01pm EDT

10:00 am
luck, we've got snow here in the nation's capital covering much of the mid-atlantic region here in washington, d.c., between 5 - 10 inches. most of our life programs were canceled. thisess happens to be out before a planned recess. members will return for legislative work next week. in use today we're learning that president obama signed an executive order imposing sanctions on russian officials within the government responsible for ukraine. the white house released a statement saying today's actions and a strong message to the russian government that there are can't does for their actions that violate the sovereignty and territorial integrity of ukraine including action supporting referendum for crimean separation. theident obama is welcoming at the white house.
10:01 am
there inspected to talk about peacereally/palestine negotiations. on cameraing you any remarks should they become available later this morning. aming appear on c-span in couple of minutes, a discussion hosted by stanford university on the balance between privacy and ellenty concerns and then will join a panel talking about world hunger and obesity. later a house hearing on u.s. policy toward taiwan. >> for st. patrick's day, we turn to the video library to see how u.s. presidents mark the holiday. >> despite all of this, tip wanted me here. he said since it was march 17 it was only fitting that someone drop by hugh actually had known
10:02 am
saint patrick area [applause] [laughter] that is true. i did know saint patrick. in fact, we both changed to the same political party at about the same time. wins yet have a glass of guinness with a man in ireland as i have with brian monahan, you are friends. >> more than 35 million americans claim irish ancestry.
10:03 am
america is richer. >> happy st. patrick's day to everybody. tens of millions of americans trace their roots back to the emerald isles. on st. patrick's day, many millions more claim it too. >> i have concluded, i will be glad to yields. >> were you on the floor at that time? at both sides. the only gentleman on that side that even made a move was mr.
10:04 am
walker. he did not rise. if anyone would have it i would have recognized it. i did not mean to suggest you're not acting with fairness. i was suggesting that we have a deal. but you have for it with the president of the united states. said i had a faster gavel. it was not. [indiscernible] there was not a man on either side of the aisle that stood. >> i suggest that i did not mean to offend you. >> you have offended me. i will accept your apology. >> by 35 years of house floor coverage on our page.
10:05 am
created by america's cable theanies 35 years to gay -- bill of rights you today by your cable service providers. next a discussion of balancing privacy and security and concerns about data collection by the government and private sector to her in the regulatory issues that arise in today's environment. speakers include the justice department, a student at princeton university. representneys who target from a data breach which is expected to of affected 70 million customers. this was posted by stanford law school. it is about two hours. >> we want to give you one reminder. this is being taped for c-span. that explains the lights.
10:06 am
interject, we will try to get a microphone to you to help facilitate a recording. you in mind because of that may want to think twice about what you're saying before you say it. [laughter] we are delighted to have a great panel. i is here in california. i'm a technology lawyer. i'm a certified privacy professional. myself.enough about ined from new jersey. he has a pretty interesting blog he has on his website. i will let him introduce himself and little more properly. they immediate left is
10:07 am
department of justice here in california and the head of the policy in training for privacy issues. about why this he and policy issues than i can possibly ever get into my school. to her left is michelle. she is the chief privacy officer of activity. there also certified privacy professionals. she is one of the authors of this ran new book it came out this week that i ensure she will plug at some point. getting from policy to code to value. available from your nearest amazon.com website. partner andis my colic. he is a litigator with our firm. doing litigation.
10:08 am
everything from the data breaches to sony cases and others. i learned how to do some of the things i do a daily basis when i was a young associate. each of you will start out with five minutes of discussion to set the stage and to explain their perspective a little more. i'm going to shut up. i'm going to ask them to take it away. >> great. inc. you. you. anank i am sorry i cannot be with you there in person. let me give a very brief introduction to give you some context on what i will be talking about.
10:09 am
i am a professor of princeton in computer science and in the woodrow wilson school which are public policy school. i'm a computer scientist by training. i have done work on privacy from a technology site and a policy site. -- side and a policy side. haveusly, recent events drawn a lot more attention to issues of privacy and big data. we hear about data breaches, leaks of data from corporate data centers. we hear about weeks of data from apps and other kinds of technologies that we use. we're hearing a lot about collection and use of data by government. realizing increasingly that we leave a very big and diverse trail of data behind us. as he moved through the physical world as well.
10:10 am
it is no longer the case that the kind of pervasive data trail in information collection that we are used to in the digital world is only in the digital world inw the digital the real world are fused. we leave these records behind us. this consists partly of data that we provide willingly to people. he data that we provide is sometimes available for collection or capture on a device we own, say data you enter on your phone or computer, data you send via e-mail. it is sometimes available for collection or capture at our own phone or computer. it typically traverses across the internet to some kind of server affiliated with the
10:11 am
service that we are using or that is somehow enacted to our activity. at data is then typically rest at the service of potentially available there were use or to be captured. in transit, the data is also often available to be captured there. we rarely see it protected. there is a lot of data that is collected about our behavior and activities. almost anytime time that you read something online the information is collected about the fact that you read that particular piece of content, what ever it is. driveation about what we and what we do in the physical word is often available for
10:12 am
capture. what does this have to do with big data? does this master that this collection of data is big? important is that data allows inferences to be made so that when a piece is provided it reveals not only what is written on the face of the data but it also reveals everything that can be inferred from it. if we are going to understand the privacy implication of data we need to think carefully about this issue of inference. the problem is that it is the tories lead difficult to model or understand or predict. even experts have a lot of trouble understanding what thatcations of policing information might be.
10:13 am
this can enable more inferences. a kind of chain reaction. whatever the reasons why it is difficult to understand and model and makes it difficult to protect ourselves is that it is difficult to think about the chains of inference that we can set off by revealing one more not very sensitive fact. talked about government collection of information versus corporate collection. one of the things we have learned in recent months is that governments and corporate connection are really connected more tightly than we thought. it is increasingly clear that government often follows a strategy of piggybacking on top of corporate collection of waiting for a company to collect data and then going to get it or using captured information off of a network where a company is conveying that data. from your computer to their
10:14 am
surverver. if it is about what you are doing or where you are or who you are, it is susceptible to being captured. biggest probably the thing we have learnt from the government -- the revelation lately is that when we talk about big data, big is bigger than we talked. there are datasets being analyzed by governments to go beyond what many of us expect ed was going on. this sharpens the issue for us as well. let me pass it off to my fellow panel. thinks. >> thank you. good evening. very happy to be here. training is a medievalist. that connect it is
10:15 am
privacy is a byzantine world. i've been working in privacy since 2000 when i headed up the former office of privacy protection in california which was a small state office treated by the legislature. it was originally part of the department of consumer affairs. education andmer advocacy organization. it did not have any regulatory authority. from its first operations it was part of the beginning of a real flourishing of consumer privacy legislation passed in california which continues in this day even as we speak. california has been acclaimed in some circles as the leading states in consumer privacy protection. and some other circles they describe it differently. we have a lot of privacy
10:16 am
statutes. we also have a constitutional right to privacy. it is not just the right to pursue what to privacy, security, and happiness. i hope we're getting our full share. the office of privacy protection ceased to exist, was eliminated at the end of 2012. not coincidentally, the attorney general at the time created a new privacy unit. along with some my former staff. a realesented advancement and privacy protection by government in california. it added to the same kind of work that was done of education and policy advocacy in the old office and added accountability
10:17 am
and enforcement. in addition to educational activities, we also enforce privacy laws both state and federal laws that allow attorney general enforcement. it is particularly significant that this level of resources is dedicated to privacy protection eye the attorney general general's office. int of our privacy statutes california are left to the attorney general and district attorneys to enforce. most of them do not have a private right of action. the law.nd on an initiative that allowed individuals to use it only if they can demonstrate monetary or property damages which is often difficult to do in describing a privacy harm. that leaves it to enforcement by and ag's.
10:18 am
we are interested in enforcing many of our new and older privacy laws. looking at bringing about with fair information practice principles with good privacy practices and respectful thanies, we use more tools just enforcement. there is encouraging businesses practice to a higher standard. educate consumers and empower them with not only information about their rights but also with information and strategies that protect theirto
10:19 am
privacy even when they do not have legal rights which is often the case. finally enforcement. enforcement can be very educational. it can develop enforcement policy. we are looking at practices that we wish to raise to people's attention and point out the wrong way to do things and suggested better ways to do things. i guess he will talk about defining big data. here's my definition. it whole lot of data. about an individual that even if no one of the points is personally identified, the accumulation of them makes the whole stream potentially identifiable. our basic legal structures for
10:20 am
providing consumers or individuals with controls over the information depends on and deals consent with restricting it in certain ways. this is not necessarily personally identified. the whole structure does not work well. that is where we are with big data. data and privacy protection in general i am interested in regulatory innovation. so many things are changing so fast that affects our personal information. not just things like big data but new business practices that the laws really cannot keep up. there are things we can do to keep refreshing them iran or standing them. it is hard for them to keep up.
10:21 am
there are proposals being discussed for alternatives to dommand and control laws ang regs. before i pass it to the shameless promoter of her book, i was very disappointed at eight ecological a marketing failure on the part of amazon who i understood was going to anticipate what i would want to order in advance. they should have gotten it to me in advance. i actually had to order it. >> there was no drone? jdroneping for a strike filled with privacy manifestoes. thank you for coming. there are so many friends in the audience. i am excited to be here. i and the chief privacy officer at mcafee. anything i say can be attributed
10:22 am
to be grey. [laughter] lawyered out as a young at a great law firm. i started out as a lens electoral -- and intellectual property law you're looking at the data. cases was inst big the realm of the aids cocktail drugs in the pharmaceutical hunt for ownership over molecules. the molecule that actually works was because ite had a version of if you hold up your hands but they are not identical, it turns out that one was effective and one was not. it was new and not obvious. that was the case. that is interesting when it
10:23 am
comes to datasets. things that everyone knows sometimes can result in analytics that create absolutely fundamentally new advances. it became a path where we could malady withease and no therapies available at that time. here we are talking about big data all these years. i wasn't going to shamelessly plug but now i will. i wrote this book with jones and fox tucson of you may know and love. -- who some of you may know and love. i think big data is the ultimate in getting together artistry, , policy,, law immunization -- consumerization of policy.
10:24 am
i'm going to read you a couple of paragraphs that shaped my thinking in this area. the world is certainly flat. so.yone said the church said so. so. wise old aunt said everyone except for a few explorers, scientists and artists who looked at the sky and noticed the ground and the sky always but for brief tips before the observer wandered closer. it seemed to suggest that there might be something more than dragons be on the edge of the world. the world was not flat. there was an endless set of new possibilities to discover. privacy is certainly death. everyone said so. rich people with big boats that sold such to the cia>
10:25 am
. you know who you are. eounders of hardwar companies. he knows who he is. someone who blogged said so. tweeted it. everyone said privacy was dead. explorers,w inventors, children, parents and government regulators who looked of data andless sea still could see how a person can be distinguished from a pile of metadata. this is true for people who wish to decide for themselves the story they wish to tell about themselves and to whom. these people see a different horizon. the privacy engineer sees this when they combined to create value.
10:26 am
this is how we start out the book. plug itot shamelessly tonight but it really is the culmination of 15 years of thinking in partnership with security architects, our forward is written by an artificial intelligence analytic ceo. thing there are more and more identifiable moments that are happening out there. there was a hearing here. elements forhe doing something that could be enriching. >> i have been a litigator in
10:27 am
boston since 1985. i am intrigued we're going to work on even defining the data. it large we carll data. the intellectual property connection is part of this for me. gemini years ago started -- jima years ago started a trade secret case with you invented the pump maker -- who invented the pump sneaker. there is a trade secret law that carries through to this context for me. it is a very fundamental role. yourder to assert rights have to keep it secret. if you do not keep it secret it is gone. your rights go with it.
10:28 am
just like data. once your data is out there, it is gone. hts is to reel back rig part of what everybody here is talking about, whether that is even possible. i have been working on data breach and data security cases ever since the tgx case seven years ago. started a lot of attention in this area. was the largest cyber attack of its kind. it stay that way until the payment system case came around in 2009. we had the good fortune to work on that case. this includes the criminal intrusions at sony a couple of years ago.
10:29 am
discuss this that followed was granted yesterday. we work for window where my colleague is challenging the authority and the areas that are claimed by these the. there are the effort to claim the ability to enforce reasonable security standards in the hotel franchise context. it is a very interesting time to be here and talk about some of the things that are in the news and that are defended in the news in the last few years.
10:30 am
my general perspective is that i do look at this as big data issues from the perspective of someone in the trenches as a litigator dealing with the bodies ofks against company data, aggregations of company data. we get called in. it is at a moment where companies are faced with challenges of dealing with the reality of finding out what has happened, what exact data is an issue. there are regulatory inquiries, it class action complaints, towns or party complaint and obligations. there is extreme pressure to act very quickly and to do it right.
10:31 am
it is very challenging and interesting task that we have gone through a number of times. my introductory comments on this is that in the litigation whicht, which riley is -- were i live analysis all starts with injury. if any information about an individual is taken or exposed or lost, of course we will say "what is the injury?" the recent clapper decisions involving a challenge to pfizer camesions and surveillance at this in march of this year. the supreme court makes clear that in the federal courts we're not talking about a risk of injury. we are not talking about the
10:32 am
possibility of misuse. not even an objectively reasonable expectation of misuse. youet into federal court need actual injury or injuries that is pending. article three to get your foot in the door. even if you can satisfy that standard, you still need to have enough of an impact, enough of a recognizable injury to me the elements of a cause of action that you are trying to assert, whether it is negligence or a level ofontract or injury that might be necessary in the statute. the fact that you are presenting include art does not
10:33 am
situation where it will intervene. courts will not engineer the situation for you with a remedy unless you have this. what litigation is. companies know how to deal with that. revolve around whether there is real injury. a neutral judge decides what the standards are. companies can work with that. the other main actors in the regulators. they are usually operating under statute prescribing unfair statutes and practices. there are more specific statutes. about privacy data
10:34 am
security issues with different language. it is different. can discussthing we as well. you arecontext, where talking about the practices, the injury discussion often seems different. in a dissection case you talk about mixed reality incidents injury. mercuriality instead of injury. many of the cases you talk about doing this talk about out-of-pocket loss. you sometimes hear concerned not about what happened but about what might happen. felten earlier said, as we do with big data inferences are things that are difficult to model and project.
10:35 am
that is what you hear people expressing concern about. hear a discussion and use of words like inconvenience in unpredictability. creepiness. wary. -- worry. warranting any for possible action. that would not be enough for a court. there are almost no judicial decisions in the regulatory aggressingidating or the difference in approach as it .s applied by regulators if you do not have a body of case law were courts have adjudicatednd determinations under that rubric, not in the same way as
10:36 am
you do in courts, obviously. at least in the data security context. i think that lack of judicial authority, the difference is look in at the approach of what you see being discussed and sometimes motivating and being implemented in the regulatory sphere. it is part of the difficulty that they data faces. predictability is important for any business. these companies are growing hand over fist. what are the rules? what are the risks? do with this data? how responsible will i be if they preach occurs. it is less clear in the revelatory areas than it is in
10:37 am
the traditional area. >> thank you. perhaps we should pick up from that being. you all test upon a harm piece of it. one of the things that seems to drive a lot of our thinking that what is the harm of data analysis and collection and analytics and putting it to use in some way. you did mention the inferences as well as the shrugging that occurred after it was revealed that it was collecting only 30% of phone traffic.
10:38 am
does society think there is no harm in this? are the potential harms that you are concerned about as he think about regulating? to either view to take it from a regulatory perspective. >> let me talk some about this rum a technology standpoint. as i said before it can be difficult to understand and figure out how data might be used. to tell if it is being used in ways that are adverse to your interest. is published in a way that they do not want published. job may not know about the offer that comes about the invitation to something that
10:39 am
does not, whatever it is. risk is really the rational way to think about a situation in which causality is believed to exist but it is difficult to tell the exact mechanism or way in which a negative consequence can manifest itself. way to a reasonable think in terms of risk. body of data of a being collected, being maintained, being used. -- think in terms of a body of data being collected, the maintained, being used as a risk. being like an oil in an underground oil tank. there are a lot of things you can do to inspect it to make it is safe, to replace it periodically so you minimize the risk that the oil will end up
10:40 am
leaking in the ground. you can never make that risk zero. if you want to minimize the risk, the best way to do it is to not have an oil tank in the first place. shouldes not mean you not have oil tanks ever. it means you need to think in terms of risk and disclosure. that is the way many people think about these issues. i think it makes a lot of sense. backwould like to step from the regulatory position to sort of a policy philosophy perspective. when we are talking about protecting from harm, what are we trying to protect? a matter ofrely protecting individuals from harm or use thathe abuse they did not want of their information.
10:41 am
there are also societal dimensions to privacy, respect for individual privacy or lack of it. it has an impact on society as a whole. a healthy functioning society we need to be a time in his individuals with a certain amount of respect for our dignity and our economy so that we can operate freely. we can deliver eight. to academicnovative freedom. there are harms to society as a whole from the massive collection from individuals that amounts to pervasive surveillance. how we live and develop and how we operate as a society.
10:42 am
i think we can run ourselves down the wrong path way if we define what we are rechecked simply protecting an individual from harm. that is one of the things we need to bear in mind. there are societal values and benefits to respect privacy. what is the harm to individuals? i'm not speaking as a regulator. i do not think our core necessarily has come to thatnize privacy harms many of us would proceed. he has categorized harm and
10:43 am
things like surveillance and collection. like disclosure. the distinction between information that is public about me such as stuff published in phone books or in public records that is left in a record or book someplace compared to the same information being posted on the world wide web. now it is exposed to more people. it is searchable. different degree of exposure. i cannot tell you how often they can call. information. the it is not like there information.
10:44 am
they are sure that there is a law. that cannot be there without their permission put there by they have no relationship with exposed to all the world. i do not have the legal answers to how we can start to expand our definition of harm. both on the impact on individuals and society. that, don'tow up on we have laws already that help protect us from some of the things? it goes directly to some of data collection issues. when you think about these data issues being used for various purposes. >> we do have laws that speak to some very specific types of harm. haveome of the others that a real impact on our lives and our society. >> i think that is an important point.
10:45 am
we are talking very provincially right now. this is only the u.s. view. if you are a practitioner, if com after your name your walking onto a global stage. it is a very different global stage. it is based on human rights and infringement upon your freedom. it is much broader. >> the future of ukraine must be decided by the people of ukraine. sovereigntykraine's and integrity must be respected. international law must be upheld. russia's decision to send troops into crimea has drawn the global condemnation.
10:46 am
from the start, the united states has mobilized this in support of ukraine. our allies and partners. we solve this international unity again over the weekend when russia stood alone defending its actions and crimea. as i told president putin yesterday, the referendum was a clear violation of ukrainian and international law. it will not be recognized by the international community. today i'm announcing a series of measures that will continue to increase the cost on russia and those responsible for what is happening in ukraine. authorized by the executive order i signed two weeks ago we are imposing sanctions on individuals responsible for undermining the integrity. we're making it clear that there
10:47 am
are consequences for their actions. i have signed a new executive order that expands the scope of our sanctions. as an initial step i'm authorizing sanctions on russian officials. am operating in the arm sector in russia and individuals who provide material support to senior officials of the russian government. if russia continues to interfere in ukraine we stand ready to impose further sanctions. are continuing our close consultations with our european partners who moved ahead with their own sanctions against russia. bident vice president meets with leaders for his allies in lithuania. i will be traveling to europe next week. our message will be clear. we have a solemn commitment to our defense. we will uphold this commitment. barely continue to make
10:48 am
these except to further eyes russia and diminishes place in the world. this will continue to stand together. continue russian military intervention in ukraine and will only deepen the romantic isolation and exact a greater toll on the russian economy. going forward we can calibrate our response based on what other -- whether they plan to escalator de-escalate the situation. i still believe there is a path to resolve this diplomatically in a way that addresses both russia and ukraine. pulling itss russia forces back to their bases, supporting monitors in ukraine and engaging in dialogue with the ukrainian government which is indicating the openness to pursuing constitutional reform as they move forward toward
10:49 am
elections this spring. throughout this process we are going to stand firm with our support. as i told him last week, the united states stands with ukraine to determine their own destiny. we will continue working with our partners to offer ukraine's economic support. go forward we will continue to look at a range of ways we can help our ukrainian friends achieve their universal prosperity and dignity they deserve. thank you very much. we'll be available for questioning. >> [captioning performed by national captioning institute] [applause] [captions copyright national cable satellite corp. 2014] >> president obama making a
10:50 am
statement on the referendum on crimea. people of the crimea region of voting to join russia. the president also taking the extra step to impose a series of sanctions on a member -- number of russian officials. this is all for their material support. your facebookging for comment about the situation in ukraine. toe to face that.com/c-span leave your comments. will leave this after the meeting. we plan to have this after the networks. but now to a discussion on privacy and security. causing people to avoid data centers in the united
10:51 am
states. inis a difference in rules companies in the du have to worry about. will about where the data be. that is why we fought a war with england. wet is not necessarily mean have to do this. it lets us exploit the opportunity from big data. you have to make a choice between collecting the data in the first place so that it can be analyzed for its maximum limiting the collection of data in deference to privacy concerns. been eliminating it from its
10:52 am
visibility for analysis. i think we have gone more in the other direction. following the floods in indonesia a couple of years ago, they were able to use cell phone data collected from people cell where peoplelyze were going and how they were reacting in relation to the floods. tohas been a tremendous boon studies as a result of being able to get that data. human not have this information. is that we will be more respective in the united states. if we're going to focus on issues with the use of data, let's focus on the context
10:53 am
specific way and of the collection way. >> i think i hear this discussion that a piece of it is risk mitigation and trying to get that risk is close to zero as possible. wouldk that most of us agree there's not going to be a company that would take it to zero and say let's not put it on the ground. ofre is the harm surveillance and the potential of foreign. we have some laws that protect. various folks in washington can take a look at big data to see these riskslance and potential harm with the opportunity. there is early experience as a
10:54 am
pack litigator. there are geographic trends. how will you advise the president to balance that risk and opportunity with the harm? how will you do that? i think is too much to ask the president to solve this. there are some important public policy things that can be done. start with some pretty fundamental concepts to try to strengthen the idea of consent citizen --nsumers in
10:55 am
and his citizens, especially in a commercial context. right now we often have this phenomenon that i call privacy theater in which someone pretends to disclose to me how might that is going to be used and then i pretended that i have understood and consented to that. that does not really help anybody. i think getting closer to something like real consent is helpful. instead of allowing people to pretend there has been a meeting of the mind about what the expectations are. i think that outside the policy we're going to see more technical self-help by end users. it denies that it lectures or that adds noise to the data by either securing information or even behaving in a way that is
10:56 am
causing incorrect data to get elected. ofare going to see all types tactics. out. we're going to see a rise of privacy protected projects online. we're going to see more sophisticated ones. data there'sto big not only sophisticated theories on how to analyze the data, there are also sophisticated theories on how to modify and manage the data in ways that further protect privacy. i think we'll see consumers and end taking matters into their own hands. sometimes in subtle and common sense ways when we see people avoiding medical care because they do not want something in their record. of two people do more sophisticated innings and orchestrating for simulating certain online behaviors in order to give the impression that certain things about them are true which they are not.
10:57 am
i totally agree. to the point of asking the president to solve everything, which president? great conversation. come as a technical community and have a close door. let's talk about what this is. said don't you think a little more technical does not mean you have to be up to date on everything. he says we have ed felten. head -- hats off to you. these of the fact that you could name a guy is problematic for our government. i do not think he could do that all. hats off to you. named technical
10:58 am
awareness engine for the u.s. government. i applaud you for that. that or i lot of would not given him the job. is reallying that exciting is that as it turns out people are not that interesting. it seems really fun. turns out that we have about 3000 years of big data. what people do when they want to entertain themselves, etc.. is just as went learned how to sandbox and use dummy data to test applications, everyone said we cannot possibly add this. it turns out you can't. you can have a simulated datasets.
10:59 am
it in terms of big data and how people behave in malls. how they shop here at men walk in and turned right. we do not know why. you do not need to pay $5,000 per shopper to have a bluetooth and a communicator and a phone and a spy thing and permissions all over the mall to find out men turn right. you just put sporting goods stores to the right and you're off to the races. it sounds silly but what you can do is actually simulate various patterns and datasets that you want to combine so you do not have to get live cdc data. back in the early 2000. we said incident getting all this data and putting sensors in airports and things, what would it take it there was a massive
11:00 am
outbreak, where would it hit? a lot of information was learned about where the information was not being shared. where it should have been. where there was too much sharing. for the smaller practitioner that will continue to come and companies that can sell in modeling using large data sets be huge winners in this space. it is a hot topic and takes a lot of risk off the table to do things in silico. the other thing is we also need to have some building standards. we need our engineering schools. it is absolutely responsible that any engineer graduates without course in practice
11:01 am
principles. it makes me aghast to know they don't have to learn about security. these are graduate programs available but it should be no lawyer gets to practice in the u.s. without a license. especially for people creating critical infrastructure and collecting data. it is not a big silver bullet but i think every engineer and technical person must be conversant in uses of data and collection strategies and transparency strategies -- all of the known fair information practices we have had around this since the 1960's. they may not need our technology but the ethics and framework, we have not tried to exploit yet. i refuse to believe that something that has been that static over time and tested through various incantations -- -- i will giveer you a real life example without
11:02 am
a company attached to it. we had a crisis were one of our important research centers was cut up by a raging wildfire. we wanted to find out quickly in real-time that kids need to be picked up at school and families these know their loved ones were safe and we had to evacuate and find out who was traveling that day. we had to crack open hr , in some cases personal databases were we had noncustodial parents that now had access to kids. the point of that was that iting that context in time was because we planned to do that, we knew when there was a beginning, a middle and and and. i knew when we shut down access to the databases as soon as when we opened access and that is the critical p[oint./ we are always in a crisis and people throw things at a wall but you have to practice for the end of the crisis when everyone is bored and exhausted and going
11:03 am
home. to make sure that data which is critical or you would not have been touching it in an emergency in the first place has been closed again. it is the boring details that really amount to safety. big data thinge of saying it's really great that this is available but handwashing turns out to be the number one thing that lowers diseases in large facilities. is a low-tech and they high-tech and an understanding of the components which is where innovation needs to happen. >> what about yourself? -- this is an extension maybe of the sand boxing. there is a lot you can do it big data and yet a lot of benefits when it is not identified. datan learn a lot from big about climate change, no pii involved.
11:04 am
orre are different buckets different types of data and contexts in which they have different rules. the kind of massive quantities of data used for medical research is probably going to be pretty well identified in most cases, much more than if it is used for other purpose like marketing. morectually need to know about the individual in many cases for it to be possible to do medical research. that is a world in which there is not only laws involving how the data is to be handled that governance models like institutional research boards and a multithousand year history of culture of confidentiality regarding the information. the rules in that world for big data are stronger already.
11:05 am
context,in a marketing there aren't very many rules. you would want to look at what ultimate herbalists and how does that -- you want to look at was the ultimate purpose and how does that stack up and might you establish some rules that would significantly reduce the risk of using big data in a context that has less public benefits. and where a lot can be done without the data being very personally identified. >> when you say establishing rules, do you mean industry or government? >> however you do it. there could be any number of ways. in the medical context, there are many sources of the rules, some of which are cultural. there are also laws and there are also governance procedures. there are lots of ways to establish rules. we can learn from some of those pieces in other contexts.
11:06 am
there's an interesting proposal tic precedenceimate ortho in which it is proposed that irb's - create ethical review boards for big data projects, an internal thing , not a government thing, but a board appropriately representational within the company that has a basis in a policy that has some values and big data projects are then reviewed in the way that research involving reviewed reviewed where you look at a risk-benefit analysis and look at the risks not to the company but to the data subject. >> that sounds like what we advise clients to do to set up good governance within their organizations.
11:07 am
are you seeing that market move a little bit toward companies thinking more about that? >> yes, definitely. that is one of the primary roles that i ply. my team a look at the privacy and practice assessment which now includes design guidelines and all of the roles that make good requirements for buildout. when you put those elements together whether we are using a or predictive analytics, big data, then we will put these things together and i will be the first line arbeiter and when we have to escalate above that, we do. more and more people are doing that. the hard part is that some of this stuff is happening for smaller organizations at the procurement level. maybe attorney, more and moremaybe a contract ms
11:08 am
facing some kind of question about maybe liability for a vendor who will be providing data. they may or may not know anything about these risks and rewards or what the controls are. they may have a sense that there should be some security liability training but this is where the loss in translation is alarming. that look like on the backend when something goes wrong? how good is governance at that point? >> that is something that people look at when you are in the situation and people are trying to evaluate whether this incident has occurred at a company that is diligently taking care of its information or not. my reaction in dealing with companies that have to struggle with these situations is that a lot of these steps that are being discussed like governments and all the productions are very expensive. and time consuming and they do
11:09 am
get in the way of doing the business you are trying to do to serve your customers or engage in your research activities. carefullyd to think about pushing that model into areas where the value of the date that you are protecting is not really worth it. hand, there is a great -- people used to always say that there are two types of companies -- companies that have been breached already and companies that will inevitably be breached because everybody inevitably will be breached. isight arabic variation of that saw aher day -- i terrific variation that the other day in oh quote the wall street journal." -- in "the wall street journal."
11:10 am
this was from kleiner perkins. firm alyst said i am a lever that there are only two kinds of companies -- those who have been breached and know it and those who have been breached and don't know it. most of what we do insecurity is around prevention, prevention, prevention. great. just know it won't work. know that they will get in. that really is the world we live in. withis what we see dealing people who come to us for advice. companyre a big data and you want to recognize that and you have to have a good feel, and intuitive instant response and that colors your overall response to security and you want to have a sense of explaining the situation to regulators and people in the
11:11 am
public. whenever there is a breach, there is always, by definition, a broken length, weak link somewhere in the process, somebody got in. in a way that maybe you would have been able to prevent by design or by lock. l --uck. somebody has gotten in and what is interesting in this area is that that's where the plaintiffs in the lawsuit and the regulators are going to focus. they will focus on the weakest link. our job as counsel is to persuade the decision-makers and thatsure they understand imperfect security does not mean unreasonable security. if you look at the totality of this, if you have a data center that is an important physical's
11:12 am
facility with guards and biometric passcodes, if you got firewalls and segmentation throughout your network, you got strong access and authentication controls, you've got security and privacy training throughout the company and you hire outside experts who come in and advise you on various experts of your design, you do vulnerability scanning and follow-up on that, you have performance monitoring and logging, you have integrity ip[s, you or ids, or put that in place and spend a lot on security -- you will still be at risk of being hacked. and you will be according to many analysts. and yes, whenever these things happen, the focus is always on the weakling. -- weak link. we have to gain a mature approach to these kinds of
11:13 am
situations and recognize that we are in a world where imperfect security may be -- we all agree that you will never bring it down to zero. that approach has got to carry through in the way people react to these incidents when it occurs. the overreaction with an intense that isf pressure placed on company by the brand damage, by the cost of the litigation really makes -- there is a cost to that. thes great for lawyers but companies themselves bear substantial responsibility. incident striving toward reasonable security, you have to put more into it because you want to be -- because of the concern you will have that the brand damage if something goes wrong, the brand damage will be substantial. like security from the
11:14 am
marketing department to analytics, taking the government model and moving that out is how much investment you can make. that's part of what industry is struggling with. one more topic before we open it up to discussion -- you made the excellent observation that there seems to be a lot of technological self-help arising and consumers the argument is about consent. are they really viable or will i go out and buy that smart refrigerator and by that smart car anyway and just know it will track stuff unless i physically turn it off? kind of where the
11:15 am
consumer market is going and the isulators are behind or business behind at that point or are we all just deluding ourselves in some sense in thinking we have the ability to self regulate? >> first of all, i love when people overspend on security. [laughter] please indulge. lawyers, two. that said, privacy is not security and security is not privacy so that's an important point to make. as anre savvy we are asset, the more we will make asset based discovery decisions and development decisions and investment decisions. as it turns out in chapter eight, you can work with a 17-year-old boy who wants to create an across the country running app and in the afternoon you can have fair principles data model program.
11:16 am
somewhere. start you have to figure out what your priorities are. it might take an extra afternoon for you to think about your deal. i also think instead of the cost of goods sold, we'll start to have the cost of data sold. that is the equation that is missing in many americans textual risk and reward discussions. before there is a breeze. after a breach, there is no choice. socs.s what happened with we had bad guys and protect it rejected data which we had which is -- which was our financials. i think the cost of data sold, the cost of data added to your rejected data which we had whichrepertoire will informw badly people want cars that tell them they are cold. i already know i am cold.
11:17 am
i don't need to have someone memorize that. pertimes i go for 85 miles hour. i don't need someone to tell you that. when you have the knowledge of that exact configuration in your automobile or the exact location of your milk, i know where my milk is. i have ai don't need a smart refrigerator. i may need the smart refrigerator to tell me that a filter needs to be changed. and the nicee out thing about being in the regulatory environment we currently have is that there is some practicing that will happen to say what is truly creepy and what is a feature worth paying for. before we can have a real analysis of what is worth it, we need to talk about worth. it's difficult because data is a
11:18 am
human right and it's about dignity so when you bring accountants in the room, people start getting their own version of creepy. i think money spurs innovation and investment in companies that know how to not just say it is really hard. the guy from kleiner perkins will not fund you but others well. --will. the gadget is great but usually the gadget and the technology itself has to be modified to fit into my large environment. i am either looking for the people or the data or both. if you don't have a handle on your data asset and you are a young company and want to combine, you could be losing and leaving money on the table for the investor who blindly says there is no privacy, get over it or is too expensive. >> at a workshop we did recently for act developers in santa monica, we had to venture
11:19 am
capitalists who spoke to the audience of mostly young developers, not too many lawyers. talking about the growing market for privacy enhancing technology in the wake of mr. snowden's revelations. they have been talking about this from the beginning and now they are actually coming on the market. they also talked about how these two vc's in particular were concerned about investigating the data practices of projects they would invest in because have ave a break -- they great big privacy scrub at the beginning, they have lost their money. they recognize it as an investment. wayother equation is the that the data breach notification law has redistributed cost and revealed
11:20 am
the cost of certain kinds of bad before, weret, borne by the consumer or the data subjects. whatever they turned out to be. instead, the cost of doing the notification and all that's involved which i certainly know will be quite costly, falls on the entity who has more ability to be an appropriate steward of the information and make changes that ken protect them -- that can protect them and reduce the cost. it's an interesting privacy law and security law. it does not prescribe to do this or that, it says when you lose control, you have to tell. >> one more point -- revolutionizing
11:21 am
everything. it's not because they are all combined and they are small. a product defect that i have to buy a cover that my device won't write but steve jobs never got the memo on that. would be that this changes everything and that's location. location andation, with that comes context. with context comes really big data that can allow really interesting and exciting things. oncely scary things but per year if you are a privacy person, you have to watch the i have a dream speech and a member it's about people. to yourming you read it kids at christmas, i do. it's about technology and codecing the ability to have a mobile location-based technology
11:22 am
that would reveal things in context for people who could not control the context. it sounds pretty relevant to me today, as relevant as it was at the turn of two centuries ago. we are talking about corporate issues. in california, it is a right and a right in europe and certain asian countries but otherwise, it's part of the struggle. >> we are one of the few outliers that does not have a federal comprehensive. as has been pointed out, we are more red -- regulated than almost of any other place on the planet. it is not fictitious, they will come get you. >> we have some time for some questions if folks want to ask questions of any of our knowledgeable panelists. the microphone is there. >> thank you.
11:23 am
theelephant in the room is edward snowden revelations and what that has done to expectations. defense thata good we should not use the fears of breaches to stop collection or stop the assembly of data which are very important. be --les may not need to the infrastructure, the technology and collection -- the rules need to be in use. belief thatto be a i think i share that the rules don't apply to the government at all. and that people, if a rule of law that might apply to how the data are used, will not result in any real punishments of people who abuse the rules.
11:24 am
is there a shortcoming that the roles of use are not going to be effective at all unless there is true enforcement of the rules? class actionsout for breaches and failures to disclose but it seems that those are gnats compared to the governmental problem. how do people feel about that? >> ed? [laughter] >> let me address that a little bit. there are several ways to look at this. rules arer that different for government. if we learned anything from what we have learned about the actions of the government and legal opinions that have been declassified about the scope of government eavesdropping authority, is that the rules are very different.
11:25 am
if that's the case, then we need to be thinking about oversight and how to make an oversight process work. actions ofe that the governments, even if they are legal, are within the scope of what is good policy. i think there are very serious questions about whether that process is able to function or has been able to function effectively whether the overseers have the information they need in order to exert the kind of guidance they are supposed to exert. that's one of the biggest questions in one of the most difficult questions raised by the snowden revelations and the discussion that came after. it's pretty clear that the overseers did not really know the ins and outs of what was happening. i have the brian krebs rule mention edwardou snowden, you should mention brian krebs who is trying to
11:26 am
reveal things to help people. i think it's a gray matter case that we can talk about for a long time of what that person's intent was based on the nature of the cases of the revelations. i have my own personal opinions but i think there are heroes that help us and speak out and look at what is going on. the information that is readily available, if we but look and brian krebs is one of those people. i use it as a rule because it is hard for me to be surprised by those revelations. i am saddened by those revelations and agreed there should definitely the consequences. it also highlights that there is a burden of collection and the burden of fiduciary duty that is not been put on the shoulders of the right people in the right places.
11:27 am
. i also think the power of social engineering -- if you peel back though all the intrusions were made by someone like that, a lot of social engineering even , andn a group of people there are a lot of people in that organization which are -- who are troop patriots and there was a cut of owner breached here and i think it was the code of honor that helps. i take the education from that into my own con text as a leader in my organization, i talk as much about being aware of sharing information and collecting information but also being polite and sharing information with people who call you and ask you. that's where a lot of those documents were collected. it was not necessarily all available through normal channels. or was hacking involved here.
11:28 am
that is a whole different thing of how that went down and i am not privy to any of that. >> when you think about the government vs industry is that there are differences. that is a fact of life that we deal with. not quite california two years ago i had the joy and pleasure of sitting for another bar examination. the shocking thing to me over the 20 years that had passed is that the last test i had taken was the fact of the fourth amendment has been completely eroded. andou look at the cases think of the great cases in the fourth amendment deal and some of the surveillance questions that come up, the cases that we studied with criminal procedure and now look at the cases people study now and realize that there is no exception that will not
11:29 am
swallow the rule. the government is very broad rights of the moment. from the snowden perspective as a lawyer, i look at that is that there were different rules creating an interesting black -- backlash. organizations are trying to deal with the government from a friendly perspective and an antagonistic perspective and that's creating industry-government relations issues which will be interesting to see how they spell out. the fourth amendment changes incrementally. it will take a long time. the good news is, we get to assess the rules for government. lotink there is now quite a of discussion that maybe should have taken place in years ago about doing an actual risk analysis about how much privacy securityyou give up in
11:30 am
, how much data is obtained via various privacy invasive measures. a more thorough analysis of some policies is certainly called for. >> our current system of data collection and management is say,ated by hubs, let us so the same information about a person is in many places, in many collections. is it feasible to imagine a regime where you are not allowed to hold information about a person unless it relates to your
11:31 am
relationship with that person? oft is, imagine a system of manyes, a graph, lines among the various actors rather than something dominated by hubs? >> back to the fair information process, this is about proportionality. these are the things that have not been technologically innovated against well enough. absolutely, you can create proportionality. does it create more metadata? probably in the short term but proportionality and purpose and if you have a fiduciary right to have information or control is certainly within the than the realm of possibility if we use a like the unified modeling language.
11:32 am
it is an academic question. is done asnk it effectively as anyone would like. as a tactical matter, it is possible to do more to reduce and control how much information somebody has. there is a tendency to take the easy path and collect everything and keep it because it might turn out to be useful. as opposed to being very thoughtful from the beginning about what you really need and how you can boil the data down to pertain to the things you need for your business model rather than keeping everything around just in case. is part of the problem that we as the general public are the only curious ones?
11:33 am
maybe from target or when someone finds their name unexpectedly on the no-fly list but otherwise people don't really care about this? if you agree with that, what do we do to increase public care about the subject. ? >> a lot of people care at different times. it is a hard issue to research. hard emotional thing that you're tapping into. there is interesting research done at berkeley on what people understand or believe are their online privacy rights and what their off-line privacy rights are. in both cases, they segments of the people using another methodology by how privacy concerns they were from not very much to very concerned. what they found was that those who were the least concerned
11:34 am
about privacy had the mistaken belief that that had a lot of legal rights and they don't. those most concerned had a greater understanding of the rights we don't have. in our school safety programs, i think defining what right does he actually is, it's not covering up or concealing or encrypting. we had this conversation at the engineering level. get out ofncrypt and liability. it really is proportional and processingharing throughout the lifecycle of data and we don't quite have a language yet. when you ask someone if they like it and they say sure and menu asked them if they would like a candy bar, you see this split in behavior.
11:35 am
the conclusion has been we will give away our privacy for a candy bar. if they really understood the trade-offs or that you have two dollars in your pocket to margot getchell from -- in your pocket, go get yourself a candy bar. it reaches us at multiple levels for us to have the analysis to find other people care. some of the bad things that happen create a market for security and privacy. i think that is the silver lining in dark days. it's not just that the business is doing better but that's the awareness of the marketplace. one one industry moves retailer did not want cichip has more of a delay at the checkout line. more and more, those guys are saying to sign him up because
11:36 am
they believe it will help them in many different ways. i'm not pointing at anyone company. i think people do care about these things but they often feel that there is not much they can do to affect their privacy. if you believe that the choices you make will not really surge protector you, you will not act differently. you might believe that your actions don't have an effect because you believe the law already protects you when it doesn't. youru just i believe that information is out there to do. to selle people willing their social security number for a dollar, that is an entirely rational thing for them to do it they believe that their social security number is already on sale elsewhere for a dollar. i'm as we'll get that dollar myself as opposed to having a go to someone else. we need to be careful about how we interpret the way people
11:37 am
behave. we need to bear in mind what their expectations are. >> as a follow-up, perhaps there is a space for maybe a somewhat more naturalistic approach by government. europeanhe way the governments do it. >> you cannot tell people how to care about privacy in that way. you cannot say you will care and behave differently. if we want to show value, we have to be more transparent about what is going on with real choices and contextual understanding. and i havephic novel the written policy but also of where people don't read
11:38 am
them. i thorough be easier for people with the amount of sharing done. i think that's where we are reaching out to our students and musicians. for every sitcom, you have a laugh track and if anyone falls in love, you have to have violence. when you read a contract, there is no musical for it or it i don't know how to feel and when and it's an important decision. if you want someone to feel something and know it is their moment, we should think about music and think about context. it's not just pop-ups. there is use room for innovation and how people like it. the yen and the yang example. >> there is some interesting mulligan didirdre
11:39 am
call privacy on the books and on the ground. they studied a feud data protection authorities in europe and companies in europe and then american companies and what they found was - with a couple of exceptions -- there is actually more internalized implementation of privacy governance in the u.s. companies with a much lighter system of law than in europe where most of the privacy row graham's at in most -- privacy was not really building valueas a corporate through implementation. germanyexception was which was much more like the u.s. and internalized like the company. that's what we are dealing
11:40 am
with is making sure we have the flexibility to allow the innovation that it does not become a process of thinking about compliance which freezes in place or stops innovation, allowing the flexibility of having innovation while also protecting people's rights when we can see that there is a potential for real harm and we should have laws protecting that. >> anymore questions? some of the research literature you regarded on big data and potentially legal institutions was positing that big a to or small dater can be used to address the consumer market or get legal institutions to get better access to people who need legal our services. opinion onnow your dataher big data or small ta
11:41 am
should provide more to individuals who cannot go to law firms or are not covered by legal aid. data? would you get the what kind of data? >> i think data can do a lot of things. the analogy i am hearing you ask about is like webmd which is the bane of physicians but also the boon of patients. you can go in and say i have malaria and they can say no. it also helps people be more informed and engaged in the process. it might be similar in legal services. i don't know. health care information is where we see the ability of people bringing information to deliver health care services that are better to those who might not be able to pay for it.
11:42 am
it is trying to get information exchanges started. we don't have that yet in different industries. disparate, almost lcanized system. afterre was some concern the revelations that u.s. companies were going to lose business because of fears of u.s. government surveillance. baker said stuart very forcefully that there is no evidence that u.s. companies have lost any business. reports ond reliable the other hand the non-u.s. governments -- company's have reported a spike in business. know and can say about the business effects of
11:43 am
this? >> stewart is a friend. he does not say anything non-forcibly. i think we have had at least 1/4 and the financial results speak for themselves and i will leave it there. least>> i would be interested e comments from the panel about culture. there were hints at that and it seems like the backdrop for whatever will happen legally is culture. for one thing, as we have talked about, that is different country to country. even regions within a country let alone part of the world and traditions. looking at our culture for a groomedmost of us are on the 10 commandments.
11:44 am
it was a simple list that was drilled into all of us. i don't recall one of them put mythat shalt not nose in other peoples business. our mothers may have told us that but, in fact, it was observed in the breach. everyone put that out there. some suggested that is part of our culture. young people are very free to put their information out knowing that other people are going to another business. culturally, it it seems they're more willing than previous generations to do that. a i think it is not just cultural thing. i think there are generational differences and i agree with that 100%. i signed up for some web service and a couple months later i could not login. i called them up and they said
11:45 am
you must have turned off cookies on your machine. i got up on the wrong set of -- a bed that day and i said i would not do it and i want my subscription back to see what would happen. he said ok. they returned the rest of my bounds will subscription and i was amazed. but every thing was fine until i told my daughter and she tore my head off because she use the service and said get over it, what's your problem? what's the big deal to put a cookie? father and about two months later i got pretty frustrated that i did not have the service anymore. i think there is a generational thing here. i'm curious to see whether people feel there is a generational issue with a difference of views going on in other countries as well.
11:46 am
as we all get more used to the notion of the fact that your data is out there like my trade secret story, once you put your data out there, maybe 10 years ago, people were surprised at the notion that you could be driving down the street and there would be a electronic targeted at you because they knew you were driving demonstrate. 10 year zero asked why they -- how they can do that. people now understand it and people understand that the orvacy in a vision expectations is eroding. i think younger people are more likely to be less interested. it could be the debate is the cause the rest of us older people are concerned. >> cultural values evolve over
11:47 am
time. i also think people are what over time. their own experiences can instruct them over time and younger people are not always aware of the consequences in many arenas. know that i don't there will not be some much more common understandings of being everything out there. of privacysee a lot instincts and younger people. they certainly are concerned about protecting their privacy against their parents, for example. thei don't think we can say next generation will not care. i don't think we know that. www.c-span.org in fact, i think we know it's not true. >> there is pretty good data on
11:48 am
this that shows that young tople are more likely to try adjust the privacy settings on facebook than older people. they are more likely to adjust the settings at all rather than young -- rather than old people and they are cagey about what they say online. if you have a teenage kid, do not be fooled into thinking you can look at their facebook page and know the stuff about their life they don't want you to know. the fact is, kids are able to communicate with each other online and share details of their lives with each other in for that are very difficult the people who they really want to maintain privacy from, meaning their parents and teachers to figure out. i don't think is right that young people act like they think they don't care about privacy. that's not what i see from our students and all the people who have studied this in detail seem to find that they do care.
11:49 am
in fact, they don't care about the data we care about comewe certainly see if for general the world avernments behaved, if little girl had a secret crush, we would not have this edward snowden incident. these girls are very sensitive. that isve a fake site different than their party girls site. another issue is a callout mpcten by jay klein from and he writes a whole article on sacred references. this goes way back into the book of genesis t andhe koran and writings from mohammed. we also have chapter 2477 of the cataclysm -- catechism of the
11:50 am
catholic church. there is respect for the reputation of her sons that s.r for bid it talks about what the sacred to inform the united nations in 1947 following the largest rubber seed disaster in history, the holocaust. you cannot organize the death of people without computers and that's exactly what happened. that's where you see some of the stuff. it goes back in time. the biggest data set is our cultural history going back 3000 years. i think people are not that interesting. >> this is fascinating. i am curious about legal developments particularly with
11:51 am
creative commons licensing. it's very much about potentially providing a compliment to copyright related things. it is an emergent law that relates to this developing transparency of the distributed engine at the worldwide. i'm asking this in the concept of creating an online university. can you see any developments with creative comments around or something parallel around a privacy law that would rewrite some of the questions of the proprietary notice of these questions. i think yes is the answer and
11:52 am
where you see it is happening is in the procurement phaese. the european model is a good example. i don't think it's a perfect example because they are nonnegotiable which makes them [indiscernible] but more system of law and more because people are trying to push liability around and we have an obligation if we have a global environment to ensure that their processing of data regardless of the circle of vendors and others , third parties that touch it, we are coming up on more and more common language so we can have shorter negotiation cycles. we can find software or share data. today,s a law student this is the two sides of the widget that the drum moment. one thing to watch out for or think about is who you are
11:53 am
building the creative commons model around. if someone has collected or has yourself,put around maybe is not created, the more the canoe type of model. in that respect, we have already have that going on with the dune not track settings and browsers. it seems like most people have gone out of their way even given the california statute to say we ignore that. they ignore it because we don't know what these contracts mean. you have individuals try to have an individual contract that unit it is difficult to enforce that. that is the cultural issue you will have to bridge at that point. question onquick
11:54 am
commercial uses and ad targeting. how would you support targeting of the ads for an individual on an individual basis? it might be their reading habits or fingerprinting. i'm not sure of the question. >> when i deliver ads, i can deliver them to a person or two people. what are you browsing? you might be browsing the sports patch so i can show you ads for nike shoes as opposed it if you are browsing using apple, i was broadly categorizing will kind
11:55 am
of technology and on that basis, i can target ads. of how muche area you can target and when does it become personal. the second scenario of looking at someone's operating system and making inferences about them may or may not be targeting. you're taking that adding their mobile data and location and e-mail account they may have but that is certainly regulated. how would you suggest these three services, how can they make money. ? view, privacy point of fisa google, yahoo!, and ebay, they will not store data, then how do you suggest they should make money.
11:56 am
? advertising tok pay in. the percentage of online advertising dollars that comes from targeted advertising ads hacing was under 20%. there is a lot of advertising on the web that does not depend collecting personal information. make two to suggestions that don't have anything to do with targeted ads. one is a fiction book by corey doctorow called "little brother." i think they go a little too far and they will not give it away.
11:57 am
i think it's about a kid who is trying to avoid government surveillance and it's a very dark world of over surveillance. it is interesting and it talks a lot about how location plus, localal and overuse in governments and how they can have impact. it's an interesting fiction view and another important book which is trillions that talks about there are about one billion sentiment places that can send or receive information. before 2020, the book went to print last year and it will probably be even faster and we will have one trillion notes of sensor type data and to put that context, the-- in brain might think that 30 billion years ago was 30 seconds
11:58 am
ago. one trillion seconds ago was 30,000 years ago. oft is a lot of difference humanity that has happened in those time spans. there is a lot of humanity that lives within one. -- $1 million. come up with policies and experience a day to armageddon. i think it's an excellent read. it is not a law book. it is a what if book. nontransparentry >> yes, it is, absolutely. >> if there are no further questions, lease join me in thanking our members [laughter] [applause] it was a fascinating panel. thank you very much. [captioning performed by national captioning institute] [captions copyright national
11:59 am
cable satellite corp. 2014] >> light snow continues to fall d.c. and thengton, entire region is digging out from an unexpected 5-10 inches of snow, making this a special st. patrick's day. the federal government shut down and congress is away on a scheduled break and members will return next week for legislative work. president obama made news issuing executive orders imposing sanctions on russian officials to make it clear there are consequences for their actions in crimea.
12:00 pm
president added that more sanctions could becoming depending on circumstances. they froze the u.s. assets of seven you -- russian officials. is treasury department also imposing sanctions on four in a statement the white house says today's action sends a strong message to the russian government. here is what the president had to say in his briefing with reporters only about an hour ago. >> good morning, everybody. the recent months for the in thes of ukraine -- recent months the citizens of ukraine have made their voices heard. the future of ukraine must be decided by the people of ukraine. that means