Skip to main content

tv   Key Capitol Hill Hearings  CSPAN  March 18, 2014 6:00am-7:01am EDT

6:00 am
we had a crisis were one of our important research centers was cut up by a raging wildfire. -- off by a raging wildfire. we wanted to find out quickly in we wanted to find out quickly in real-time that kids need to be picked up at school and families know their loved ones were safe and we had to evacuate and find out who was traveling that day. we had to crack open hr databases, in some cases personal databases were we had noncustodial parents that now had access to kids. the point of that was that during that context in time it was because we planned to do that, we knew when there was a beginning, a middle and and and. -- and the end. i knew when we shut down access to the databases as soon as when we opened access and that is the
6:01 am
critical point as well. we are always in a crisis and people throw things at a wall but you have to practice for the end of the crisis when everyone is bored and exhausted and going home to make sure that data which is critical or you would not have been touching it in an emergency in the first place has been closed again. it is the boring details that really amount to safety. it's kind of the big data thing of saying it's really great that this is available but -- that penicillin is available, it is terrific, but handwashing turns out to be the number one thing that lowers diseases in large facilities. there is a low-tech and they high-tech and an understanding of the components which is where innovation needs to happen. >> what about yourself? >> there is -- this is an extension maybe of the sand boxing.
6:02 am
there is a lot you can do it big data and yet a lot of benefits when it is not identified. we can learn a lot from big data about climate change, no pii involved. there are different buckets or different types of data and contexts in which they have different rules. the kind of massive quantities of data used for medical research is probably going to be pretty well identified in most cases, much more than if it is used for other purpose like marketing. you actually need to know more about the individual in many cases for it to be possible to do medical research. that is a world in which there is not only laws involving how the data is to be handled that -- but governance models like institutional research boards and a multithousand year history of culture of confidentiality regarding the information.
6:03 am
the rules in that world for big data are stronger already. whereas in a marketing context, there aren't very many rules. you would want to look at what is the ultimate herbalists and at what is the ultimate purpose and how does that stack up and might you establish some rules that would significantly reduce the risk of using big data in a context that has less public benefit and where a lot can be done without the data being very personally identified. >> when you say establishing rules, do you mean industry or government? >> however you do it. there could be any number of ways. in the medical context, there are many sources of the rules, some of which are cultural. there are also laws and there are also governance procedures. there are lots of ways to
6:04 am
establish rules. we can learn from some of those pieces in other contexts. there's an interesting proposal called the orthotic experiment in which it is proposed that companies create irb's - ethical review boards for big data projects, an internal thing, not a government thing, but a board appropriately representational within the company that has a basis in a policy that has some values underlying it and big data projects are then reviewed in the way that research involving human subjects are reviewed where you look at a risk-benefit analysis and look at the risks not to the company but to the data subject.
6:05 am
>> that sounds like what we advise clients to do to set up good governance within their organizations. are you seeing that market move a little bit toward companies thinking more about that? >> yes, definitely. that is one of the primary roles that i play, that my team will look at the privacy and practice assessment which now includes design guidelines and all of the rules that make good requirements for buildout. when you put those elements together whether we are using a vendor or predictive analytics, big data, then we will put these things together and i will be the first line arbeiter and when we have to escalate above that, we do. but i think more and more people are doing that. the hard part is that some of this stuff is happening for smaller organizations at the
6:06 am
procurement level. it is the young, maybe attorney, usually a contract manager, who is facing some kind of question about maybe liability for a vendor who will be providing data. they may or may not know anything about these risks and rewards or what the controls are. they may have a sense that there should be some security liability training but this is where the loss in translation is alarming. >> what does that look like on the backend when something goes wrong? how good is governance at that point? >> governance is something that people look at when you are in the situation and people are trying to evaluate whether this incident has occurred at a company that is diligently taking care of its information or not. my reaction in dealing with companies that have to struggle with these situations is that a lot of these steps that are
6:07 am
being discussed like governments -- like governance and all the productions are very expensive. and time consuming and they do get in the way of doing the business you are trying to do to serve your customers or engage in your research activities. we all need to think carefully about pushing that model into areas where the value of the data that you are protecting is not really worth it. on the other hand, there is a great -- people used to always say that there are two types of companies -- companies that have been breached already and companies that will inevitably be breached because everybody inevitably will be breached. i saw a terrific variation that
6:08 am
-- in " the wall street journal." this was from kleiner perkins. one analyst said i am a firm a believer that there are only two kinds of companies -- those who have been breached and know it and those who have been breached and don't know it. most of what we do insecurity is around prevention, prevention, prevention. great. just know it won't work. know that they will get in. that really is the world we live in. that is what we see dealing with people who come to us for advice. if you are a big data company and you want to recognize that and you have to have a good feel, and intuitive instant response and that colors your overall response to security and you want to have a sense of
6:09 am
explaining the situation to regulators and people in the public. what we see is whenever there is a breach, there is always, by definition, a broken link, a weak link somewhere in the process, somebody got in in a way that maybe you would have been able to prevent by design or by luck. somebody has gotten in and what is interesting in this area is that that's where the plaintiffs and the lawsuits and the regulators are going to focus. they will focus on the weakest link. our job as counsel is to persuade the decision-makers and make sure they understand that imperfect security does not mean unreasonable security. if you look at the totality of
6:10 am
the circumstances, if you have a data center that is an important physical facility with guards and biometric passcodes, if you got firewalls and segmentation throughout your network, you got strong access and authentication controls, you've got security policies and privacy policies and training all throughout the company and you hire outside experts who come in and advise you on various experts of your design, you do vulnerability scanning and follow-up on that, you have performance monitoring and logging, you have integrity monitoring or ids, or ip[s, you ips, you put that in place and spend a lot on security -- you will still be at risk of being hacked. and you will be according to many analysts. and yes, whenever these things
6:11 am
happen, the focus is always on the weakest link. we have to gain a mature approach to these kinds of situations and recognize that we are in a world where imperfect security may be -- we all agree that you will never bring it down to zero. we never felten says will. that approach has got to carry through in the way people react to these incidents when it occurs. the overreaction with an intense amount of pressure that is placed on company by the brand damage, by the cost of the litigation really makes -- there is a cost to that. it is great for lawyers but the companies themselves bear substantial responsibility. instead of striving toward reasonable security, you have to put more into it because you want to be -- because of the concern you will have that the
6:12 am
brand damage if something goes wrong, the brand damage will be substantial. >> and just to extrapolate that, it sounds like security from the marketing department to analytics, taking the government -- the governance model and moving that out is how much investment you can make. that's part of what industry is struggling with. more topicit one before we open it up to discussion. you made the excellent observation that there seems to be a lot of technological self-help arising and consumers -- the argument is about consent. are they really viable or will i go out and buy that smart refrigerator and by that smart car anyway and just know it will track stuff unless i physically
6:13 am
turn it off? is that kind of where the consumer marketplace is going and the regulators are behind or is business behind at that point or are we all just deluding ourselves in some sense in thinking we have the ability to self regulate? >> first of all, i love when people overspend on security. [laughter] please indulge. and lawyers, too. that said, privacy is not security and security is not privacy so that's an important point to make. the more savvy we are with data as an actual asset, the more we will make asset based discovery decisions and development decisions and investment decisions. as it turns out in chapter eight, you can work with a 17-year-old boy who wants to create an across the country running app and in the afternoon you can have fair principles respecting data model program.
6:14 am
and if my nephew can do it, so can you. you have to start somewhere. you have to figure out what your priorities are. it might take an extra afternoon for you to think about your deal. i also think instead of the cost of goods sold, we'll start to have the cost of data sold. that is the equation that is missing in many contextual risk and reward discussions. before there is a breeze. -- before there is a breach. after a breach, there is no choice. that is what happened with socs. we had bad guys and protect it the most rejected data which we had which is -- which was our financials. i think the cost of data sold, the cost of data added to your repertoire will inform us how
6:15 am
badly people want cars that tell them they are cold. i already know i am cold. i don't need to have someone that memorizes that 69 degrees felt good to me yesterday. sometimes i go for 85 miles per hour. i don't need someone to tell you that. when you have the knowledge of that exact configuration in your automobile or the exact location of your milk, i know where my milk is. , i don't need a smart refrigerator to tell me that. i may need the smart refrigerator to tell me that a filter needs to be changed. we will figure out and the nice thing about being in the regulatory environment we currently have is that there is some practicing that will happen to say what is truly creepy and what is a feature worth paying for. but i think before we can have a
6:16 am
real analysis of what is worth it, we need to talk about worth. it's difficult because data is a human right and it's about dignity so when you bring accountants in the room, people start getting their own version of creepy. but i do think money spurs innovation and investment in companies that know how to not just say it is really hard. the guy from kleiner perkins will not fund you but others well. -- but others will. and the buyers of companies on m&a --nd a dash on the on the m&a side, the gadget is great but usually the gadget and the technology itself has to be modified to fit into my large environment. i am either looking for the people or the data or both. if you don't have a handle on your data asset and you are a young company and want to combine, you could be losing and leaving money on the table for blithelytor who very
6:17 am
says there is no privacy, get over it or is too expensive. >> at a workshop we did recently for act developers in santa monica, we had to venture -- we had two venture capitalists who spoke to the audience of mostly young developers, not too many lawyers. they were talking about the growing market for privacy enhancing technology in the wake of mr. snowden's revelations. they have been talking about this from the beginning and now they are actually coming on the market. they also talked about how these two vc's in particular were concerned about investigating the data practices of projects they would invest in because they have a great big privacy screwup at the beginning, they have lost their money. they recognize it as an investment. the other equation is the way
6:18 am
that the data breach notification law has redistributed cost and revealed the cost of certain kinds of bad practices that, before, were borne by the consumer or the data subjects. whatever they turned out to be. instead, the cost of doing the notification and all that's involved which i certainly know will be quite costly, falls on the entity who has more ability to be an appropriate steward of the information and make changes that can protect them and reduce the cost. i think it's an interesting privacy law and security law. it does not prescribe to do this or that, it says when you lose control, you have to tell. >> one more point --
6:19 am
i am sorry to keep going back to innovation. these things are revolutionizing everything. it's not because they are all combined and they are small. i think it is a product defect that i have to buy a cover that my device won't write but steve -- that my device won't break, but steve jobs never got the memo on that. the idea would be that this changes everything and that's location. location, location, location and with that comes context. with context comes really big data that can allow really interesting and exciting things. and really scary things but once per year if you are a privacy person, you have to watch the i have a dream speech and a member it's about people. and then you have to read warren and brandeis.
6:20 am
i'm assuming you read it to your kids at christmas, i do. it's about technology and codecing the ability to have a mobile location-based technology that would reveal things in context for people who could not control the context. it sounds pretty relevant to me today, as relevant as it was at the turn of two centuries ago. >> interestingly enough, we are talking about corporate issues. in california, it is a right and a right in europe and certain asian countries but otherwise, it's part of the struggle. >> we are one of the few outliers that does not have a federal comprehensive. as has been pointed out, we are more regulated than almost of any other place on the planet. it is not fictitious, they will come get you. >> we have some time for some questions if folks want to ask questions of any of our knowledgeable panelists. the microphone is there. >> thank you.
6:21 am
the elephant in the room is the edward snowden revelations and what that has done to expectations. i have heard a good defense that we should not use the fears of breaches to stop collection or stop the assembly of data which are very important. the rules may not need to be -- the infrastructure, the technology and collection -- the rules need to be in use. i think that was made very clear. my concern is there seems to be a belief that i think i share that the rules don't apply to the government at all. and that people will not -- that the rule of law that might apply
6:22 am
to how the data are used, will not result in any real punishments of people who abuse the rules. is there a shortcoming that the rules of use are not going to be effective at all unless there is true enforcement of the rules? we can talk about class actions for breaches and failures to disclose but it seems that those are gnats compared to the governmental problem. how do people feel about that? >> ed? [laughter] >> let me address that a little bit. there are several ways to look at this. it is clear that rules are different for government. if we learned anything from what we have learned about the actions of the government and legal opinions that have been
6:23 am
declassified about the scope of government eavesdropping authority, is that the rules are very different. if that's the case, then we need to be thinking about oversight and how to make an oversight process work to make sure that the actions of governments, even if they are legal, are within the scope of what is good policy. i think there are very serious questions about whether that process is able to function or has been able to function effectively whether the overseers have the information they need in order to exert the kind of guidance they are supposed to exert. to me that's one of the biggest mostions and one of the difficult questions raised by the snowden revelations and the discussion that came after. it's pretty clear that the overseers did not really know the ins and outs of what was happening.
6:24 am
>> i have the brian krebs rule - every time you mention edward snowden, you should mention brian krebs who is trying to reveal things to help people. i think it's a gray matter case that we can talk about for a long time of what that person's intent was based on the nature of the cases of the revelations. -- based on the nature and the cadence of the revelations. i have my own personal opinions but i think there are heroes that help us and speak out and look at what is going on. the information that is readily available, if we but look and brian krebs is one of those people. not affiliated with mcafee at all. i use it as a rule because it is hard for me to be surprised by those revelations. i am saddened by those revelations and i agree that there should definitely the consequences. it also highlights that there is a burden of collection and the
6:25 am
burden of fiduciary duty that is not been put on the shoulders of the right people in the right places. throughout industry and government alike. i also think the power of social engineering -- if you peel back though all the intrusions were made by someone like that, a lot of social engineering even within a group of people, and there are a lot of people in that organization who are true patriots and there was a cut of owner breached here and i think it was the code of honor that helps. when i take the education from that into my own context as a leader in my organization, i talk as much about being aware of sharing information and collecting information but also being polite and sharing information with people who call you and ask you. that is where a lot of those
6:26 am
documents were collected. it was not necessarily available through normal channels. there was hacking involved here. i am not privy to any of that inside information. >> one thing to keep in mind when we think about the government versus industry is that there are different goals, right? that is just a fact of life that we deal with. i moved to california not quite two years ago and have enjoyed the pleasure of sitting for yet another bar examination. the shocking thing to me over the 20 years that had passed is that the last test i had taken was the fact of the fourth amendment has been completely eroded. if you look at the cases and think of the great cases in the fourth amendment deal and some of the surveillance questions that come up, the cases that we studied with criminal procedure and now look at the cases people
6:27 am
study now and realize that there is no exception that will not swallow the rule. the government is very broad rights of the moment. from the snowden perspective as a lawyer, i look at that is that there were different rules creating an interesting black -- backlash. working with organizations are trying to deal with the government from a friendly perspective and an antagonistic perspective and that's creating industry-government relations issues which will be interesting to see how they spell out. -- how they spill out. the fourth amendment changes incrementally and slowly. it will take a long time for those rules to change. >> the good news is, we get to assess the rules for government. i think there is now quite a lot of discussion that maybe should have taken place in years ago about doing an actual risk
6:28 am
analysis about how much privacy you need to give up in security, how much is actually obtained by various privacy invasive measures, and a more thorough analysis of some of the policies is strictly called for. our current system of data collection and management is by hubs, let's say, so the same information about a person is in many places, in many collections. is it technically and
6:29 am
economically feasible to imagine a regime where you are not allowed to hold information about a person unless it relates to your relationship with that person? oft is, to allow a system of manyes, a graph lines among the various actors rather than something dominated by hubs? purpose andbout proportionality. these are two of the things that have not yet been technologically innovative against well enough. proportionality? absolutely, you can. can you create -- does it create more metadata? yes, probably in the short term, but her portion l.a. and purpose and do you have a fiduciary right to have information control and manage that information is certainly within
6:30 am
the realm of possibility if we use fair information practice principles like the unified modeling language. it is an academic question at this point. is done asink it simply as anyone would like. >> as a technical matter, it is possible to reduce and control how much information somebody has. there is a tendency to take the easy path and collect everything and keep it because it might turn out to be useful. as opposed to being very thoughtful from the beginning about what you really need and how you can boil the data down to retain the things you need for your business model rather than keeping everything around just in case. >> is part of the problem that we as the general public are the
6:31 am
only curious ones? maybe from target or when someone finds their name unexpectedly on the no-fly list but otherwise people don't really care about this? if you agree with that, what do we do to increase public care about this subject? >> a lot of people care at different times. it is a hard issue to research. it is a hard emotional thing that you're tapping into. there is interesting research done at berkeley on what people understand or believe are their online privacy rights and what their off-line privacy rights are. in both cases they segmented the people using another methodology by how privacy concerns they were from not very much to very concerned.
6:32 am
what they found was that those who were the least concerned about privacy had the mistaken belief that that had a lot of legal rights and they don't. those most concerned had a greater understanding of the rights they did not have. think that is true, and certainly in our school safety programs, i think defining what privacy actually is, it is not covering up, concealing, encrypting. get out of liability. andeally is proportional agreed to sharing processing throughout the lifecycle of data and we don't quite have a language yet. when you ask someone if they like it and they say sure and
6:33 am
then you ask them if they would like a candy bar, you see this split in behavior. the conclusion has been we will give away our privacy for a candy bar. if they really understood the trade-offs or that you have two dollars in your pocket to margot getchell from -- in your pocket, go get yourself a candy bar. i don't think that dialogue has occurred yet. it needs to happen at multiple levels for us to have the analysis to find other people care. some of the bad things that happen create a market for security and privacy. i think that is the silver lining in dark days. it's not just that the business is doing better but that's the awareness of the marketplace. once one industry moves one retailer did not want cchip and ping which has more of a delay at the checkout line.
6:34 am
more and more, those guys are saying to sign him up because they believe it will help them in many different ways. i'm not pointing at anyone company. i am talking about history at large. > i think people do care about these things but they often feel that there is not much they can do to affect their privacy. if you believe that the choices you make will not really surge -- serve to protect you, you will not act differently. you believe your actions do not have effect because you believe that the law protects you when in fact it doesn't. or you might believe your information is out there bank and there is nothing you can do andt it -- is out there there is nothing you can do about it. we have to bear in mind that is a rational thing for them to do if they believe their social security number is already on sale somewhere for a dollar.
6:35 am
i might as well get that dollar myself. we need to be careful about how we interpret the way people behave, bearing in mind what their expectations are. >> as a follow-up, perhaps there is a space for maybe a somewhat more paternalistic approach by government, as the europeans are. michelle, you have to comply with all these rules around the world -- i think to be -- you cannot tell people how to care about privacy in that way. you cannot say you will care. it behooves us if we want to show value, we have to be more serious about what is going on -- real choices, contextual understanding. instead ofphic novel
6:36 am
writing and i do have the written policy but i also have because i know that people do not read privacy policies. i think that is where we are reaching out to our art students and musicians. for every sitcom, you have a laugh track. whenever anyone falls in love, when the making out is not enough, you have to have violence. but when you read a contract, there is no music for it. i don't know how to feel and when, and i think it is an important decision. i think we should think about it is notcontext, and just pop-ups, but there are huge -- there is huge room for innovation. nangu gave the unit theple, the u.s. versus
6:37 am
european approach. >> there is some interesting work that was done called privacy on the books and on the ground. they studied a few european data protection authorities in europe and companies in europe and then american companies and what they found was - with a couple of exceptions -- there is actually more internalized implementation of privacy governance in the u.s. companies with a much lighter system of law than in europe where most of the privacy programming in most companies was directed at filling out forms and complying with bureaucratic processes and not ethicsbuilding it into with corporate values and implementation. the one exception was germany,
6:38 am
which was much more like the u.s., much more internalized in the companies with resources to support it and build it into their processes. >> that is what we deal with, making sure that we have the flexibility to allow the innovation that it does not become a process of thinking about compliance which freezes in place or stops innovation, allowing the flexibility of having innovation while also protecting people's rights when we can see that there is a potential for real harm and we should have laws protecting that. >> any more questions? >> some of the research literature regarding big data and potentially legal institutions was positing that data can besmall used to address the consumer market or get legal institutions to get better access to people
6:39 am
who need legal our services. i want to know your opinion on whether big data or small data sets could be used in provide -- in providing more legal services for individuals who cannot afford to go to law firms or are not covered by legal aid. >> how would you get the data? >> what kind of data? >> i think data can do a lot of things. the analogy i am hearing you ask about is like webmd which is the bane of physicians but also the boon of patients. you can go in and say i have malaria and they can say no. it also helps people be more informed and engaged in the process. in the legal services, it may be a similar possibility. i don't know. >> health care information is where we see the ability of people bringing information to deliver health care services
6:40 am
to those who may not be able to pay for it. see that in many aspects of the affordable care act trying , to get information exchanges started. we don't have that yet in different industries. it's a very disparate, almost balkanized system of providing services to people. >> i know there was some concern after the revelations that u.s. companies were going to lose business because of fears of u.s. government surveillance. i know that stuart baker said very forcefully that there is no evidence that u.s. companies have lost any business. i have heard reliable reports on on the -- on the other hand that
6:41 am
non-us companies have reported a spike in business. what do you know and can say about the business effects of this? >> stewart is a friend. he does not say anything non-forcibly. that is from someone who does not say anything non-forcibly herself. i think we have had at least 1/4 and the financial results speak for themselves and i will leave it there. >> i would be interested in some comments from the panel about culture. there were hints at that and it seems like the backdrop for whatever will happen legally is culture. for one thing, as we have talked about, that is different country to country. even regions within a country let alone part of the world and traditions. looking at our culture for a
6:42 am
moment, most of us are groomed on the 10 commandments. it was a simple list that was drilled into all of us. i don't recall one of them saying that thou shalt not put thy nose in other people's business. our mothers may have told us that but, in fact, it was observed in the breach. everyone puts their nose into other people's business. partam suggesting that is of our culture, and young people are very free to put their information out knowing that other people are going to know their business, and culturally seem to be much more willing than previous generations to do that. it is not just a cultural thing. i think there are generational differences. i agree with that 100%. i signed up for some web service and a couple months later went
6:43 am
to log in again and i could not login and could not figure out what the problem was. turned off cookies on your machine and you have to turn them back on. i got up on the wrong side of the bed that day and i said i am not going to do it and i want my prescription -- i want my subscription back. just to see what they would do. they returned the rest of the balance of my subscription and i was amazed. ok, i thought everything was fine until i told my daughter, and she tore my head off because she used the service. she said, dad, get over it, what is your problem? they want to put a cookie, what is the big deal? i did the father thing and said i am your father, and then two months later i got frustrated i did not have the service anymore and i put the cookies back on. i think there is a desk -- i
6:44 am
think there is a generational thing. a difference of views going on in other countries as well, and i think as we all get more used to the notion of the fact that your data is out there, it is the trade secret story at the beginning. maybe 10 years ago people were surprised at the notion that you could be driving down the street and there would be an electronic ad targeted at you because they knew you were driving down the street. people 10 years ago would say how could they do that and where did they get the data? it does not surprise me that people now understand it. inhibition privacy or the sense of privacy or expectations is eroding. people are likely to be more -- less interested in worrying about it.
6:45 am
rest of thethat the older people are -- i don't know, we will see. evolve overvalues time, and i also think people grow up over time. and their own experiences can instruct them over time, and young people are not always aware of the consequences in many arenas of what they do. thatsaid, i don't know there will not be some very much more common understanding of being everything out there, but i do think we see a lot of privacy instincts and people are certainly concerned about protecting their privacy against their parents, for example. say thethink we can
6:46 am
next generation will not care. i don't think we know that. >> i think we know it is not true. there is pretty good data on this now that shows that young tople are more likely tighten the privacy settings on facebook. they are more likely to adjust the settings at all than older people. young people are cagey about what they do and say online. if you have a teenaged kid, don't be fooled into thinking that you can look at their facebook page and no the stuff about their life that they do not want you, the parent, to know. kids are able to communicate with each other online and share details with each other in ways that are difficult for the people who they really want to maintain privacy from, meaning their parents and teachers, to figure out. so i do not think it is right that young people act like they
6:47 am
do not care about privacy. that is not what i see from our students. all of the people who have studied this in detail seem to find that in fact they do care. >> they don't care about the data sets that we care about necessarily, but we certainly of theif for example any world governments behave like a 15-year-old girl had a secret crush, we would not have snowden revelations. they switch on and off sites, aey are very choosy, and if company tease them off, they go away. issue.to address another this is a callout that was written by jay cline, from mpc. sacredes an article on references to privacy, so this turns out to go way back into the book of genesis.
6:48 am
we have writings from mohamed, and we also have chapter 27 of the catechism of the catholic church stating -- and this is a long time ago -- respect to the reputations of persons -- he becomes guilty of rash judgment who we tackle we tactically assume is true without foundation, and it goes on and talks about the sacred text and how it informs what the united nations did for their treaty of human rights in 1947 following the largest rubber seed disaster in history, the holocaust. you cannot organize the death of that many people without computers and that's exactly what happened. it does go back in time. the biggest data set is our cultural history going back 3000 years. i think people are not that interesting.
6:49 am
we continue to make the same mistakes again and again. >> this is fascinating. i am curious about legal developments particularly with creative commons licensing. it's very much about potentially providing a compliment to copyright related things. it is an emergent law that relates to this developing transparency of the distributed internet that is worldwide. i'm asking this in the concept of creating an online a university or school. can you see any developments with creative comments around or commons licensing or something parallel around a privacy law that would rewrite some of the questions of the proprietary notice of these questions. which you all have been quite
6:50 am
eloquent about. >> i think yes is the answer and where you see it is happening is in the procurement phaese. the european model is a good example. i don't think it's a perfect example because they are nonnegotiable, which makes them to me at here in contracts. adherent contracts. under our system of law but more and more because people are trying to push liability around the plate and we have an obligation if we have a global environment to ensure that their processing of data regardless of the circle of vendors and others, third parties that touch it, we are coming up on more and more common language so we can have shorter negotiation cycles. when someone is either coming in can service provider or we find software or share data. if i was a law student today, this is the two sides of the
6:51 am
cathedral moment for you. >> one thing to watch out for or think about is who you are building the creative commons model around. if someone has collected or has it try to put around yourself, maybe is not created, the more the canoe type of model. in that respect, we have already had that going on with the dune trackh the do not settings and browsers. it seems like most people have gone out of their way even given the california statute to say we ignore that. they ignore it because we don't know what these contracts mean. you have individuals try to have an individual contract that unit corporation it is difficult to enforce that. that is the cultural issue you will have to bridge at that point.
6:52 am
>> i have a quick question on commercial uses and ad targeting. how would you support targeting of the ads for an individual on an individual basis? like their reading habits or .urfing habits -- usingnting microsoft internet explorer or windows. >> i'm not sure of the question. >> when i deliver ads, i can deliver them to a person or two basis. what are you browsing? you might be browsing the sports patch so i can show you ads for
6:53 am
nike shoes as opposed it if you are browsing using apple, i was broadly categorizing will kind of technology and on that basis, i can target ads. >> i don't have a good answer. it's a whole area of how much you can target and when does it become personal. the second scenario of looking at someone's operating system and making inferences about them may or may not be targeting. if you are taking that, you are adding their mobile data and location and e-mail account that they may have and that certainly is regulated. >> let me differ this. how would you protect all of these free services? ,f they are not storing data
6:54 am
from a privacy point of view, if ebay,google, yahoo!, and paypal are not going to store data, and how would you suggest they make money? >> the last advertising panel i participated in, the percentage of online advertising dollars that comes from targeted ,dvertising, based on tracking was under 20%. there is a lot of advertising going on on the web that does collectingon personal information and using it for targeted ads. two reading make suggestions that have nothing to do with targeted ads. one is a fiction book by cory doctorow. "little brother." it is good reading for your high school age kids until you get to
6:55 am
the ending spoiler alert. i think it goes a little bit too far. it is about a kid trying to avoid government surveillance, and it is a dark world of over surveillance. but i think it is interesting and it talks about how location use inarental and over loco parentis by government can have. there is another important book about this discussion tonight, book "trillions." before 2020 -- the book went into print last year and will webably be faster, by 2016 will have one trillion nodes of center type data.
6:56 am
30 billion seconds ago was 30 years ago when we were all things, and doing funky but 1,000,000,000,002nd was 30,000 years ago. that is a lot of difference of humanity that has happened in those time spans. -- one trillion seconds was 30,000 years ago, so that is a lot of difference of humanity that has happened in those time spans. it is an interesting kind of what-if book. >> and it is all very nontransparent. >> right now it is, absolutely. right, if there are no further questions, then please join me in thanking our panelists. [applause]
6:57 am
a fascinating panel. thank you very much. i have concluded -- yes, i would be glad to yield. yes, i was on the floor, sir. >> i looked to both sides and the only gentleman on that side who even made him remove was mr. walker. the gentleman did not stand, the gentleman did not rise. i resent the statement of the gentleman. >> well, i'm sorry. had anybody stood, i would have recognized them. >> sir, i did not mean to
6:58 am
suggest that you were not acting with fairness at all. i suggested we have a consensus bill that you have worked with the president of the united states -- >> that is not what you said. it was a normal procedure. on a bill of this type, i would never do a thing like that. as for the opportunity not only for the previous question but for the rules. there was not a man on either side of the aisle who stood. >> i respectfully suggest that i did not mean to offend you, and it was not -- >> you have offended me, but i will accept your apology. >> find more highlights from 35 years of house floor coverage on our facebook age. c-span, created by our cable companies 35 years ago, brought cable today by your satellite provider as a public service. >> this morning, a look at the state of the u.s. economy.
6:59 am
and americans for tax reform's grover norquist. fcc commissioner mignon clyburn. live coverage starts at 9:00 eastern. , the state of israeli palestinian negotiations. saeb erekat. wallng up this hour, " street journal" reporter jeff generalwill talk about motors recall of 1.6 million vehicles because of faulty ignition switches. and harry holzer on president
7:00 am
obama overtime rules changes. we will also look at today's news and take your phone calls. >> i have signed a new executive order that expands the scope of our sanctions. i'm authorizing sanctions on russian officials and those operating in the arm sector in russia and individuals who provide material support to senior officials of the russian government. to interferetinues in ukraine, we stand ready to .ost further sanctions e ♪ host: president obama announcing the sanctions yesterday. today, we will get the official response russian president vladimir putin. good morning. the white house