Skip to main content

tv   Facebook Data Privacy Practices  CSPAN  April 6, 2018 1:02am-2:33am EDT

1:02 am
week, facebook ceo mark zuckerberg will testify before senate and house committees on facebook's handling of user information and data privacy. 2:15 eastern on c-span3. on wednesday at 10:00 eastern on c-span3. watch live coverage on c-span3 and c-span.org.
1:03 am
♪ [indistinct conversations] >> all right, folks, i think we will get started. great. good afternoon. i'm the director of the open technology institute in america which guarantees everyone has access to an internet that is open and secure. have you for joining us today. for our conversation, what should we do next? if you're not sure what i'm talking about, you might be in the wrong room. once upon a time, there was a fast-growing social network called facebook that hoped to grow even faster by becoming a platform for other apps so in 2010, it launched an api,
1:04 am
application programming interface that allowed cap developers to use data from facebook users who had signed up to use their apps. but, there was a big privacy catch. not only could app developers obtain data from users, but also obtain data from users, but also all the friends of those users. though nominally, facebook notified users of the set up through the privacy policies, and there was a not particularly easy to find privacy setting for adjusting what data your friends could share about you, the default on the setting was for apps to have incredibly broad access to friends data and most ordinary date -- users had little understanding of what was going on. so for about four years until facebook tightened up access to friends data with an upgrade -- updated 2.0 in 2014 and 2015, untold thousands of app developers site event tons of data from people who did not even use their apps. primary guardrails protecting
1:05 am
the data from misuse after it left facebook's platform or simply facebook's terms of service telling them they should only use the data for providing the service that users had signed up for and that they shouldn't, for example, sell it all to a spooky political consulting company that wanted to build profiles in order to better manipulate them. of course, we know that is exactly what happened, that in 2014, a researcher named alexander used a survey at called this is your digital life and was able to attract 270,000 facebook users and through access to those users friends data, was able to obtain personal information about, well we are not sure, but we heard from facebook up to 87 million facebook users. he then sold the data to cambridge analytica, a portable consulting firm that worked with the trump presidential campaign and the brexit campaign, has bragged about influencing other
1:06 am
political outcomes in mexico, australia, and kenya, and based on recently released undercover recordings, has apparently used workers as part of its toolbox for influencing political candidates. this brings us to last month, when we learned about how cambridge analytica had obtained the data. we also learned that facebook has known about the passing of data to cambridge analytica since late 2015 but did little to confirm that the misappropriated data had been deleted other than demanding that cambridge analytica certify that it had done so. facebook also continued to allow cambridge analytica to advertise on its platform until just before last month's story broke. this has led to a firestorm over nude concern just as controversy has already been raging for over a year about how several big tax -- big platforms have been
1:07 am
subverted to help spread foreign propaganda during the u.s. presidential election in other elections as well since then. now, as facebook is losing billions of dollars in stock value due to lost public trust, and it is promising to make extensive business changes to regain trust, policymakers in the u.s. and europe are rattling the saber of regulation, and ordinary folks, only now seem to be starting to understand how facebook actually works or at least how it worked for five years ago, and what that means for privacy, the simple question is, what now, what should facebook do, what should policymakers do, what should users demand they do with regard to facebook or internet platforms generally? i will be talking to commissioner sweeney about these questions and more generally about the state of online privacy and how to improve it. before we do that, i wanted to pass the mic to my colleague who
1:08 am
runs an independent project called ranking digital rights, dedicated to answering another question very relevant to today's proceedings. how well our companies like facebook are user's rights? she will briefly give a preview of how the latest annual corporate accountability index being released later this month will answer that question, then we will move on to my conversation with commissioner sweeney and then our panel with the experts. thank you. >> thank you. i do not want to take too much time other than to let you know the 2018 corporate account ability index will be launched on april 25 and we have a flyer here at april 27th, an event here right in the room. we are planning to talk about it
1:09 am
in person to the people who are not in new york. the 2017 index can be found on our website so you can see how we evaluated companies last year, the index ranks 22 of the world's most powerful internet mobile and telecommunications companies on commitments and and policies affecting user rights on freedom and privacy. there are indicators looking specifically at facebook and other company policies affecting how they handle user data, and it will not surprise you, you can see on the website last year facebook did not perform well with the policies disclosed and also what it did and did not disclose. you will not be shocked to hear
1:10 am
there was not a revolutionary change between 2017 and now. you can see the report when it comes out online on april 25 for all the details and downloadable data and everything else, the analysis, we will have the event in new york on the 25th. a similar event here on the 27th to discuss it in person and people will be able to go through and discuss results in great detail. one other point in relation to data, generally, it is doing poorly. but facebook's disclosures were toward the bottom of its cohort. so that is just a little preview. thank you. >> thank you. we look forward to reading it -- reading all about that. i would like to welcome terrell mcsweeny to talk about the issue.
1:11 am
terrell: hi, there. thank you for having me. kevin: of course i will start with a question i will also post to expert panelists, which is, is this a tipping point? is this a snowden moment in the context of surveillance where we might actually see changes in policy, or is this more of a moment where we will see a lot of noise but not a lot of action? terrell: i think you just made your own point. let me start by saying thank you for having me here today. i will give you my own perspective and not the official views of the federal trade commission. i will not pull any punches. i will be careful not to talk
1:12 am
about what has been confirmed, that there is an open investigation into at least some of the conduct alleged here. but i think we should have a policy conversation. i appreciate your first question which is, ok, maybe a beat -- 87 people's information -- and -- 87 million people's information was misused, is it a big deal? i would say we're not even talking about fact that 250 million people's very detailed information was breached not even a year ago, and that incident fortunately did not have a policy detail that i had hoped. i certainly hope this is a moment of change. i think it is also a powerful moment because the general data protection in europe is coming into implementation in may. changes are being made in response to that. i think that has a big impact at
1:13 am
the same time that this news cycle is having a big impact on the story. if it has an impact on just one thing, what i would really like it to have an impact on are the people who say to me when we have been talking for years about that or consumer protections for the digital age that american consumers just don't care. demonstrablyis false and i think we have evidence that people do care and that consumer trust is incredibly important and ought to be at the top of everybody's list in terms of what they are concerned about for businesses. i think it is also underscoring to me the fact that consumers are not necessarily understanding or anticipating fully all of the risks of transacting and their data on the platforms. i do not personally believe we
1:14 am
to put all ofng that risk onto individual consumers to anticipate what might happen to their data and i think that is part of the policy conversation we ought to be having. kevin: currently, data protection rules in the u.s., the primary consumer policy cap on the beat going after unfair and deceptive trade practices, indeed, you have gone after facebook before. to negotiate with them in 2011, over alleged deceptions around their last privacy transition. it leads to a question, how did this happen if the fcc, the cop on a beat, already had this in place and presumably was policing facebook? terrell: i think that is 100% the right question. i'm a sitting federal trade
1:15 am
commissioner and i love the fcc. i think the people, the staff, are doing an incredible job with pretty antiquated authority it is a 104-year-old agency using the authority to protect consumers from unfair practices and has been able to adapt that into the online environment. the agency itself has always called for strong tools. i think this set of facts underscores that the ftc is not strong enough as it is currently configured with its current authorities and its current resources to be the kind of consumer protection agency required for a moment in which we are connecting every part of our lives to the internet and each other. kevin: how do we fix that? terrell: i would start by making sure it is adequately resourced . at the end of the obama administration, they did call for resources for the agency.
1:16 am
it has never been funded near that level, so it is under resourced and that is easy to fix. i think it needs to think about its configuration. one thing it had been doing, and i am proud of sharing the fcc with this is we have been bringing more technologists into our work in bringing more researchers on staff. we have an office of technology research and investigations i think is a great first step in the direction. i think we need to think about institutional design and whether that kind of capability ought to be significantly expanded, maybe by the creation of technology just -- a bureau of technology like the bureau of economics, so there is even more horsepower within it. it also needs additional authority to contact outside experts to really have resources to evaluate what it is being told. it needs in-house expertise and additional resources to bring
1:17 am
that in when it does not have it. i think beyond that, it has called for penalty authority, not just for data security and breach violations but for privacy as well. it needs rulemaking authority it can use for privacy and data security. it has also been studying some conduct that it finds very concerning. looking at the data broker industry for example and it called for more transparency and accountability for data brokers and they think that is really important. beyond that, you could also be making the case for consumer rights that we need an additional -- digital age which include things like interoperability. those are meaningfully procompetitive as well. >> one of the big limits to what the fc -- ftc can do and what sticks it has to work with is your primary tool in regard to privacy has been deceptive trade practices.
1:18 am
if someone misrepresents what they are doing with your data, that is within it but if they are doing something with your data that is awful but telling -- tell you about it and nominally you consent, that is ok. how do we get past that? is noticing consent a workable model? facebook certainly will model and has argued that its users and the fcc had consented with and had notice of, this is how it works, this is what the product was, so where do you go from there? >> the idea that notice and consent is a framework that can protect consumers in this environment has been described as quaint and i think that is correct. no i do not think we could --
1:19 am
continue to rely solely on the framework. i do not even think the ftc itself is advocating we rely fully on that framework. the ftc does have limits to its authority and it does look for deception, which we can see her at if you are not telling people truthfully what is happening to information, and how it is being used, i think that is very important. it is looking at the role of consent and how that has played out on the marketplace, it has been emphasizing best practices around requiring notice that it is clear and timely and outside of long-term service, private policy agreement for a number of years. those choices need to be offered around the collection of sensitive information in particular. i think it has consistently laid out test practices, whether the industry has actually been following them i think is a different question. it raises the issue again of whether it is strong enough and i am obviously arguing that it is not. i do not think it needs to be
1:20 am
doing this all by itself. we are sitting here one year after congress repealed the stronger broadband privacy rule, and i think there was no justification for that. we need more than one consumer protection cop on the beat. kevin: the privacy settings are confusing. this is not unique to facebook but it has gotten more complex. i want to share an anecdote that describes that in illustrates some of the problem. when the scandal broke, i went back and looked at the settings. i have been working to some extent around facebook privacy for some time and i am familiar with it but i went into the settings and found a setting that said apps that your friends use. they can share and app subject to all of these check boxes and
1:21 am
90% of them are checked by default but you can uncheck them . it means facebook did not update privacy settings when it updated in 2015 so there is a vestigial setting and i asked facebook if the setting means anything at this point and they were like, we hadn't gotten around to fixing it or changing it or getting rid of it in part because there are some cases where some of those checkboxes do matter. they named one and i was like, is that the only one, and it was not clear they were sure whether plus. they are clearly sorting that out now. if they are not even clear how work, how can we be clear on how their settings work? terrell: again, you are identifying the weaknesses of the model we are using.
1:22 am
i do not know if anyone else have the experience of the room, you know, shortly after the latest news broke for example, i have the opportunity to sit down with my mom and go over her settings. she was very concerned but also wanted some help figuring out what the privacy settings work. i walked her through it and we and i said,e app you know, this is where you view the settings, she said turn it all off. i do not know of anyone else has had that conversation with their mom, but it is a familiar situation to me and it suggests to me that people are trying to exert choices over how their data is being used and how it is flowing out of first party relationship they have with whatever service they're using and they think they are exerting choices and they may not necessarily know that there are more choices they need to be looking at. the ftc has been looking at the issue.
1:23 am
they looked at this recently in and said youse can't have a default setting that has several different options to navigate to keep something private. if you are trying to keep selling private, and you think you set to private and then have to do three more steps, that is a problem. the ftc was looking at whether consumers could really navigate the settings they were being offered. it also looked at, i think of the cases more as trickery and clever technical workarounds to privacy choices. if a consumer has said do not track my geolocation app, then running programs that is triangulating your location using wi-fi is probably going around that privacy setting. we have to find ways to make sure the technology is following the privacy choice of consumers.
1:24 am
i think to have been looking at these issues. the fact that we had more than one case already on this suggests to me there are problems out that we need to be continuing to be very active about here. kevin: in trying to protect consumer products, what other role is played? >> particularly influential is cases involving whether the security practices are reasonable or not. it is a really important authority and it is challenging to use so we have to have a clear likelihood of harm or we have talked about harms that are not just economic, but harms that involve invasions, turning cameras on in people's bedrooms and emotional harms with revenge porno and things like that, it
1:25 am
can be tricky for the ftc to reach some conduct using just the unfairness authority. this is an area that i continue to reiterate the ftc cannot go alone. one thing that happened is when it starts to use its unfairness authority aggressively, congress in the past has stepped into eliminated pretty severely. the agency has been cautious developing how it uses the authority but with good reason. kevin: what about market harms? you are also a competition authority. there is a lot of talk about platform monopoly or breaking up these companies or a variety of other ideas to deal with the fact that they are big, what role do you see their?
1:26 am
-- there? terrell: i think more competition would deftly benefit consumers. one of the tricks that has to do with the economic of how the markets work. it is not completely obvious to me that just getting more competition would yield better outcomes and better protections for consumers. that is why we need additional regulations to help really direct the marketplace toward the outcomes we want in data use and security and privacy. more competition is good and i think using its competition authorities aggressively is terrific. it should be advocating procompetitive policies like interoperability. i think we need to be mindful of
1:27 am
the fact we cannot just rely on competition as a market force to correct for all of the problems we are potentially seeing here. kevin: you talked about what congress could do to help strengthen your agency. what about when we are talking about privacy online? terrell: i think congress could start thinking about what are the laws it needs to pass to protect consumer privacy. one thing is stop passing laws that undermine it. we could start and build on that by really, you know, taking a look again at a number i've -- a number of ideas and number of people here have been talking about this for a while, comprehensive privacy legislation and also real comprehensive data security legislation and cyber security legislation. i would argue again for more transparency and accountability
1:28 am
four day till rocher's -- for data brokers. obviously, more researchers and strengthening specifically as an agency, i that is very important. i also think talking about some rights that consumers really deserve here, the right to control over your data, meaningful control around meaningful interoperability, important conversations that we need to be having. that is the consumer protection angle. i want to emphasize one thing we are seeing play out in all of the stories about facebook and analytica these stories is -- it has to do with rigor issues and bigger risk
1:29 am
than just the protection harm we're all concerned about. the potential for technology and the disinformation campaign 209 distributions come i'm personally worried about the fcc's comments, sections for the open internet order, manipulation of democratic institutions is a deeply harmful thing. that will require more than just addressing privacy and consumer protection issues. kevin: there is a new conference of data protection that will come in force later this month. --s that strengthen, weekend weaken, how does that impact the argument for trying to get a protection bill done in the u.s.? terrell: i do hope it strengthens it. if what in fact is happening is that the major technology companies are coming into compliance with gdp are and they are complying across platforms -- same in u.s. as in europe, it seems to me, a lot of the opposition to the burdens
1:30 am
is eliminated. i think he could have the effect of making it easier for congress to think about rightsizing consumer protection. >> i do hope you're right about that. i appreciate you taking time to chat today commissioner. thank you. barragan to invite the panel of right now. [applause] >> hey gang. how are you all? i'm going to let everybody introduced themselves. we will go down the line. what i would like is for everyone to introduce themselves and that briefly answered the same question i put to the commissioner. is this a tipping point? >> starting with me.
1:31 am
it looks like this issue is going to have legs for a little while. i think it has some legs. it has been in the works for a long time and there have been privacy questions we have thought of for long time. i think this will make a difference. the ftc investigation will be .ngoing also on the company's response to this. also, any impact on competition. , butn get into this later i think we need to insist on very important consumer protections. need to bealso mindful of any unintended consequences. over corrections could inhibit some of the things we love about andinternet, about openness
1:32 am
ensuring that competition remains vibrant. >> and two are you? grexit i am karen and i may mozilla fellow working on competition. >> my name is harlan yu. i think the answer to your question is i sure hope so. i guess we really have no way of knowing, but it does seem like especially because it is facebook and because of the links it has to political campaigns and political groups that people seem to have interest and, the story may have more legs.
1:33 am
they are willing to make some changes and hopefully positive changes, but i think that is going to require an ongoing conversation between advocates and the company. hi everyone, i am michelle. thank you for having me. i think the answer lies in your framing of it being a tipping point. the reason i say that is because i think there have been little chips away at the public's trust. i think that has had the effect .f a crescendo there is a great amount of angst that exists in the country. there is a company that portrays itself as being friendly to
1:34 am
consumers for their use. that is the value proposition they are getting. says will make this free to you because we want to connect you. i think all of those things crescendo into this moment. i am an optimist. i think facebook will have to face the music, will have to change its practices, and will have to become at the very least more transparent. >> i am david and i teach law at georgetown law school. i am not sure this is a tipping point, but it will be a significant moment in a couple of ways. facebook has a lot to answer to. they're going to be public hearings next week. mark zuckerberg is going to testify.
1:35 am
think a lot depends on what path facebook decides to follow. reachs the first major --breach. yes, there was a dustup with google. view there are major issues about facebook's willingness to comply with federal orders. i think one of the things to does closely is what zuckerberg and facebook say about that? this may be a tipping point because it will force the agency's hand. >> in terms of facebook's response to have enacted a whole bunch of changes trying to regain user trust. they will be simplifying their privacy settings. are clarifying the terms of service and privacy explanations.
1:36 am
but not yet talking about substantially changing any of those terms. they stopped working with online data brokers which is good. they closed a gaping privacy search.their people just yesterday they announced they will be surveilling what data is available to act directors. are doingike they what they can in the short term. harlan,tion, first to but the rest of the panel, what else should facebook be doing now? if you were in the war room at facebook at this moment what would you be advocating? to relateddal raised issues. obviously we will be talking a lot on this panel about user
1:37 am
ofvacy, and the scope sensitive user information facebook makes available through its api to app developers. as facebook starts to raise its how is the public going to then a scrutinize what is happening inside that wall. in particular, how facebook's business model is vulnerable to potential abuse and how the public finds out and addresses some of those issues. this story obviously got a lot of attention because of its links to global groups and political campaigns. i think what the story did was intensify people's interest in the way that facebook data and the facebook platform was potentially be used to manipulate both our elections
1:38 am
and the political discourse in this country and elsewhere. from that, facebook has promised some amount of additional transparency, especially after it's own investigation of russia interference. it wants to establish a new standard for add transparency that would help everyone, especially watchdog groups and reporters, keep advertisers accountable for who they say they are and what they say. i just want to spend a few minutes talking not about the user privacy side, but actually about ad targets. any message onut facebook touched by money. anyone, whether you are a company or not, can spend money to boost your message. your mind probably gravitates
1:39 am
towards consumer projects. we are obviously talking also about political campaigns, and othersoups that want to spread intentionally miss information or exploit certain medical and social divisions in our society. ways thatalking about advertisers might be using facebook as a targeting platform to pray on vulnerable consumers. example, ads that are trying to find patients for fraudulent opioid rehab centers. for-profitive as for colleges.
1:40 am
these are the kinds, among targeting, there is a range of abuses possible in the system. this goes to the core of whichok's business model, is finding certain segments of facebook users and targeting them with specific messages. is main question that i have how will the public in the future be able to scrutinize behavior, andting what is facebook going to do to help address the issue? how are they going to be transparent about what is happening? facebook has made a few small promises so far, but there is a lot more that the company can do.
1:41 am
i would just go over for potential things facebook can do a quick. first, facebook has slowly started to make advertisements on its platform available for scrutiny. they have been doing a pilot youect in canada where if are user and you go to an advertisers page you can actually see the list of are running ads that advertiser is currently running. in principle, all ads are visible. if you are a researcher makes it to knowifficult for you the universe of all ads in particular advertiser is running. in addition, many advertisers have thousands of ads. the first thing i think facebook could do is, in the same with a have built a very robust api for user data, they could build a
1:42 am
thatt api for advertisers allows researchers and journalists to scrutinize ads. the second thing is that facebook has started to make enhanced transparency promises, especially around election ads. but that is very narrow. they're looking at political ads that specifically mention a candidate or election. but it ignores the broader range of abuses i have described. they should really be turning their attention to all ads on facebook's platform. third, in order to have effective accountability it is important to not just know what the content of those ads are, but the scope and reach of those ads. talking about what the
1:43 am
explicit targeting criteria was for the ad campaign. who is that advertiser trying to target? because someicated advertisers do not use explicit targeting criteria. they upload a list of voters or consumers. they look for facebook users that have the same features. facebook needs to also do in addition to making the ads themselves more transparent is to expose targeting criteria and information about the audience a particular ad actually reached. how many people? what are the demographics of that audience? they are doing their internal enforcement to take down bad ads. they should give the public a
1:44 am
detailed accounting of all of the ads they are taking down and for what reason. those are the kinds of steps i think would provide real transparency and real accountability, and these are the kinds of steps that will help raise the public's trust. it is not just facebook telling us they are doing these things to stop nefarious behaviors on the platform, but actually letting the public scrutinize this and verify it is actually happening. theuld just plug a report team is releasing in the next couple of weeks that hopefully will serve as a public advocate guide to add transparency and what we should be pushing for. >> banks. i look forward to reading that. we are building a mean the people he later into the internet that we do not fully understand and can't control. transparency,t of
1:45 am
this reminds me that i should make sure to disclose that facebook has provided financial support. although we clearly disagree on some things, we are aligned on important issues. in terms of my wish list, i would add greater transparency in the sense of venues like this. we did invite facebook they are busy. done a lotthey have more press calls than usual lately. obviously, there testifying to congress. i do not feel they had a whole lot of choice in that. i think public engagement would be great. as other people's wish list if they were in the war room, and the other things or should we move on? >> let me sort of say what i think they should talk about doing.
1:46 am
one of the real problems in the debacle is how little control facebook was exercising over third parties, particularly at developers. facebook has recently acknowledged that do not have apptracts with third party developers. they do not have any remedy in case of deliberate overharvesting or over sharing. have dueously did not diligence on any of the developers. when it comes to how do you solve the problem, part of it is there has to be much greater oversight control and auditing of third-party developers. proven -- episode has one part required facebook to identify threats to privacy and
1:47 am
to plug those threats. since third-party developers data,ccess to ada, -- to that was an obvious violation. one would suspect there would be some control place. as this debacle unfolds it becomes clear that the really was not. just go down the list. lockupsd of contractual that give them power to require audits, oversight, some sort of certification. audits done by facebook or an outside party to ensure there was compliance. we are weeks and months into this, and facebook cannot assure
1:48 am
us that the information is not still floating around. or that the researchers have actually destroyed them. in law school we teach students how to enforce these kinds of appears, and it does not that facebook has any remedies at all that are effective to discipline third-party developers that have a very broad access to consumer data. what i would like to hear from facebook is what are we going to do to control this? was berra's famous line this is deja vu all over again. things were designed to avoid exactly this problem. i would like to see facebook come before the senate, as sucker is going to do next week,
1:49 am
with a real list of things to control this part of the problem. i agree with harlan. there are lots of other problems. >> now that you bring it up, what you helped negotiate in you indicated earlier you believe it has been violated. i would love to hear a little bit more about what that was about and what you expect or want to see from the ftc in regards to that now. back to, this goes third-party access. in 2009 facebook made two changes to its privacy settings that pushed a lot of private information into the public, it slso gave third-party app
1:50 am
access to information they were not supposed to have. one of the things that is ironic about the complaint, part of the deception was allowing third withoutto get access their consent. i have seeni think this movie before. that fcc didings was try to rein in third-party collection. it draws a line between users who actually post things and third parties who harvest information. the goal was to limit third-party access unless there is clear notice and clear consent. now facebook is going to say that the settings they have allowed sharing. on the other hand, the question
1:51 am
the ftc was asking is what our consumers reasonable expectations about what that means? one question to ask mark zuckerberg next week's do you really believe any of the friends thought something like this would happen to them? then,our notices back whenever this happened, clear about that? i do not think they meet that test at all. that will be part of the ftc's inquiry. so, i think that is part of my concern. once action isis devoted basically to forcing facebook to look at former , where is consumer privacy in jeopardy and plugging
1:52 am
those holes. clear in the aftermath that facebook paid no attention to that heart of the decree because there are no controls on third-party remedyding, there is no for harvesting data that you have no consent for, for sharing that data with third parties. which is why this is such a scandal. this is two or three years, maybe four years, since facebook has known about this problem. and yet it still has done nothing to fix it. a lot of the things facebook has announced, the new platform policies mark zuckerberg is talking about, we have heard all this stuff before. question is, is facebook serious about moving forward
1:53 am
now? >> at want to jump back into what facebook would argue. say when we would all negotiated the settlement in say, they would basically this is how it worked and these are the disclosures made to the users, these are the settings they had. what changed to make that not ok anymore? our product isis working the way it was designed. problems withvoid people like that. facebook to give clear and better notices. that was part of the consent decree. section four was looking at vulnerabilities. that was the key provision. in my mind, facebook did not pay
1:54 am
any attention to that. the question for facebook users in any time after the consent decree was entered into what friends understand the scope of harvesting of their data? to the 87e question million facebook users who had their data taken. the answer i think is plainly no. i think in terms of ftc enforcement, i do not think it really matters, because i do not think facebook has any argument that this is a violation of section five. section five turns on what consumers reasonably expect. i think they also do not have a defense to my view, which is they violated the consent decree.
1:55 am
there's going to be a very substantial civil penalty. at the time of google, the civil penalty statute provided for $16,000. multiply $40,000 .imes 87 million people jumped from $16,000 to $40,000. that has changed? that would not be the starting point for the agency. i think there is likely to be very substantial penalty. we're talking about what the f cc might do. there's also the question of what congress might do or should do. to start that conversation i will move over to michelle.
1:56 am
>> what should you do? i think yelling at mark zuckerberg is a start. it is not necessarily going to make change, though. at some point i want to go back to the consent decree. perhaps, congress could make some fixes to make them have more teeth. example, in the google case, the fine was $24 million. so, to make it really matter and ,ctually have some hat --heft what congress should do is not r.make the gdp this is not in line with what advocates are saying. dpr is fantastic.
1:57 am
i do not think that we need to duplicate it. that's not to say that there are not elements that could be great in baseline privacy laws, but i do not think enacting a baseline privacy law is to replicate the failure of consent. instead, what we should be looking at is expectations. i think the way congress should in butte a baseline privacy law is with the idea of what is a person's expectation? what sort of transparency is available and what sort of accountability is attached? the way we interact with these platforms is obscured, and other words you cannot make consent for the most part. all of this is by design.
1:58 am
to push to the forefront the idea that if you're going to your expectations into platform and use these values of accountability, some of those changes have to come from the design side. standards.me design it is more about what sort of interactions make clear to a person what the true value proposition is. one person that i know put it very well and set, with companies leverage your data 100 times that is like a price increase you do not know about. you get nothing out of that. i think that argument is ringing hollow now. there are design principles that allow for more transparency, accountability, not just the gd pr.
1:59 am
i think also the idea of accountability is so crucial. idea of making public disclosures and drawing on some of the other laws that exist. public disclosures on a quarterly basis. force is ceo to have skin in the game. it forces the ceo to have skin in the game. i think there are discrete ways that would make a huge difference in privacy protection. i think just that. baseline privacy. >> what you described is not a modest proposal. >> no, it is not. considering there was a much
2:00 am
worked on proposal from the obama white house in 2015. it does raise the question of, light,some sort of gdpr what? you mentioned a couple of targeted things. impact assessments. there are staff who are wondering, what can i write that my boss could introduce and look impactful on this issue? is a strongdo that that will strike fear into the heart of facebook and other companies? perhaps impact their behavior? frank, striking fear in their hearts is not what i'm concerned with. my eye is on the ball,
2:01 am
how do we get protection for people? with protection should come accountability. i think strengthening consent decrees would be great. it would strengthen the ftc and everybody knows, truly needs more tools, more resources, to particularly with what is going on. more public transparency, penalties, something we are going to becoming out with his, here are specific regulations on consent degrees -- decrees. you can look at the republicans reform playbook and use the ideas to get something like that to be more palatable. area, and david has touched on this, is the idea of data access by researchers. it is something i feel like has been avoided because it is a very tricky subject. we don't want to shut down
2:02 am
innovation. we don't want to shut down open access. there are ways to create applications for researchers that do not exist right now. if you are a federally funded academic researcher, you follow the common rules, which means there are ethical guidelines and you go through an institutional review board. those institutional review boards, they are fairly worthless. not for any reason of the people faith, on them in good but they do not ask for things like terms of service reviews. not to say that if you review terms of service and it says you should not do this, should not do a lot of what researchers do, but it is appeared of further to -- imperative for their to be a review of that, some accountability for the researcher. there should also be certifications for the researchers so they are held to some obligations, not just for what they intend to do but how they are protecting the privacy and security of the data. facebook's data sharing agreement was very light on
2:03 am
details and light on accountability. i am not sure that was not by design. i think the idea is to let the data to go, and then we do not have liability. we do not want to create lability. the other aspect would be creating a chain of command of liability in this ecosystem which would not be easy. , but you start with the platform of the service, so creating the risk of the user, the benefit and the risk. and decide and assign the rules and liabilities. those can be chopped up in small ways, and maybe consent decrees are part of that, maybe certifications are a part of that. >> i am guessing, david, you are all for strengthening the ftc. >> yes, i am happy to repeat everything she said before. she is absolutely spot on. i would say this. there may be smaller pieces of
2:04 am
the privacy issue congress might tackle with with data brokers. this would get at any but not -- many but not all of the problems. given some of the breaches that had and some of the problems with large data brokers, there needs to be something like a fair credit reporting act but much stronger for data brokers. the fact is people are worried about what the nsa knows about them, them, but they actually know some much more, and so does facebook and other companies, but we do not have regulatory tools. these are massive data pools. there is real risk there. >> i am somewhat skeptical of going back to the platforms and away from the brokers. we will see specific use
2:05 am
restrictions in law or requirements about consent, but i think the possibility of much stronger transparency requirements is definitely in the answer. they have been talking about it in the context of advertisements. there is also the question of political viability and timing and what-not. for example, i have a crazy idea, which is there is a single law that is the strongest states law in the united called the video privacy protection, and it protects records of what you watch on netflix and the video store. that got past after video records were obtained by journalists, and congress freaked out thinking it might be them. hence, the strongest privacy law ever. i do not see why that should not extend to the contents i interact with online. i shared this with a staffer and
2:06 am
like, sure, but that is in the criminal code, mending it -- that means it would go through judiciary, and nothing is going to happen in judiciary and we need something that will go through commerce. so what is actually possible right now? what is the timing? they are not going to pass anything this year. they are basically already done because of the election. but what seeds do we need to plant and where might we best played them? >> what happens in that if all -- in the fall is huge. it will decide if democrats retake the gavel, retake congress, and then the possibilities are much greater for baseline privacy law or any kind of updates to privacy in the general. it is funny, the jurisdictional question, because privacy is notorious for being 100 different committees or people believing it should be in 100
2:07 am
especiallyommittees, when you have a high-profile case like this. they are all scrambling to figure at how they can fit it into agriculture. [laughter] great, fine. this is way our democracy bumbles along. but i think to the extent we can provide staffers with the correct facts about what happened, first of all. that is something i have noticed, even excellent journalists using words like scraping and access interchangeably. those are not the same at all. making sure that we can, as advocates, create a fact-based situation, we need to make sure we have that. and offering different committees different solutions, i think that is up to groups like cdt and other groups to work hard to make sure the committees have information. and we need to bring in republicans who are interested in this issue. they had not said much for a
2:08 am
while, but now they are. that is a really important development, especially in d.c., and i think it is important to engage both sides to explain this is a truly bipartisan issue , or should be. >> moving onto caroline and the issue of competition, which is your expertise, what role does competition law, antitrust law, play in addressing a situation like this, if any? theo i think one of questions we have heard is, well, wait a minute, maybe if we had more competition, things would be better. we have seen a lot of headlines talking about the power of big tech, the concentration, the consolidation that has occurred. and can't the antitrust laws do something about this? the follow on is, maybe if there was more competition and they were doing their jobs, maybe things would not be so bad. i am here to say antitrust can play a limited role in some of
2:09 am
the bigger market structure questions, probably not as good at the consumer privacy question. yes, if we had multiple social platforms. petition thek a basis of privacy. -- see competition on the basis of privacy. the trick about social platforms is we really do not want to have to go to six different social platforms and find all of our friends on each one. the value of this service is that the more people who are on it, the more valuable it becomes. so competition is not necessarily something that consumers really would want in the real world. you might want options for competition, and i can talk about some ideas for promoting competition. but i do want to talk about the limit of antitrust law. antitrust law is a law enforcement that can keep them
2:10 am
-- monopolies from taking actions that will harm the competitive process, and it can stop mergers that will lead to a lessening of competition. this could have some positive impact on how the platforms compete and the actions they take. another important tool besides antitrust laws is that ftc's section five authority is believed to, and i believe it encompasses something more than just the antitrust laws, and congress must have meant something, this is not just the antitrust laws. prohibitsve laws unfair competition. so looking at how the agency can use that authority, it finds actions facebook is taking that
2:11 am
violates this principle of unfair competition. they are somewhat limited tools. they may be able to get bad conduct. but how do we promote competition, how do we put competitive pressures on a company like facebook? is that through better data portability? creating a meaningful way to port your data to some other application or service that would be able to create some sort of networking service that consumers want to go to. this goes into some of the questions that i am exploring and trying to think about, and i think i still have more questions than answers on this. the importance of data portability, interoperability, and how the use of api's work into this. as a side note, i have been thinking about how more open api's promote competition, promote innovation, and ensure
2:12 am
the openness of the internet that we all want and get a lot of innovation from. then cambridge analytica happened. i thought, oh, shoot. i need to step back and think about, what is the process behind api's. is it more responsible use? how do these feed into the digital ecosystem? a warning and something we should be looking for and the ftc is well told to do this because they have the office of technology, is to make sure that facebook, as it starts review the api's and policies, and i think there are a lot of things they should do to protect privacy and security to make sure the people willy-nilly cannot get access of all of our consumer data, but at the same time make sure there is enough -- is not an overcorrection in terms of shutting down access to
2:13 am
data, not personally identifiable information, but data that helps developers come up with exciting new programs and apps that people want to use that could ultimately compete with facebook. facebook could have the incentive and the ability, using its api's, to say, you know, let us not let this developer access data, and we are worried -- this is a threat to ask him as a let's shut this down. >> this is something a number of us have been thinking about, including the commissioner. you can go look at some tweets about it if you like. there is portability, getting your data out. facebook does have a tool for getting your timeline out. rowss built for unib expressly. it is not really suitable for uploading into another service. assume there are services out there.
2:14 am
>> they build it. >> but not for you to take it somewhere else. gdpr is going to require some level of portability, including machine readability, so you can move it somewhere else. remains to be seen how people are going to implement that, and that will be really interesting. then there's also the issue of how do we come up with an environment where something can actually get big enough that you will actually even want to move your data to it. then you get into interoperability. from where i am sitting, one of the more plausible versions of a networking big enough at this point now the facebook really bought the two networks that were getting big enough to compete, instagram and what's app, is to be able to leave facebook while still being able to communicate with people on facebook. they are not selling the platform, it is the people on the platform. if no one feels like they can leave because everyone is there,
2:15 am
then a hashtag on facebook nene -- means nothing and there's no pressure to change. how do we do that? what are the tools other than consumer outrage and a bully pulpit to push facebook in the direction toward what it is probably considering its most mortal threat, building doors into its walled garden. >> you would have to find the market power. but doing things that would require that sort of a remedy to say, look, you need to make this open. i think one of the solutions is more likely. legislative, or giving the ftc more authority to do rulemaking. the important thing is it is not just helpful to have my pictures and my posts, the most valuable pieces were my friends. we have to also thing about privacy. my friends are my friends, but
2:16 am
if i'm going to port it over to some other service, are my friends ok getting pinged by the new service to join? i think there are a lot of questions about what the meaningful data portability to another service actually means, and the devil is really going to be in the details. i think it would be great to hear from developers and others who are trying to think about , what is the next great social create, andant to what they would need to be able to meaningfully port the data or get the data, capture the data, and then build a service off of that. >> what you are really talking about is forcing facebook to become a common carrier. and that is fraught with all sorts of subsidiary problems. interoperability, i just do not know how you force it except through legislation. >> there is some precedent like
2:17 am
when aol bought time warner, there was a condition about aol having to make its messenger interoperable with its biggest competitor. at this point, whether facebook would put itself in a position where would have to exceed to such a demand -- >> besides amazon and google. >> when will the authorities have that type of hammer? >> maybe it is small parts of facebook, like messenger. maybe that is not that small. but looking at specific communication aspects. like the plug-in or something. >> it also seems to me there is also a core tension between privacy and interoperability. yesterday, they announced that they were close off more parts of the api. privacy advocates will cheer but competition advocates will not. so how to get both at the same time will probably have to do with more people and policy
2:18 am
-- legal and policy solutions, rather than tweaking of the knob in terms of how much of the user data the api exposes or not. >> and i think this is going to require a lot of study. it will be helpful to have the research and empirical data. to respond a little bit to common carrier, we really do need to think, is that what you want to do because you need to keep in mind that we want to ensure the incentives to build the next great whatever platform or facebook is going to be. if you create a policy that says it is going to be big and have lots of people, then you need to open everything up. it is like the chill innovation, and maybe the flipside is that it will force the services to be the best they can. competition is good. you are going to want them to be
2:19 am
-- come to me, because i have the best privacy policy, whatever it is. it is going to be an important consideration of what trade-offs might there be for getting certain policies we think will benefit consumers and what impact it will have on innovation. >> as that occurs, i will admit that my greatest fear right now is that this is not only going to push the decision-makers at facebook but also the rest of the industry toward, like, it is not worth trying to plane that -- play in that field. i worry that there is like a in the briar patch and facebook is like, no, please don't make us locked down our api's even more. >> there were some ideas around data collaboratives that i think could be interesting for people to look at, the idea that maybe part of the data, part of some
2:20 am
of the big platform's data could be put into data cooperatives or interested into intermediaries. -- in trusted intermediaries. maybe there are ways to segment some of it so it doesn't create problems for research. >> what is the minimum viable amount of data you actually need? >> a really good question. >> all these api's are not actually relevant. it is really about, can i get my friend list without -- and a piece of data that will allow me to re-identify them. but you get into a privacy problem. they say, well, gdpr but let us do that. >> gdpr will not apply to american data. >> yes, in europe though. anyway, a lot of rich discussion there. not to overpromise, but i
2:21 am
believe oti will be doing an event on interoperability and portability within the next few months. knock on wood. meantime, we have a a few more minutes for some questions. and there are hands shooting up. there is a mic going around, so please do not speak until you have that for the benefit of folks watching on c-span or online. i'm from access now. david, you spoke about the consent decree, and michelle, you spoke about audits. audits are in this consent decree, and they would have had to have gone through at least one, probably two, between then and now, and he clearly did not do anything to push us forward. is an audit actually something we should be pushing for? was it effective with the consent decree? if it was not, which is seems not to be, how can you make them better? >> the word audit is a word of many meanings. the consent decree requires
2:22 am
biannual filings by a third-party to essentially make sure this was a good year for -- facebook is adhering to the commitments it made in the consent decree. so there have been at least two, maybe three, of them. submitted since the consent decree was filed. i have not seen one in several years, but that is not really what i am talking about when i talk about audits. i am talking about facebook trying to make sure that third-party apps speak to -- stick to whatever commitments they have made in terms of what they are going to download and not share with third parties. one of the real problems with cambridge analytica is facebook really do not know what cogan was downloading, nor does facebook have any means of making sure that data was not shared with third parties or sold. there was no control and know
2:23 am
-- no audit. so what i am talking about is ways of overseeing third parties who have access to data. there needs to be some contractual lockups, and there needs to be some ways of, after the fact, making sure that the third parties behave as they promised. at the moment, facebook is simply depending on wishful thinking that third-party apps are going to do whatever is in the terms of service. but the cambridge analytica debacle has shown there really are no controls. when i use the word audit, that is what i am trying to refer to. >> from what i understand, these are really assessments, which is different. so you have the private assessor who is sort of cataloging benchmarks. there are ways for companies to game this. they will change their practices right before. for example, to look better,
2:24 am
for the assessor, they will often not -- obviously, this is not public -- but they will also not list material changes, things that would be relevant to a consent decree. the assessor does not have access to, or the company does not tell them about it because of the timing. a formal audit would require it to be public. it would require the company to sign off on material changes of their policy and practices before and after the audit happens. and it would create instance of -- a sense of accountability tied to the consent decree. >> it is not at all clear that facebook reported what happened with cogan and cambridge analytica to the ftc, but i think it wouldi have been required in one of the biannual reports. >> thank you, my name is
2:25 am
courtney, i am the advocacy director of the committee to protect journalists. it is interesting, because in several conversations around and countering violent extremism debate and the fake news debate, one of the big concerns is the lack of access to data by researchers. so it seems like there is this tension between access to all this personal data, yet facebook and other companies not turning over lists of content that they have removed or censored, etc. so i think that we should be careful about making broad brush strokes about institutional research boards. at least they have to go through something. definitely they need to be more technologically apt. but how can we balance the need for more oversight and auditing potential, both by researchers but also potentially by the judiciary? as they are deciding what content is illegal or not outside the judicial process with the need to protect private data. >> i can take that.
2:26 am
at least in terms of the recommendations i was talking about, it is true, especially when talking about countering , how toextremism balance privacy and transparency is a tough problem. the recommendations i was talking about applies to ads. facebook already considers ads to be all public good with the pilot they are trying, any post that money has been spent to boost is already considered public and not private information. and so, at least in terms of -- it will not get it all of the problem, but for large part of the problem, beyond just election, more transparency about ads will probably do a lot of good without really risking individual privacy. >> i will just add that i think your question is one we all need
2:27 am
to be asking and it is relevant to the interoperability question, how to balance the need for privacy with all the other things we need. you have tom wheeler writing in "the new york times" about being , but need more open apis at the same time, we have a push to close them down because of privacy. to be fair to facebook, i would be like, what the hell do you want us to do? >> we can look at things to exist. maybe it just needs to become a -- be codified -- fair information practices. it is a data governance framework. most privacy advocacy experts are very well-versed in it. it is outdated in some ways, but offers a good way to think about how to govern data well. when i say obligations on researchers, that does not mean restrictions in terms of this is
2:28 am
a good project or this is not a good project. but it is about privacy and security restrictions and requirements. maybe limiting the amount of the data, the scope of the data, the reason you're collecting the data. those kinds of requirements really do not exist right now. that would be a step forward. >> we have time for two more questions. >> thank you, my name is david, i am a researcher that has been looking into this cambridge analytica situation since over a year ago. i am pretty familiar with the internals. one of the things you might want to consider as part of your advocacy around the consent decree, to mitigate the situation, is kind of a technical solution, which is to simply deprecate, for facebook to internally deprecate all of the user id numbers that have been exposed. one of the problems here is that there is a horse left the barn in 2015 issue here, and you simply cannot target those user
2:29 am
id numbers anymore. it would not totally solve the problem but would mitigate it in large part. >> that is really interesting. >> that is just my thinking. if you have not been thinking about that, it occurred to me. it is something facebook will not like very much, so it will have a punitive aspect to them, but it will not crush them and will not fundamentally disrupt legitimate ongoing activity from people who are obeying the rules. >> thank you very much. >> i would be happy to talk to you more about it. >> that would be great. >> one more question. >> my name is bob, i am a privacy consultant here in washington, d.c. i have a simple suggestion or thought. what if, as a result of the whatever ftc investigation comes out, the ftc would require facebook to divest instagram or some other application? that would create competition, solve some of the problems,
2:30 am
there could be other requirements, but that would be something facebook could say is punitive along the lines of the last question are. could the ftc do that? >> you would have to ask. when i was at the ntc in the dark ages, i don't know whether the commission would have thought of a remedy like that for a deceptive act as opposed to one that defeated competition. i suspect the agency would still have problems doing that. know, it's an interesting idea. bob always has interesting idea. it's an interesting idea people should think about. the remedies that the agency has are basically equitable. people totimes, force do all sorts of things they don't want. is an intriguing idea.
2:31 am
>> i'm sure other intriguing developments ongoing, including everyone enjoys or finds interesting the testimony next week. it should give us new things to chew on. thank you everybody for coming, thank you to the panelists. [applause] [captioning performed by the national captioning institute, which is responsible for its caption content and accuracy. visit ncicap.org] announcer: tomorrow, a discussion about key security and political developments in somalia and the economic and political future of the country. live coverage from the brooking institution starts at 10:00 a.m. eastern on c-span.
2:32 am
>> next week, facebook ceo mark zuckerberg will testify before senate and house committees on facebook handling of user information and data privacy. tuesday at 2:1 5:00 p.m. on -- 2:15 p.m. on c-span3 and wednesday before the house energy and commerce committee. watch live coverage on c-span3 and c-span.org. listen live with the free c-span radio app. >> "washington journal" continues. host: armstrong williams is back with us as we continue the discussion about fake news and the state of journalism. on the term of fake news, how do you define that? guest: well, it is deceptive. it is misleading. you are more concerned about your agenda than you are about the truth and the

42 Views

info Stream Only

Uploaded by TV Archive on