tv Facebook Data Privacy Practices CSPAN April 5, 2018 3:33pm-5:07pm EDT
3:33 pm
>> all right, folks, i think we will get started. great. good afternoon. i'm the director of the open technology institute in america which guarantees everyone has access to an internet that is open and secure. have you for joining us today. conversation, what should we do next? if you're not sure what i'm talking about, you might be in the wrong room.
3:34 pm
once upon a time, there was a fast-growing social network called facebook that hoped to grow even faster by becoming a so inrm for other apps 2010, it launched an api, application programming interface that allowed cap developers to use data from facebook users who had signed up to use their apps. there was a big privacy catch. not only could app developers not obtain data from users, but also all the friends of those users. though nominally, facebook notified users of the set up through the privacy policies, and there was a not particularly easy to find privacy setting for adjusting what data your friends could share about you, the default on the setting was for apps to have incredibly broad access to friends data and most ordinary date -- users had little understanding of what was going on. so for about four years until facebook tightened up access to
3:35 pm
friends data with an upgrade -- in 2014 and 2015, untold thousands of app developers site event tons of data from people who did not even use their apps. primary guardrails protecting the data from misuse after it platform ork's simply facebook's terms of service telling them they should only use the data for providing the service that users had signed up for and that they ituldn't, for example, sell all to a spooky political consulting company that wanted to build profiles in order to better manipulate them. of course, we know that is exactly what happened, that in 2014, a researcher named alexander used a survey at called this is your digital life and was able to attract 270,000 facebook users and through access to those users friends able to obtain personal information about, well we are not sure, but we heard
3:36 pm
from facebook up to 87 million facebook users. then sold the data to cambridge analytica, a portable consulting firm that worked with the trump presidential campaign and the brexit campaign, has bragged about influencing other political outcomes in mexico, australia, and kenya, and based on recently released undercover recordings, has apparently used its toolbox for influencing political candidates. this brings us to last month, when we learned about how cambridge analytica had obtained the data. we also learned that facebook has known about the passing of data to cambridge analytica since late 2015 but did little to confirm that the misappropriated data had been delete other than demanding that cambridge analytica certify that .t had done so facebook also continued to allow cambridge analytica to advertise on its platform until just
3:37 pm
before last month's story broke. this has led to a firestorm over just as controversy has already been raging for over a year about how several big tax -- big platforms have been to help spread foreign propaganda during the u.s. presidential election in other elections as well since then. facebook is losing billions of dollars in stock value due to lost public trust, in -- and it is promising to make extensive business changes to regain trust, policymakers in the u.s. and europe are rattling the saber of regulation, and ordinary folks, only now seem to be starting to understand how facebook actually works or at least how it worked for five years ago, and what that means for privacy, the simple question is, what now, what should facebook do, what should policymakers do, what should users demand they do with regard to facebook or internet
3:38 pm
platforms generally? i will be talking to commissioner sweeney about these questions and more generally about the state of online privacy and how to improve it. that, i wanted to pass the mic to my colleague who runs an independent -- called ranking digital rights, dedicated to answering another question very relevant to today's proceedings. how well our companies like facebook are user's rights? she will briefly give a preview of how the latest annual corporate accountability index being released later this month will answer that question, then we will move on to my conversation with commissioner panel with then our the experts. thank you. >> thank you. i do not want to take too much time other than to let you know the 2018 corporate account ability index will be launched
3:39 pm
on april 25 and we have a flyer here at april 27th, an event .ere right in the room we are planning to talk about it in person to the people who are .ot in new york the 2017 index can be found on you can see how we evaluated companies last year, the index ranks 22 of the world's most powerful internet mobile and telecommunications companies on commitments and and policies affecting you -- user rights on freedom and privacy. there are indicators looking facebook andat other company policies affecting how they handle user data, and it will not surprise you, you
3:40 pm
can see on the website last year i -- facebook did not perform well with the policies disclosed and also what it did and did not disclose. you will not be shocked to hear there was not a revolutionary change between 2017 and now. you can see the report when it comes out online on april 25 for all the details and downloadable data and everything else, the analysis, we will have the event in new york on the 25th. a similar event here on the 27th to discuss it in person and piece of -- people will be able to go through and discuss results in great detail. one other point in relation to data, generally, it is doing poorly. disclosures were toward the bottom of its cohort.
3:41 pm
just a little preview. thank you. >> thank you. we look forward to reading it -- reading all about that. i would like to welcome terrell mcsweeny to talk about the issue. terrell: hi, there. thank you for having me. kevin: of course i will start i will also post to expert panelists, which is, is this a tipping point? this a snowden moment in the context of surveillance where we might actually see changes in policy, or is this more of a moment where we will see a lot of noise but not a lot of action? terrell: i think you just made your own point. let me start by saying thank you
3:42 pm
for having me here today. i will give you my own perspective and not the official views of the federal trade commission. i will not pull any punches. i will be careful not to talk what has been confirmed, that there is an open investigation into at least some of the conduct alleged here. but i think we should have a policy conversation. i appreciate your first question which is, ok, maybe a beat -- 87 people's information -- and -- 87 million people's information was misused, is it a big deal? i would say we're not even talking about fact that 250 million people's very detailed information was breached not , and thatr ago incident fortunately did not have a policy detail that i had hoped.
3:43 pm
i certainly hope this is a moment of change. i think it is also a powerful moment because the general data protection in europe is coming into implementation in may. changes are being made in response to that. i think that has a big impact at the same time that this news cycle is having a big impact on .he story if it has an impact on just one thing, what i would really like it to have an impact on are the people who say to me when we have been talking for years about that or consumer protections for the digital age that american consumers just don't care. i think that is to knock -- demonstrably false and i think we have evidence that people do care and that consumer trust is incredibly important and ought to be at the top of everybody's list in terms of what they are concerned about for businesses. i think it is also underscoring to me the fact that consumers
3:44 pm
are not necessarily understanding or anticipating fully all of the risks of transacting and their data on the platforms. weo not personally believe should betray and to put all of that risk onto individual consumers to anticipate what might happen to their data and i think that is part of the policy conversation we ought to be having. currently, data protection rules in the u.s., the primary consumer policy cap on the beat going after unfair and deceptive trade practices, indeed, you have gone after facebook before. to negotiate with them in 2011, over alleged deceptions around their last privacy transition. question, how did
3:45 pm
this happen if the fcc, the cop on a beat, already had this in place and presumably was policing facebook? terrell: i think that is 100% the right question. i'm a sitting federal trade commission are and i love the fcc. i think the people, the staff, are doing an incredible job with pretty antiquated authority it is a 104-year-old agency using the authority to protect consumers from unfair practices and has been able to adapt that into the online environment. the agency itself has always called for strong tools. think this set of facts is notores that the ftc strong enough as it is currently configured with its current authorities and its current resources to be the kind of consumer protection agency required for a moment in which we are connecting every part of our lives to the internet and each other.
3:46 pm
kevin: how do we fix that? terrell: i would start by making sure it is adequately resourced p by the end of the obama administration, they did call for resources for the agency. never been funded near that level, so it is under resourced and that is easy to fix. i think it needs to can -- think about its configuration. one thing it had been doing, and i am proud of sharing the fcc with this is we have been bringing more technologists into our work in bringing more researchers on staff. we have an office of technology research and investigations i think is a great first step in the direction. i think we need to think about institutional design and whether that kind of capability ought to be significantly expanded, maybe by the creation of technology just -- a bureau of technology
3:47 pm
like the bureau of economics, so there is even more horsepower within it. it also needs additional authority to contact outside experts to really have resources beingluate what it is told. it needs in-house expertise and additional resources to bring that in when it does not have it. beyond that, it has called for penalty authority, not for data security and preach violations but for privacy as well. it needs rulemaking authority it can use for privacy and data security. also been studying some conduct that it finds very concerning. looking at the data broker industry for example and it called for more transparency and accountability for data brokers and they think that is really important. the on that, you could also be making the case for consumer rights that we need an additional -- digital age which include things like interoperability.
3:48 pm
those are meaningfully procompetitive as well. >> one of the big limits to what the fc -- ftc can do and what sticks it has to work with is your primary tool in regard to privacy has been deceptive trade practices. if someone misrepresents what they are doing with your data, that is within it but if they are doing something with your data that is awful but telling -- tell you about it and nominally you consent, that is ok. how do we get past that? noticing consent a workable model? facebook certainly will model and has argued that its users and the fcc had consented with and had notice of, this is how it works, this is what the product was, so where do you go from there? >> the idea that notice and
3:49 pm
consent is a framework that can protect sit -- consumers in this environment has been described as quaint and i think that is correct. --i do not think we could continue to rely solely on the framework. i do not even think the ftc itself is advocating we rely fully on that framework. the ftc does have limits to its authority and it does look for deception, which we can see her at if you are not telling people truthfully what is happening to information, and how it is being used, that a -- i think that is very important. it is looking at the role of consent and how that has played out on the marketplace, it has been emphasizing best practices around requiring notice that it is clear and timely and outside of long-term service, private policy agreement for a number of years. those choices need to be offered around the collection of sensitive information in particular. i think it has consistently laid
3:50 pm
out test practices, whether the industry has actually been following them i think is a different question. it raises the issue again of whether it is strong enough and i am obviously arguing that it is not. i do not think it needs to be doing this all by itself. we are sitting here one year after congress repealed the stronger broadband privacy rule, and i think there was no justification for that. we need more than one consumer protection cop on the beat. kevin: the privacy settings are confusing. this is not unique to facebook but it has gotten more complex. i want to share an anecdote that describes that in illustrates some of the problem. when the scandal broke, i went back and looked at the settings. i have been working to some extent around facebook privacy for some time and i am familiar
3:51 pm
with it but i went into the settings and found a setting that says at -- apps that your friends use. they can share and app subject to all of these check boxes and 90% of them are checked by default but you can uncheck them did not updateok privacy settings when it updated in 2015 so there is a vestigial setting and i asked facebook if the setting means anything at this point and they were like, we hadn't gotten around to changing it or getting rid of it in part because there are some cases where some of those checkboxes do matter. they named one and i was like, is that the only one, and it was not clear they were sure whether plus. they are clearly sorting that out now. they are not even clear how their settings work. how can we be clear on how their settings work question mark --
3:52 pm
work? terrell: again, you are identifying the weaknesses of the model we are using. i do not know if anyone else have the experience of the room, you know, shortly after the example, i broke for have the opportunity to sit down with my mom and go over her settings. she was very concerned but also wanted some help figuring out what the privacy settings work. i walked her through it and we went in and i said, you know, this is where you view the settings, she said turn it all off. i do not know of anyone else have that conversation with their mom, but it is a familiar situation to me and it suggests to me that people are trying to exert choices over how their data is being used and how it is flowing out of first party relationship they have with whatever service they're using
3:53 pm
they think they are exerting choices and they may not necessarily know that there are more choices they need to be looking they think they are exeg choices and they may not at. the ftc has been looking at the issue. looked at this. only recently in our paypal and then locate where you said you can't have a default setting that has several different options to navigate to keep something private. if you are trying to keep selling private, and you think you set to private and then have to do three more steps, that is a problem. the ftc was looking at whether consumers could really navigate the settings they were being offered. it also looked at, i think of the cases more as trickery and clever technical workarounds to privacy choices. it's a consumer has said do not track my geolocation app, then that isprograms
3:54 pm
triangulating your location using wi-fi is probably going around that privacy setting. we have to find ways to make sure the technology is following the privacy choice of consumers. i think to have been looking at these issues. the fact that we had more than one case already on this suggests to me there are problems out that we need to be continuing to be very active about here. kevin: in trying to protect consumer products, what other role is played? >> particularly influential is cases involving whether the security practices are reasonable or not. it is a really important authority and it is challenging have aso we have to
3:55 pm
clear likelihood of harm or we have talked about harms that are not just economic, but harms that involve invasions, turning cameras on in people's bedrooms and emotional harms with revenge porno and things like that, it can be tricky for the ftc to reach some conduct using just the unfairness authority. this is an area that i continue to reiterate the ftc cannot go alone. is wheng that happened it starts to use its unfairness authority aggressively, congress in the past has stepped into eliminated pretty severely. the agency has been cautious developing how it uses the authority but with good reason. kevin: what about market harms? competition a
3:56 pm
authority. he talk about platform monopoly or breaking up these companies or a variety of other ideas to deal with the fact that they are big, what role do you see their? -- ther? -- there? i think more competition would deftly benefit consumers. that has to doks with the economic of how the markets work. completely obvious to me that just getting more competition would yield better outcomes and better protections for consumers. that is why we need additional regulations to help really marketplace toward the outcomes we want in data use and security and privacy. more competition is good and i
3:57 pm
think using its competition authorities aggressively is terrific. it should be advocating procompetitive policies like interoperability. i think we need to be mindful of the fact we cannot just rely on competition as a market force to correct for all of the problems we are potentially seeing here. you talked about what congress could do to help strengthen your agency. what about when we are talking about privacy online? i think congress could start thinking about what are the laws it needs to pass to protect consumer privacy. one thing is stop passing laws that undermine it. we could start and build on that by really, you know, taking a look again at a number i've -- a number of ideas and number of people here have been talking
3:58 pm
about this for a while, comprehensive privacy and also real comprehensive data security legislation and cyber security legislation. i would argue again for more transparency and accountability for data progress. obviously, more researchers and strengthening specifically as an agency, i that is very important. talking about some rights that consumers really deserve here, the right to control over your data, meaningful control around meaningful interoperability, important conversations that we need to be having. that is the consumer protection angle. i want to emphasize one thing we are seeing play out in all of these stories is -- it has to do with rigor issues and bigger risk than just the protection harm we're all concerned about.
3:59 pm
the potential for technology and the disinformation campaign 209 distributions come i'm personally worried about the , sections for the open internet order, manipulation of democratic institutions is a deeply harmful thing. will require more than just addressing privacy and consumer protection issues. there is a new conference of data protection that will come in force later this month. does that strengthen, we can, and how does that impact the argument for trying to get a protection bill done in the u.s.? hope it i do strengthens it. if what in fact is happening is that the major technology companies are coming into compliance with gdp are and they to consumers as they are inmsse
4:00 pm
europe it seems to me a lot of of opposition to the burden allowing u.s. consumers to codify those rates is eliminated . i think it could have an effect of making it easier for congress to really think about rightsizing consumer protection for the digital age. kevin: i hope you're right about that and i appreciate you taking the time to chat commissioner. we will invite the rest of the panel of now. [applause] >> how are you all? i'm going to let everybody introduce themselves. we will go down the line. what i would like is for everyone to introduce themselves and briefly answer the same questions i posed to the
4:01 pm
commissioner. is this a tipping point or an equifax moment? >> i think this can will have legs were little while. i think it has been in the works for a long time. questions for a long time. i think this will make a difference. the ftc investigation will be ongoing and i hope a spotlight will remain on how the company responds to this whether it's response is sufficient to protect consumer interest in privacy and their data but also any potential impacts their response of -- the response might have. important insist on
4:02 pm
consumer protections but we also need to be mindful of unintended or overnces corrections that could inhibit things we love about the internet. ensuring competition remains vibrant and facebook does not use this as an opportunity to shut down competition in the name of privacy and security. >> and who are you? >> i'm caroline holland. tech policy fellow working in the digital ecosystem and healthy internet. .> my name is harlan yu focusing on technology and civil rights issues. to answer question, i hope so. we have no way of knowing. it does seem like especially because it is facebook and because the links that it has to political campaigns and political groups that people
4:03 pm
seem to have intense interest in come of this story may have more legs. it seems like facebook, given the public pressure they are under, is willing to make some changes and hopefully positive. i think that's going to require an ongoing conversation to see what those changes actually are. director privacy and data for the center of democracy and technology. the answer lies in your framing of tipping point. there been little chips away at the public trust. the internet, even digital systems written large. the success of a crescendo and when it was tied to a political campaign where there already is --reat amount of
4:04 pm
the fact that there was a company that portrays itself as being friendly to consumers and for their youth and that is the value proposition they are getting three of facebook says we will make this free for you because we want to connect with you. all of those things crescendoed into this moment. something will happen. -- i'm an optimist. i think facebook will have to face the music. it will have to change practices and become at the very least more transparent. >> i'm david. i teach law at georgetown law school. i'm not sure this is a tipping point. it will be a significant moment in a couple of ways. it depends on what facebook says . facebook has a lot to answer to.
4:05 pm
.ublic hearings next week mark zuckerberg will testify. whatnk a lot depends on path facebook decides to follow. this is the first major breach entered with an internet giant. there was a dustup with google shortly after it entered consent decree. this is in my view major issues of facebook willingness to comply with federal order. at zuckerberg and facebook response to the ftc? this may be a tipping point because it could force the agency's hand. >> in terms of facebook's asponse, they have announced whole bunch of changes trying to regain user trust.
4:06 pm
they will be simplifying privacy settings. although a lot of that was planned for gdp are compliant. clarifying terms of service and privacy explanations are not yet apparently talking about substantially changing any of those terms. they stop working with off-line data brokers to facilitate ad targeting which is good. they closed the privacy gap in the people search function that was allowing map scraping of parts of profiles. just yesterday the announced they will be narrowing what data is available to apt developers over the range of its api. it seems a clear doing anything and everything they can short of significantly changing their business model or terms of service to tear down concerns. else should facebook be doing now? if you were in the war room at facebook at this moment, what would you be advocating for?
4:07 pm
>> the cambridge analytica scandal raised two related but distinct issues. obviously we will be talking a lot on this panel about user privacy and the scope of sensitive user information facebook makes available through developers.r this is equally vital conversation we need to be having that commissioner mcsweeney alluded to. as facebook starts to raise the , how of its walled garden is the public going to scrutinize what's happening inside that walled garden? in particular, how facebook's business model is vulnerable to potential abuse and how the those issues.es cambridge analytica story got a lot of attention because of its links to political groups and political campaigns.
4:08 pm
i think with the story did was intensify people's interest in the ways that facebook data and the facebook platform was potentially being used to manipulate elections and the political discourse in this country and elsewhere. promised, facebook has some amount of additional transparency, especially after its own internal investigation of russian interference. it wants to establish a new standard for add transparency that would "help everyone is wesley political watchdog groups and reporters keep advertisers accountable for who they say they are and what they say to "ifferent groups. want to spend a few minutes talking not about the user privacy side but about and target. when i took as i really talking about any message on facebook that's touched by money.
4:09 pm
whether you are a company or not , anyone can spend money to boost your message to a certain segment of facebook users. your mind probably gravitates toward consumer products. chase has a new credit card wants to target to certain consumers. we are also talking about political campaigns, political groups that want to do issue ads. about otherg nationstates that want to spread misinformation intentionally or to exploit certain divisions in our society. i'm also talking about ways that advertisers might be using facebook's ad targeting platform to pray on vulnerable consumers. these are ads that are trying to patience for fraudulent opioid rehab centers or sketchy
4:10 pm
loan repayment schemes. at campaigns that drive the drive illegal discrimination especially if we are talking about advertising in finance, employment, and in housing. these are the kinds among legitimate ad targeting there's a range of possible abuses that are possible in the system. as goes to the core of facebook's business model, which is finding certain segments of faith looks users and targeting based on paid messages to those users. the main question i have, how is the public in the future going to be able to scrutinize this ad targeting behavior and dress these abuses? what is facebook going to do to help address these issues? how will they be transparent
4:11 pm
about what happened? facebook has made a few small promises. it seems clear to us that there is a lot more that the company can do. four potential things facebook could do. slowly started to make advertisements on platform available to public scrutiny. they've been doing a pilot project in canada where is your user and you go to an advertisers page you could actually see a list of running at that advertiser is currently running. in principle, all ads are visible. it's also very manual process. if you're are a researcher in makes it difficult to know the universe of all ads that a particular advertiser is running. many advertisers have hundreds of ads. you're no idea what the scope of those ads in the reach of those
4:12 pm
ads are. the first thing i think facebook could do, in the same way that they built a very robust api for user data, a robust api for advertisements that allows the public, researchers and journalists to scrutinize ads more effectively. facebook has started to make enhanced transparency promises especially around election. that's us narrow slice of the problem. looking at federal election ads that mention a candidate or election or get people to vote. it ignores the broader range of abuses described. should be turning their attention not just to election ads but really to all ads run on facebook platform. effectiveo have accountability it's important not just to know what the
4:13 pm
content of those individual ads are but it important to know the scope and reach of those ads. i'm talking about exactly what the explicit targeting criteria was for the ad campaign. who is that advertiser trying to target? complicated.gets they might use something like custom audience or look like audience were the upload a list of existing voters or consumers and facebook uses -- but looks like audience that has the same features. dot facebook needs to also in addition to making the ads themselves more transparent, is to expose criteria and information about the audience a particular ad actually reached. how many people? what are the demographics? to the extent that facebook is , theirnforcement
4:14 pm
internal enforcement to take down bad ads, they should disclose to the public a detailed accounting of all the bad ads and they are taking down and for what reasons they are taking down those ads. those are the steps that i think would provide real transparency and accountability. these are the kinds of steps that will help raise the public's trust. it's not just facebook telling us they are doing these things to try to stop these nefarious behaviors, but actually letting the public scrutinize this and verify that this is actually the case. toill put a plug on a report upturn team is releasing in the next couple of weeks hopefully will serve as an advocate guide to add transparency and what we should be pushing. >> looking forward to reading that.
4:15 pm
it seems that we are building a manipulation layer into the internet that we do not understand or control. in the interest of transparency this reminds me i should providesthat facebook financial support for some of -- although clearly we disagree on some things we also are aligned on some important issues for security.nd secure in terms of my wish list i would add greater transparency in the sense of appearing in venues like this. we did invite facebook but they are busy. presse done a lot more calls than usual lately. releasing transcripts and stuff like that. seeing more of that public engagement would be great. as far as other people's wish lists, if they were in the
4:16 pm
warroad, any other things or shall we move on to another question? >> let me say what i think they should talk about doing. one of the real problems in the debacle, howlytica little control facebook was exercising over third-party, particularly after developers. facebook has recently knowledged that they don't really have contracts with third party app developers. they don't have remedies in case a delivered overharvesting or sharing. they obviously did no due diligence on third-party app developers. when it comes to how do you solve the cambridge analytic problem, part of it is there has .o be greater oversight control --s whole episode has proven
4:17 pm
that was one of the aims of the consent decree. one part required facebook to andtify threats to privacy to plug those threats. since third-party app developers have access to this data, that was an agnes -- an obvious factor for privacy violation. one would've expected that there would have been some controls placed on and developers. yet, as this debacle unfolds, it becomes more clear there really were none. go down the list. some kind of due diligence about who has access to the data. some kind of contractual lockups that give them power to require sort ofoversight, some certification of not over collection or sharing, audits done by facebook or an outside
4:18 pm
party to ensure that there was compliance. thise weeks or months into and facebook cannot assure us that the cambridge analytic data is not still floating around are the cambridge analytics or cogan, the researcher, have actually destroyed them. i teach law school. we teach students how to enforce these kinds of promises. it does not appear facebook has any remedies at all. so basically disciplined third-party apps that have broad access to consumer data. i could go on. what i would like to hear from facebook is, what are we going to do to control this. berra'se era -- yogi famous line is this is deja vu all over again. we saw these problems in 2000
4:19 pm
11. the consent decree was designed to avoid the cambridge analytics problem. one of the things i like to see is facebook come before the senate as zuckerberg will do next week and come in with a list of things that are going to control this part of the problem. i agree with harland, there are lots of other problems but in terms of providing minimal safeguards for consumer information, those are some of the things i want to see facebook talk about. >> the consent decree that you help negotiate in 2011, can you talk about -- you seem to indicate that you believe it's been violated. i would love to hear if you could enlighten the audience on what that was and what that was about and what you expect or want to see from the ftc in regards to that now. partain, this goes back in to third-party access. 2009, facebook made two changes through his privacy settings
4:20 pm
that pushed a lot of private information to be public. it also gave third-party apps access to information they were not supposed to have. one of the things that's ironic about the ftc complaint, the ftc said part of the deception was allowing third parties to get access about how people exercise their political views without their consent. think i've seen this movie before and it did not end well. one of the things the fcc would guess the ftc did was try to rein in third-party protection. the consent decree draws the line between users and third parties who actually harvest things. the goal was to limit third-party access, unless there is clear notice and consent. facebook is going to say, the
4:21 pm
settings they had allowed sharing, mass sharing. on the other hand, the question the ftc was asking, what are consumers reasonable expectations about what that means. one question to ask zuckerberg next week's hearings is, do you really believe any of the friends thought that something like cambridge analytics was going to happen to them. back then, back in 2013 or whenever this happened, clear about that? i looked at those notices. i don't think they meet that test at all. that will be part of the ftc inquiry in terms of whether the consent decree was violated or whether there were fresh violations of section five. i think that is part of my concern. , one sectiont is is devoted to forcing facebook
4:22 pm
to look at vulnerability. at jeopardy and plug those holes. that was designed to respond to downloading by third-party apps. it is quite clear in the aftermath of cambridge analytics facebook paid no attention to that part of the consent decree because there are no controls on third-party downloading. does no remedy -- there is no remedy for harvesting data that you haven't -- that you have no consent for or sharing that data with third parties. which is why cambridge analytics is such a scandal. this is months and months, two or three years, maybe four years since facebook has known about this problem. it has done nothing to fix it. a lot of the things facebook has announced, the new platform
4:23 pm
policies that mark zuckerberg has talked about, we've heard all of this stuff before. is facebook really serious about? >> on the subject of the consent decree i want to jump back into .hat facebook would argue i think they would say when you all negotiated the settlement in 2011 comparing that to what was going on in 2014, they would say when we negotiated that settlement this is how graph 1.0 worked. these are disclosures that were made. these are the settings they have. what changed to make that not ok? it was cogann is who violated the rules. our product was working the way it was designed until we changed it. >> the consent decree was to avoid problems with people like cogan. gives to force facebook to clear and better notices. that was part of the consent
4:24 pm
decree. section four, which was looking at vulnerabilities, that is the key provision. in my view facebook did not pay any attention to that. , in 2013 orstion is anytime after the consent decree was entered into, friends of friends understand the scope of harvesting of their data. the 87 million facebook users who had cambridge analytics take their data. what they have understood they gave permission for that? the answer plainly is no. >> what happens if the answer is no? enforcement i ftc don't think it really matters. don't think facebook has any argument this is not a violation of section five. section five turns on what
4:25 pm
consumers reasonably expected. i think they don't have the defense, which is they violated multiple groups -- multiple provisions of the consent decree. if there's a violation of the -- mileage and of the consent decree there will be a substantial civil penalty. at the time we did the google case the civil penalty statute provided for $16,000 per violation or the ftc has considered a violation to harm to an individual consumer. times 87asure $40,000 million, only harland would be able to figure out what the answer to that would be. >> that has changed -- >> you're talking about an astronomical civil penalty. that would not be the starting point for the agency. i think there's likely to be a substantial civil penalty in this case. >> we're talking about what the
4:26 pm
ftc might do. there's the question of what congress might do or should do. congress, what should you do? i think yelling at mark zuckerberg is a start. it's not mr. going to make change though. i want to go back to consent decrees. in some ways this illustrates weaknesses in the consent decrees themselves. where perhaps congress could maybe in a more discreet way make some fixes to make them have more teeth. in the google case i think the fine was $24 million which is half a day's profits. this is my point. to make it really matter and actually have some heft when the ftc levies a fine. what congress should do is not remake the gdp are.
4:27 pm
for those of you in efficacy you might know this is not exactly in line with what advocates are saying. i think the gdp are is fantastic. it has forced companies to incorporate a lot of user rights into their services that i don't that we need to duplicate it for that reason. that's not to say that there are elements that are not constructive. enacting privacy lawn is to replicate the failure of consent. instead what we should be looking at his expectations. i think the way congress should imbue baseline privacy law is with the ideas of what is a person's expectations as defined by what kind of user agency did they have. what sort of transparency is available and accountability attached to those things. the wayat the idea that
4:28 pm
that we interact with those platforms is of skewered. you really can't make consent for the most part. you'll see the hundreds of eyes looking at you as you post something on facebook. if you're going to imbue expectations into your platform and you're going to use these values of agency transparency and accountability, some of those changes have to come from the design side. create some law that says you have to have -- it is more about what sorts of interactions will make clear to a person what the true value proposition is. a person that i know put it well and said when the companies leverage or data 100 times that's like a price increase you don't know about. you get nothing out of that accept a free service. that argument i think is ringing hollow now. to the extent that there are
4:29 pm
ways -- there are design principles that allow for more transparency, more agency, not just the gdp are. , going back into accountability i think that is so crucial. making public disclosures and drawing on some of the laws that exist. making ceos certified public disclosures on quarterly basis. it forces the ceo to have skin in the game. other areas are auditing requirements. those things can and are doable. they are not easy technically. i think there are discrete ways that would make a difference. >> what you decide at the front and is not a modest proposal.
4:30 pm
i will admit i share skepticism about the value and political viability of comprehensive baseline privacy in stage, considering that a lot of smart people focused on the proposal from the obama white house in 2015. i am not sure if the calculus has changed that much. it is a question of, if not some for the u.s.,ight what? you mentioned a couple of good targeted things, the ceo certification idea and the impact assessments. staff are wondering, what can i write right now that my boss can introduce? or what can we do that is really inong that will strike fear the hearts of facebook and other companies and perhaps entice
4:31 pm
their behavior? be frank, striking fear in their hearts is not really what i am concerned with. ofme, my eye is on the ball how to get protection for people. with protection should come accountability. i think strengthen consent decrees would be great. strengthening the ftc, which we have heard and everybody here probably knows, it truly needs more tools and resources to be able to do its job. more public transparency around consent decrees, more penalties for violations. what we're coming out with is something with specific recommendations on consent decrees, and hopefully that can get bipartisan support. in some ways, you can look at the republican reform playbook and use ideas of good governance to get something like that to be more palatable.
4:32 pm
another area, and david touched on this, is the idea of data access by researchers. it is something that has been sort of avoided, partly because it is a tricky subject. we do not want to shut down innovation or open access. that is what the internet is built onto it but there are ways to create obligations for researchers they do not exist. if you are a federally funded academic researcher, you follow the common rules, which means there are ethical guidelines and you go through an institutional review board. those institutional review boards, there fairly worthless. on the markho sit in good faith, but they do not ask for things like terms of service reviews. not to say that if you review terms of service and it says you should not do this, should not do a lot of what researchers do, but it is appeared of further to be a review of that, some accountability for the researcher.
4:33 pm
there should also be certifications for the researcher's so they are held to some obligations, not just for what they intend to do but how they are protecting the privacy and security of the data. facebook's data sharing agreement was very light on details and light on accountability. i am not sure that was not by design. think the idea is to let the day to let the data go and then we do not have liability. to createwant lability. the other aspect would be creating a chain of command of lability and the speaker system, which would not be easy. but you start with the platform of the service, so creating the risk of the user, the benefit and the risk. and got on the line and decide and assign the rules and liabilities. those can be chopped up in small ways, and maybe consent decrees are part of that, maybe certifications. >> i am guessing, david, you are all for strengthening the fcc.
4:34 pm
>> yes, i am happy to repeat everything she said before. absolutely spot on. there are maybe smaller pieces of the privacy issue with data brokers. this would get at any but not all of the problems. given some of the breaches that we have had and some of the problems with large data there needs to be something like a fair credit reporting act but much stronger for data brokers. the fact is people are worried knows abouthe nfc them, but they actually know some much more, and so does facebook and other companies, but we do not have her glittery tools. -- we do not have regulatory tools. these are massive data pools. there is real risk there.
4:35 pm
skeptical ofhat going back to the platforms and away from the brokers. we will see specific use restrictions and law or requirements about consent, but i think the possibility of much stronger transparency requirements is definitely in the answer. aboutect they are talking it. there is also the question of political viability and timing and what-not. for example, i have a crazy idea, which is there is a single law called the video privacy, protection, and it protects records of what you watch on netflix and the video store. after videot records were obtained by
4:36 pm
journalists, and congress freaked out thinking it might be them. hence, the strongest privacy live o-- law ever. i do not see why that should not extend to what is online. i shared this with a staffer and there were like, sure, but that is in the criminal code, mending it would go through judiciary, and nothing is going to happen in judiciary and we need something that will go through congress. so what is actually possible right now? they are not going to pass anything this year. they are basically already done because of the election. but where should we plant this? >> what happens in that if all is huge. it will decides of -- decide if democrats will retake congress, and then the possibilities are much greater for baseline
4:37 pm
privacy law or any kind of updates to privacy in the general. it is funny, the jurisdictional question, because privacy is notorious for being 100 different committees or people believing it should be in 100 different communities, especially with a high-profile case exists. they are all scrambling to figure at how they can fit it into agriculture. [laughter] great, fine. this is way our democracy bumbles along. providehe extent we can staffers with the correct facts about what happened, first of all. useve seen journalists words like scraping and access interchangeably. those are not the same at all. as advocates, we need to make sure they have that. and offering different committees different solutions, i think that is up to groups other groups to work hard to make the committee
4:38 pm
-- make sure the committees have information. and we need to bring in republicans who are interested in make this issue. they had not said much for a while, but now they are. that is a really important development, especially in d.c., and i think it is important to explainoth sides to this is a truly bipartisan issue or should be. caroline and the issue of competition, which is your expertise, what role does the antitrust law play in that addressing a situation like this, if any? theo i think when of questions we have heard is, well, wait a minute, maybe if we had more competition, things would be better. we have seen a lot of headlines talking about the power of big tech, the concentration, the consolidation that has occurred. can't the antitrust laws -- doting about this?
4:39 pm
something about this? the follow on is, maybe if there was more competition and they were doing their jobs, something -- maybe things would not be so bad. iny could play a bigger role some of the bigger questions, probably not as good at the consumer privacy question. yes if we had multiple social platforms. the trick about social platforms is we really do not want to have to go to six different social platforms and find all of our friends on each one. one of the value is the value in this service, that the more people who are on it, the more valuable it becomes. so competition is not necessarily something that consumers really would want in the real world. you might want options for competition, and i can talk
4:40 pm
about some ideas for promoting competition. but there is a limit on antitrust law. enforcement that can keep them from taking actions that will farm from the competitive process, and it can stop mergers that will lead to lessening of, titian -- of competition. this could have some positive impact on how the platforms compete and the actions they take. another important tool besides ftc'sust laws is that section five authority is believed to, and i believe it and cumbersome thing more than just the antitrust laws, and congress must have meant something. it is not just the antitrust
4:41 pm
laws. so looking at how the agency can use that authority, it finds actions facebook is taking that violates this principle. they are somewhat limited tools. promoting competition, how do we put competitive pressures on a company like facebook? is that through better data portability? creating a meaningful way to support data through some other application or service that would be able to create some sort of networking service that consumers want to go to. this goes into some of the questions that i am exploring and trying to think about, and there are still more questions than answers on this. it is the importance of data portability, interoperability,
4:42 pm
and how api's work into this. side note, i have been thinking about how more open api's promote competition, promote innovation, and ensure the openness of the internet that we all want and get a lot of inquiries -- innovation from. then cambridge analytica happened. i thought, oh, shoot. i needed to get out the process behind api's. is it more responsible use? how does it fit into the digital ecosystem? wearning and something should be looking for and the ftc will do this because they have the opposites -- office of technology, they will make sure that facebook, as it starts and policies,'s and i think there are a lot of things they should do to protect privacy and security to make sure the people willy-nilly
4:43 pm
cannot get access of all of our consumer data, but at the same time make sure there is enough overcorrection in terms of shutting down access to data that helps developers come up with exciting new programs and use onat people want to facebook that could ultimately compete with facebook. facebook could have the incentive and the ability, using say, you know, that is not what this developer access data, and we are worried that this is a threat to ask him as a let's shut this down. this is something a number of us have been thinking about, including the commissioner. you can go look at some tweets about it if you like. there is portability, getting your data out. facebook does have a tool for getting your timeline out.
4:44 pm
it is not really suitable for uploading into another service. assume there are services out there. >> they build it. >> but not for you to take it somewhere else. gdpr is going to require some level of portability, including machine readability, so you can move it somewhere else. remains to be seen how people are going to implement that, and that will be really interesting. then there's also the issue of how to come up with an environment or something can actually get big enough that you will actually even want to move your data to it. then you get into interoperability. from where i am sitting, one of the possible versions of a networking big enough at this point now the facebook really bought the two networks that were getting big enough to compete, instagram and what's
4:45 pm
app, is to be able to leave facebook while still communicating with people on facebook. it is the people on the platform. if no one feels like they can leave because everyone is there, then a hashtag on facebook nene -- means nothing and there's no pressure to change. how do we do that? what are the tools other than consumer outrage and a bully pulpit? it is probably considering that the most mortal threat, building doors into its wall. >> you would have to find the market power. but doing things that would require that sort of a remedy to make this open. i think one of the solutions is more likely legislative or giving the ftc more authority to do rulemaking.
4:46 pm
the important thing is it is not just helpful to have my pictures , the mostts valuable piece of my friends. we have to also thing about privacy. butriends are my friends, if i go over to some other service, are my friends ok the newpinged by service to join? there are a lot of questions about what the meaningful data portability to another service actually means, and the devil is really going to be in the details. it would be great to hear from developers and others who are trying to think about the next great social platform to create and what they would need to be able to meaningfully port the data or get the data, capture the data, and then build a service off of that. what you are really talking about is forcing facebook to become a common carrier. that is full of subsidiary
4:47 pm
problems. i just do not know how you force it except through legislation. >> there is some precedent like when aol bought time warner, there was a condition about aol having to make its messenger interoperable with its biggest creditor -- competitor. they would have to see the demand. >> besides amazon and google. >> when will the authorities have that type of hammer? >> maybe it is small parts of facebook, like messenger. maybe that is not that small. but looking at specific communication aspects. like the plug-in or something. >> there is also a core tension between privacy and into broad -- and interoperability. yesterday, they announced that they were close off more parts
4:48 pm
of api. advocates will cheer but competition advocates will not. so how to get both at the same time will probably have to do with more people and policy solutions, rather than tweaking of the knob in terms of how much of the user data the api exposes or not. this is going to require a lot of study. it will be helpful to have the research and empirical data. we really do need to think, is that what you want to do because you need to keep in mind that we want to ensure the incentives to build the next great whatever platform or facebook is going to be. if you create a policy that says it is going to be big and have lots of people, so we need to open everything up.
4:49 pm
innovation,he chill and maybe the flipside is that it will force the services to be the best they can. competition is good. you are going to want them to be excited about the best privacy policy, whatever it is. it is going to be an important consideration of what trade-offs gettingere be for certain policies we think will benefit consumers and what impact it will have on innovation. >> as that occurs, i will admit that my greatest fear right now is that this will not only push the decision-makers at facebook but also the rest of the toward, like, it is not worth trying to plane that field. aworry that there is like rabbit in the prior patch and facebook is like, no, please don't make us locked down our
4:50 pm
api's even more. ideas around some data collaboratives that i think would be interesting for people to look at, the idea that maybe of somethe data, part of the big platform's data could be put into data cooperatives or interested into intermediary ies. and there is ways -- are ways to segment some of it. the minimum viable amount of data you actually need? >> a really good question. are nothese api's actually relevant. it is about getting my friends and a piece of data that will allow me to react to defy them. but you get into a privacy problem. gdpr but let us
4:51 pm
do that. click gdpr will not apply to american data. >> yet, in europe. anyway, a lot of rich discussion there. not to overpromise, but i believe oti will be doing an event on interoperability and portability within the next few months. knock on wood. meantime, we have a a few more minutes for some questions. and there are hands shooting up. there is a mic going around, so please wait for that for the benefit of folks watching on c-span or online. >> hello, i have from access now. david, you spoke about the dissent -- consent decree, and michelle, you spoke about audits . audits are in this consent decree, and they would have had to have gone through at least one, probably two, between then and now, and he clearly did not do anything to push us forward.
4:52 pm
is an audit actually something we should be pushing for? was it effective with the consent decree? if it was not, which is seems not to be, how can you make them better? >> the word audit is a word of many meanings. the consent decree requires biannual filings by a third-party to essentially make this was a good year for the commitments it made in the consent decree. so there have been at least two, maybe three, of them. i have not seen one in several years, but that is not really what i am talking about when i talk about audits. i am talking about facebook trying to make sure that third-party apps speak to whatever commitments they have made in terms of what they are going to download and not shared with third parties. one of the real problems with cambridge analytica is facebook really do not know what cogan
4:53 pm
was downloading, nor does facebook have any means of making sure that data was not shared with third parties or sold or done -- there was no control and know what it. so what i am talking about is ways of overseeing third parties who have access to data. there needs to be some contractual lockups, and there needs to be some ways of, after the fact, making sure that the third parties behave as they promised. at the moment, facebook is simply depending on wishful thinking that third-party apps are going to do whatever is in the terms of service. but the cambridge analytica debacle has shown there really are no controls. when i use the word audit, that is what i am trying to refer to. >> from what i understand, these
4:54 pm
are really assessments, which is different. so you have the private assessor that is sort of cataloging benchmarks. there are ways for companies to game this. they will change their practices right before. for example, to look better, they will often not -- obviously, this is not public -- but they will also not miss material changes, things that would be relevant to a consent decree. the assessor does not have access to it or the company does not tell them about it because of the timing. a formal audit would require it to be public. it would require the company to sign off on material changes of their policy and practices before and after the audit happens. and it would create instance of create a sense of accountability tied to the consent decree. >> it is not at all clear that facebook reported what happened ogan andan -- c
4:55 pm
cambridge analytica to the ftc, but i think it would have been required in one of the biannual reports. >> i am the advocacy director of the committee to protect journalists. it is interesting, because in several conversations around and countering violent extremism debates and the fake news debate, one of the big concerns is the lack of access to data by researchers. so it seems like there is this tension between access to personal data, yet facebook and other companies not turning over lists of content that they have removed or censored, etc. so i think that we should be careful about making broad brush strokes about institutional research boards. at least they have to go through something. definitely they need to be more technologically apt. but how can we balance the need for more oversight and auditing potential, both by researchers
4:56 pm
but also potentially by the judiciary? and that is with the need to protect private data. >> i can take that. at least in terms of the recommendations i was talking especiallys true, when talking about countering violent stream is him a desk violent extremism. balancing that is a problem. the recommendation i was talking about applies to ads. facebook already considers ads to be all public good with the pilot they are trying, any post that money has been spent to boost is already considered public and not private information. at least in terms of -- it will not get it all of the problem, but for large part of the problem, beyond just election,
4:57 pm
more transparency about ads will probably do a lot of good without really risking individual privacy. add that i think your question is one we all need to be asking and it is relevant to the interoperability question , how to balance the need of privacy with all the other things we need. writing inm wheeler "the new york times" about being more open and honest, but it's the same time, we have a push to close them down because of privacy. to be fair to facebook, i would be like, what the hell do you want us to do? be codifiedneeds to -- fair information practices. it is a data governance framework. most privacy advocacy experts
4:58 pm
are very well-versed in it. it offers a good way to think about how to govern data well. when i say obligations on researchers, that does not mean restrictions and terms of this is a good project or this is not a good project. but it is about privacy and security restrictions and requirements. it may the limiting the amount of the data, the scope of the data, the reason you're collecting the data. those kinds of requirements really do not exist right now. >> we have time for two more questions. has am a researcher that been looking into this cambridge analytica situation since over a year ago. i am pretty familiar with the internals. one of the things you might want to consider in terms of -- advocacy around the consent decree is kind of a technical solution, which is to simply facebook toor
4:59 pm
internally deprecate all of the user id numbers that have been exposed. one of the problems here is that is the 2015 issue here, cannot target those user id numbers anymore. it would not totally solve the problem but would mitigate it in large part. >> that is really interesting. >> that is just my thinking. it occurred to me. it is something facebook will not like very much, so it will have a punitive aspect to them, but it will not crush them and will not fundamentally disrupt legitimate ongoing activity from people who are obeying the rules. >> thank you very much. >> i would be happy to talk to you more about it. >> that would be great. i am a privacy consultant here in washington, d.c. i have a simple suggestion or . >> i am a privacy consultant
5:00 pm
in washington, d.c. i have a simple suggest, though, whatever. whatever the ftc ftcstigation comes out, would require facebook to invest applicationr other and that would create barrier, a competition, solve some of the otherms, of course requirements but that would be something that facebook would be the ideas.ong >> couldn't the ftc do that? have to ask darryl. when i was at the c back in the ages, i don't know whether the commission would have though that for a like deceptive ac as one opposed who defeated competition. i suspect the agency would have problems doing that now. but, you know, it is, it is interesting idea. bob always has interesting
5:01 pm
ideas. it is yet another interesting from bob that i think people should think about. you know, the remedies that the agency has are basically equitable remedies, and we do, times, force people to do all don'tf this ings they want like occupational bans, so this is an intriguing idea. >> well, there will be plenty of, i am sure, intriguing ongoing, including, you know everyone enjoys or testimonyesting the next week which should give us all kinds of new things to chew on. fork you, everybody, coming. thank you to the panelists. [applause]
5:03 pm
leader once nationalism and globalism will debate at the university of park.nd and college that starts live at 6:00 p.m. eastern here on c-span. c-span2, it is discussion on sum of foreign poll sane the trump administration and mark answer flewence in the world. that event hosted by the council on foreign replacing relation gets underway live at again, onm. eastern, c-span2. here is a look at the primetime networks.n the c-span beginning at 8:00 p.m. eastern on c-span, a look at what local, aree, federal governments doing to combat the opp yid epidemic. tv withn2, it is book authors and books on world war ii. it is american history tv and a lack back to senatorn new york robert kennedy announced diecision to run for the democratic nomination for president. this weekend on c-span, saturday
5:04 pm
at 8:30 p.m. eastern, the 50th anniversary of "60 minutes" at 9:30 p.m. eastern, hillary clinton at rutgers university. on book tv on c-span2, saturday annual p.m. eastern, the national black writers' brooklyn. at sunday at 1:00 p.m. eastern, of the besthor book and j.d. transtalk about the new book on tribalism in america. c-spanican history tv on 3 saturday at 10:00 a.m. eastern, the 50th anniversary of assassination of dr. martin luther king jr. sunday at aresta arestarr star if, lynne con's war secretary, edwin stanton, president clinton's assassination and the aftermath. c-spanekend on the networks. q&a.y on c-span's
5:05 pm
the oh receipt call fists talks about the careen science and the book the future humanity. the norm for mother nature is extinction. if you dig under our feet, right the bones of see the 99.9% that no longer walk surface of the earth. now we're different. we have self-awareness. we can see the future. plot. we scheme. we plan. and so perhaps, we're going to conundrum and survive because we need a insurance policy. the other books talk about the steps. where this is pot of gold out there? >> q&a sunday night at 8:00 eastern on c-span. facebook ceo mark zuckerberg will be on capitol hill next week for two hearing on the way facebook has used data.
5:06 pm
gottenrs may have information from as many as 87 million facebook users. mr. zuckerburg will appear at the rare senate hearing of the commerce committees and that april 10th.day c-span 3 will have live coverage at 2:15 eastern. then on wednesday, mr. zuckerburg will testify energy and house commerce committee and live coverage begins at 10:00 a.m. eastern on c-span 3. ussteven brill is back with on the washington jurn until and discussthis morning to the latest venture newsguard. journalism toe fight fake news. steven brill, define that term us. fact news. >> guest: well, i our case, what to solve is news that reports to be news that fabricated ors
52 Views
IN COLLECTIONS
CSPAN Television Archive Television Archive News Search ServiceUploaded by TV Archive on