tv Facebook Data Privacy Practices CSPAN April 10, 2018 11:33am-1:04pm EDT
11:33 am
unfolds . c-span, where history unfolds daily. in 1979, c-span was created as a public service by america's cable television companies. today, we continue to bring you unfiltered coverage of congress, the white house, the supreme court, and public policy events in washington, d.c., and around the country. c-span is brought to you by your cable or satellite provider. >> this week, facebook's ceo mark zuckerberg will testify before a senate and house committee on facebook's handling of user information and data privacy. today at 2:15 p.m. eastern on c-span 3, he'll answer questions during a joint senate judiciary and commerce hearing and on wednesday on c-span 3 he will appear before the house energy and commerce committee.
11:34 am
watch live coverage on c-span 3 and on-line at c-span.org and listen live with the free c-span radio app. >> ftc commissioner joined by a discussion with people in the technology industry on facebook's data and privacy practices after cambridge analytica misused data. also here on c-span 3 live 2:15 of the senate hearings with facebook ceo mark zuckerberg. >> good afternoon. i'm kevin bankston the director of the open technology institute here at new america which is dedicated to ensure everyone has access to an internet both open and secure. i want to thank you for joining us here today at new america for our conversation about facebook after cambridge analytica, what should we do next. if you're not sure what i'm talking about, you might be in
11:35 am
the wrong room. once upon a time there was a fast-growing social network called facebook that hoped to grow even faster by becoming a platform for other apps. and so in 2010, it launched the graph api an application programming interface that allowed app developers to access and use data from facebook users who signed up to use their autop apps. but there was a privacy catch. not only could they obtain data from their users but also from all of the friends of those users. and although nominally facebook notified users of this setup through their privacy policies and there was a not particularly easy to find privacy setting for adjusting what data your friends could share about you, the default on that setting was for apps to have incredibly broad access to friends' data and most ordinary users had little
11:36 am
understanding of what was going on. and so for about four years until facebook tightened up access to friends' data with an updated graph 2.0, untold thousands of app developers siphoned data off facebook from people who didn't even use their apps. and the primary guardrails protecting that gauts data from misuse after it left facebook's platform were simply facebook's terms of service for those app developers telling them they should only use the data for providing the service users signed up for, and that they shouldn't, for example, sell it all to a spooky political consulting company that wanted to build profiles on voters in order to better manipulate them. of course, now we know that is exactly what happened, that in 2014 a researcher name aleksandr kogan used a survey app called this is your digital life and was able to attract 270,000 facebook users and through access to those users' friends'
11:37 am
data able to obtain personal information, but we heard from facebook yesterday, up to 87 million facebook users. kogan then sold that data to cambridge analytica, political consulting firm that worked with the trump presidential campaign, and the brexit campaign, has bragged about influencing other political outcomes in mexico, australia and kenya, and based on recently released undercover recordings has used bribes and workers as part of its toolbox for influencing political candidates. which brings us to last month. when we learned about how cambridge analytica obtained this data. we also learned that facebook has known about kogan's passing of data to cambridge analytica since 2015 but did little to confirm that this misappropriated data had been deleted other than demanding that cambridge analytica certify it had done so. while facebook also continued to
11:38 am
allow cambridge analytica to advertise on its platform until just before last month's story broke. the story has led to a firestorm of renewed concern over the state of privacy on-line generally and on facebook specifically, just as controversy has already been raging for over a year about how several of the big tech platforms have been subverted to help or were to help spread foreign state sponsored propaganda during the u.s. election and other elections as well since then. so now, as facebook is losing billions of dollars in stock value, due to lost public trust, and is promising to make extensive business changes to regain the trust, as policymakers in the u.s. and europe are rattling the saber of regulation and ordinary folks only now seem to be starting to understand how facebook works or how it worked four to five years ago and what that means for their privacy, the simple question is what now, what should facebook do, what should policymakers due, what should
11:39 am
users demand that they do in regard to facebook or internet platforms generally. i will be talking to ftc commissioner mcsweeney and a panel of experts about those questions and generally about the state of on-line privacy and how we can improve it, but before we do that, i wanted to pass the mic to my colleague rebecca mckinnon who runs a project housed within oti called ranking digital rights dedicated to answering another question relevant to today's proceedings, just how well are companies like facebook protecting their users' rights. she'll give a preview of how rdr's latest corporate accountability index being released this month will answer that question and then we'll move on to my conversation with commissioner mcsweeney and our panel with the experts. thank you. >> thanks very much, kevin. i don't want to take too much of your time other than to let you know that the ranking digital
11:40 am
rights 2018 corporate accountability index is going to be launched on april 25th in new york and we have a flyer here and then april 27th there will be an event here in right in this room we're planning to talk about it in person to people who aren't in new york. the index, the 2017 index, can be found on our website at ranking digital rights.org so you can see how we evaluated companies last year, the index ranks 22 of the world's most powerful internet mobile and tell communications companies on their commitments and disclosed policies affecting users, human rights, to freedom of expression and privacy and so there's a set of indicators that are looking very specifically at facebook and other companies' policies affecting how they handle user
11:41 am
data and it will not surprise you, you can see on our website from last year, facebook did not perform well on those indicators last year in terms of the policies that it disclosed, the quality of the policies and also what it disclosed and didn't disclose and you will not be shocked to hear that there wasn't a revolutionary change between 2017 and now. you can see our report when it comes out on-line on april 25th for all the details, all the downloadable data, everything else, the analysis, will have the event in new york on the 25th and then similar event here on the 27th to discuss it in person and people will be able to go through and discuss all the results in great detail. but one other just point just in relation to this issue of user data, the industry generally is doing poorly, so -- but facebook's disclosures were
11:42 am
towards the bottom of its cohorts. so that's just a little preview. thank you. >> thank you. we look forward to reading all about that. and now i would love to welcome terrell mcsweeney, ftc commissioner, to chat about this issue. >> all right. >> hi there. >> how are you doing? >> thanks for having me. >> of course. i'm going to start with a question that i'm going to pose as the first question to our expert panelists, which is, is this a tipping point, say like a snowden moment in the context of surveillance, where we might see significant changes in policy or is this maybe more of a moment where we will see a lot of noise but not a lot of action?
11:43 am
did i say -- >> i think you made your own point. >> let me start by just saying thanks so much for having me here today. i will give you my own perspective, not the official views of the federal trade commission. i'm not going to pull any punches. and i'm also going to be careful not to talk about what the ftc has confirmed which it does have an open investigation into at least some of the conduct that is alleged here. i think we should have a policy conversation. so i appreciate your first question, which is more or less okay, so maybe 87 million people's information was misused, is this a big deal? and, you know, one of the things i was going to say, wow, we're not even talking about the fact that 250 million people's very detailed information was breached, not even a year ago, and that incident didn't have a
11:44 am
policy tale that i hoped. i hope that this is a moment of change. i think it's also a powerful moment because the general data protection regulation in europe is coming into implementation in may, so changes are being made in response to that so i think that has a big impact at the same time this news cycle has had an impact on the story. if it has an impact on just one thing, though, what i would really, really like it to have an impact on, are the people that say to me when we have been talking for years about better consumer protections for the digital age, that american consumers just don't care. okay. i think that is demonstrably false. i think we're getting good evidence that people do care, that consumer trust is incredibly important, and ought to be at the top of everybody's list in terms of what they are concerned about for their
11:45 am
businesses, and i think it's also really underscoring to me the fact that consumers are not necessarily understanding or anticipating fully all of the risks of transacting in their data on these platforms. i don't personally believe that we should be trying to put all of that risk on to individual consumers to anticipate what might happen to their data and i think that's part of the policy conversation we ought to be having. >> so currently, lacking any kind of comprehensive data protection rules in the u.s., ftc is the primary consumer privacy cop on the beat through your authority to go after unfair and deceptive trade practices and you gone after facebook before. there's a consent decree negotiated with them in 2011 over some alleged deceptions around their last privacy
11:46 am
transitions. which leads to the question, how did this happen if the ftc, the cop on the beat, already did have this consent decree in place and presumably was policing facebook? >> that's 100% the right question, and look, i'm a sitting federal trade commissioner, i love the ftc, i think the people at the ftc, the staff at the ftc are doing an incredible job with some pretty antiquated authority, 100-year-old -- 104-year-old agency using authority to protect consumers from unfair deceptive acts and practices, been able to adapt that into the on-line environment but at the same time the agency itself has always called for stronger tools and i think this set of facts underscores that the ftc is not strong enough as it is currently configured with its current authorities and current resources, to be the kind of consumer protection agency that is required for a moment in which we are connecting every part of our lives to the
11:47 am
internet and to each other. >> so how do we fix that? >> well, i would start by making sure the agency's adequately resourced. it's been more or less flat funded for the last few years. at the end of the obama administration, the obama administration did call for an increase in resources for the agency, but that's -- it has never been funded near that level. so it's first of all under resourced. that's easy to fix. i think it needs to think about its configuration. one of the things that it was -- has been doing and i'm very proud of sharing, you know, time at the ftc with this implementation, we've been bringing more technologyists into our work and more researchers on staff. we have an office of technology research and investigations called otec that i think is a great first step in that direction, but i think we need to think about institutional design and whether that kind of capability ought to be significantly expanded, maybe by
11:48 am
a creation of a bureau of technology just like the bureau of economics so that there's more horsepower within the ftc. the ftc itself also needs additional authority to contract with outside experts so that it can really have resources to evaluate what it's being told. it needs in-house expertise and additional resources to bring that expertise in when it doesn't have it. i think beyond that it has consistently called -- this is really important, for civil penalty authority, not just for data security and data breach violations but for privacy as well, it needs rule-making authority it can use for privacy and data security, it's also been studying some of the conduct that it finds very concerning, so it's been looking at the data broker industry and called for more transparency and accountability for data brokers and i think that's really important. beyond that i think it could be making the case for the consumer rights that we need in the digital age, which include
11:49 am
things like data portability and interoperability. those are also meaningfully competitive as well. >> so one of the big limits to what the ftc can do and what sort of sticks it has to work with, is that your primary tool in regard to privacy has been around deceptive trade practices. so if someone misrepresents what they are doing with your data, that is within the ftc's bit, but if they're doing something awful with your data but telling you about it and you have consented, that's okay. how do we get passed that? i mean, first off, as a starter, is noticing consent at this point a workable model? facebook will argue and has argued that its users and frankly the ftc were aware of or had consented with to and notice of this is how graph 1.0 worked, this what is the product was, where do you go from there?
11:50 am
>> i mean, all right, so the idea that notice and consent is a framework that can adequately protect consumers in this environment has been described as quaint and i think that's correct. okay. so no, i don't think think thatn continue to rely solely on that framework. now, i don't even think the ftc itself is advocating that it relies solely on that framework, but the ftc does have limits to its authority. so it does look for deception, which i can see if you're not telling people truthfully what's happening to their information and how it's being used, then that's actionable. i think that's very important. but of course, it's been looking at the role of consent and how that's been playing out in the marketplace. it's been emphasizing best practices around requiring notice that is clear and timely and outside of just long terms of service privacy policy agreement for a number of years. and it's been saying those choices need to be offered
11:51 am
around the collection and use of sensitive information in particular. i think it's been consistently laying out best practices whether the industry has actually been following them, i think s think, is a different question and raises the issue of whether the ftc is strong enough. i'm arguing, obviously, that it's not. i also don't think the ftc needs to be doing this all by itself. we are sitting here a year after congress repealed the fcc's stronger broadband privacy rules, and you know, i think there was no justification for that. we need more than one consumer protection cop on this beat. >> the privacy settings are confusing. this is not a unique problem to fs facebook, but it's gotten more complex. i want to share an anecdote that describes that. when this scandal broke, i went back to look at the settings.
11:52 am
i've been working to some extent around facebook privacy for a long time. i'm fairly familiar with it. when i went into the settings, i found a setting that said apps your friends use. it was all about your friends, whatever they can see on facebook, they can share with an app, subject to these check boxes. all these check boxes, like 90% of them are checked by default. but you can uncheck them. which means that facebook didn't actually update its privacy settings when it updated graph in 2014, 2015. so there's this weird setting. when i asked facebook, does this setting mean anything at this point, they were like, well, you know, we hadn't gotten around to fixing it or changing it or getting rid of it, in part because there are some edge cases where some of those check boxes do still matter. and they named one. i was like, is that the only one? they weren't -- it wasn't clear whether they were sure. so they're clearly sorting that out now. if they're not even clear on how their settings work, how can we
11:53 am
be clear on how their settings work? >> again, i mean, i think you're identifying some of the weaknesses of the model that we're using. what i think is interesting about this -- i don't know if anybody else had this experience in the room -- you know, shortly e, i had the opportunity ke, for to sit down with my mom to go over her settings because she was very concerned but also wanted some help in figuring out what her privacy settings actually were. so i was walking her through the flow. we went into the apps and platform, i said this is where you do the settings and can turn off platform. oh, turn it all off, she said. i don't know if anyone else has had that conversation with their mom, but it's kind of a familiar conversation at this point for me. what it suggests to me is that people are trying to exert choice over how their data is being used and how it's flowing
11:54 am
out of that first-party relationship they have with whatever service they're using. and they think they're exerting some choices and they may not necessarily know that there are more choices that they need to control and need to be looking at. now, the ftc has been looking at this issue. they looked at it in our paypal/venmo case. they said, you can't have default settings that you have to have several different options to navigate through. if you're trying to keep something private and you think it's private but you have to do three more steps, that's going to be problematic. that's a good case in that particular set of fact where is the ftc was looking at whether consumers could really navigate the settings they were being offered. and the ftc has also looked at -- i think of these cases more as trickery, clever technical workarounds. in our case, for example f a
11:55 am
consumer has said don't track my geolocation, then running program to serve ads that is triangulating your geolocation using your wi-fi is probably going around that assertive privacy setting. so we have to find ways to make sure the technology is following the asserted privacy choice of consumers. i think the ftc has been looking at these issues. the fact we have more than one case already on this kind of thing suggests to me there's some problems out there and that we need to be continuing to be very active here. >> so what role might your unfairness authority might play? >> well, unfairness has been particularly influential in our data security cases. these are cases involving whether security practices were reasonable or not. unfairness is a really important authority. it's also challenging for the ftc to use, partly because of the way courts have limited it over the years.
11:56 am
we have to have fairly clear likelihood of harl. while we've been talking about harms that are not just economic harms, that are harms that involve invasion of private spaces, such as turning cameras on in people's bedrooms and emotional harms associated with revenge porn and things like that, it can be very tricky for the ftc to reach some of the conduct using just the unfairness authority. it also -- i mean, i think this is an area where i continue to reiterate the ftc cannot go it alone. it needs to partner with other agencies as well because one of the things that happens to the ftc when it starts to use its unfairness authority very aggressively is that congress in the past has stepped in to try to limit it pretty severely. so the agency has been incremental and cautious in developing how it uses that authority, but with good reason.
11:57 am
>> so what about market harms? you also are a competition authority. there's a lot of talk in the air about platform monopoly or breaking these companies up or a variety of other ideas to try and deal with the fact they are big. what overall do you see for the ftc there? or how do you see competition intersecting with this issue? >> i think competition is incredibly valuable. i think more competition would definitely benefit consumers. one of the tricks here, though, has to do with the economics of how these markets work. because of network effects and because of the incentives within them, it's not completely obvious to me that just getting more competition is going to yield better outcomes and better protections for consumers. that's why we need additional regulation to help really direct the marketplace towards the outcomes that we want around
11:58 am
consumer data use, data security, and privacy. so more competition is good. i think the ftc using its competition authorities aggressively is terrific. it should be also advocating, i think, pro competitive policies like data portability and interoperability. but i think we need to be mindful of the fact we can't just rely on competition as a market force to correct for all of the problems that we are potentially seeing here. >> well, then there's perhaps congress. you've talked about what congress could do to help strengthen your agency, what might it do here to strengthen consumers' hand when we're talking about their privacy online? >> well, i think that congress could start really thinking about what are the laws it needs to pass in order to better protect consumer privacy. so one thing it could definitely do is stop passing laws that undermine it like repealing the broadband privacy rule. so we could start there. and we could build on that by really taking a look again at a
11:59 am
number of ideas that -- i'm looking around the room, a number of people here has been talking about for a while, which is comprehensive privacy legislation but also real comprehensive data security legislation, cybersecurity legislation. i would argue again for more transparency and accountability for data brokers. obviously i've been talking about more resource and strengthening the ftc specifically as an agency. i think that's very important. i also think talking about some of the rights that consumers really deserve here, rights to and control over your data, meaningful control, meaningful ways to port it around, meaningful interoperability are important conversations we need to be having. and that's the consumer protection angle. i do want to emphasize that one of the things that we're seeing play out in all of these stories about facebook and cambridge analytica has to do with bigger issues and bigger risks than
12:00 pm
just the consumer protection harms we're all concerned about. the potential for the technology to be used in disinformation campaigns, to undermine democratic institutions, i mean, i'm personally very worried about the use of bots and filling the fcc's comment, you know, sections for the repeal of open internet order. so manipulation of democratic institutions is a deeply harmful thing. that's going to require more responses than just addressing privacy and consumer protection issues. >> so there is this new comprehensive general data protection regulation that's about to come in force later this month in europe. does that strengthen, weaken -- how does that impact the argument for trying to get a comprehensive data protection bill done here in the u.s.? >> i really do hope it strengthens it because if what, in fact, is happening is that the major technology companies are coming in to compliance with
12:01 pm
gdpr and are come plying across their platforms globally, offering the same choices to consumers in the u.s. as they are in europe, then it seems to me a lot of the opposition to the burden allowing u.s. consumers to codify those rights is eliminated. so i think it could have an effect of making it easier for congress to really think about right sizing consumer protection for the digital age. we >> well, i do hope you're right about that. i appreciate you coming and taking the time to chat. >> thank you. >> we're going to invite the rest of the panel up right now. [ applause ] hey, gang. how are y'all? so i'm going to let everybody introduce themselves. we'll just go down the line. what i'd like is for everyone to
12:02 pm
introduce themselves and then briefly answer the same question that i posed to the commissioner. is this a tipping point, or is this an equifax moment? please. >> oh, starting with me. so i think this issue is going to have legs for a little while. subject to some other massively news-breaking issue that might come up, but i think it's got some legs. i think it's been in the works for a long time because there have been privacy questions that we've all thought about for a long time. i think this will make a difference. the ftc investigation will be ongoing, and i hope a spotlight remains on how the company responds to this, whether its responses are sufficient to protect consumer interests and privacy and their data, but also any potential impacts, what their responses might have on competition. we can get into this later, but
12:03 pm
i think we need to insist on very important consumer protections, but i think we also need to be mindful of any unintended consequences or overrove overcorrection that could inhibit some of the thing wes love about the internet, about openness and ensuring that competition remains vibrant and facebook doesn't use this as an opportunity to sort of shut down competition in the name of privacy and security. >> and who are you? >> oh, sorry. i'm caroline holland. i'm currently a mozilla tech policy fellow, working on competiti competition in the digital ecosystem. >> okay, thank you. >> hi. my name is harland yu, part of a nonprofit here that focuses on civil rights issues. i guess to answer your question, i hope so. we have no way of knowing, but it seems like especially because it's facebook and because of the links that it has to, you know,
12:04 pm
political campaigns and political groups that people seem to have intense interest in that the story may have more legs. certainly seems like facebook, you know, given the public pressure they're under is willing to make some changes and hopefully positive changes, but i think that's going to require, you know, an ongoing conversation between advocates and the company to see what those changes actually are. >> hi, everyone. i'm michelle, director of privacy and data at the center for democracy and technology. thanks for having me. i think the answer lies in your framing of it being a tipple point. the reason i say that is because i think there have been little chips away at the public's trust. the internet and digital systems writ large, probably, it's fair to say. i think that has had this effect of having sort of this crescendo. when it was tied to a political
12:05 pm
campaign where there already is a great amount of -- let me think of the right word -- >> angst. >> angst, yes, yes. the fact there was a company that sort of portrays itself as being friendly to consumers and for their use -- and in fact, that's the value proposition they are getting. you know, facebook says, here, we'll make this free because we want to connect you. i think all of those things crescendoed into this moment. so something will happen. my hope is it will be baseline privacy legislation in the united states, but i'm an optimist. i think regardless, facebook will have to face the music, will have to change its practices and will have to become, at the very, very least, more transparent. >> i'm david, i teach law at georgetown law school. i was the former director of the bureau of consumer protection at the ftc when we did our investigation of facebook. i'm not sure this is a tipping point, but it will be a
12:06 pm
significant moment in a couple ways. one is it depends on what facebook says. and facebook has a lot to answer to. there are going to be public hearings next week. mark zuckerberg is going to testify. i think a lot depends on what path facebook decides to follow. i also think it's a tipping point in this sense. this is the first major breach of a consent decree the ftc entered with an internet giant. yes, there was a dust-up with google shortly after it entered a consent decree. but this is, in my view, there are major issues about facebook's willingness to comply with the federal order. and i think one of the things that we're going to watch closely is what does zuckerberg and facebook say about that, and how do they respond to the ftc? if this investigation going on until next year, this may be a tipping point because it's going to force the agency's hand. >> in terms of facebook's
12:07 pm
response, you know, they've now announced a whole bunch of changes, trying to regain user trust. they'll be simplifying their privacy settings, although a lot of that was already planned for gdpr compliance. they're clarifying terms of service and privacy explanations, but not yet apparently talking about actually substantially changing any of those terms. they stopped working with offline data brokers to facilitate ad targeting, which is good. they closed a gaping privacy gap in their people search function that was allowing mass scraping of the public parts of profiles. i'm wondering why they didn't fix that sooner. and just yesterday, they announced they're going to be severely narrowing what data is available to app developers over a range of its api. seems like they're doing anything and everything they can in the short-term at least short of significantly changing their business model or their terms of service to tamp down concerns. so i guess the question first to harland, then to the rest of the panel, what else should facebook
12:08 pm
be doing now? and if you were in the war room at facebook at this moment, what would you be advocating for? >> so the cambridge analytica scandal raised two related but i think distinct issues. obviously we're going to be talking a lot about user privacy and the scope of sensitive user information that facebook makes available through its api to its app developers. but i think there's an equally vital conversation we need to be having that commissioner mcsweeney alluded to. as facebook starts to raise the walls of its walled garden, how is the public going to then scrutinize what's happening inside that walled garden? in particular, how facebook's business model is vulnerable to potential abuse and how the public finds out about and addresses some of those issues. so the cambridge analytica story
12:09 pm
got a lot of attention because of political campaigns. i think what the story did is intensified people's interest in the ways that facebook data and the facebook platform was potentially being used to manipulate both our elections and the political discourse in this country and elsewhere. so from that, you know, facebook has promised some amount of additional transparency, especially after its own internal investigation of russian interference. so it recently said it wants to establish a new standard for ad transparency that would, quote, help everyone, especially political watchdog groups and reporters, keep advertisers accountable for who they say they are and what they say to different groups. so i just want to spend a few minutes talking not about the user privacy side but actually about ad targeting. so when i talk about ads here,
12:10 pm
i'm really talking about any message on facebook that's touched by money. anyone, whether you're a company or not, can spend money to boost your message to a certain segment of facebook's users. so you know, your mind probably gravitated towards consumer products. chase has a new credit card that it wants to target to certain consumers. but we're obviously talking also about political campaigns, political groups that want to do issue ads, and we're talking about, you know, other perhaps nation states that want to spread, intentionally, misinformation or to exploit certain divisions, political and social divisions in our society. but i'm also talking about ways that advertisers might be using facebook's ad targeting platform to prey on vulnerable consumers. z you know, these are, for example, ads that are trying to find patients for fraudulent
12:11 pm
opioid rehab centers or aggressive ads for for-profit colleges or sketchy loan repayment schemes or ads or ad campaigns that drive illegal discrimination, especially if we're talking about advertising in finance, in employment, and in housing. so these are the kind -- you know, among legitimate ad targeting, there's a range of possible abuses that are possible in the system. this goes to the core of facebook's business model, which is finding certain segments of facebook's users and targeting specific messages based on paid messages to those users. the main question that i have is how is the public in the future going to be able to scrutinize this ad targeting behavior and address a lot of these abuses? what is facebook going to do to
12:12 pm
help address these issues? how are they going to be transparent about what's happening? so facebook has made a few small promises so far. it seems clear to us at upturn that there's a lot more that the company can do. i'll just go over four potential things facebook could do real quick. first, facebook has slowly started to make advertisements on its platform available to public scrutiny. so they've been doing a pilot project in canada where if you're a user and you go to an advertiser's page, you could actually see the list of running ads that that advertiser is currently running. so in principle, all ads are visible. but it's also a very manual process. if you're a researcher, it makes it really difficult for you to even know the universe of all ads that a particular advertiser is running.
12:13 pm
in addition, many advertisers have thousands of ads where they're testing different messages, and you have no idea what the scope of those ads and reach of those ads are. the first thing i think facebook could do is in the same way they've built a very robust api for user data, also build a robust api with search functionality for advertisements that allows the public, researchers, and journalists to scrutinize ads more effectively. the second thing is that, you know, facebook has started to make enhanced transparency promises, especially around election ads. but that's a very narrow slice of the problem, in my opinion. they're looking at federal election ads that specifically mention a candidate or specifically mention an election. we're trying to get people to vote. but it ignores the broader range of abuses that i've described. so they should really be turning their attention not just to election ads but really all ads being run on facebook's
12:14 pm
platform. third, in order to have effective accountability, it's important not just to know what the text or the content of those individual ads are, but it's really important to know the scope and reach of those ads. so here i'm talking about, for example, exactly what the explicit targeting criteria was for the ad campaign. here's an ad. who is that advertiser trying to target? it get as little complicated technically here because some advertisers don't use explicit targeting criteria. they might use something like a custom audience where they upload a list of existing voters or consumers and facebook uses its special sauce to find a look alike audience of facebook users that have the same features. and so what facebook needs to also do in addition to making the ads themselves more transparent is to, you know, expose targeting criteria and also information about the audience that a particular ad actually reached, right. how many people, what are the
12:15 pm
demographics of that audience. and fourth, finally, to the extent that facebook is already doing enforcement, right, where they're doing their internal enforcement to take down bad ads, they should really disclose to the public a detailed accounting of all the bad ads they're taking down and for what reasons they're taking down those ads. so those are the kinds of steps that i think would provide real transparency and real accountability. these are, i think, the kinds of steps that will help raise the public's trust because it's not just facebook telling us that they're doing these things to try to stop these nefarious behaviors on the platform but actually the letting the public scrutinize this and to verify that this is actually the case. so i'll just put a plug on a report that upturn team is releasing in the next couple weeks that we hopefully will serve as a public and advocate's guide to ad transparency and
12:16 pm
what we should be pushing facebook for. >> thanks. look forward to reading that. it does seem like we're building a manipulation layer into the internet that we don't fully understand and can't control. greater transparency would be helpful in grappling with that. in the interest of transparency, this reminds me as well, i should make sure to disclose that facebook has provided and does provide financial support for some of oti's work. although clearly we disagree on some things, we also are aligned on some important issues for internet openness and security, including especially the encryption debate. so in terms of my wish list, i would add greater transparency in the sense of appearing in venues like this. we did invite facebook, but they are busy. but i will say, you know, they've done a lot of -- a lot more press calls than usual lately, on the record and releasing transcripts and stuff like that. obviously they are testifying to congress, though i don't really feel they had a whole lot of
12:17 pm
choice in that. seeing more public engagement would be great. as far as other people's wish lists, if they were in the war room, any other things or shall we move on to another question? >> let me sort of say what i think they should talk about doing. i mean, one of the real problems in the cambridge analytica debacle sort of shows this is how little control facebook was exercising over third party, particularly app developers. so facebook has recently acknowledged that they don't really have contracts with third party app developers. they don't have any remedies in case there are, you know, a deliberate overharvesting or sharing. they obviously did no due diligence on any of the third-party app developers. so when they come -- when it comes to how do you solve the cambridge analytica problem, part of it is there has to be much greater oversight control
12:18 pm
and auditing of third-party developers. and yet, this whole episode has proven that that -- and that was one of the aims of the consent decree. one part of the consent decree required facebook to identify threats to privacy and to plug those threats. and since third-party app developers, you know, have access to this data, that was an obvious vector for privacy violations. and so one would have expected that there would have been some controls placed on app developers. yet, as this debacle unfolds, it becomes more clear that there really were none. so just go down the list. some kind of due diligence about who's getting access to the data. some kind of contractual lock-ups that give them power to require audits, oversight, some
12:19 pm
sort of certification of nonovercollection or sharing. audits done by facebook or an outside party to ensure that there was compliance. and you know, we're weeks or months into this, and facebook cannot assure us that the cambridge analytica data isn't still floating around or that cambridge analytica or the researcher have actually destroyed them. i teach law school. we teach students how to enforce these kinds of promises. and it doesn't appear that facebook has any remedies at all that are effective to basically discipline third-party apps that have very broad access to consumer data. so i could go on, but what i'd like to hear from facebook is what are we going to do to control this? because yogi's famous line is this is déjà vu all over again.
12:20 pm
we saw all of these problems back in 2011. the consent decree was designed to avoid exactly the cambridge analytica problem. so one of the things i'd like to see is facebook come before the senate, as zuckerberg is going to do next week, and come in with a real list of things that are going to control this part of the problem. i realize -- i agree with harland. there are lots of other problems. but in terms of providing minimal safeguards for consumer information, those are some of the things that i want to see facebook talk about. >> now that you bring it up, the consent decree that you helped negotiate in 2011, can you talk a little bit about -- because you seem to indicate earlier that you do believe it's been violated. >> oh, yes. >> so i'd love to hear, first, if you could enlighten the audience a little more about what that was and what that was about, then what you expect or want to see from the ftc in regard to that now. >> so again, this goes back to,
12:21 pm
in part, third-party access. in 2009 in november and december, facebook made two changes to its privacy settings that pushed a lot of private information to be public and also gave third-party apps access to information they were not supposed to have. and one of the things that's ironic about the ftc's complaint, the ftc said part of the deception was allowing third parties to get access about how people exercise their political views without their consent. and so this is why i think this is -- i've seen this movie before, and it didn't end well. so one of the things the ftc did was try to reign in third-party collection. if you look at the consent decree, it draw s a bright line between users, who are people who actually post things, and third parties, who actually harvest things. the goal was to limit
12:22 pm
third-party access unless there's clear notice and clear consent. now, facebook's going to say, well, the settings they had allowed sharing, mass sharing, if you put friends to friends setting. on the other hand, the question the ftc was asking is what are consumers' reasonable expectations about what that means? so one question to ask zuckerberg next week at his hearings is do you really believe any of the friends thought that something like cambridge analytica was going to happen to them? your notice is back then, back in 2013 or whenever this happened, 2014, clear about that? i've looked at those notices. i don't think they meet that test at all. but that will be part of the ftc's inquiry, both in terms of whether the consent decree was violated or whether there were fresh violations of section five. so i think that's part of my concern.
12:23 pm
the other part is one section is devoted basically to forcing facebook to look at vulnerabilities. where is consumer privacy at jeopardy, and plugging those holes. that was designed really to respond to downloading by third-party apps. it is quite clear in the aftermath of cambridge analytica that facebook paid no attention to that part of the consent decree because flthere are no controls on third-party downloading, there's no remedy for harvesting data that you haven't -- you have no consent for or sharing that data with third parties, which is why cambridge analytica is such a scandal. this is months and months -- this is two or three years, maybe four years since facebook has known about this problem.
12:24 pm
yet, it still has done nothing to fix it. so a lot of the things facebook has announced, the new platform policies mark zuckerberg has talked about, we've heard all this stuff before. the question is, is facebook really serious about moving forward now. >> on the subject of consent decree, i want to jump back into what facebook would argue, i expect. i think they would say, when y'all negotiated this settlement in 2011, comparing that to what was going on in 2014, they would basically say when we negotiated that settlement, this is how graph 1.0 worked. these are the disclosures that were made to the users. these are the settings that they had. what changed between 2011 and 2014 to make that not okay anymore? because, you know, their position is we didn't -- our product was working the way it was designed. >> right, but the consent decree was to avoid problems with
12:25 pm
people like that. section four was looking at vulnerabilities, that's the key provision. in my view, facebook did not pay any attention to that. so the real question is, look, the question for facebook users is in 2013 or any time after the consent decree was entered into, would friends of friends understand the scope of harvesting of their data? that's the question. the 57 million, 87 million, you know, facebook users who had cambridge analytica take their data. >> what happens if the answer is no? >> so i think in terms of ftc enforcement, i don't think it really matters. i don't think facebook has any
12:26 pm
argument this isn't a violation of section five. section five turns on what consumers reasonably expected. i also think they don't have a defense to my view, which is they violated multiple provisions of the consent decree. what turns on that is if there's a violation of the consent decree, there's going to be a very substantial civil penalty. at the time we did the google case, the civil penalty statute provided for $16,000 per violation. the ftc has always considered a violation harm to an individual consumer. so now if you measure -- if you multiply $40,000 times 87 million, only harland would be able to figure out what the answer to that would be. >> the jump from 16 to 40, that has changed? >> statutory. so you're talking about an astronomical civil penalty. obviously that would not be the starting point for the agency.
12:27 pm
but i think there's likely to be a very substantial civil penalty in this case. >> so we're talking about what the ftc might do. there's also the question of what congress might do or should do. to start that conversation, i'll move over to michelle. >> so congress, what should you do? i think yelling at mark zuckerberg is a start. it's not necessarily going to make change though. and at some point, i want to go back to consent decrees. i think this illustrates where there are some vulnerabilities and weaknesses in the consent decrees themselves. perhaps congress could in maybe a more discreet way, more discreet than baseline privacy legislation, make fixes to have them have more teeth. for example, in the google case, i think the fine was $24 million, which was like half a day's profit. >> but it was the most we could get. >> i know. this is my point. so to make it really matter and actually have some heft when the
12:28 pm
ftc levies a fine, what could should do is not remake the gdpr. you might know this is not exactly in line with a lot of what advocates are saying. i think the gtpr is fantastic. i think it's coming into force, and it's forced companies to incorporate a lot of user rights into their platforms, products, and services. but i don't think we need to duplicate it for that reason. that's not to say there aren't elements of the gdpr that aren't instructive and couldn't be baseline privacy law. it needs to replicate the failure of consent. what we should be looking at is expectation. so i think the way that congress should imbue a baseline privacy law is with the ideas of what is a person's expectations as defined by what kind of user agency do they have, what sort of transparency is available, and what sort of accountability
12:29 pm
is attached to those things. so sort of looking at the idea that the way we interact with these platforms is obscured. so in other words, you don't -- really can't make consent for the most part, and everybody knows this. you don't see the hundreds of eyes looking at you as you post something on facebook. all of this is by design to push to the forefront the idea that if you're going to imbue expectations into your platform and you're going to use these values of agency, transparency, and accountability, that some of those changes have to come from the design side. so creating some design standards. now, this isn't to create some paternalistic law that says you have to have fonts this size and that kind of thing. it's more about what sorts of interactions will make clear to a person what the true value proposition is. right now, you know, a person that i know put it very well and said, when the companies leverage your data a hundred times, that's like a price increase you don't know about. you get nothing out of that except the free service.
12:30 pm
but that, i think, is ringing hollow now. so to the extent there are ways to do this -- and there are design principles that allow for more transparency, more accountability, more agency, not just the gdpr. those need to be imbued into a law. i think also the idea of -- and just going back into accountability, because i think that's so crucial, the idea of making public disclosures and drawing on some of the other laws that exist. for example, making ceos certify public disclosures on a quarterly basis. it forces the ceos to have skin in the game. i think other areas are auditing requirements, making the companies do data impact assessments. those things can and are doable. they're not easy technically, for sure. they're challenges. but i think that they are discreet ways that would make a huge difference in americans' privacy protections. so i think that's -- just that.
12:31 pm
>> what you described at the front end is not a modest proposal. >> no, no, it's not. >> i'll admit, i share some skepticism about the value and political viability of comprehensive baseline privacy in the u.s. at this stage. >> sure. >> considering there was a much worked on -- a lot of smart people focused on it, a proposal from the obama white house that wasn't good enough for either side of the debate. i'm not sure if the calculus has changed that much. but it does raise the question of if not that, if not some sort of gdpr light for the u.s., what? you've mentioned a couple of neat targeted things, the ceo certification idea and impact assessments. what other more targeted -- i mean, there are staff who are wondering, what can i write right now that my boss could introduce and look impactful on this issue? >> i may have talked to some of those. >> or what can we do that's
12:32 pm
really strong that will strike fear into the heart of facebook and other companies and perhaps impact their behavior? what should we be doing? >> to be frank, you know, striking fear in their hearts is not really what i'm concerned with. to me, my eye is on the ball of how do we get protections for people. of course, that comes -- with protection should come accountability. i think strengthening consent decrees would be great. it would strengthen the ftc, which we heard, and probably everyone in this room knows it truly needs more resources, more tools to be able to do its job. more public transparency around consent decrees, more significant penalties for violations. we have -- actually, we're going to be, just a plug, something we're coming out with, here are very specific recommendations on consent decrees. hopefully that's something that could get bipartisan support. i think in some ways you can look at the republicans' reform playbook and use some of the
12:33 pm
ideas of good governance to get something like that to be more palatable. i think also, another area, and david has touch the on this, is the idea of data access by researchers. so it's something that i feel like has been sort of avoided, and it's partly because it's a very tricky subject. we don't want to shut down innovation. we don't want to shut down open access. that's what the internet is built on. but there are ways to create obligations for researchers that don't exist right now. for example, if you're federally funded academic researcher, you know, you follow the common rule, which means there are ethical guidelines and you go through an institutional review board. if anybody has ever gone through them, they're fairly worthless, okay. not for any reason of the people who sit on them, but they don't ask for things like terms of service review. not to say that if you review terms of service and it says you shouldn't do this, which almost all the platforms say, by the way, shouldn't do a lot of what researchers do, but it's
12:34 pm
imperative for there to be a review of that, some accountability for the researcher. there should be -- somebody mentioned certifications for the researchers so they're held to some obligations not just for what they intend to do but how they're protecting privacy and security of that data. facebook's data sharing agreement was very light on details and very light on accountability. i'm not sure that wasn't by design. i think the idea is let the data go, and then we don't have liability. we don't want to create liability. the other aspect would be creating a chain of command of liability in this ecosystem, which again would not be easy, but i think at the end of the day, you start with the platform of the product or service who is creating the risk for the user. the benefit and the risk. then you go down the line and you decide and assign what are the rules and what are the liabilities. i think those can be chunked off in small ways. maybe consent decrees is a part of that. maybe the certifications are another part of that. >> other ideas?
12:35 pm
i'm guessing, david, you're all for strengthening the ftc. >> yes, i'm happy to repeat everything said before. she's absolutely spot on. i would say this. to the extent there may be smaller pieces of the privacy issue that congress might tackle is data brokers. this would get at many but not all of the problems. i think given some of the breaches that we've had and some of the problems with large data brokers, there needs to be something like the fair credit reporting act but much stronger for data brokers. the fact s people are worried about what the nsa knows about them, but axium knows way more, and so does facebook and lots of other companies. yet we don't have any effective regulatory tools for any of them. these are massive data pools. to the extent that they get
12:36 pm
merged, you have massive data seas. that's where real risk to consumers lie. >> i'm somewhat skeptical of going back to this, back to the platforms and away from the brokers, but we'll see strong, specific use restrictions in law for -- or requirements about consent, but i think the possibility of much stronger transparency requirements is definitely in the offing. i'm certain they've been talking about that in the context of ads. i expect they are talking about it in the context of who actually gets your data. but there's also the question of political viability and timing and whatnot. for example, i have a crazy idea, which is we already have -- there's a single law that is the strongest privacy law in the united states. it's called the video privacy protection act. it protects records about what you watched, rented at a video store, and now protects what you watch on netflix. that got passed after the
12:37 pm
supreme court video records were obtained by journalists and congress freaked out the same thing may happen to them. hence, the strongest privacy law ever. i don't see any principled reason why that shouldn't extend to the content i interact with online. i shared this idea with a staffer who was like, sure, but that's in the criminal code, and that means it would go there ju dish yarks and nothi -- judiciary, and nothing is going to go through judiciary. what is actually possible right now? what's the timing? clearly they're not going to pass anything this year. they're basically already done because of the election. but what seeds do we need to plant now, and where might we best plant them? >> i mean, i'll just say briefly and let others speak, but what happens in the fall is huge. it will decide if the democrats retake the gavel, retake
12:38 pm
congress. then the possibilities are much greater for baseline privacy law or any kind of updates to privacy in general. i think it's funny the jurisdictional question because privacy is notorious for being in a hundred different committees or at least people believing it should be in a hundred different committees, especially when you have a high-profile case like this. they're all scrambling to how they can fit it into agriculture and all kinds of crazy stuff. great, fine. this is the way our democracy bumbles along. i think to the extent we can provide staffers with the correct facts about what happened, first of all. that's something i've noticed. even excellent journalists using words like skrapg and access interchangeably. those aren't the same thing at all. so making sure that we can, as advocates, create a fact-based situation and they have that. then offering different committees different solutions. i think that is up to groups like cdt and other groups to
12:39 pm
really work hard and make sure the committees have information about what they could do next. i also think it's important to bring in republicans that are interested in this issue. you know, they hadn't said much for a little while, but now they are. i think that's a really important development that i think a lot -- especially in d.c. -- can get ignored. both parties explain this is a truly bipartisan issue, or should be. >> moving on to caroline and the issue of competition, which is your expertise, what role does competition law, anti-trust law play in addressing a situation like this, if any? >> so i think one of the questions we've heard after this is, wait a minute, what if we had more competition? things would be better. we've been seeing for month as lot of head lines talking about the power of big tech, the concentration, the consolidation
12:40 pm
that has occurred. company the anti-trust laws do something about this? and the follow h-on is, maybe i there was more competition, things wouldn't be so bad. i'm here to say anti-trust can play limited role in some of these sort of bigger market structure questions. it's probably not going to be so good at addressing the consumer privacy questions. yes, if we had multiple social platforms, in an ideal world you might see competition on the basis of privacy. the trick about social platforms is we really don't want to have to go to six social different platforms and find all of our friends on each ones. the value is tn this service is that the more people who are on it, the more valuable it becomes. so competition is not necessarily something that consumers really would want in
12:41 pm
the real world. you might want options. i'll talk about ideas for promoting competition, but i want to take a second to talk about the limits on anti-trust law. anti-trust laws is a law enforcement function that prohibits conduct by firms with substantial market power, a monopoly, if you will, from taking actions that will harm the competitive process and also it can stop mergers that will lead to lessening of competition. this can have some positive impact an how these platforms compete and the actions that they take. another important tool besides the anti-trust laws is -- or unless you consider it one of the anti-trust laws -- is the ftc's section five authority. section five, i believe it has
12:42 pm
more than the section five law. section five prohibits unfair competition. so query how the agency could use that authority. can it find actions that facebook is taking that violate this principled, unfair competition. so those are somewhat limited tools. they might be able to nibble around the edges and get at bad conduct. one of the things i'm thinking about is how to promote more competition. how do we put competitive pressures on a company like facebook? and is that through better data portability, creating a meaningful way to port your data, machine-readable data, to some other application or service that would be able to create some sort of networking service that consumers want to go to. this then sort of goes into some of the questions that i'm now exploring and trying to think about. i think i still have more questions than answers on this.
12:43 pm
the importance of data portability, interoperability and the use of apis. i had been thinking about how more open apis help promote competiti competition, promote investigation, and ensure the openness of the internet that we all want and that we get a lot of innovation from. then cambridge analytica happened. i was like, oh, shoot, i need to step back and think about what is the concept behind apis. is it more responsible use of apis? and how do these feed into the digital ecosystem. so one of the warnings i want to put out there and something i think we should all be looking for and i think the ftc is well tooled to do this because they have the office of technology, the competition bureau, is to make sure that facebook as it starts to review its apis, review its api policies, and again i want to be careful to
12:44 pm
say i think there are a lot of things they should do to prerott privacy, to protect security, to make sure people willy nilly can't get access to all our consumer data, but at the same time make sure there isn't an overcorrection, but data that helps developers come up with exciting new programs and apps that people want to use on facebook, that could ultimately compete with facebook. and facebook could have the incentive and the ability using its apis to say, you know, let's not let this developer access data because i really am worried that's a threat to us. let's shut that down. >> let's pause on this. this is something a number of us have been thinking about, including the commissioner. you can look at some tweets about it, if you like. let's distinguish a few things. there's portability, which is
12:45 pm
getting your data out. facebook does right now have a tool for getting your timeline out, but it's basically built for you to browse expressly and not really suitable for uploading into another service. let's assume there are services out there. >> and they built it for that reason. >> yeah, they built it for you to have your data but not for you to take it somewhere else. gdpr is going to require some level of portability, including machine readability so that you can move it enter else. remains to be seen how people are going to implement that. and that'll be really interesting. but then there's also the issue of how do we come up with an environment where something can actually get big enough that you'll actually even want to move your data to it. that's where you get into interoperability. one of the more plausible or maybe the only plausible version of a network getting big enough at this point now that facebook really bought the two networks
12:46 pm
getting big enough to compete, is to be able to leave facebook while still being able to communicate with people on facebook. what facebook is selling right now is not its platform, it's the people on the platform. if no one feels like they can leave because everyone is there, then #deletefacebook means nothing and there's not a lot of consumer pressure on them to change. but how do we do that? what are the tools, other than consumer outrage or bully pulpiting, to push facebook in the direction towards what is probably what it considers its most mortal threat, which is building doors into its walled garden. >> i would say anti-trust is a tough tool to do that. you'd have to find them, having substantial market power, which maybe you could find, but doing things that would require that sort of remedy. to say, look, you need to make this open. one of the solutions is more likely legislative, or if we
12:47 pm
were to give the ftc more authority to do rule making. it's not just helpful to have my pictures and posts. the most valuable piece is, who are my friends? when you think about data portability, you also still have to think about privacy. my friends are my friends, but if i'm going to port it over to some other service, are my friends okay getting pinged by that new service to say, hey, come join it. so i think there are a lot of questions about what does meaningful data portability to another service actually mean. the devil is really going to be in the details. i think it would be great to hear from developers and others who are trying to think about what is the next great social platform i want to create, and what would they need to be able to meaningfully port this data or meaningfully get this data, capture this data and then build a service off of that. >> david? >> well, i mean, what you're really talking about is forcing
12:48 pm
facebook to become a common carrier. that's fraught with all sorts of, you know, subsidiary problems. so i'm all in favor of interoperability, but i don't know how you force it, except through legislation. >> there is some precedent with aim like when aol bought time warner, there was a merger condition around aol having to make its messenger interoperable with its biggest competitor. at this point, whether facebook would put itself in a position where it actually -- where it would have to acede to such a demand. >> when will the authorities have the hammer -- that type of hammer. >> maybe it is small parts of facebook, like messenger. maybe that's not that small, but looking at specific communication aspects, like the plug-in or something. >> i think it seems to me that there's also a core tension between privacy and
12:49 pm
interoperability. yesterday their announcement to close off more parts of the api, you know, privacy advocates will cheer, but competition advocates will not. so how to get both at the same time probably will have to do with more legal and policy solutions rather than any, you know, tweaking of the knob in terms of how much data the api exposes or not. >> and just to ensure, i think we can't just -- i think this is going to require a lot of study. i think it will be helpful to have research and empirical data to build a -- to respond a little to what david said about a common carrier. we really do need to think, is that what you want to do? because you need to keep in mind that we want to keep -- we want to ensure the incentives to build the next great whatever, platform, or facebook is going to be, and if you create some sort of policy that says, well,
12:50 pm
as soon as you get big and have lots of people, then you need to open everything up. might that, and i don't know, might that chill innovation? maybe the flip side of that is, well, that's just going to force everyone in these services to t just be the best they can and competition is good, so you want them to be able to say everything's open, but you're going to want to come with me because i have the best privacy policy, ad targeting, whatever it is i can provide that's great. so it's going to be an important consideration of what trade-offs might there be for getting certain policies that we think could benefit consumers and what impact that will have on innovation. >> well, as that weighing occurs, you know, i'll admit that my greatest fear right now is that this is not only going to push the decisionmakers at facebook, but also the rest of the industry toward, let's go all the way, like it's not worth trying to play in that field. and indeed, i worry that to some extent there's like a brair
12:51 pm
rabbit in the patch where they go, please, don't make us lock down our apis more, when ultimately that would serve their competition. >> there were ideas around data collaboratives that i think could be interesting for people to look at, the idea that maybe part of its data -- i'm not exactly sure how policy would deal with this, but maybe part of some of the big platforms data could be put into data collaboratives or entrusted intermediaries where it could be accessed by researchers. you know, maybe there are ways to sort of segment some of it so that it doesn't create the problem, at least for certain segments like research. >> yeah, well, i think there's also a question of what's the minimum amount of viable data you really need -- >> that's a really good question. >> in terms of the social network, all these apis aren't necessarily relevant. it's more can i get my friends list out along with a piece of data for each of them that will allow me to reidentify them. >> right. >> but the most likely piece of data there would be a phone number or e-mail address, then you're into a privacy problem
12:52 pm
and they can say, well, jdpr won't let us do that, or that might be violating what -- >> well, gdpr is not going to apply to american data, so -- >> yeah, in europe, though. >> just saying. oh, okay. >> but a lot of rich discussion there, not to overpromise too much, but i do believe that oti will be doing an event on interoperability and portability within the next few months, knock on wood. but in the meantime, we have a few more minutes for some questions. and there are hands shooting up. there's a mike going around. please, don't speak until you have the mike for the benefit of folks who are watching on c-span or online. >> hi. amy stepanovich from access now. david, you brought up the descent decree and spoke about it and michelle brought up the idea of audits, but audits were in the dissent decree and would have had to go through at least one, probably two, between then
12:53 pm
and now, and it clearly didn't do anything to push this forward. is an audit actually something we should be pushing for in law? was it effective in the consent decree, and if it wasn't effective, which it seems not to be, how can you make them better? >> the word audit is a word of many meanings. what the consent decree requires are biannual filings by third party to essentially make sure facebook is adhering to the commitments it made in the consent decree. so, had there been at least two, maybe three of them submitted to the ftc since the consent decree was filed. they're not public. i haven't seen one in several years, but that's not really what i'm talking about when i talk about audits. i'm talking about facebook's trying to make sure that third-party apps stick to whatever commitments they've made in terms of what they're going to download and not share
12:54 pm
with third parties. so, one of the real problems with cambridge analytica is facebook didn't really know what cogan was downloading, nor did facebook have any means of making sure that data wasn't shared with third parties or sold or, you know, or done, you know, there was no control and there was no audit. and so, what i'm talking about is ways of overseeing third parties who have access to data. and there needs to be both contractual lock-ups, which don't really exist, and there needs to be some ways of after the fact making sure that the third parties behaved as they promised. at the moment, facebook has simply been depending on wishful thinking, that third-party apps are going to do whatever they, you know, whatever the terms and service are provided. but the cambridge analytics debacle has shown there really are no controls. so when i use the word audit,
12:55 pm
that's what i was referring to. >> and from what i understand, these are really assessments, which is different. >> yes. >> so, you have a private assessor who's sort of cataloging benchmarks about the company. and you know, there are ways for companies to game this, you know. they will change their practices right before, for example, to look better for the assessor. they will often not, you know, obviously, none of this is public, but they will also not list material changes, things that would be relevant to a consent decree, that the assessor doesn't have access to or the company's not telling them about because of the timing of the assessment. so, a formal audit would require it to be public, it would require the company to sign off on material changes of their policy and practices before and after the audit happened, and it would create a sense of accountability that's tied to the consent decree, which hopefully would have more fining
12:56 pm
authority. >> it's not at all clear that facebook reported what happened with cogan and cambridge analytics to the ftc, even though i think it would have been required in one of the biannual reports. >> my name is courtney raj at the committee to protect journalists. i think it's interesting, because in several conversations around the countering extremism debate and the fake news debate, one of the big concerns is the lack of access to data by researchers. so, it seems like there's this tension between access to all this personal data, and yet, you know, facebook and other companies not turning over lists of content that they've removed or censored, et cetera. so, i think we should be careful about making broad brush strokes about institutional research boards. at least they have to go through something. definitely, they need to be more technologically apt. but how can we balance the need
12:57 pm
for more oversight and auditing potential, both by researchers, but also potentially by the judiciary as they're deciding what content is illegal or not outside of a judicial process with the need to protect private data? >> i can take that first. so, at least in terms of the recommendations that i was talking about -- and it is true, i mean, there's, especially when we're talking about countering violent extremism, how to balance privacy and transparency is a really tough problem. the recommendations i was talking about applied to ats, applies to ads, and facebook already considers ads to be all public, right, with the pilot that they're already trying. you know, any post that money has been spent to boost is already considered public and not private information. and so, at least in terms of a lot of -- i mean, it's not going
12:58 pm
to get at all of the problems, but at least for a large fraction of the problems, beyond just elections, you know, more transparency about ads will probably do a lot of good without really risking individual privacy that much. >> i mean, i'll just add, it's not an answer to your question. i think your question is one that we all need to be asking, and it's relevant to the portability question and interoperability question as well -- how do we balance the need for privacy with all the other things we need? you know, you have tom wheeler, former ftc commissioner writing in "the new york times" about how we need more open api so we can do research on how problematic content is spreading on these networks. >> right. >> yet, at the same time, we have a big push to close them down because of privacy. to be fair to facebook, i'd be like, what the hell do you want us to do? >> but i think you can look to some things that exist, right? maybe this is where phipps makes sense -- >> where what? >> sorry, phipps. phipps makes sense -- >> for the folks in the audience. >> fair information practice
12:59 pm
principles. so, it's just a data governance framework. and most privacy advocates are very well versed in fips. it's sort of outdated in some ways but offers a really good way to think about how to govern data well. so, for example, when i say obligations on researchers, that doesn't mean restrictions in the terms of this is a good project or this isn't a good project, per se, but it's about privacy and security restrictions and requirements, you know, maybe limiting the amount of data, the scope of the data, the reason you're collecting the data, right? those types of requirements, which really don't exist right now. that i think would be a step forward. >> if we're fast, we have room for two more questions, and this gentleman's been raising his hand. >> thank you. my name's david troy. i'm a researcher that's been looking into this cambridge analytica situation since over a year ago. so, i'm pretty familiar with the internals of it. one of the things that occurred to me that you guys might want to consider as part of your advocacy around the consent decree and how to sort of, you know, possibly mitigate the situation is kind of a technical
1:00 pm
solution, which is to simply deprecate, for facebook to internally deprecate all of the user i.d. numbers that have been exposed so far. because clearly, one of the problems here is that there's kind of a horse left the barn in 2015 issue here, and i think that while, you know, deprecating those user i.d. numbers so that you simply couldn't target those user i.d. numbers anymore would not totally solve the problem, it would mitigate it in large part. >> that's a really interesting comment. >> and that was my thinking, just something that if you hadn't been thinking about that, it occurred to me, and it was something that facebook won't like very much, so it will have a punitive aspect to them. >> right. >> but it won't crush them, and it will also not fundamentally disrupt, you know, legitimate, ongoing activity from people who are obeying the rules. >> thank you very much. >> yeah, sure. i'd be happy to talk to you more about that. >> that be great. one more question. >> my name is bob gelman. i'm a privacy consultant here in washington, d.c. i have a simple suggestion, thought, whatever.
1:01 pm
what if as a result of the whatever ftc investigation comes out, the ftc would require facebook to divest instagram or maybe some other application, and that would create barrier -- competition, solve some of the problems, and of course, there could be other requirements, but that would be something that facebook would see as punitive along the ideas of the last questioner. >> could the ftc do that? >> you'd have to ask terrell! abo when i was at the ftc back in the dark ages, i don't know whether the commission would have thought of a remedy like that for a deceptive act, as opposed to one that impeded competition. and i suspect the agency would still have problems doing that now. but you know, it's an interesting idea. bob always has interesting ideas. it's yet another interesting idea from bob that i think people should think about.
1:02 pm
you know, the remedies that the agency has are basically equitable remedies, and we do at times force people to do all sorts of things they don't want, like occupational bans. and so, this is an intriguing idea. >> well, there will be plenty, i'm sure, of other intriguing developments ongoing, including, i hope, you know, everyone enjoys or finds interesting the testimony next week, which should give us all kinds of new things to chew on. thank you, everybody, for coming. thank you to the panelists, for the interesting conversation.
1:03 pm
facebook's ceo mark zuckerberg is on capitol hill today to answer questions about the way facebook has handled users' data. yesterday he did the rounds of senators' offices. this afternoon he'll be the sole witness at a joint hearing of the senate judiciary and commerce committees. we'll have live coverage at 2:15 p.m. eastern. this morning we talk to a tech reporter about what to expect from today's hearing. >> here to tell us about that hearing that will take place on the senate side today with mark zuckerberg, tony rahm joining us on the phone with the "washington post." he is their tech policy reporter. good morning to you. >> hey, good morning. >> give us a sense of what's going to play out today during this hearing. >> yeah, this is the most serious political test that facebook ceo mark zuckerberg has probably ever faced, and it's his first ever testimony to capitol hill. he'll face off with more than 40 senators. that's almost half the entire chamber, who have lots of questions for him about cambridge analytica, t
92 Views
IN COLLECTIONS
CSPAN3Uploaded by TV Archive on
![](http://athena.archive.org/0.gif?kind=track_js&track_js_case=control&cache_bust=22864356)